HK1180771B - Integrated eye tracking and display system - Google Patents
Integrated eye tracking and display system Download PDFInfo
- Publication number
- HK1180771B HK1180771B HK13107988.4A HK13107988A HK1180771B HK 1180771 B HK1180771 B HK 1180771B HK 13107988 A HK13107988 A HK 13107988A HK 1180771 B HK1180771 B HK 1180771B
- Authority
- HK
- Hong Kong
- Prior art keywords
- eye
- infrared
- display
- planar waveguide
- illumination
- Prior art date
Links
Description
Technical Field
The present invention relates to visual field blending, and more particularly to an integrated eye tracking and display system.
Background
Mixed or augmented reality is a technology that allows virtual imagery to be mixed with the user's actual field of view in the real world. Unlike other display devices, a feature of a transparent, mixed, or augmented reality near-eye (e.g., head-mounted) display device is that the displayed image is not exclusive to the user's field of view. With a transparent, mixed reality display device, a user can theoretically see through the display and interact with the real world while also seeing images generated by one or more software applications. Further, the user's field of view is not static as the user moves his or her head. Even if his or her head does not move, what the user is looking at in the field of view (a.k.a. the user's gaze) changes as the user turns his or her eyes. The ability to identify eye movement will enhance the placement of images in the display.
Disclosure of Invention
Techniques are provided for integrating eye tracking and display functions using a shared optical system of a transparent, near-eye, mixed reality display device. The embodiments described herein permit the eye tracking system to illuminate and capture data on the optical axis of each display positioned to be seen through by the respective eye, resulting in simpler algorithms, better eye illumination, and a higher probability of capturing more data of the eye, and more affected by obstacles like sagging eyelids and hordeolum on the eyelids, than eye tracking sensor systems that capture off-axis data.
The present technology provides an integrated eye tracking and display system for a transparent, near-eye, mixed reality display device. The system includes a display optical system for each eye having an optical axis and positioned so that each eye can see through a transparent, planar waveguide. One or more wavelength selectable filters are positioned in the waveguide coaxially aligned with the optical axis of the respective display optics. One or more filters direct infrared and visible illumination out of the respective planar light guides. In addition, one or more filters direct infrared reflections into the planar waveguide. Some examples of wavelength-selective filters are fixed and active diffraction grating elements, reflective grating elements, and other reflective elements that direct illumination at a predetermined wavelength or within a range of wavelengths.
The infrared illumination source is positioned such that its infrared illumination is optically coupled into the planar waveguide. An infrared sensor is optically coupled to the planar waveguide to direct infrared and visible illumination out of the planar waveguide and to receive infrared reflections directed from the wavelength selectable filter. The image generation unit is optically coupled to emit visible illumination into the planar waveguide.
The present technology provides another integrated eye tracking and display system for a transparent, near-eye, mixed reality display device. As in the embodiments described herein, system embodiments include a display optical system for each eye. Each display optical system has an optical axis and a transparent planar waveguide positioned to be seen through by a respective eye. One or more wavelength-selectable filters are positioned within the waveguide coaxially aligned with the optical axis of each display optical system to direct infrared reflections into the planar waveguide. An array of optical elements including light sources for emitting infrared and visible illumination is optically coupled to the planar waveguide to direct its illumination into the planar waveguide. An infrared sensor is optically coupled to the planar waveguide to receive infrared reflections directed from the wavelength selectable filter.
The present technology provides methods for processing visible and infrared wavelengths for image display and eye tracking in the optical path of a transparent planar waveguide positioned to be transparent to the display optics of a transparent, near-eye, mixed reality display device. The method includes optically coupling visible and infrared illumination into a planar waveguide along a first direction of an optical path. Visible and infrared illumination is directed out of the planar waveguide and to the eye by one or more wavelength-selective filters coaxially aligned with the optical axis of the display optics. The filter also directs infrared reflections from the eye into the waveguide in a second direction along the optical path of the system.
The method also includes optically coupling the infrared reflection from the planar waveguide to an infrared sensor. Data generated by an infrared sensor, such as a Charge Coupled Device (CCD) or CMOS pixel sensor array, is stored as eye tracking data. Some examples of eye tracking data are image data from an IR camera or the location of glints detected by a Position Sensitive Detector (PSD). In some embodiments, the first and second directions may be the same. In other embodiments, the first direction and the second direction may be different. An example of a different direction is the opposite direction.
The summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Drawings
FIG. 1 is a block diagram depicting example components of one embodiment of a transparent, mixed reality display device system.
Fig. 2A is a side view of the temple of a frame in an embodiment of a transparent, mixed reality display device embodied as eyeglasses that provides support for hardware and software components.
Fig. 2B is a top view of an embodiment of an integrated eye tracking and display optical system of a transparent, near-eye, mixed reality device.
Fig. 2C is a top view of another version of the embodiment of fig. 2B, in which infrared reflections traverse waveguide 112 in the same direction as the IR illumination goes, rather than in the opposite direction.
FIG. 3A is a block diagram of one embodiment of hardware and software components of a transparent, near-eye, mixed reality display device that can be used with one or more embodiments.
FIG. 3B is a block diagram depicting components of the processing unit.
Fig. 4 is a flow diagram of an embodiment of a method for processing visible and infrared wavelengths for image display and eye tracking in the optical path of a transparent, planar waveguide positioned to be transparent to the display optics of a transparent, near-eye, mixed reality display device.
Fig. 5A is a view of a viewing angle from an eye looking through a planar waveguide of another embodiment of an integrated eye tracking and display optical system using an integrated linear light source array and a scanning mirror for generating an image.
Fig. 5B shows another version of the embodiment of fig. 5A, in which the infrared sensing elements are integrated into the array of light sources.
Fig. 6A illustrates an example layout of a linear integrated light source array that simultaneously generates visible and infrared illumination for use in an integrated eye tracking and display optical system of a transparent, near-eye, mixed reality device.
Fig. 6B illustrates an example layout of a linear integrated optical element array including infrared sensing elements and light sources that simultaneously generates visible and infrared illumination for use in an integrated eye tracking and display optical system of a transparent, near-eye, mixed reality device.
Fig. 6C shows another example layout of a linear integrated light source array that generates both visible and infrared illumination, where infrared light sources are placed at the end of each row in an arrangement suitable for flash tracking.
FIG. 6D illustrates another example layout of a linear integrated optical element array including infrared sensing elements, visible light sources, and infrared light sources placed at the ends of visible light rows in an arrangement suitable for flash tracking.
Fig. 7A is a view of a viewing angle from an eye looking through a planar waveguide of another embodiment of an integrated eye tracking and display optical system using a modulated light source and a 2D scanning mirror.
Fig. 7B shows another version of the embodiment of fig. 7A, which uses a single modulated light source and an active grating.
FIG. 8 is a block diagram of another embodiment of hardware and software components of a transparent, near-eye, mixed reality display device for use with an image generation unit that includes a light source and a scanning mirror.
FIG. 9 is a block diagram of an embodiment of an electronic module that may be used to control hardware components of an integrated eye tracking and display optical system using at least one light source and a scanning mirror.
Detailed Description
Embodiments of a transparent, near-eye, head-mounted mixed reality display device system with integrated eye tracking and display optics are described. The eye tracking system shares a portion of a see-through Head Mounted Display (HMD) light path to project eye tracking infrared illumination to the eye and capture infrared reflections from the eye through an infrared sensor. Eye tracking systems use Infrared (IR) illumination to make the illumination invisible to the user. The IR sensor may be an IR camera that provides infrared image data of the eye or an IR sensor that detects glints or reflections emitted from the cornea of the eye generated by IR illumination of the eye.
In the embodiments described below, there is one display optical system for each eye that includes a transparent, planar waveguide. In a practical implementation, the waveguide provides an optical path based on the principle of operation of total internal reflection. One or more wavelength selectable filters are positioned in the waveguide coaxially aligned with the optical axis of the display optical system. The optical axis of a real system is coaxial or very close to coaxial with the nominal eye line of sight. The nominal eye line of sight is centered on the pupil and extends from the pupil center when the user looks straight ahead. With the one or more filters coaxially aligned with the optical axis, visible and infrared illumination representing the image is directed out of the waveguide along the optical axis and into the eye, while reflections from the eye centered on the optical axis are directed into the waveguide. Illuminating the eyes and capturing the reflections from the eyes (centered around the optical axis, close to the eye line of sight) simplifies the image processing algorithms due to eye image data and is more tolerant of individual differences in human facial features. For example, a hordeolum or low eyelid on the eye may occlude more of the illumination derived from the upper corners of the eyeglass frame embodiments than if the illumination was directed along the optical axis of the respective display of the eye.
Better illumination and eye data capture centered on the optical axis can improve the results for many applications, such as corneal glint tracking and pupil tracking for raster determinations, blink tracking for user command interpretation, iris scanning and retinal vein tracking for biometric-based applications, measuring convergence, identifying pupil alignment with the optical axis and determining interpupillary distance (IPD), and structured light pattern techniques for corneal tracking. See the following documents for examples of some of these applications that can benefit from the embodiments presented below: U.S. patent application No. 13/221,739 to Lewis et al entitled "size Detection in As-Through, Near-Eye, Mixed Reality Display", U.S. patent application No. 13/221,662 to Lewis et al entitled "Alignment Inter-purification Distance in a Near-Eye Display System", U.S. patent application No. 13/221,707 to Lewis et al entitled "Adjustment of a Mixed Reality Display for Inter-purification Display Alignment", and U.S. patent application No. 13/221,669 to Perez et al entitled "Head Mounted Display with Iris Display Profile".
FIG. 1 is a block diagram depicting example components of one embodiment of a transparent, mixed reality display device system. System 8 includes a transparent display device as a near-eye, head-mounted display device 2 in communication with processing unit 4 via line 6. In other embodiments, head mounted display device 2 communicates with processing unit 4 through wireless communication. The processing unit 4 may take various embodiments. For example, the processing unit 4 may be implemented in a mobile device such as a smartphone, tablet or laptop computer. In some embodiments, processing unit 4 is a separate unit that may be worn on the user's body (e.g., the wrist in the illustrated example) or placed in a pocket, and includes most of the computing power for operating near-eye display device 2. The processing unit 4 may communicate wirelessly (e.g., WiFi, bluetooth, infrared, RFID transmission, Wireless Universal Serial Bus (WUSB), cellular, 3G, 4G, or other wireless communication devices) with one or more hub computing systems 12 over a communication network 50, whether located nearby as in this example or at a remote location. In other embodiments, the functionality of the processing unit 4 may be integrated in the software and hardware components of the display device 2.
The head mounted display device 2 (which in one embodiment is in the shape of framed 115 glasses) is worn on the head of a user so that the user can see through the display (which in this example is embodied as display optics 14 for each eye) and thus have an actual direct view of the space in front of the user.
The term "actual direct view" is used to refer to the ability to see real-world objects directly with the human eye, rather than seeing a created image representation of the object. For example, looking through glasses in a room would allow a user to have an actual direct view of the room, whereas viewing a video of a room on a television set is not an actual direct view of the room. Based on the context of executing software (e.g., a gaming application), the system may project an image of a virtual object (sometimes referred to as a virtual image) on a display viewable by a person wearing the transparent display device, while the person also views real-world objects through the display.
The frame 115 provides a support for holding the elements of the system in place and a conduit for electrical connections. In this embodiment, the frame 115 provides a convenient frame for the glasses as a support for the elements of the system discussed further below. In other embodiments, other support structures may be used. Examples of such structures are a visor or goggles. The frame 115 includes a temple or side arm for resting on each ear of the user. The temple 102 represents an embodiment of a right temple and includes control circuitry 136 of the display device 2. The nose bridge 104 of the frame comprises a microphone 110 for recording sound and transmitting audio data to the processing unit 4.
Fig. 2A is a side view of the temple 102 of the frame 115 in an embodiment of a transparent, hybrid real-world display device embodied as eyeglasses that provides support for hardware and software components. A video camera 113, facing the physical environment or towards the outside, at the front of the frame 115, captures video and still images sent to the processing unit 4.
Data from the camera may be sent to processor 210 of control circuitry 136, or processing unit 4, or both, which may process the data, but unit 4 may also send the data over network 50 or to one or more computer systems 12. The process identifies and maps the user's real-world field of view.
Control circuitry 136 provides various electronics that support other components of head mounted display device 2. More details of the control circuit 136 are provided below with reference to fig. 3A. The earpiece 130, inertial sensor 132, GPS transceiver 144, and temperature sensor 138 are internal to the temple 102 or mounted on the temple 102. In one embodiment, inertial sensors 132 include a three axis magnetometer 132A, three axis gyroscope 132B, and three axis accelerometer 132C (see FIG. 3A). Inertial sensors are used to sense the position, orientation, and sudden acceleration of head mounted display device 2. From these movements, the head position can also be determined.
The image source or image generation unit 120 may be mounted on the temple 102 or within the temple 102. In one embodiment, the image source includes an image microdisplay 120 for projecting one or more virtual objects and a lens system 122 for directing an image from microdisplay 120 into transparent planar waveguide 112. Lens system 122 may include one or more lenses. In one embodiment, lens system 122 includes one or more collimating lenses. In the example shown, reflective element 124 receives the image guided by lens system 122 and optically couples the image data into planar waveguide 112.
There are different image generation techniques that can be used to implement microdisplay 120. For example, microdisplay 120 can be implemented using a transmissive projection technology where the light source is modulated by optically active material, backlit with white light. These techniques are typically implemented using LCD-type displays with powerful backlights and high optical power densities. Microdisplay 120 can also be implemented using a reflective technology where external light is reflected and modulated by an optically active material. Digital Light Processing (DLP), Liquid Crystal On Silicon (LCOS) and those of the general electric companyDisplay technologies are all examples of reflective technologies. Additionally, microdisplay 120 can be implemented using an emissive technology in which light is generated by the display, such as the PicoPTM display engine by Microvision, inc. Another example of an emissive display technology is a micro Organic Light Emitting Diode (OLED) display. The company eMagin and micropoled provides examples of micro OLED displays.
Fig. 2B is a top view of an embodiment of display optical system 14 of a transparent, near-eye, augmented, or mixed reality device. A portion of the frame 115 of the near-eye display device 2 will surround the display optics 14 to provide support and electrical connection for one or more optical elements as shown here and in subsequent figures. To illustrate the various components of display optical system 14 (in this case, right eye system 14 r) in head mounted display device 2, a portion of frame 115 around the display optical system is not depicted.
In one embodiment, display optical system 14 includes a planar waveguide 112, an optional opacity filter 114, a transparent lens 116, and a transparent lens 118. In one embodiment, opacity filter 114 is behind and aligned with transparent lens 116, planar waveguide 112 is behind and aligned with opacity filter 114, and transparent lens 118 is behind and aligned with planar waveguide 112. Transparent lenses 116 and 118 may be standard lenses used in eyeglasses and may be made according to any prescription, including no prescription. In some embodiments, head mounted display device 2 will include only one transparent lens or no transparent lens. Opacity filter 114, which is aligned with planar waveguide 112, selectively blocks natural light from passing through planar waveguide 112, either uniformly or on a per-pixel basis. For example, the opacity filter enhances the contrast of the virtual imagery. Further details of an Opacity Filter are provided in U.S. patent application No. 12/887,426, "Opacity Filter For set-Through Mounted Display," filed on 21/9/2010, the entire contents of which are incorporated herein by reference.
Planar waveguide 112 transmits visible light from microdisplay 120 to the eye 140 of a user wearing head mounted display device 2. The transparent planar waveguide 112 also allows visible light to be transmitted from the front of the head mounted display device 2 through the light guide optical element 112 to the user's eye, as indicated by optical axis arrow 142 representing the display optical system 14r, thereby allowing the user to have an actual direct view of the space in front of the head mounted display device 2 in addition to receiving a virtual image from the microdisplay 120. Thus, the walls of the planar waveguide 112 are transparent. The planar waveguide 112 includes a first reflective surface 124 (e.g., a mirror or other surface). Visible light from microdisplay 120 passes through lens 122 and is incident on reflecting surface 124. The reflective surface 124 reflects incident visible light from the microdisplay 120 such that the visible light is confined within a planar substrate that includes the planar waveguide 112 due to internal reflection as will be described further below.
The infrared illumination and reflection also traverses the planar waveguide 112 to an eye tracking system 134 for tracking the position of the user's eye. The user's eyes will be directed to a subset of the environment, which is the user's focus or gaze area. The eye tracking system 134 includes an eye tracking illumination source 134A, in this example located on the temple 102, and an eye tracking IR sensor 134B located between the lens 118 and the temple 102. The present technique allows flexible placement of optical couplers into and out of the optical path of the waveguide for the image generation unit 120, illumination source 134A, and IR sensor 134B. Visible illumination, which presents an image, as well as infrared illumination, may enter from any direction relative to waveguide 112, and one or more wavelength-selective filters (e.g., 127) direct the illumination out of the waveguide centered on optical axis 142 of display optics 14. Similarly, the placement of the IR sensor is also flexible as long as it is optically coupled to receive infrared reflections directed by one or more wavelength selectable filters positioned to receive infrared reflections along optical axis 142.
In one embodiment, eye-tracking illumination source 134A may include one or more Infrared (IR) emitters, such as infrared Light Emitting Diodes (LEDs) or lasers (e.g., VCSELs) emitting at about a predetermined IR wavelength or range of wavelengths. In some embodiments, eye tracking sensor 134B may be an IR camera or an IR Position Sensitive Detector (PSD) for tracking the location of glints.
In this embodiment, the wavelength selectable filter 123 is implemented as a grating 123 that passes light in the visible spectrum from the microdisplay 123 through the reflective surface 124 and directs infrared wavelength illumination from the eye-tracking illumination source 134A into the planar waveguide 112 where the IR illumination is reflected inside the planar waveguide until reaching another wavelength selectable filter 127 aligned with the optical axis 142. The grating 123 may be a diffraction grating or a reflection grating. In this example, the IR sensor 134B is also optically coupled to a grating 125, which may also be a diffraction grating or a reflection grating. The gratings are stacked. However, at least grating 125 is unidirectional in that grating 125 passes IR illumination in the optical path from source 134A in a direction toward nose bridge 104, while infrared reflections, including infrared reflections from optical axis 142 toward grating 125, are directed out of waveguide 112 and toward IR sensor 134B. In some examples, the grating may be a fixed diffractive element such as an air cavity grating (air space grating), a fixed reflective grating, or an active or switchable grating for any of diffraction, reflection, or combinations thereof of different wavelengths.
From the IR reflections, the position of the pupil within the eye box can be identified by known imaging techniques when the IR sensor is an IR camera, and by flash position data when the IR sensor is a Position Sensitive Detector (PSD).
After coupling into waveguide 112, visible and IR illumination, which represents image data from microdisplay 120, is internally reflected in waveguide 112. In the example of fig. 2B, the planar waveguide is a reflective array planar waveguide. Other types of planar waveguides may also be used, for example, diffractive optical elements or planar waveguides with Total Internal Reflection (TIR) grooves. In the example of fig. 2B, after several reflections off the surface of the substrate, the visible light waves left reach the optional reflective surface 12, which in this example is implemented as a selective reflective surface61To 126NThe wavelength selective filter array of (1). In addition, a wavelength-selectable filter 127 aligned with the optical axis of the display optical system is also positioned in the waveguide 112. Reflective surfaces 126 couple visible wavelengths that exit the substrate and are incident on those reflective surfaces, directed in the direction of the user's eye 140.
Reflective surface 126 also passes infrared radiation within the waveguide. However, one or more wavelength-selective filters 127 are aligned with optical axis 142 of display optical system 14r, wavelength-selective filters 127 not only directing visible illumination, but also receiving infrared illumination from illumination source 134A. For example, such asFruit reflecting element 1261To 126NEach reflecting a different portion of the visible spectrum, one or more optional filters 127 may reflect wavelengths in the red visible spectrum and the infrared spectrum. In other embodiments, the filter 127 may reflect wavelengths of the infrared spectrum covering the entire visible spectrum or a larger portion thereof as well as the wavelengths of the IR reflection as well as those generated by the IR illumination source.
In addition, one or more wavelength-selective filters 127 direct infrared reflections from the eye, through the transparent wall of the planar waveguide, and centered on the optical axis 142 into the optical path of the planar waveguide, but in the opposite direction toward the wavelength-selective filter 125, where the wavelength-selective filter 125 selectively filters and directs infrared reflections from the waveguide to the IR sensor 134B. The filter 127 may comprise a bi-directional infrared filter. In addition, the visible and infrared filters may be stacked in a direction from the lenses 116 to 118 such that they are coaxial with the optical axis. For example, a hot mirror (hot mirror) placed in front of the visible reflective element with respect to the eye passes visible light but reflects IR wavelengths. Additionally, the one or more filters 127 may be implemented as an active grating that is modulated between filtering wavelengths between the visible and infrared spectra. This is done at a rate fast enough that the human eye cannot detect it.
Fig. 2C is a top view of another version of the embodiment of fig. 2B, in which infrared reflections traverse waveguide 112 in the same direction as the IR illumination goes, rather than in the opposite direction. In this embodiment, IR sensor 134B is positioned within nose bridge 104. In addition to the wavelength selective filter 127 that directs IR illumination to the eye, another wavelength selective filter 125 is implemented as an IR reflecting element that passes visible light through the waveguide and directs the received IR reflection along optical axis 142 into the waveguide and toward the IR sensor 134B. An example of such an IR reflecting element 125 is a hot mirror embodiment. In other examples, diffractive or reflective gratings may also be used. In addition, sensor 134B is part of waveguide 112 that is located within nose bridge 104 so as not to obstruct the user's view. An electrical connection (not shown) may be established to the sensor 134B in the nasal bridge portion to read out the sensor data.
In one embodiment, each eye will have its own planar waveguide 112. When a head-mounted display device has two planar waveguides, each eye may have its own microdisplay 120, which microdisplay 120 may display the same image in both eyes or a different image in both eyes. In another example, there may be a planar waveguide with two optical axes, one for each eye, that spans the bridge of the nose and reflects visible light into both eyes simultaneously.
In the above embodiments, the specific number of lenses shown is merely an example. Other numbers and configurations of lenses operating according to the same principles may be used. In addition, fig. 2A, 2B, and 2C show only half of the head mounted display device 2. A complete head-mounted display device may include, for example, another set of see-through lenses 116 and 118, another opacity filter 114, another planar waveguide 112 with one or more wavelength-selectable filters 127, another microdisplay 120, another lens system 122, a physical environment-facing camera 113 (also referred to as an outward-facing or front-facing camera 113), an eye-tracking component 134, an earpiece 130, gratings 123, 125, and a temperature sensor 138. Additional details of the head mounted display 2 are shown in U.S. patent application No. 12/905952 entitled "Fusing Virtual Content Into Real Content," filed on 10/15/2010, which is incorporated herein by reference.
FIG. 3A is a block diagram of one embodiment of the hardware and software components of a transparent, near-eye, mixed reality display device 2 that may be used with one or more embodiments. Fig. 3B is a block diagram depicting components of processing unit 4. In this embodiment, near-eye display device 2 receives instructions regarding the virtual image from processing unit 4 and provides data from the sensor back to processing unit 4. Software and hardware components such as depicted in fig. 3B that may be implemented in processing unit 4 receive sensory data from display device 2 and may also receive sensory information from computing system 12 over network 50. Based on this information, the processing unit 4 will determine where and when to provide the virtual image to the user and send instructions to the control circuitry 136 of the display device 2 accordingly.
Notably, certain components of fig. 3A (e.g., camera 113, eye camera 134, micro-display 120, opacity filter 114, eye-tracking lighting unit 134A, headphones 130, optional active grating 127 for implementing at least one of the one or more wavelength-selectable filters 127, and temperature sensor 138 toward the outside or physical environment) are displayed in shadow to represent that each of those devices may have at least two, at least one on the left side and at least one on the right side of head mounted display device 2. Fig. 3A shows the control circuit 200 in communication with the power management circuit 202. The control circuit 200 includes a processor 210, a memory controller 212 in communication with a memory 244 (e.g., D-RAM), a camera interface 216, a camera buffer 218, a display driver 220, a display formatter 222, a timing generator 226, a display output interface 228, and a display input interface 230. In one embodiment, all components of control circuit 200 communicate with each other via dedicated lines of one or more buses. In another embodiment, each component of the control circuit 200 is in communication with the processor 210.
The camera interface 216 provides an interface to both the cameras 113 facing the physical environment and the IR camera, such as sensor 134B in this embodiment, and stores the respective images received from the cameras 113, 134B in a camera buffer 218. Display driver 220 will drive microdisplay 120. Display formatter 222 may provide information about the virtual image displayed on microdisplay 120 to one or more processors of one or more computer systems (e.g., 4 and 12) performing the processing of the mixed reality system. Display formatter 222 may identify transmittance settings for display optical system 14 to opacity control unit 224. A timing generator 226 is used to provide timing data to the system. The display output 228 comprises a buffer for providing images from the physical environment facing camera 113 and the eye camera 134B to the processing unit 4. Display input interface 230 displays a buffer that includes a display for receiving images, such as virtual images to be displayed on microdisplay 120. The display output 228 and the display input 230 communicate with a band interface 232 that is an interface to the processing unit 4.
Power management circuit 202 includes voltage regulator 234, eye tracking illumination driver 236, audio DAC and amplifier 238, microphone preamplifier and audio ADC 240, temperature sensor interface 242, active raster controller 237, and clock generator 245. Voltage regulator 234 receives power from processing unit 4 through band interface 232 and provides the power to the other components of head mounted display device 2. Illumination driver 236 controls eye-tracking illumination unit 134A to operate at approximately a predetermined wavelength or within a certain wavelength range, e.g., via a drive current or voltage. Audio DAC and amplifier 238 provides audio data to headphones 130. Microphone preamplifier and audio ADC 240 provides an interface for microphone 110. Temperature sensor interface 242 is an interface for temperature sensor 138. The active grating controller 237 receives data indicative of one or more wavelengths for which each active grating 127 is to serve as a selectable wavelength filter. Power management unit 202 also provides power to and receives data back from three axis magnetometer 132A, three axis gyroscope 132B, and three axis accelerometer 132C. The power management unit 202 also provides power to, receives data from, and transmits data to the GPS transceiver 144.
FIG. 3B is a block diagram of one embodiment of the hardware and software components of the processing unit 4 associated with a transparent, near-eye, mixed reality display unit. Fig. 3B shows control circuitry 304 in communication with power management circuitry 306. Control circuitry 304 includes a Central Processing Unit (CPU) 320, a Graphics Processing Unit (GPU) 322, a cache 324, a RAM 326, a memory controller 328 in communication with a memory 330 (e.g., D-RAM), a flash controller 332 in communication with a flash memory 334 (or other type of non-volatile storage), a display output buffer 336 in communication with transparent, near-eye display device 2 via band interface 302 and band interface 232, a display input buffer 338 in communication with near-eye display device 2 via band interface 302 and band interface 232, a microphone interface 340 in communication with an external microphone connector 342 for connecting to a microphone, a PCI express interface for connecting to a wireless communication device 346, and a USB port 348.
In one embodiment, the wireless communication component 346 includes a Wi-Fi enabled communication device, a Bluetooth communication device, an infrared communication device, a cellular, 3G, 4G communication device, a Wireless USB (WUSB) communication device, an RFID communication device, and so forth. The wireless communication device 346 thus allows end-to-end data transmission with, for example, another display device system 8, as well as connection to a larger network via a wireless router or base tower. The USB port may be used to interface the processing unit 4 to another display device system 8. Additionally, processing unit 4 may be docked to another computing system 12 for loading data or software to processing unit 4 and charging processing unit 4. In one embodiment, CPU 320 and GPU 322 are the main load devices used to determine where, when, and how to insert virtual images into a user's field of view.
Power management circuitry 306 includes a clock generator 360, an analog-to-digital converter 362, a battery charger 364, a voltage regulator 366, a transparent, near-eye display power supply 376, and a temperature sensor interface 372 (located on the wrist band of processing unit 4) that communicates with a temperature sensor 374. The AC-to-DC converter 362 is connected to a charging receptacle 370 to receive AC power and generate DC power for the system. The voltage regulator 366 communicates with a battery 368 for providing power to the system. The battery charger 364 is used to charge the battery 368 (via the voltage regulator 366) upon receiving power from the charging receptacle 370. The device power interface 376 provides power to the display device 2.
Before proceeding to other system embodiments, FIG. 4 is a flow chart of an embodiment of a method for processing image display and eye tracking in an optical path of a visible and infrared wavelength planar waveguide for perspective, where the transparent, planar waveguide is positioned to be capable of perspective by a display optical system of a transparent, near-eye, mixed reality display device. At step 402, visible and infrared illumination is optically coupled into a planar waveguide along a first direction of an optical path. For example, in fig. 2B and 2C, the reflective element 124 couples visible light into the waveguide and the grating 123 couples IR illumination into the waveguide and is directed to element 126 and one or more filters 127. At step 404, one or more wavelength selectable filters coaxially aligned with the optical axis of each display optical system are directed out of the planar waveguide and to the eye. See, for example, grating 127 in FIG. 2B and grating 125 in FIG. 2C. By directing the IR illumination along the optical axis, the illumination is concentrated on the eye, providing the most illumination of the pupil for tracking or the most illumination of the iris for scanning. Also, the following assumptions exist: the optical axis of the display optical system is aligned with the pupil of the user when looking straight ahead or as aligned as possible.
At step 406, one or more wavelength-selective filters (e.g., 127, 125) direct infrared reflections from the eye into the planar waveguide along a second direction of the same optical path. At step 408, the infrared reflection is optically coupled from the planar waveguide to the infrared sensor. At step 410, data generated by an infrared sensor (e.g., a Charge Coupled Device (CCD) or CMOS pixel sensor array) is stored as eye tracking data. Some examples of eye tracking data are image data from an IR camera or the location of glints detected by a Position Sensitive Detector (PSD). In some embodiments, the first and second directions may be the same, as shown in fig. 2C. In other embodiments, the first and second directions may be different, as shown in fig. 2B. An example of a different direction is the opposite direction.
Fig. 5A is a view of a viewing angle from an eye looking through a planar waveguide of another embodiment of an integrated eye tracking and display optical system using an integrated linear light source array and a scanning mirror for generating an image.
In this embodiment, the integrated light source array 504 and IR sensor 516 are electrically connected to an electronic module, such as a Printed Circuit Board (PCB) 502 located at the brow of the eyeglasses frame 115 and above the display optics 14. The electronic module is connected to the control circuit 136 via an electrical connection 140. Display optics 14 includes at least one transparent lens 118 and one transparent planar waveguide 112. In this embodiment, the waveguide 112 may be implemented as a TIR slotted planar waveguide.
The image generation unit is implemented by an integrated light source array 504, optical elements 508, beam combiner 506, scanning mirror 510, and optical coupler 512. The electronics module 502, as described in fig. 9 below, determines and sets the output wavelengths of the different visible light sources for generating an image. In addition, the light source includes an infrared light emitter for generating infrared illumination for eye tracking. The output of the light source 504, e.g., an integrated LED or laser (e.g., VCSELS), is combined into a combined beam by a beam combiner 506 and optically coupled to a scanning mirror 510 by an optical element 508, e.g., a reflective element. In some embodiments, the scanning mirror can be implemented with micro-electromechanical systems (MEMS) technology. The mirror may be moved to direct the received illumination along one axis for unidirectional scanning or along two axes (e.g., a horizontal axis and a vertical axis) for bidirectional scanning. The layout of the array is discussed next before returning to optically coupling the illumination to the planar waveguide.
Fig. 6A illustrates an example layout of a linear integrated light source array that simultaneously generates visible and infrared illumination for use in an integrated eye tracking and display optical system of a transparent, near-eye, mixed reality device. As shown in the example of fig. 6A, the light source array 504 may be implemented as a linear integrated visible light source array, such as an LED or VSSELS. In this example, a plurality of rows of red indicated by "R", green indicated by "G", and blue indicated by "B" are used. Other color ranges may be used, such as cyan, magenta, and yellow. In other examples, each visible light source may be individually modulated to any color in the visible spectrum. In addition, the array includes a row of infrared emitters indicated by "I". The rows of the array repeat according to the size of the array. Thus, the image generating illuminator and the eye tracking illuminator are combined into one integrated array unit.
In this example, rows of red, green, blue, and infrared emitters are located in columns, and rows are scanned. Each of the red, blue and green light sources is modulated to represent a portion of an image, e.g., a picture element such as a pixel. Each set of red, green, blue, and infrared in a row may correspond to a portion of an image. The output of the integrated linear array 504 passes through a beam combiner 506 and an optical element 508, the optical element 508 directing the combined beam of both visible and IR illumination to a scanning mirror 510.
In other examples, fewer or more infrared emitters may be dispersed throughout the visible light source. For example, there may be one IR emitter for every twenty (20) visible light sources. This configuration can be used for structured lighting based applications.
Fig. 6C shows another example layout of a linear integrated light source array that generates both visible and infrared illumination, where infrared light sources are placed at the end of each row in an arrangement suitable for flash tracking. Glints are reflections that emerge from one or more surfaces of the cornea. As the user's eye moves, specular reflections of different eye parts (e.g., pupil, iris, and sclera) affect the density value of each glint received at an IR sensor, such as a Position Sensitive Detector (PSD) or photodetector. The pupil position may be determined from the glint data values generated by the sensor.
Scanning mirror 510 scans each row and reflects each row onto optical surface 512, thereby reproducing an image on surface 512 of the optical element. In other examples, a column-by-column scan may also be used. Progressive scanning is an example of unidirectional scanning allowed using linear arrays. Embodiments employing bi-directional scanning may also be used, if desired. Visible and infrared wavelengths pass through the optical element 512 in the direction of the optical path from the light emitter to the planar waveguide 112. As discussed below, the other side of the optical element 512 includes a one-way wavelength-selective filter 511 for infrared wavelengths to direct infrared reflections along the optical path to an infrared sensor 518. An optical coupler (e.g., one or more collimating lenses 527) couples the image and the IR illumination into the waveguide 112.
In this example, the grating 524 diffracts visible and IR illumination in the planar waveguide for total internal reflection. In one example, grating 524 may be implemented using a stack of multiple gratings, one for directing visible spectrum wavelengths and another for directing IR illumination. For example, a hot mirror may be laminated on the surface of the visible spectrum reflecting element. Additionally, in this example, grating 524 comprises a two-way IR reflection grating to couple IR reflections along path 529 to wavelength-selective filter 511 (e.g., a hot mirror) of optical element 512 and to reflect IR illumination to IR wavelength-selective filter 525 positioned coaxial with optical axis 142 of the real optical system.
In this example, the stacked gratings 526 and 525 are both coaxially aligned with the optical axis 142. Grating 526 is located behind grating 525, grating 526 directing visible light out of the waveguide along optical axis 142, and grating 525 directing infrared illumination out of the waveguide and received IR reflections into the waveguide, e.g., by diffraction or reflection. Arrows 522, representing the IR illumination exiting the waveguide (in this example exiting the page) and the IR reflection entering the waveguide (in this example entering the page), are centered about the optical axis 142. In some embodiments, to better reduce backscattered IR illumination in IR reflection, both the visible grating 526 and the IR grating 525 can have an IR blocking coating on their longer right side surfaces to block IR illumination that is internally reflected back to the grating 524.
In this example, the IR grating 525 is bi-directional and directs infrared reflections back to the grating 524 which is also bi-directional for infrared wavelengths. The IR reflection 529 is directed back to the wavelength selective filter 511 of the optical element 512, which directs the IR reflection to another IR wavelength selective filter 514 (e.g., a hot mirror, a reflective grating, a diffraction grating), which directs the IR reflection 528 through the coupled lens 516 and to the IR sensor 518.
FIG. 5B shows another version of the embodiment of FIG. 5A in which an array of optical elements is integrated, wherein the infrared sensing elements are integrated into the array of light sources. Fig. 6B illustrates an example layout of a linear integrated optical element array including infrared sensing elements and light sources that simultaneously generates visible and infrared illumination for use in an integrated eye tracking and display optical system of a transparent, near-eye, mixed reality device.
In the example shown, the array of optical elements 505 includes a plurality of rows of IR sensing elements denoted by "S". In some examples, the IR sensing element "S" may be an integrated photodetector. The order of the rows of light emitters represented is flexible for both the array of light sources 504 and the array of optical elements 505. For example, the sequence of red, green, blue, infrared is used for array 504 in FIG. 6A, while the sequence of red, infrared, green, sensor, blue is used in FIG. 6B.
Other geometries may be used for the integrated light source array. For example, instead of multiple rows of light sources, there may be multiple groups or clusters of light sources. Each set may include a color light source and an IR sensor (if an optical element arrangement is employed). Examples of geometric arrangements for each group may be square, circular, or rectangular. In this embodiment, visible and infrared illumination is coupled into the waveguide as discussed above for the embodiment of fig. 5A. In this embodiment, optical element 612 may be a pass-through optical element, where scanning mirror 510 recreates an image on optical element 612 by reflecting visible illumination as the infrared reflections are directed back into the integrated optical element array. The IR illumination from the grating 524 passes through the optical element 612. Since the reflection travels at the speed of light, the scan mirror 510 reflects the IR reflection back to the retro-reflective element 508 in the reverse path that passes forward through the beam combiner 506 and back to the array of optical elements 505 that include infrared sensing elements ("S" elements). In response to the received IR photons, the sensing element generates an electrical signal. The electrical signals are converted into representative data signals that are transmitted (e.g., electrical connection 140) by the electronics module 502 to one or more processors of the control circuitry 136 and the processing unit 4 for applications such as gaze determination, biological monitoring, and biological identification.
FIG. 6D illustrates another example layout of a linear integrated array of optical elements including infrared sensing elements, visible light sources and infrared light sources "I" and infrared sensing elements "S", where the infrared light sources "I" and infrared sensing elements "S" are placed at the ends of a row of visible light sources in an arrangement suitable for flash tracking. The processing discussed above for the embodiment of fig. 5B is also applicable to this embodiment using an array of integrated optical elements as in fig. 6D. As with the example in fig. 6B, in some examples, the IR sensing element may be an integrated photodetector.
Fig. 7A is a view of a viewing angle from an eye looking through a planar waveguide of another embodiment of an integrated eye tracking and display optical system using a modulated light source and a 2D scanning mirror. This embodiment is another version of the embodiment of fig. 5A, except that the integrated light source array 504 is replaced by a separate light source 507. In this example, four light sources are shown consistent with the context of an example in which red, green, and blue are used to generate different colors, but other combinations may of course be used. In this example, one of the light sources (e.g., laser, LED) emits red spectrum illumination, another emits blue spectrum illumination, another emits green spectrum illumination, and another emits infrared spectrum illumination, so the three light sources generate the colors that make up the entire portion of the image. In this example, the scanning mirror 510 is a bi-directional scanning mirror that moves in both the vertical and horizontal directions to reproduce the portion of the image represented by the current output of the laser on the optical element 512 to complete the image. The optical element 512 is similar to a projector screen.
Fig. 7B shows another version of the embodiment of fig. 7A, which uses a single modulated light source and an active grating. In this example, the set of individual light sources is replaced by a single light source 509, the light source 509 being modulated to one of several wavelengths in the visible spectrum and also being used to generate illumination in the infrared spectrum. The scan mirror is a bidirectional scan mirror. In this example, the light source emits infrared and visible illumination at different time intervals. The scanning mirror directs the IR illumination toward the center of the element, through collimating lens 516, and into waveguide 112.
A single pass grating (also referred to as a switch grating) may be used for the wavelength selective filter 525, which filters visible and infrared wavelengths entering and exiting the waveguide 112 along the optical axis 142 and directed to and from the eye. Within the eyewear frame 115, an electrical connection 141 is established between the electronics module 502 and the planar waveguide 112. Control signals from active grating controller 237 selectively modulate the wavelength of grating 525 between the visible and infrared spectrum according to the timing of different time intervals for modulation of a single source to generate different visible and infrared wavelengths.
FIG. 8 is a block diagram of another embodiment of elements and software components of a transparent, near-eye, mixed reality display device for use with an image generation unit that includes a light source and a scanning mirror. This embodiment is another version of the hardware and software components of fig. 3A, in which the control circuitry interfaces with the electronic module 502. In this embodiment, the display driver 220 provides its image data to the display lighting controller 554 of the electronic module 502. In this embodiment, the IR sensor 518 or IR sensing element "S" in 505 is an IR image sensing element, while the IR camera interface 558 of the electronic module 502 is coupled to the camera interface 216 to transmit IR image data for the eye tracking process.
FIG. 9 is a block diagram of an embodiment of an electronic module that may be used to control hardware components of an integrated eye tracking and display optical system using at least one light source and a scanning mirror. The electronic module 502 is communicatively coupled to the control circuit 136, for example, via the electrical connections 140 shown in fig. 5A, 5B, 7A, and 7B. The display illumination controller 554 receives image data from the display driver 220 and stores it in the display data buffer 560. Display illumination controller 554 converts image data for visible light source 568 (labeled herein as color 1 light source 568)1Color 2 light source 5682And color N light source 568N) The modulated signal of (2). The controller 554 sends individual light source modulation signals to control individual drivers 570 of the light sources 568. The display illumination controller 554 further provides an IR illumination driver 572 for driving an IR light source 569A modulation signal for a predetermined IR wavelength or a range of IR wavelengths. The individual light sources 568 may be a single, differently modulated light source as in fig. 7A and 7B, or part of the integrated array of light sources 504 or the integrated array of optical elements 505. In some embodiments, the drivers 570, 572 may also be integrated in the arrays, and each driver drives a corresponding light source 568 in a respective array 504, 505. In other embodiments, the driver 570 may activate a row of light sources 568 or a group of light sources 568 in the array in a sequence of time intervals. In the case of a single different light source 509 in fig. 7B, there may be only one driver that generates modulation signals in both the IR and visible spectrum, or a switch (not shown) may switch modulation signals from different drivers 570 and 572 for reception by the light source 509.
The electronics module 510 also includes a scan mirror controller 556 for controlling the movement of the scan mirror 510. In some examples, the scan mirror controller 556 can be programmed to perform unidirectional scanning or bidirectional scanning. In this example, the IR camera interface 558 receives data representing photons received by the IR camera 564 and stores them in the IR data buffer 562, and the interface 558 may transmit the data from the IR data buffer 562 to the camera interface 216 of the control circuit.
The integrated eye tracking and display system described in the various embodiments above simplifies the eye tracking process for many applications, such as measuring trends, interpupillary distance (IPD), gaze determination, eye movement based commands, and biometric identification. In addition, integration of an eye tracking and display system as described herein may be implemented in a form suitable for commonly available consumer products.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (4)
1. An integrated eye tracking and display system for a transparent, near-eye, mixed reality display device, the integrated eye tracking and display system comprising:
a display optical system (14) for each eye, the display optical system having an optical axis (142) and a transparent, planar waveguide (112) positioned to be seen through by the respective eye;
one or more wavelength-selectable filters (525, 526) located in each transparent, planar waveguide coaxially aligned with an optical axis of each display optical system for directing infrared and visible illumination (522) out of each planar waveguide and for directing infrared reflection (522) into the planar waveguide;
an integrated array (504, 505) of optical elements comprising light sources (504) for emitting infrared and visible illumination, the light sources being optically coupled (508, 510, 512, 527) to the planar waveguide to direct their illumination into the planar waveguide; and
the integrated array of optical elements further comprises one or more infrared sensors (518, 505) optically coupled ((527, 511, 514), (527, 612)) to the planar waveguide to receive infrared reflections directed from the one or more wavelength selectable filters.
2. The integrated eye tracking and display system for a transparent, near-eye, mixed reality display device of claim 1, wherein the one or more wavelength selectable filters comprise a bidirectional infrared wavelength selectable filter (525).
3. The integrated eye tracking and display system for a transparent, near-eye, mixed reality display device of claim 1, wherein the integrated array of optical elements is a linear array of rows of the light sources, the linear array being scanned by a scanning mirror (510) to couple visible and infrared illumination into the planar waveguide.
4. The integrated eye tracking and display system for a transparent, near-eye, mixed reality display device of claim 3, wherein the infrared reflections of the respective eye are optically coupled from at least one of the one or more wavelength-selectable filters in the planar waveguide to the scanning mirror along opposite optical paths and to the one or more infrared sensors of the integrated array of optical elements.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/245,700 US8998414B2 (en) | 2011-09-26 | 2011-09-26 | Integrated eye tracking and display system |
| US13/245,700 | 2011-09-26 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1180771A1 HK1180771A1 (en) | 2013-10-25 |
| HK1180771B true HK1180771B (en) | 2016-01-22 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN102955255B (en) | Integrated eye tracking and display system | |
| US11782277B2 (en) | Waveguide display with increased uniformity and reduced cross-coupling between colors | |
| US9759913B2 (en) | Eye tracking apparatus, method and system | |
| KR102522813B1 (en) | Waveguide eye tracking employing switchable diffraction gratings | |
| EP2948813B1 (en) | Projection optical system for coupling image light to a near-eye display | |
| CN104254800B (en) | Image generation system and image generation method | |
| TWI615631B (en) | Head wearing type display device and control method of head wearing type display device | |
| KR102599889B1 (en) | Virtual focus feedback | |
| US20150003819A1 (en) | Camera auto-focus based on eye gaze | |
| US20160041384A1 (en) | Waveguide eye tracking employing volume bragg grating | |
| US20130249946A1 (en) | Head-mounted display device | |
| CN109960481B (en) | Display system and control method thereof | |
| KR20170039294A (en) | Near eye display with a spherical mirror and a decoupled aspherical element | |
| KR20170065631A (en) | See-through display optic structure | |
| HK1180771B (en) | Integrated eye tracking and display system | |
| US20250314891A1 (en) | Photonic integrated circuit based laser engine for high resolution and crosstalk free ar display and eye tracking | |
| US20250291411A1 (en) | Eye tracking via dense point cloud scanning using a light beam source and a reflective and/or diffractive surface |