HK1258061B - Near-eye device - Google Patents
Near-eye deviceInfo
- Publication number
- HK1258061B HK1258061B HK19100439.8A HK19100439A HK1258061B HK 1258061 B HK1258061 B HK 1258061B HK 19100439 A HK19100439 A HK 19100439A HK 1258061 B HK1258061 B HK 1258061B
- Authority
- HK
- Hong Kong
- Prior art keywords
- light
- eye device
- image
- phase
- holographic
- Prior art date
Links
Description
The application is a divisional application of a patent application with the application date of 2014, 3, 31, and the application number of 201480033963.X, and the name of the invention is 'near-eye device'.
Technical Field
The present invention relates to near-to-eye devices such as goggles, glasses, and the like. Embodiments disclosed herein relate generally to near-eye devices for augmenting real scenes or for augmented reality. More particularly, embodiments disclosed herein relate generally to augmented reality holographic projection, such as augmented reality phase-only holographic projection techniques.
Background
Augmented reality near-eye devices and the like have now been developed.
In fig. 1, a known near-eye device is shown. Fig. 1 shows a light source 101 and a collimator lens 103 arranged to illuminate a spatial light modulator 107 via a beam splitter 105. The spatial light modulator 107 comprises an array of amplitude modulation elements arranged to form an image. More specifically, the amplitude of light incident on the spatial light modulator 107 is spatially modulated to form an image. The image may be viewed through the beam splitter 105. More specifically, the image on the spatial light modulator 107 forms a first optical input of the beam combiner 109. The beam combiner 109 further comprises a second optical input 123 providing a field of view of the real scene.
The beam combiner 109 comprises a sphere 111, which sphere 111 causes the image of the spatial light modulator 107 to become divergent. The beam combiner 109 is further arranged to reflect at least the dispersed image portions to the optical output 125 of the beam combiner.
Light received on the second optical input 123 is also directed to the optical output 125 of the beam combiner 109. In this regard, it will be appreciated that the beam combiner combines the real image with the dispersed image from the spatial light modulator 107. Thus, it can be appreciated that using the image from the spatial light modulator increases the real image. Notably, the arrangement described with reference to fig. 1 provides a spherical surface 111 such that the image on the spatial light modulator appears to be from some fixed point in space in front of the beam combiner. Thus, the image from the spatial light modulator 107 appears to come from some fixed point in space defined by the radius of curvature of the sphere 111.
The present invention seeks to provide an improved near-eye device.
Disclosure of Invention
Aspects of the invention are defined in the appended independent claims.
The present invention provides a near-eye device and corresponding method using a beam combiner arranged to receive spatially modulated light from a spatial light modulator and provide a field of view of the real world. Accordingly, the real scene may be supplemented or enhanced with additional information in the form of images. Known near-eye devices use physical optics to project real images into a scene to increase eye relief. The inventors have realised that images can advantageously be provided using computational phase-only holographic techniques. This technique is more energy efficient and enables additional optical elements to be efficiently encoded or embedded in the imaging data. Advantageously, no additional complex components are required. Also advantageously, the distance of the enhanced image can be controlled in real time.
Drawings
Embodiments will be described with reference to the following drawings.
FIG. 1 is a schematic diagram of a known near-eye device;
FIG. 2 illustrates a reflective SLM, such as an LCOS, arranged to produce a holographic reconstruction at a replay field position;
FIG. 3 illustrates an exemplary algorithm for computer-generated phase-only holograms;
FIG. 4 illustrates an exemplary random phase seed point of the exemplary algorithm shown in FIG. 3;
FIG. 5 illustrates an embodiment according to the present invention;
FIG. 6 illustrates an algorithm for computing a Fresnel hologram according to an embodiment;
FIG. 7 is a schematic diagram of a LCOS SLM configuration.
In the drawings, like reference numerals designate like parts.
Detailed Description
The present invention aims to overcome some of the disadvantages of known devices using an image on a spatial light modulator and a beam combiner having a profile for increasing the so-called eye relief, providing an improved near-eye device. In particular, the inventors have realised that improved near-eye devices may be provided by using general purpose computational holography techniques.
Light scattered from an object includes both amplitude and phase information. This amplitude and phase information can be captured on, for example, a photosensitive plate by known interface techniques to form a holographic recording or "hologram" that includes interference fringes. The "hologram" may be reconstructed by illumination with suitable light to form a holographic reconstruction, or replay image, representing the original object.
It has been found that a holographic reconstruction of acceptable quality can be formed from a "hologram" that contains only phase information relating to the original object. Such holographic recordings may be referred to as phase-only holograms. Computer-generated holography techniques can numerically simulate an interference process using, for example, fourier techniques, to produce a computer-generated phase-only hologram. Computer-generated phase-only holograms may be used to produce a holographic reconstruction representing an object.
Thus, the term "hologram" relates to a record containing information about an object, which record can be used to form a reconstruction representing the object. The hologram may contain information of the object in the frequency or fourier domain.
It has been proposed to use holographic techniques in two-dimensional image projection systems. The advantage of using a projection image of a phase-only hologram is that many image properties can be controlled computationally, such as aspect ratio, resolution, contrast and dynamic range of the projection image. A further advantage of phase-only holograms is that no optical energy is lost by amplitude modulation.
Computer-generated phase-only holograms may be "pixilated". That is, a phase-only hologram may be represented by an array of discrete phase elements. Each discrete phase element may become a "pixel". Each pixel acts as a light modulation element, such as a phase modulation element. Thus, a computer-generated phase-only hologram may be represented by an array of phase modulating elements, such as liquid crystal Spatial Light Modulators (SLMs). The SLM may be reflective, meaning that the modulated light is output from the SLM in reflection.
Each phase modulating element, or pixel, may be in a different state, thereby providing a controllable phase delay to light incident on the phase modulating element. An array of phase modulating elements, and thus, for example, a Liquid Crystal On Silicon (LCOS) SLM, may represent (or "display") the computationally determined phase delay profile. If the light incident on the phase modulating element is uniform, the light will be modulated using holographic information or a hologram. The holographic information may be in the frequency domain or fourier domain.
Alternatively, the phase delay profile may be recorded on a kinoform. The term "kinoform" is commonly used to refer to phase-only holographic recordings or holograms.
The phase delay may be quantized. That is, each pixel may be set to one of a discrete number of phase levels.
The phase delay profile may be applied to the incident light wave (e.g., by illuminating an LCOS SLM) and reconstructed. It is possible to reconstruct the position in space by performing an optical fourier transform control using lenses to form a holographic reconstruction, or "image", in the spatial domain. Alternatively, if the reconstruction occurs in the far field, no lens may be required.
The computer-generated hologram may be computed in a variety of ways, including using a Sarkston (Gerchberg-Saxton) algorithm, and the like. The saxose algorithm may be used to derive phase information in the frequency domain from amplitude information in the spatial domain (e.g. 2D images). That is, phase information about an object may be "derived" from intensity or amplitude information in the spatial domain only. Thus, a phase-only holographic reconstruction of the object can be calculated in the frequency domain.
It is possible to form a holographic reconstruction by illuminating the hologram and, if necessary, performing an optical fourier transform using, for example, a fourier transform lens, so as to form an image on a replay field, for example, a screen (holographic reconstruction). In the case of a fresnel hologram, the holographic reconstruction is formed at a predetermined location.
Fig. 2 shows an example of the use of a reflective SLM, such as an LCOS-SLM, to generate a fourier holographic reconstruction at the replay domain position, according to the invention.
A light source (210), such as a laser or laser diode, is arranged to illuminate the SLM (240) through a collimating lens (211). The collimating lens causes the wavefront of the light, which is generally in the same plane, to become incident on the SLM. The direction of the wavefront is slightly shifted (e.g. two or three degrees off from being truly orthogonal to the plane of the transparent layer). Arranged such that light emitted from the light source is reflected by the rear mirrored surface of the SLM to interact with the phase modulation layer to form an exit wavefront (212). The exiting wavefront (212) is applied to an optical device having a Fourier transform lens (220), the Fourier transform lens (220) being focused on a screen (225).
A fourier transform lens (220) receives the beam of phase modulated light exiting the SLM and performs a frequency-to-space conversion to produce a spatial domain holographic reconstruction at the screen (225).
In this process, for light of an image projection system, visible light from a light source is distributed to an SLM (240) and a phase modulation layer (e.g., an array of phase modulation elements). The light leaving the phase modulation layer may be distributed throughout the replay domain. Each pixel of the hologram contributes integrally to the replay image. That is, there is no one-to-one correspondence between a particular point on the playback image and a particular phase modulation element.
The saxose algorithm takes into account the beam I when in planes a and B, respectivelyA(x, y) and IB(x, y) intensity profiles are known and correlated I by a single Fourier transformA(x, y) and IBPhase recovery problem at (x, y). Using a given intensity section, it is possible to find the respective approximations of the phases, phi, in planes A and B, respectivelyA(x, y) and phiB(x, y). The saxose algorithm finds a solution to this problem by the following iterative process.
The saxose algorithm repeatedly transmits the representation I between the spatial and fourier (spectral) domainsA(x, y) and IBWhile the data set of (x, y) is being iterated using spatial and spectral constraints. Spatial and spectral domains are respectively IA(x, y) and IB(x, y). Constraints in the spatial or spectral domain are imposed on the amplitude of the data set. The corresponding phase information can be obtained through a series of iterations.
Algorithms based on the saxophone improvement have been developed, see, for example, pending published PCT application WO 2007/131650, which is incorporated herein by reference.
Fig. 3 shows a modified algorithm that retrieves the fourier transformed phase information Ψ [ u, v ] of the data set, which yields the known amplitude information T [ x, y ] 362. The amplitude T [ x, y ]362 represents a target image (e.g., a holographic image). The bit information Ψ [ u, v ] is used to generate a holographic representation of the target image at the image plane.
Since amplitude and phase are inherently combined in the fourier transform, the amplitude of the transform (as is the phase) includes useful information about the accuracy of the computed data set. Thus, the algorithm may provide feedback on the amplitude and phase information.
The algorithm shown in fig. 3 can be thought of as having a complex wave input (with amplitude information 301 and phase information 303) and a complex wave output (also with amplitude information 311 and phase information 313). Although the amplitude information and the phase information are inherently combined to form a data set, the amplitude information and the phase information are separately considered for the convenience of description. It should be remembered that the amplitude information and phase information are themselves functions of the spatial coordinates (x, y) of the far field image and the spatial coordinates (x, y) of the hologram, which can both be considered as amplitude and phase distributions.
Referring to fig. 3, processing block 350 generates a fourier transform from first data having amplitude information 301 and phase information 303. The result is a second data set Ψ having amplitude information and phase informationn[u,v]305. The amplitude information from processing block 350 is set to represent the distribution of the light sources while the phase information Ψ is preservedn[u,v]305. The phase information 309 is passed to processing block 356 and combined with the new amplitude by processing block 352. The third data set 307, 309 is applied to a processing block 356, which processing block 356 performs an inverse fourier transform. This results in a fourth data set R in the spatial domain with amplitude information 311 and phase information 313n[x,y]。
Starting from the fourth data set, its phase information 313 forms the phase information of the fifth data set, which is used as the first data set for the next iteration 303'. Its amplitude information Rn[x,y]311 by comparing the amplitude information T [ x, y ] from the target image]362 to produce the set of amplitude information 315. From the target amplitude information T [ x, y ]]362 subtracts the scaled amplitude information 315 (scaled by a) to produce input amplitude information η x, y for a fifth data set as the first data set for the next iteration]301. This process can be mathematically expressed by the following equation:
Rn+1[x,y]=F'{exp(iψn[u,v])}
ψn[u,v]=∠F{η·exp(i∠Rn[x,y])}
η=T[x,y]-α(|Rn[x,y]|-T[x,y])
wherein:
f' is an inverse Fourier transform;
f is a forward inverse Fourier transform;
r is a playback field;
a T target image;
the angle is angle information;
Ψ is a quantized version of the angle information;
ε is the new target amplitude, ε is greater than or equal to 0;
alpha is a gain element-1.
The gain element α may be predetermined according to the size and ratio of the incoming target image data.
In the absence of phase information from a previous iteration, the first iteration of the algorithm uses a random phase generator to provide random phase information as a starting point. Fig. 4 shows an example of a random phase seed.
In a variation, the resulting amplitude information from processing block 350 is not discarded. Subtracting the target amplitude information 362 from the amplitude information generates new amplitude information. Subtracting the multiple of amplitude information from the amplitude information 362 may produce the input amplitude information for processing block 356. Further optionally, the phase is not fed back in its entirety, only a portion proportional to its change in the last two iterations is fed back.
Thus, fourier domain data representing the image of interest may be formed.
The embodiments are related by way of example to phase-holograms, and it will be appreciated that the invention is equally applicable to amplitude holograms.
The inventors have recognized limitations of the near-eye device shown in fig. 1:
the device of fig. 1 is inefficient because the light is amplitude modulated, which includes attenuation of most of the light;
an image of an augmented real world field of view is perceived at a fixed (e.g., invariant) location in space;
additional optical equipment may be required in the optical components to compensate for optical aberrations, etc.;
the spherical surface that is part of the combiner is expensive to manufacture.
The inventors have overcome these problems by using general holographic techniques to form images that enhance real scenes. This image may be referred to as an "enhanced image".
Fig. 5 illustrates an embodiment of the present invention.
Fig. 5 shows a light source 501 and a collimator lens 503 arranged to illuminate a spatial light modulator 507 through a beam splitter 505. Spatial light modulator 507 comprises an array of phase modulating elements (or "pixels") arranged to render (or "display") a holographic domain representation. More specifically, the phase of light incident on the spatial light modulator 507 is spatially modulated to form spatially modulated light. The spatially modulated light forms a first optical input to a beam combiner 509.
The beam combiner 509 also includes a second optical input 523 that provides a field of view of the real scene. The beam splitter 509 is further arranged to direct the spatially modulated light to an optical output of the beam combiner 509. Light received on the second optical input 523 is also directed to the optical output 525 of the beam combiner 509. In this regard, it will be appreciated that the beam combiner combines the real world image with the spatially modulated light. Thus, it can be appreciated that the light from the spatial light modulator enhances the real world image.
It will be readily appreciated that the beam splitter 505 is optional and that the spatial light modulator 507 may be backlit or illuminated in other geometries without the need for a beam splitter. Likewise, the collimating lens 503 may not be needed, e.g., the light emitted by the source is already collimated. Furthermore, it will be appreciated by those skilled in the art that other techniques for collimating light may be equally employed.
In contrast to the device with reference to fig. 1, the spatial light modulator 507 according to the present invention spatially modulates the phase of the incident light instead of the amplitude. Thus, the device is more energy efficient since the light is not attenuated.
The present invention therefore provides a near-eye device comprising: a spatial light modulator comprising an array of phase modulating elements arranged to provide a phase delay profile to incident light; a beam combiner comprising a first optical input arranged to receive spatially modulated light from the spatial modulator, and a second optical input having a field of view of the real world.
In further contrast to the apparatus of fig. 1, spatial light modulator 507 displays a hologram corresponding to an image rather than a real image. That is, the spatially modulated light includes phase-only holographic domain data representing an image. Accordingly, the apparatus is energy efficient since all parts of the hologram contribute to all parts of the reconstruction.
In an embodiment, the hologram is a fourier hologram. That is, the holographic domain data is a Fourier hologram. A fourier hologram is a hologram that forms a reconstruction at infinity or at the focus of a lens that performs a fourier transform. Advantageously, by using a fourier hologram, it is possible for the eye of the viewer to perform a fourier transformation in order to form a holographic reconstruction on the retina. Accordingly, the distance between the viewer and the beam combiner is not important. That is, the viewer may move towards or away from the beam combiner while still seeing the holographic reconstruction. Thus, the device is more forgiving to movement of the viewer and provides a more flexible design. That is, in an embodiment, the spatially modulated light is arranged such that the eye of the user performs a fourier transform of the spatially modulated light such that the user sees a holographic reconstruction of the image.
The embodiments described herein are related to fourier holography by way of example only. The invention is generally applicable to fresnel holography, where the fresnel lens function is applied in the calculation of the hologram. FIG. 6 illustrates an example of a Fresnel holography algorithm for computing Fourier domain data representing a projected target image.
The starting condition 601 of the phase recovery algorithm is that each pixel has a uniform amplitude and a random phase provided by a random phase seed function. A fresnel phase function 603 is added to the phase data. The resultant amplitude and phase functions are fourier transformed 605. The target image (phase only) is subtracted from the amplitude component 609 and a controllable gain 611 is applied. The target image 609 is added to the amplitude component and an inverse fourier transform 615 is performed. The fresnel lens function 617 is subtracted and the phase 619 is quantized. The generated phase information forms a hologram 623. Further iterative cycles may be performed by adding fresnel lens function 621 again, repeating fourier transform 616 and subsequent steps until a "acceptable" quality hologram is obtained.
In an embodiment, the hologram is a fresnel hologram. That is, the holographic domain data is a Fresnel hologram. In fresnel holography, the holographic reconstruction is formed at some predetermined points along the fourier path. That is, in an embodiment, the spatially modulated light is arranged to form a holographic reconstruction of the image at a replay plane in the near field. Advantageously, using fresnel holography, a reconstruction can be created from a single hologram at multiple planes in space. Thus, at the same time, multiple images may enhance the real scene.
In an embodiment, the hologram further comprises lens data. More specifically, the holographic domain data having the lens effect is incorporated, for example, added to the holographic domain data representing the image. The additional holographic data simulates a real lens, thus increasing the optical power. Thus, the lens data controls the position in space where the image is presented to the user. The skilled person knows how to calculate the holographic domain data with the required lens effect and how to add these data to the further holographic domain data.
Thus, in an embodiment, the spatially modulated light further comprises phase-only holographic domain data having a lens effect.
In an embodiment, the hologram is a phase-only holographic domain, the lens effect being provided by a phase-only lens. The phase-only hologram may be computed in real time or obtained from a repository such as a database. The hologram may be calculated using a saxose type algorithm or any other algorithm for generating suitable holographic domain data. It will be appreciated by those skilled in the art that the hologram may be equivalent to an amplitude hologram, or an amplitude and phase hologram, and thus the lens effect may also be provided by an amplitude hologram, or an amplitude and phase hologram.
In an embodiment, the holographic domain data representing the image is combined with holographic domain data having a lens effect such that no additional elements such as spheres on the beam combiner are required to add eye relief to the system. Thus, the cost of manufacturing the system is reduced. In particular, spherical surfaces with the required tolerances are expensive to manufacture.
In an embodiment, the two holographic data sets are combined by simple vector addition. In this respect, the hologram on the spatial light modulator comprises first data representing the actual image for enhancement and second data comprising the lens function. It is noted that the lens function can easily be changed by simply adding different lens functions to the holographic data representing the image. Thus, the method also allows real-time adjustment of the perceived position of the image if, for example, the system is readjusted during use or the user wishes to display data at different distances.
In an embodiment, the lens effect is a negative lens effect. Accordingly, the image of the enhanced real scene can be effectively moved away from the user. That is, the image appears to originate from a location that is farther than it actually is. More specifically, the image appears to come from a point in space that is further away than the spatial light modulator. In this respect, it can be said that the lens effect increases the eye distance. Advantageously, by using the negative lens effect, the image may thus be presented to the user as being in a real scene. In other words, in an embodiment, the spatially modulated light is dispersive.
In further embodiments, additional optical power may be added to the hologram to compensate for other optical components. In an embodiment, the spatially modulated light further comprises phase-only holographic domain data arranged to compensate for aberrations in other optical components of the near field apparatus. Furthermore, it will be appreciated that the phase-only holographic domain data arranged to compensate for aberrations can be computationally controlled and therefore easily varied.
As can be appreciated from the above, in an embodiment, the image (holographic reconstruction) enhances the real scene. In an embodiment, the enhancement is optionally implemented. In an embodiment, the first optical input and the second optical input are collinear. In an embodiment, the beam combiner further comprises an optical output arranged to combine light received on the first optical input with light received on the second optical input. Accordingly, a simple and convenient method of enhancing a real scene is provided. More specifically, in an embodiment, the optical output is arranged to at least partially superimpose the light received on the first optical input with the light received on the second optical input.
In an embodiment, a near-eye device is arranged for a user of the near-eye device to receive light output from the beam combiner. Holographic reconstruction may be used to provide additional information to the user. Holographic reconstruction may also be used to provide artificial scenes. In an embodiment, the holographic reconstruction of the image enhances the field of view of the real world.
It will be appreciated that the device requires a light source, but the light source may not be external to the near-eye device, or may be integrated with the near-eye device. In an embodiment, the near-eye device further comprises a light source arranged to illuminate the spatial light modulator.
For simplicity, the incident light may be a plane wave. However, it will be appreciated that the hologram is adapted to accommodate the incident light to form the desired light after spatial adjustment, which light is reconstructed to form the desired image. That is, the incident light may not be a plane wave. However, in an embodiment, the light source is arranged to illuminate the spatial light modulator with a screen wave of light.
In an embodiment, the near-eye device is a pair of goggles or glasses. It will be appreciated by those skilled in the art that the near-eye device may take other known forms. The image may be a video image or may be time-varying. The image may move in time. The image may also be a static image.
Accordingly, the present invention provides a method of providing augmented reality using a near-eye device, the method comprising: providing holographic data comprising phase-only holographic domain data representing an image; spatially modulating light with the holographic data to form spatially modulated light; and combining the spatially modulated light with a real-world field of view using a beam combiner. In an embodiment, the holographic data further comprises phase-only holographic domain data having a lens effect. In an embodiment, the holographic domain data is at least one of a fourier hologram and a fresnel hologram.
It is possible to spatially modulate the light using a spatial light modulator, such as a liquid crystal on silicon SLM. It will be appreciated that holographic data is written to the SLM such that the incident plane wave of light is spatially modulated with the holographic data. In this regard, it may be considered that the pixels of the SLM "display" or "represent" holographic data.
It will be appreciated that the device may display a variety of information. The holograms corresponding to the many possible images to be displayed may therefore be pre-computed and stored in a repository, or computed in real time. In an embodiment, a repository of holographic domain data representing a plurality of images respectively is provided. Similarly, in embodiments, a repository of holographic domain data having different lensing effects is provided. In a further embodiment, a look-up table of optical powers for various sets of lens data is provided.
In fourier holography, the quality of the reconstructed image may be affected by the so-called zeroth order problem, which is caused by the diffractive properties of the reconstruction. Such zero order light may be referred to as "noise" and includes, for example, specularly reflected light, as well as other unwanted light from the SLM.
This "noise" is usually concentrated at the focus of the fourier lens, resulting in a bright spot in the center of the reconstructed image. Usually, the zeroth order light can simply be blocked, however this will clearly indicate that the bright spots are replaced by dark spots.
However, since the hologram contains three-dimensional information, the reconstruction can be transferred to a different plane in space, see for example published PCT application WO 2007/131649, which is incorporated herein by reference.
Alternatively, angularly selective filters may be used to remove only zero-order collimated light. Other methods of managing zeroth order may also be used.
Although the embodiments described herein relate to displaying images, it is not intended that the invention be limited in this regard and more than one hologram may be displayed on an SLM in any implementation.
For example, embodiments implement a "tile" technique, in which the surface area of the SLM is further divided into a number of tiles, each tile being set in a phase distribution similar or identical to that of the original tile. Thus, each tile has a smaller surface area for a large phase pattern than the entire allocated area of the SLM. The smaller the number of frequency bins in the tile, the more separated the reconstructed pixels are when the image is generated. The image is created within zero diffraction orders and preferably the first and subsequent orders are moved far enough so as not to overlap the image, possibly being blocked by means of a spatial filter.
As described above, the image produced by the method (whether or not tiles are used) includes dots that form the image pixels. The larger the number of tiles used, the smaller these points become. If one takes the example of a Fourier transform of an infinite sine wave, then a single frequency can be generated. This is the best output. In practice, if only one tile is used, then that tile corresponds to the input of a single cycle of the sine wave, with zero values in the positive and negative directions extending from the end node of the sine wave to infinity. Instead of a single frequency resulting from its fourier transform, the main frequency component is fabricated with a series of adjacent frequency components on either side thereof. The use of tiles reduces the amplitude of these adjacent frequency components, and as a direct result, less interference (constructive or destructive) occurs between adjacent image pixels, thereby improving image quality.
Preferably, each tile is a complete tile, although portions of the tile may be used.
Although the embodiments refer to the saxophone algorithm, it will be appreciated by those skilled in the art that other phase recovery algorithms may implement the improved method of the present disclosure.
It will be appreciated by those skilled in the art that the improved method disclosed herein is equally applicable to the calculation of holograms for forming three-dimensional reconstructions of objects.
Likewise, the present invention is not limited to projection of monochromatic images.
A colour 2D holographic reconstruction can be produced which is achieved in mainly two ways. One method is known as "Frame-sequential color" (FSC). In an FSC system, three lasers (red, green and blue) are used, each of which is illuminated in succession on the SLM to produce each frame of video. These colors cycle at a fast enough rate (red, green, blue, etc.) that the viewer sees a holographic image from the combination of the three lasers. Thus, each hologram is a particular color. For example, in a 25 frame per second video, the first frame is generated by illuminating the red laser 1/75 seconds, followed by the green laser 1/75 seconds, and finally the blue light emitter 1/75 seconds. Then, starting with the red laser and so on, the next frame is generated.
An alternative approach, which will be referred to as "Spatially Separated colors" (SSCs), involves all three lasers being illuminated at the same time, but taking different optical approaches, e.g., each using a different SLM or different regions of a single SLM, and then combining to form a color image.
The advantage of the Frame Sequential Color (FSC) approach is that the entire SLM is used for each color. This means that the quality of the resulting three color image will not be compromised because all pixels on the SLM are used for a single color image. However, a disadvantage of the FSC method is that the entire image produced will not be as bright as the corresponding image produced by the SSC method by a factor of about 3, since each laser is used for only one third of the time. This disadvantage could potentially be addressed by overdriving these lasers, or using more powerful lasers, but this would require the use of more power, would involve higher costs, and would make the system less compact.
The advantage of the SSC (spatially separated colors) method is that the image is brighter since the three lasers are illuminated at the same time. However, if only one SLM is required due to space constraints, the surface area of the SLM can be divided into three equal parts, which is equivalent to three separate SLMs. A disadvantage of this approach is that the quality of each single color image is reduced because the SLM surface available for each holographic image is reduced. Therefore, the instructions for the holographic image are correspondingly reduced. The reduction in the available SLM surface means that fewer pixels can be used on the SLM, thus reducing the quality of the picture. Because the resolution of the picture is reduced, the quality of the image is also reduced.
In an embodiment, the SLM is a Liquid Crystal On Silicon (LCOS) device. LCOS SLMs have the advantage of signal lines, gate lines and transistors under the mirror, resulting in a high fill factor (typically greater than 90%) and high resolution.
LCOS devices are now effective for pixels between 4.5 μm and 12 μm.
The structure of the LCOS is shown in fig. 7.
A single crystal silicon substrate (802) is used to form an LCOS device. It has a 2D array of planar square aluminium electrodes (801) arranged on the upper surface of the substrate, the aluminium electrodes (801) being separated by gaps (801 a). Each electrode (801) is accessible through a circuit (803) buried in the substrate (802). Each electrode forms a respective flat mirror. An alignment layer (803) is provided on the electrode array, and a liquid crystal layer (804) is provided on the alignment layer (803). A second alignment layer (805) is disposed on the liquid crystal layer (804), and a planar transparent layer (806) (e.g., of glass) is disposed on the second alignment layer (805). A single (e.g. ITO) transparent electrode (807) is provided between the transparent layer (806) and the second alignment layer (805).
The footprint of each square electrode (801) with the transparent electrode (807) and the intervening liquid crystal material define a controllable phase modulating element (808), commonly referred to as a pixel. The effective pixel area or fill factor is the percentage of total pixels that are optically active that takes into account the space between pixels (801 a). By controlling the voltage applied to each electrode (801) relative to the transparent electrode (807), the properties of the liquid crystal material of the respective phase modulating element can be varied, thereby providing variable retardation to light incident on the phase modulating element. The effect is to provide pure phase modulation to the wavefront, i.e. no amplitude effect occurs.
The main advantage of using a reflective LCOS spatial light modulator is that the thickness of the liquid crystal layer is half the thickness that would be necessary if a transmissive device were used. This greatly provides the switching speed of the liquid crystal (key point for moving the projection of the video image). LCOS devices also have the unique ability to display large arrays of pure phase elements in a small aperture. Small components (typically on the order of 10 microns or less) result in practical diffraction angles (a few degrees) so that the optical system does not require very long optical paths.
It is easier to sufficiently illuminate the small aperture (a few square centimeters) of the LCOS SLM than the aperture of a larger liquid crystal device. LCOS SLMs also have a large aperture ratio and very small dead spaces exist between pixels (due to the circuitry driving these pixels being buried under the mirror). This is an important issue in reducing optical noise in the playback field.
The above-described devices typically operate at temperatures in the range of 10 ℃ to about 50 ℃, however, depending on the LC composition used, preferred device operating temperatures are about 40 ℃ to 50 ℃.
The advantage of using silicon back is that the pixels are optically flat, which is important for phase modulation devices.
Although the embodiments refer to a reflective LCOS SLM, it will be understood by those skilled in the art that any SLM may be used, including transmissive SLMs.
The present invention is not limited to the above-mentioned embodiments, and the protection scope thereof should be subject to the appended claims.
Claims (14)
1. A near-eye device comprising:
a spatial light modulator (507) comprising an array of light modulating elements arranged to apply a modulation comprising a phase delay profile to incident light to form spatially modulated light;
a beam combiner (509) comprising a first optical input arranged to receive the spatially modulated light from the spatial light modulator, and a second optical input (523) having a field of view of the real world;
wherein the spatial light modulator is configured to spatially modulate the phase of incident light with holographic data representing an image and data having a lens effect, and
wherein the lensing is negative such that the spatially modulated light received from the spatial light modulator through the beam combiner is dispersive.
2. A near-eye device as claimed in claim 1 wherein the holographic data is a fourier hologram, wherein the spatially modulated light is arranged such that a user's eye performs a fourier transform of the spatially modulated light such that the user sees a holographic reconstruction of the image.
3. A near-eye device as claimed in claim 1 wherein the holographic data is a fresnel hologram and wherein the spatially modulated light is arranged to form a holographic reconstruction of the image at a replay plane in the near-field.
4. A near-eye device as claimed in any preceding claim wherein the spatial light modulator modulates the phase of incident light with data configured to compensate for aberrations in other optical components in the near-eye device.
5. A near-eye device as claimed in any one of claims 1-3 wherein the beam combiner (509) further comprises an optical output (525) arranged to combine light received on the first optical input with light received on the second optical input (523).
6. A near-eye device as claimed in claim 5 wherein the optical output (525) is arranged to at least partially superimpose light received on the first optical input with light received on the second optical input (523).
7. A near-eye device as claimed in any of claims 1-3, wherein the near-eye device is configured to have a user of the near-eye device receive the optical output from the beam combiner (509).
8. A near-eye device as claimed in any one of claims 1-3 wherein holographic reconstruction of the image enhances the field of view of the real world.
9. A near-eye device as claimed in any one of claims 1-3 further comprising a light source (501) arranged to illuminate the spatial light modulator (507), wherein the light source is arranged to illuminate the spatial light modulator with a plane wave of light.
10. A near-eye device as claimed in any one of claims 1-3 wherein the near-eye device is a pair of goggles or glasses.
11. A method of providing augmented reality using a near-eye device, the method comprising:
providing holographic data representing an image and data having a lens effect;
spatially modulating, by a spatial light modulator, a phase of light with the holographic data and the data having a lens effect to form spatially modulated light; and
combining the spatially modulated light with a real world field of view using a beam combiner (509),
wherein the lensing effect is negative such that spatially modulated light received from the spatial light modulator by beam combining is dispersive.
12. The method of providing augmented reality of claim 11, wherein the holographic data is a fourier hologram.
13. The method of providing augmented reality of claim 11, wherein the holographic data is a fresnel hologram.
14. The method of providing augmented reality of any one of claims 11 to 13, further comprising:
providing data arranged to compensate for aberrations in an optical component of the near-eye device; and
compensating for aberrations in the optical component of the near-eye device.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1306763.2 | 2013-04-12 | ||
| GB1306763.2A GB2515460B (en) | 2013-04-12 | 2013-04-12 | Near-eye device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1258061A1 HK1258061A1 (en) | 2019-11-01 |
| HK1258061B true HK1258061B (en) | 2022-04-14 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN108957746B (en) | Near-to-eye device | |
| US12013533B2 (en) | Holographic image projection with holographic correction | |
| EP3146377B1 (en) | Head-up display with diffuser | |
| GB2554575A (en) | Diffuser for head-up display | |
| HK1258061B (en) | Near-eye device |