[go: up one dir, main page]

GB2641369A - Display device - Google Patents

Display device

Info

Publication number
GB2641369A
GB2641369A GB2407535.0A GB202407535A GB2641369A GB 2641369 A GB2641369 A GB 2641369A GB 202407535 A GB202407535 A GB 202407535A GB 2641369 A GB2641369 A GB 2641369A
Authority
GB
United Kingdom
Prior art keywords
pixels
cell
display device
array
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2407535.0A
Other versions
GB202407535D0 (en
Inventor
Kumar Shrestha Pawan
Georgiou Andreas
Chang Xin
Yadav Gyanendra
Preston Leon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Allfocal Optics Ltd
Original Assignee
Allfocal Optics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Allfocal Optics Ltd filed Critical Allfocal Optics Ltd
Priority to GB2407535.0A priority Critical patent/GB2641369A/en
Publication of GB202407535D0 publication Critical patent/GB202407535D0/en
Priority to PCT/GB2025/051150 priority patent/WO2025248234A1/en
Publication of GB2641369A publication Critical patent/GB2641369A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/1336Illuminating devices
    • G02F1/133602Direct backlight
    • G02F1/133606Direct backlight including a specially adapted diffusing, scattering or light controlling members
    • G02F1/133607Direct backlight including a specially adapted diffusing, scattering or light controlling members the light controlling member including light directing or refracting elements, e.g. prisms or lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/1323Arrangements for providing a switchable viewing angle
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/1336Illuminating devices
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/1336Illuminating devices
    • G02F1/133602Direct backlight
    • G02F1/133603Direct backlight with LEDs
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/3406Control of illumination source
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133526Lenses, e.g. microlenses or Fresnel lenses
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/068Adjustment of display parameters for control of viewing angle adjustment

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Nonlinear Science (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Mathematical Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Liquid Crystal (AREA)
  • Electroluminescent Light Sources (AREA)

Abstract

A directional display device includes an image forming array 2, e.g. colour transmissive liquid crystal display (LCD), of first pixels P1. The display device also includes a directional backlight unit 3 arranged to illuminate the image forming array 2. The directional backlight unit 2 includes an array of cells 61-63, each cell including an array of second pixels P2 and an optical coupling element (i.e. lens) 8 arranged to receive light 9 from the second pixels P2 and to output collimated light 4 towards the image forming array 2. By controlling which of the second pixels P2 in each cell are illuminating their respective optical coupling element (lens), it is possible to control the output angle of the collimated output light from each cell. The array of second pixels P2 in the backlight unit may include emissive pixels 49 such as inorganic LEDs or organic LEDs (OLEDs). Alternatively, the second pixels P2 in the backlight unit may be formed by transmissive or reflective liquid crystal display pixels (23, figure 8) coupled with a light source (13, figure 8) to allow selective spatial control of pixel illumination. The directional display may be implemented in a head-mounted display as a near-eye display.

Description

Display device
Field of the invention
The present invention relates to display devices operable to produce highly directional output. The present invention also relates to use of the display devices in a number of application, including near eye displays.
Background
Mixed Reality (MR) displays, both optical see-through Augmented Reality (AR) and occlusive Virtual Reality (VR) displays, suffer from Vergence Accommodation Conflict (VAC) and the need to wear prescription spectacles under the device. In addition, the image quality is often degraded by the complex refractive and reflective optics necessary to create a large eye box for the eye, i.e., the area where a user's eye can be to see the image.
VR displays have been developed for some time. However, optical hardware needs improvement before wider adoption. One problem for users of VR displays is VAC. The conflict between accommodation and vergence may cause discomfort to a user. This conflict occurs because human brains link "vergence" and "accommodation".
However, in a Near Eye Display (NED), such as a VR or AR display, this link is broken, and a users' brain tries to focus on one plane while the optical system displays the image on another plane. This can cause a variety of symptoms in users of VR displays, including eyestrain and nausea.
Another problem limiting adoption is that many users require prescription glasses.
Users with prescription glasses have the option to: (a) wear the display on top of their spectacles (this increases the eye relief and subsequently the size of the headset and causes discomfort).
(b) purchase custom-made clip-on's (which may cumbersome and/or expensive).
(c) correct only the spherical prescription by using the optics of the display (which ignores astigmatism, and can lead to discomfort, increased complexity and/or cost).
Some prior attempts to address these issues of near eye displays: Fixed plane Some approaches aim to minimis a display's vergence variation about a single fixed accommodation plane. The single accommodation plane is typically configured somewhere in the middle of the working space. If the working space covers all the depth that a typical user can focus (30cm to infinity), then the intermediate plane would be configured for somewhere between one meter and two meters. Such displays are designed to minimise the discrepancy between the accommodation plane (which is fixed) and the vergence plane (which is software controlled), and may lessen the degree of VAC-associated discomfort. With such approaches, a user still needs to wear their prescription glasses to correct their short-or long-sightedness.
Dynamic lenses Another proposed approach is to use dynamic lenses to change the accommodation plane in real-time. The principle is to have a dynamic lens between the user and the display, and to change according to the experienced content and/or the need to correct the user's prescription. Although theoretically promising, the technology for practical dynamic lenses is not yet mature. Existing dynamic lenses such as, for example, Alvarez lenses, liquid crystal lenses, liquid lenses, and so forth, may correct for spherical aberration over only a small field of view (FoV), may introduce scattering, are often bulky, and typically too slow (low bandwidth) for fast-moving experiences more likely to drive adoption (for example video games, movies and so forth).
Retinal Scanners The concept for retinal scanners is to eliminate the VAC problem and significantly reduce the need for prescription glasses by delivering a very narrow beam of light into the user's eye pupil, forming a tiny eye box. Keeping the eye box small diminishes the aberrations of the optical system and the user's lens. However, existing retinal scanner solutions require careful headset adjustment/calibration to deliver the light into the user's eye pupil. As the field of view (FoV) increases, this adjustment becomes increasing complicated and impractical.
WO 2022/170287 A3 describes an optical subsystem of a near-eye display system which provides for projecting light of a virtual image of image content to an eye location, and provides for collecting light of the virtual image onto an exit pupil on a surface proximate to an outer surface of an eye when at the eye location. A subpupil modulator within an aperture in cooperation with the optical subsystem provides for forming a plurality of subpupils within the exit pupil, and provides for less than all of the light of the virtual image associated with one or more less than all of the plurality of subpupils to be projected to the eye location.
Wetzstein, G., Lanman, D., Hirsch M., Raskar, R. "Tensor Displays: Compressive Light Field Synthesis using Multilayer Displays with Directional Backlighting", SIGGRAPH 2012. ACM Transactions on Graphics 31(4), https://web.media.mit.edui-gordonw/TensorDisplaysi, describes tensor displays, explained as a family of compressive light field displays comprising architectures employing a stack of time-multiplexed, light-attenuating layers illuminated by uniform or directional backlighting.
Summary
According to a first aspect of the invention there is provided a display device including an image forming array of first pixels. The display device also includes a directional backlight unit arranged to illuminate the image forming array. The directional backlight unit includes an array of cells. Each cell includes an array of second pixels; and an optical coupling element arranged to receive light from the second pixels and to output collimated output light. The display device is configured to control the output angle of the collimated output light from each cell by controlling which second pixels of that cell illuminate the respective optical coupling element.
A viewing angle of the display device may be less than or equal to 20° about a mean output angle. The viewing angle may be defined as the angle to the output angle at which a luminance of the display device is 50% of the maximum luminance. The term "viewing angle" may have the meaning used in the field of televisions and conventional computer displays. The viewing angle is not the same as the field of view when the display is incorporated into a near-eye display, which is determined by the range of angles from which light converges on the eye box.
The light leaving the image forming array may be diverging, converging or collimated.
The second array of each cell may be provided as a physically separate array of second pixels. Alternatively, the second arrays of all cells may be provided by a global array of pixels, with each second array corresponding to a region of that global array.
The display device may be configured such that the collimated output light from each cell illuminates 25% or less of the active area of the image forming array. The collimated output light from each cell may illuminate 20% or less of the active area of the image forming array. The collimated output light from each cell may illuminate 15% or less of the active area of the image forming array. The collimated output light from each cell may illuminate 10% or less of the active area of the image forming array.
Each second pixel may take the form of an emissive pixel. Each second pixel may take the form of an inorganic light-emitting diode. Each second pixel may take the form of an organic light-emitting diode.
The display device may also include a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels. Each second pixel may take the form of a transmissive pixel. For each cell, the device may be configured to control which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
A mask image may take the form of a single second pixel being set to a transmissive (or "ON") state to form an aperture. A mask images may take the form of a group of second pixels being set to a transmissive/reflective state to form an aperture. All other second pixels in the same cell are set an opaque (or "OFF") state. The mask image may be a binary image.
The display device may be configured to control every cell to output the same mask image, having the same size and shape of aperture. However, this is not essential, and in some implementations the display device may be configured to control any or all of the cells to output mask images which differ in one or both of size and shape of the aperture formed.
The display device may be configured to control every cell to output a mask image having an aperture centred at the same relative location within the respective array of second pixels (the apertures may be the same sizes/shapes, or not, as already explained). Alternatively, the display device may be configured to control any or all of the cells to output mask images in which the apertures are centred at different relative locations within the respective array of second pixels.
The display device may also include a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels. Each second pixel may take the form of a reflective pixel. For each cell, the device is configured to control which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
The optical coupling element may be separated from the light source by the second pixels along an optical path which leads to the image forming array. Mask images used with reflective second pixels may be configured in any way described in relation to mask images used with transmissive second pixels.
The light source includes, or take the form of, a back-light unit. The back-light unit may include, or take the form of, a lightguide and one or more in-coupled light sources. The back-light unit may include, or take the form of, one or more organic light-emitting diodes. The back-light unit may include, or take the form of, an array of inorganic light-emitting diodes.
The back-light unit may include, or take the form of, of any type known in relation to conventional liquid crystal displays (transmissive or reflective), liquid crystal on silicon displays, and/or electrochromatographic displays.
The back-light unit is a segmented back-light unit. The back-light unit may include a separate segment corresponding to each cell, and preferably arranged to illuminate only that cell. A segment of the back-light unit corresponding to a cell may have a relatively smaller area than that cell. The segment of the back-light unit corresponding to a cell may be concentric with that cell.
Each second pixel may take the form of a liquid crystal pixel. Alternatively, each second pixel of the second array may take the form of an electrochromatographic pixel.
Each optical coupling element may include, or take the form of, a lens. A lens of a cell may be a refractive lens. A lens of a cell may be a pancake lens. A lens of a cell may be a Fresnel lens. A lens of a cell may be a diffractive lens. A lens of a cell may be a meta-lens. A lens of a cell may be a holographic lens.
The lenses corresponding to the array of cells may be arranged to form a lens array.
The lens array may be formed from individual lenses. The lens array may be integrally formed as a single piece of material, shaped to define the lenses for each of the cells.
The lens array may comprise one or more of refractive lenses, pancake lenses, Fresnel lenses, diffractive lenses, meta-lenses and holographic lenses. The lens array may include a single type of lens.
Each optical coupling element may include, or take the form of, one or more arrays of shutter pixels stacked in sequence between the second pixels and the image forming array. Each shutter pixel may include, or take the form of, a transmissive pixel. For each cell, the display device may be configured to control the one or more arrays of shutter pixels such that light from the second pixels of that cell can only illuminate the image forming array by passing through a sequence of apertures formed using the one or more arrays of shutter pixels.
The display device may include two or more arrays of shutter pixels. The apertures formed in the one or more arrays of shutter pixels may be disposed along a line originating at an active second pixel (or centroid of a group of active second pixels) and extending along the desired output angle for that cell. In this way, light originating from the active second pixel(s) and directed along the output angle passes through the apertures to illuminate the image forming array. Similarly, light originating from the active second pixel(s) will be blocked (unless at a small enough angle to the output angle). Consequently, in combination with selecting which second pixels will be active the one or more arrays of shutter pixels may be controlled to produce collimated light output light at a desired output angle.
The display device may also include optical baffles arrange to block light between the second pixels of each cell and the optical coupling elements of the cells adjacent to that cell. The optical baffles may span all, or part of, a space between the second pixels and the optical coupling elements. Alternatively, the optical baffles may also separate the optical coupling elements of adjacent cells. The optical baffles may take the form of walls forming a mesh and leaving areas corresponding to each cell open.
The second pixels of each cell may be surrounded by a buffer region. The buffer region may comprise pixels which are identical to the second pixels and which the display device is configured not to use. Second pixels may not be formed in the buffer region. In other words, second pixels may be formed exclusively in active areas of the cells. Conductive traces for addressing the second pixels of each cell may be routed through the buffer region(s) of the array of cells.
When the second pixels of each cell are provided as region of a single, global array of pixels, buffer regions may correspond to pixels which do not belong to any cell, and these inactive second pixels may form a mesh surrounding each region of second pixels.
When the display device includes the optical baffles, the optical baffles may be aligned to overlie the buffer regions. For example, a walls providing an optical baffle may be formed to overlie the buffer regions of the adjacent cells which that optical baffle separates.
The directional backlight unit may be arranged to illuminate the image forming array with white light. The first array may be configured to output a colour image.
The second pixels may emit white light. The light source may emit white light. White light may take the form of a mixture of emission peaks, in the same way as a white LED or OLED. In other words, white light does not require a broadband/black body spectrum.
The first array may take the form of a colour transmissive liquid crystal array. Each first transmissive pixel may include three of more sub-first pixels of a different colour.
For example, each first pixels may include red, green and blue sub-first pixels.
Each optical coupling element may be configured to collimate the light received from each cell to within ±3° of an output angle set for that cell by the display device. The optical element may be considered to be configured to collimate the light received from each cell to within 3° of the output angle set by the display device if the luminance of light from the corresponding cell is 50% or less at an angle 3° away from the output angle set by the display device.
The optical coupling element is configured to collimate the light received from each cell to within ±2° of an output angle set for that cell by the display device. The optical coupling element is configured to collimate the light received from each cell to within ±1° of an output angle set for that cell by the display device. The optical coupling element is configured to collimate the light received from each cell to within ±0.5° of an output angle set for that cell by the display device.
The display device may be configured to control the output angle of the collimated output light from each cell to the same angle. Alternatively, the wherein the display device is configured to control the output angle of the collimated output light from each cell to a slightly different output angle. For example, when the display device is to be used in combination with a focussing/objective lens, the output angles of each cell may be adjusted to compensate for non-ideal behaviour, off-axis effects and so forth.
The display device may also include a diffuser disposed between the array of cells and the image forming array.
Alternatively, instead of a diffuser, a low-power lens may be used. In either case (diffuser or low-power lens), causing a slight spreading of the light from the cells may help to reduce or even remove visibility of boundaries/borders between adjacent cells of the backlight unit.
When used, the low-power lens may be of any type described as being usable for a lens providing an optical coupling element.
The diffuser (or low-power lens) may spread a distribution of incident light by at most 5°. The diffuser (or low-power lens) may spread a distribution of incident light by at most 4°. The diffuser (or low-power lens) may spread a distribution of incident light by at most 3°. The diffuser (or low-power lens) may spread a distribution of incident light by at most 2°. The diffuser (or low-power lens) may spread a distribution of incident light by at most 1°. The diffuser (or low-power lens) may spread a distribution of incident light by less than 1°.
A diffuser spreads a distribution of incident light by 50 is the angle(s) at which the luminance of that distribution is 50% of the luminance at the output angle are changed by Se before and after passage through the diffuser.
Each cell may be rectangular. Rectangular may encompass square. Every cell may have the same aspect ratio. The aspect ratios of cell may vary across the array of cells.
Each cell may be hexagonal.
When the optical coupling element comprises an array of lenses, each lens may have a shape matching the corresponding cell. For example, light from second pixels of a rectangular cell may be collimated by a rectangular lens having a matching aspect ratio. Similarly, when a segmented backlight is used having segments corresponding to each cell, each backlight segments may have a shape matching the respective cell.
The second pixels of each cell may have a shape matching the shapes of that cell. For example, hexagonal cells may include a plurality of hexagonal second pixels, whilst rectangular (or square) cells may include a plurality of rectangular or square second pixels. The aspect ratio of rectangular or square pixels second pixels of a rectangular (or square) cell need not match, for example a rectangular cell may include square -10 -second pixels. Similarly, a hexagonal cell may include square second pixels forming a region bounded by or approximating a hexagon.
A near eye display may include the display device. The near eye display may also include an eye-tracking system configured to determine a location of a user eye pupil.
The near eye display may also include one or more imaging elements configured to focus light output from the image forming array to a display exit pupil. The display device is configured to control the output angles of each cell such that the display exit pupil formed by the one or more imaging elements coincides with the location of the user eye pupil.
The near eye display may take the form of a head mounted display. The field of view at the display exit pupil may be greater than or equal to 70°.
According to a second of the invention, there is provided a method of using a display device which includes an image forming array of first pixels; a directional backlight unit arranged to illuminate the image forming array, the directional backlight unit comprising an array of cells, each cell comprising: an array of second pixels; and an optical coupling element arranged to receive light from the second pixels and to output collimated output light. The method includes controlling the output angle of the collimated output light from each cell by controlling which second pixels of that cell illuminate the respective optical coupling element The method of the second aspect may include features corresponding to any features of the display device of the first aspect. Definitions applicable to the display device of the first aspect (and/or features thereof) may be equally applicable to the method of the second aspect (and/or features thereof).
Each second pixel may take the form of an emissive pixel.
The display device may also include a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels, and wherein each second pixel takes the form of a transmissive pixel. The method may also include, for each cell, controlling which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
The display device may also include a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels, and wherein each second pixel takes the form of a reflective pixel. The method may also include, for each cell, controlling which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
Each optical coupling element may include, or take the form of, one or more arrays of shutter pixels stacked in sequence between the second pixels and the image forming array. Each shutter pixel may include, or take the form of, a transmissive pixel. The method may also include, for each cell, controlling the one or more arrays of shutter pixels such that light from the second pixels of that cell can only illuminate the image forming array by passing through a sequence of apertures formed using the one or more arrays of shutter pixels.
The display device may be included in, or form a part of, a near-eye display which also comprises an eye-tracking system configured to determine a location of a user eye pupil, and one or more imaging elements configured to focus light output from the image forming array to a display exit pupil. The method may also include measuring a location of the use eye pupil. The method may also include controlling the output angles of each cell of the display device such that the display exit pupil formed by the one or more imaging elements coincides with the location of the user eye pupil.
-12 -
Brief description of the drawings
Certain embodiments of the present invention will now be described, by way of example, with reference to the accompanying drawings in which: Figure 1 schematically illustrates a display device; Figure 2 schematically illustrates an array of second pixels for the display device shown in Figure 1; Figures 3A and 3B schematically illustrate different states of an array of second pixels Figures 4A to 4C schematically illustrate example of interactions between a users' eye and a display exit pupil of a near eye display; Figure 5 schematically illustrates a near eye display system; Figures 6A to 6C schematically illustrate interactions of the display device shown in Figure 1 with ideal and non-ideal imaging elements; Figure 7A and 7B schematically illustrate steering the display exit pupil of the near eye display system shown in Figure 5 by controlling output angles of the display device shown in Figure 1; Figure 8 schematically illustrates a first directional backlight unit for the display device shown in Figure 1; Figure 9 is a ray-diagram for a portion of a single cell of the first directional backlight unit; Figure 10 schematically illustrates a second directional backlight unit for the display device shown in Figure 1; Figure 11 is a schematic plan view of a 3 by 3 block of arrays corresponding to cells of the second directional backlight unit; Figures 12A to 12F schematically illustrate examples of shapes for cells of a directional backlight unit; Figure 13 schematically illustrates a third directional backlight unit for the display device shown in Figure 1; Figure 14 schematically illustrates a fourth directional backlight unit for the display device shown in Figure 1; Figure 15 is a schematic plan view of a portion of the fourth directional backlight unit; Figure 16 schematically illustrates a fifth directional backlight unit for the display device shown in Figure 1; Figure 17 schematically illustrates using the first directional backlight array to illuminate a reflective image forming array; Figure 18 schematically illustrates a sixth directional backlight unit for the display device shown in Figure 1; Figure 19 schematically illustrates the operation of the optical coupling element of the sixth directional backlight unit; -13 -Figure 20 schematically illustrates a seventh directional backlight unit for the display device shown in Figure 1; and Figure 21 is a ray-diagram for a portion of a single cell of the first directional backlight unit.
Detailed description
In the following description, like elements will be denoted by like reference numerals.
The present specification describes a new display device incorporating a directional backlight unit. The display devices of the present specification may be particularly useful in near-eye displays (NEDs), whether head mounted or not, in order to form a smaller display exit pupil which may be steerable within an eye box to track the location of a users' eye pupil. In this way, the depth-of-field may be improved, the effect of vergence accommodation conflict (VAC) may be reduced, and the optical aberrations of a user's eye may be reduced, allowing a user to operate a NED without their prescription lenses.
In addition to the particular application to near NEDs, the new display devices according to the present specification may also provide advantages and new capabilities in other areas including, without being limited to: * Large displays; * Mobile 3D displays; * In-vehicle entertainment systems (for example forming an image for passengers without distracting a driver/operator); and * Dynamic privacy screens.
Some definitions of terms commonly used in the field of NEDs may help to improve understanding of the descriptions hereinafter.
Vergence Accommodation Conflict (VAC): Vergence is the movement of the eyes in opposite directions when an object moves closer to them. The effect of vergence is best described when the object is on the optical axis of the user (alternatively "viewer"). As the object moves closer to the eyes, the eyes look more towards the nose. The movement of the muscles controlling vergence is linked to the eye's accommodation and the muscles bringing an object to focus. Human ocular muscles and visual systems have linked these two factors: thus, when they are in conflict, it is said that there is a vergence accommodation conflict -14 - (VAC), and without wishing to be bound by theory it is presently believed to be a cause of fatigue and sickness in users of NEDs.
Eye box: The eye box is the area of a NED within which the eye can be located and see the image being projected. While it is defined as an area, the eye box typically exists in many different planes and extends in three dimensions.
Eye Pupil: The eye pupil is the opening of a users' eye where the light enters. The iris determines the diameter of the eye pupil. Usually, the eye pupil varies between 3 mm and 5 mm.
Display Exit Pupil: The display exit pupil is a small pupil formed by a NED in the corresponding eye box.
Display devices described herein are specially adapted to allow the display exit pupil to be steerable within the eye box.
Pupil Steering: Pupil steering refers to steering the display exit pupil to different positions within the eye box so that it illuminates the eye pupil of a user of a NED as they move their eyes.
There will also preferably be feedback to a display engine so that the movement of the users' pupil causes the appropriate adaptations of a scene to be rendered and output.
Field of View (FoV):
FoV defines the image size perceived by a user. For a NED, the range of angles from which light converges to the user eye pupil represents the image size on the retina and the perceived image size. FoV may be described as an angle in degrees. Field of view does not correspond to the viewing angle of a display.
F#, Depth of Field (DoF) and Aberrations:
The "F-number" (also F# of f/#) is the ratio of the focal length f of a lens divided by the diameter of that lens. The depth of field (DoF) relates to how fast an image becomes defocused as the optics move away from the focal plane.
Increasing the Fit value increases the depth of field (DoF), and reduces the effect of aberrations on the image quality. When the F# value is large (i.e. for small beam diameter), the DoF is large, and the image remains in focus further away from the focal plane.
-15 -Note that the DoF is directly linked to the "Depth of Focus". DoF refer to the object side (i.e., the real world in this case) and the "Depth of Focus" refers to the image side (i.e., for a near-eye display this will be the eye's retina) as the distance between the image plane and the lens changes.
Beam diameter As the eye pupil diameter decreases, aberrations associated with any display and the users' eye lens reduce, and the DoF increases. While the eye will automatically reduce its iris diameter (and hence eye diameter) in a bright environment, this is not controlled and cannot be utilised in a practical system.
Conventional NEDs create a large eye box that often covers an area of a few square centimetres. Therefore, the eye pupil diameter is not controlled by the optical system but by the individual's physiology.
A similar effect may be obtained by instead reducing the display exit pupil diameter. The present specification concerns a display device which may be controlled to provide highly directional output. Such display devices have many applications, one of which is to be used in a NED in combination with and eye/pupil tracking system, in order to form a display exit pupil of small size which may be steered to track the users' eye pupil. This may allow advantages including reduced aberrations (so a user may not require their prescription lens), increased DoF and reduced or eliminated VAC.
Display device Referring to Figure 1, a display device 1 is schematically shown.
The display device includes an image forming array 2 of first pixels Pi (Figure 8), and a directional backlight unit 3 arranged to illuminate the image forming array 2. The image forming array 2, for example an array of transmissive liquid crystal pixels forming a colour display, is controlled to form an output image /mout. Collimated and directional output light 4 from the directional backlight unit 3 passes through (or reflects from -see Figure 17) the transmissive (or reflective) image forming array 2 to produce output image light 5 which projects the output image whilst remaining highly directional (though there will be some small scattering in the image forming array 2).
The light leaving the image forming array 2 may be diverging, converging or collimated. For example the choice of converging or diverging may depend on -16 -whether further optical elements will be used to focus or spread the image light 5 and the position of the virtual image plane.
As shall be explained herein, the output direction Gout in which the output light 4 is directed may be constant across the display device 1, though more typically there will be some variation in the output direction Bout across the display device, for example about a central output direction GO (for example, the mean of the output direction Bout across the display device 1) Nonetheless, the overall viewing angle of the display device 1 is relatively low: for example less than or equal to 20° about the central output direction GO. The viewing angle may be defined as the angle to the central output direction Boat which a luminance of the display device is 50% of the maximum luminance, i.e. the viewing angle relates to the drop-off in luminance as observation angle moves away from the central output direction GO. Viewing angle is therefore used to have the same meaning as this term has in relation to more conventional televisions/displays. The viewing angle of the display device 1 is entirely different from the field-of-view (FoV) which will be perceived by a user when the display device 1 is incorporated into a near-eye display system 17 (Figure 5). The FoV in such a situation will depend on the range of angles from which light arrives into the eye box 15 (Figure 5), which largely depends on an imaging element 21 which receives light 5 output from the display device 1. In addition to the near-eye display applications explained hereinafter, this property also makes the display device 1 well suited to applications such as dynamic privacy displays or similar directional display application.
The directional backlight unit 3 comprising an array of cells 6. Each cell 6 includes an array 7 of second pixels P2 (Figure 2) and an optical coupling element 8 arranged to receive incident light 9 from the corresponding array 7 of second pixels P2, and to output collimated output light 4.
In the example shown in Figure 1, the directional backlight unit 3 includes an N by M array of cells 6 (each of N and M is a positive definite integer), with the array 7 belonging to the cell 6 at the intersection of the nth of N rows and mth of M columns denoted as SP(n,m). Each array 7, SP(n,m) provides incident light 9 to one corresponding optical coupling element 8. That optical coupling element 8 then provides collimated output light 4 at an output angle Bout(n,m) corresponding to that cell 6.
In reality, the output angle of collimated output light 4 will be in three dimensions, and will be characterised by a pair of angles: a polar output angle Bout(n,m) made with -17 -a normal 10 to a plane of the optical coupling element 8, and an azimuthal angle yocut(n,m) made between the projection of the collimated output light 4 onto a plane defined by normal 10 and an arbitrary reference direction (which for the sake of discussion is taken to be the vertical direction in Figure 1). Thus the collimated output light 4 is output in a direction [Bout(n,m), cpout(n,m)]. However, in the interests of brevity, unless the context clearly shows otherwise, reference to output angle eout(n,m) should be understood as referring to the direction [0out(n,m), coout(n,m)] of collimated output light 4 from the cell 6 with coordinate (n,m) in the array of cells 6.
Referring also to Figure 2, an example of an array 7, SP(n,m) of second pixels P2 is shown.
In the example shown in Figure 2, the array 7, SP(n,m) is a rectangular array of J by K second pixels P2 (each of J and K is a positive definite integer). The second pixel P2 at the intersection of the jth of J rows and the kth of K columns is denoted P2(j,k).
The display device 1 is configured to control the output angle Ocut(n,m) of the collimated output light 4 from each cell 6 by controlling which second pixels P2(j,k) of that cell 6 illuminate the respective optical coupling element 8. For example, the directional backlight unit 3 illustrated in Figure 1 includes a controller 11 which controls the arrays 7, SP(n,m) of second pixels P2(j,k). When included the controller 11 may also control the image forming array 2. However, in other examples the image forming array 2 may be driven by a separate display controller (not shown).
For example, referring also to Figures 3A and 3B, two example states of second pixels P2(j,k) are shown for the array 7, SP(n,m) of a cell 6.
In Figures 3A and 3B, the array 7, SP(n,m) is formed from 1=9 by K=9 second pixels P2(j,k). Figure 3A shows a first output condition in which second pixel P2(5,5) is set to an illuminated, or "ON" state, whilst every other second pixel P2(j,k) is set to an unilluminated or "OFF" state. This may be compared to a second output condition, shown in Figure 3B, in which the second pixel P2(4,3) is set to ON whilst all other second pixels P2(j,k)are set to OFF.
The physical meaning of a second pixel P2(j,k) being ON/OFF depends on the way it is implemented. For example, if the second pixels P2(j,k) are emissive, for example inorganic light-emitting diodes (LED), or organic light-emitting diodes (OLED), then ON corresponds to driving the second pixels P2(j,k) to emit light and OFF corresponds -18 -to not driving the second pixel P2(j,k). Alternatively, the second pixels P2(j,k) may be transmissive (or reflective), for example liquid crystal (LC) pixels, in which case an ON state would correspond to maximum transmission (or reflectance) and an OFF state would correspond to minimum transmission (or reflectance), so as to form an aperture (or "pinhole") to pass light 12 emitted from a backlight source 13. This may alternatively be thought of as using the second pixels P2(j,k) to output a binary "mask" image defining the aperture (though more complex, non-binary filter-shapes may of course be used instead).
The second array 7, SP(n,m) of each cell 6 may be provided as a physically separate array of second pixels P2(j,k). Alternatively, the second arrays 7, SP(n,m) of every cell 6 may be provided by a global array of pixels, with each second array 7, SP(n,m) corresponding to a region of that global array. For example, the global array may be a LED/OLED display or a LC display covering the entire active area of the display device 1, with the each second array 7, SP(n,m) being defined as a separately controllable region. This may require modification to standard addressing schemes for some LED/OLED/LC displays. When a global array is used, the states of every cell 6 may be simultaneously output by driving the global array with an image defining the emission regions/apertures for every cell 6. In this way, a conventional display and driver may be adapted to provide second arrays 7, SP(n,m).
When used, the backlight source 13 may be controlled by the controller 11. If the backlight source 13 is segmented, the controller 11 may control which segments are illuminated to allow improved dark contrast (in the same way as a conventional LCD with segmented backlight). The backlight source 13 is not required if the second pixels P2 are emissive (e.g. LED/OLED).
In this way, the array 7, SP(n,m) of second pixels P2(j,k) of each cell 6 functions as a localised light source having a controllable relative coordinate (x, y) within that cell.
Specifically, the relative coordinate (x, y) is controlled by the controller 11 determining which second pixel(s) P2(j,k) are set to the ON state within the array 7, SP(n,m) of each cell 6. The relative coordinates (x, y) which are illuminated may be the same in each cell 6, but more typically the relative coordinated within each cell 6 will be controlled in dependence upon the position n, m of that cell 6 within the overall array, i.e. (x(n,m), y(n,m)), as illustrated in Figure 1.
The optical coupling element 8 receives the incident light 9 from the array 7, SP(n,m) controlled to provide a localised light source with relative coordinate (x, y), and -19 -converts this to collimated output light 4 at an output angle 90,,t(n,m) which is a function of the relative coordinate (x, y). In other words, for each cell 6, the controller 11 sets which second pixels P2(j,k) of the array 7, SP(n,m) of that cell 6 are switched ON, and the location (x, y) of those second pixels P(j,k) within that cell 6 controls the output angle 90,,t(n,m) of collimated output light 4 leaving the optical coupling element 8 to illuminate the image forming array 2. The total range across which a cell 6 is able steer the output angle ecut(n,m) will vary depending on a number of factors including but not limited to the choice of optical coupling element 8, and may typically be within a range of ±20° of the normal 10 to the plane of the optical coupling element 8.
Preferably the collimated output light 4 from each cell 6 illuminates 2 5 % or less of the active area of the image forming array 2. Typically, the fraction of the active area of the image forming array 2 illuminated by a single cell 6 will be less than 25%, for example as low as 10% or less.
The directional backlight unit 3 is preferably configured to illuminate the image forming array 2 with white collimated output light 4, and the image forming array 2 preferably includes colour first pixels Pi so that the image light 5 encodes the output image /mout in colour. For example, the image forming array 2 may take the form of a colour transmissive liquid crystal array (i.e. each first transmissive pixel may include three of more sub-first pixels of different colours such as, for example, red, green and blue sub-first pixels).
White light does not require broadband/black body type spectra, and indeed such light sources are often impractically bulky/inefficient. Typically white light will have a spectra obtainable using LEDs/OLEDs/plasma display elements, fluorescent elements (for example of a backlight source 13) and so forth.
The collimation provided by each optical coupling element 8 is preferably narrow, for example each optical coupling element 8 may be configured to collimate the light received from each cell 6 to within ±3° of an output angle 0..t(n,m) set for that cell 6 by the display device 1 (e.g. using controller 11). The optical element 8 may be considered to be configured to meet this condition if the luminance of light from the corresponding cell 6 is 50% or less at an angle ±E° (e.g. E = 3° etc) away from the output angle 00,,r(n,m) set by the display device 1. Smaller values of the collimation angle E are generally preferred.
-20 -Near-eve display Referring also to Figures 4A, 4B and 4C, examples of interactions of a users' eye 14 with the display exit pupil of a NED are shown.
Referring in particular to Figure 4A the configuration of a conventional NED is shown, in which a relatively large display exit pupil 15 is formed to define a large eye box. This allows movement of the user pupil 16, and at any point within the display exit pupil 15 the user will be about to see the output image Imo" r. However, as explained hereinbefore, the challenges of optimising the output across a large display exit pupil lead to VAC, poor DoF, significant influence of aberrations from the users' eye 14 so that they must wear any prescription lenses they may require, and so forth.
Referring in particular to Figure 4B, as described hereinbefore, these issues can be reduced by reducing the size of the display exit pupil 15 of the NED. For example, making the display exit pupil 15 of a NED similar to, or preferably smaller than, the user eye pupil 16 may reduce or remove VAC, improve the DoF, and may allow a user to use the NED without needing their prescription lenses (there may of course be upper limits for particularly strong prescriptions).
However, referring now to Figure 4C, this approach has the problem that even a small displacement (e.g. due to jogging a headset of switching between users) and/or rotation of the user eye 14 will lead to the display exit pupil 15 of the NED fully or partly missing (clipping) the user pupil 16.
Referring also to Figure 5, a near eye display system 17 is shown.
The near eye display system 17 includes a directional display system 18 and an eye-tracking system 19 configured to determine a location of the user eye pupil 16. The eye tracking system 19 may take the form of a camera, and conventional eye/pupil tracking systems may be used. The near eye display system 17 may be for a head mounted display, but is not limited to this application.
The directional display system 18 is configured to output a beam 19 which forms a narrow display exit pupil 15 which is of the order of the eye pupil 16 diameter or preferably smaller. For example, in the range between 2 mm and 7 mm. The near eye display system 17 solves the issue of the user pupil 16 moving out of the small display exit pupil 15 by using the eye-tracking system 19 to determine a position of -21 -the user pupil 16 in real time, and controlling the directional display system 18 to steer the beam 20 to keep the display exit pupil 15 coincident with the user pupil 16.
For example, if the user eye 14 was to move to a second position 14b, the user pupil 16 would be moved to new position 16b. The eye tracking system 19 detects and relays the new position 16b to the directional display system 18, which steers the beam 20 to new position 20b to move the display exit pupil 15 into coincidence with the new use pupil position 16b. In this way, the eye box of the near eye display system 17 may be expanded beyond the size of the display exit pupil 15.
The difficulty in implementing the near eye display system 17 resides in providing a directional display system 18 which is capable of producing a steerable beam 20 as described. This may be accomplished using movable mirrors, lenses and so forth, but this may quite quickly become too bulky and/or heavy for practical use in key applications such as near-eye displays. The added complexity and cost also provide barriers to adoption of such technologies.
The directional display device 1 of the present specification may be used to avoid such issues. In particular, referring also to Figure 6A, the directional display system 18 may be provided by combining the display device 1 with one or more imaging elements 21 configured to focus image light 5 output from the image forming array 2 of the display device 1 to form a display exit pupil 15.
As shown in Figure 6A, if the imaging element 21 is ideal, then all cells 6 may be controlled to output at a single angle eour(n,m) = Go for all and all then focussed by an imaging element 21 to form the display exit pupil 15 at the focal length f of the imaging element 21. The position of the display exit pupil 15 on the plane at the focal length f will depend on the incident angle GO of the image light 5 (as shown by consideration of regular geometric optics for a convex lens interacting with collimated beams). If sufficiently ideal imaging elements 21 can be obtained, then the display device 1 may then simply control all cells 6 to a single output angle eo.
However, in practice, and in particular at the compact focal lengths f typically required for a near eye display system 17 used in a head mounted display, imaging elements 21 are likely to be non-ideal for image light 5 which is not directed parallel to the optical axis. Referring also to Figure 6B, if all cells 6 output at a single angle DO then this will result in aberrations of the image formed in the display exit pupil 15.
-22 -Referring also to Figure 16C, the imperfections of practical imaging elements 21 may be corrected by modifying the angles output by each cell 6 about the mean output angle (90. For example the four beams of image light 5 corresponding to four cells 6 illustrated in Figure C are set to angles Bs to 04 respectively to correct for non-ideal and/or off-axis effects of the imaging element 21. The output angles eout(n,m) will average (mean) to the output angle GO across the display device 1, i.e. 190 = mean(00.t(n,m)).
The compensation required may in general be a function of the desired display exit pupil 15 position, and can be calibrated for a specific imaging element 21. For example one option for calibration during manufacturing is to use a camera sensor (not shown) in place of the users eye 14. By turning ON a small group of second pixels P2(j,k) at the time, and varying the output angle Bout(n,m), the relationship between output angle Bout(n,m) and pixel coordinate (j,k) may be found for the corresponding cell 6. By carrying out the same procedure for each cell 6, the relationship that links to the display exit pupil 15 position to the output angles Bout(n,m) and pixel coordinates (j,k) (or FoV) can be determined across the display device 1.
Aside from non-ideal behaviours, it may also be desired to use imaging elements 21 which depart from circular symmetry. For example, in a near-eye display system 17 providing a head mounted display, imaging elements having a more rectangular shape may be used. The ability of the display device 1 to control the output angle Bour(n,m) of each cell 6 independently may provide great flexibility in the choice of imaging elements 21.
Examples hereinafter may be drawn showing a single output angle Go for visual clarity, however any of these examples may employ output angles eout(n,m) which vary across the array of cells 6 as described.
Steerable display exit pupil Referring also to Figures 7A and 7B, steering of the display exit pupil 15 position by changing an average output angle GO is illustrated.
For visual clarity, Figure 7A only shows the output collimated light 4 from a pair of cells 6 illuminating the image forming array 2. Although drawn as perfectly collimated, the output light 4 may in practice diverge (or converge) slightly. When a first pixel Pi of the image forming array 2 is illuminated with collimated output light 4, -23 -the light emerging from the first pixel Pi is also collimated, though there will be some spreading due to diffractive effects (expected to be relatively minor for practical pixel sizes). The illustrated spreading of image light 5 has been exaggerated for visualisation purposes. The angle of a light cone leaving a first pixel Pi of the image forming array 2 is mainly controlled by two factors: the quality of collimation of the collimated output light 4, in combination with the diffractive effects in the first pixel Pi. Unless deliberately increased, the diffractive effects are expected to be minimal, so the collimation angles of the collimated output light 4 will dominate the cone angle of the image light 5, and to a reasonable first approximation the collimation angle of the image light 5 may be considered to be equal to the collimation angle E of the collimated output light 4.
as After focussing by an imaging element 21 with focal length f, the spread in angles for light which passed a first pixel Pi may be approximated as: SO (n, m) = f * taME(n, m)) (1) In which 60(n,m) is the angle of the cone of light leaving the imaging element 21, f is the focal length of the imaging element 21, and s(n,m) is the collimation angle about the output angle Bout(n,m) for the output light 4 and image light 5 from the cell 6 at position (n,m).
Referring now to Figure 7B, if the user eye 14 moves (the position from Figure 7A is shown with a dashed outline), the user pupil 16 will no longer receive the display exit pupil 15 focussed by the imaging element 21. As illustrated in Figure 7B, the display exit pupil 15 tracks the user pupil 16 by changing the average output angle Go.
First directional backlight unit Referring also to Figure 8, a first example of a directional backlight unit 22 is shown in a schematic cross-section (hereinafter the "first directional backlight unit").
The first directional backlight 22 includes a light source 13, in this example backlight source 13, arranged to illuminate the array of cells 6 such that each optical coupling element 8 is separated from the light source 13 by the respective second pixels P2(n,m). Each second pixel P2(n,m) of each cell 6 takes the form of a transmissive pixel. In this example, an array of monochrome liquid crystal pixels 23, is common to all of the cells 6, with each second array 7, SP(n,m) provided by a separate region of the array of monochrome liquid crystal pixels 23. Three cells 6i, 62, 63 are illustrated -24 -in Figure 8, but in generally the array of cells 6 will extend beyond the portion illustrated in Figure 8.
The backlight source 13 outputs uncollimated backlight 24. As described hereinbefore in relation to Figures 3A and 3B, within each cell 62, 62, 6s, if it is active then one, or a small group, of second pixels P2(j,k) will be controlled to be transmissive, whilst the remainder are kept opaque. In other words, each cell 6i, 62, 63 outputs a mask image which defines an aperture through which the uncollimated backlight 24 may pass.
In some example, for instance as illustrated for the second cell 62 in Figure 8, the mask image need not include an aperture: for example if a corresponding region of the image forming array 2 is outputting black, this may improve the dark contrast. However, in general the size of cells 62, 62, 63 will be larger, or significantly larger, than the size of first pixels Pi, so that typically every cell 62, 62, 63 will output an aperture (second pixels Pi, P2 will generally be of similar dimensions to at least sub-first pixels in a colour image forming array 2).
Each aperture formed in an array 7, SP(n,m) of second pixels P2(j,k) acts as a point source (or "pinhole") of incident light 9. Typically the divergence of incident light 9 is expected to be dominated by the divergence of the uncollimated backlight 24, though if the second pixels P2(j,k) are made small enough then diffractive effects may modify this.
Each cell 62, 62, 63 includes an optical coupling element 8 in the form of a lens, arranged so that its focal plane corresponds to an implied origin of incident light 9 diverging from the aperture formed by the transmissive second pixel(s) P2(j,k). The spacing between the array of monochrome liquid crystal pixels 23 and the optical coupling elements 8 may be typically in the range of several mm, down to about 0.5 mm. The optical elements 8 of each cell 62, 62, 63 may be formed as separate pieces.
However, it is expected to be more practical to form all the optical coupling elements 8 as a single piece having thickness modulated to form an lens array (sometimes also termed a "lenslet array") to provide the optical coupling elements 8. Such a lens array is then aligned relative to the array of monochrome liquid crystal pixels 23 to form the cells 62, 62, 63.
The lenses 8 (or array thereof) of cells 6 may be of any suitable type (or combination of types) including for example, refractive lens(es), multi-material achromatic lenses, -25 -pancake lens(es), Fresnel lens(es), diffractive lens(es), meta-lens(es) and holographic lens(es).
Referring also to Figure 9, the control of the angle Bout(n,m) of collimated output light 5 is illustrated within a single cell 6.
Figure 9 is a schematic, geometric optics diagram for an optical coupling element 8 in the form of a convex lens. For the sake of visual clarity, a column of the array 7, SN(n,m) of second pixels P2(1,k), P2(11,k) has been positioned on the focal plane of the lens 8.
For a first case, in which an aperture is formed by switching second pixel P2(10,k) to ON, a pair of rays 25a, 26a are drawn with solid lines: a first ray 25a of incident light 9 travelling parallel to the optic axis 27 of the cell 6 and a second ray 26a of incident light 9 which passes through the optical centre of lens 8. These rays 25a, 26a are drawn for visual clarity, though all other rays originating from the ON second pixel P2(10,k) will be deflected to the same angle Owt(n,m) of collimated output light 4.
In a second case, aperture is moved by switching second pixel P2(10,k) to OFF and switching second pixel P2(4,k) to ON. A pair of rays 25b, 26b are drawn with chained lines: a first ray 25b of incident light 9 travelling parallel to the optic axis 27 of the cell 6 and a second ray 26b of incident light 9 which passes through the optical centre of lens 8. However, for the second pixel P2(4,k), it may be observed that the direction of collimated output light 4 is shifted from 0out(n,/n) to Wout(n,m).
In this way, the display device 1 (e.g. the controller 11 thereof) controls which second pixels P2(j,k) of each cell 6 illuminate the respective optical coupling element 8, and hence the output angle Bout(n,m), by causing the array 7, SP(n,m) of that cell 6 to output a mask image defining an aperture having a location (x, y) relative to that cell 6.
In general, a "mask" image defining the aperture may take the form of a single second pixel P2(j,k) being set to a transmissive (or "ON") state to form the aperture. However, in some examples mask images may take the form of a group of second pixels P2(j,k) being set to a transmissive/reflective state to form an aperture. For example a 2 by 2 block of second pixels P2(j,k), a cross formed by a central second pixel P2(j,k) and its nearest neighbours P2(j÷1,k±1). In all cases, any other second -26 -pixels P2(j,k) in the same cell not defining the aperture are set to the OFF (i.e. opaque) state.
The mask image may be a binary image, but this is not essential. For example, a central pixel of an aperture P2(j,k) may be set fully ON, i.e. maximum transmission, and a surrounding square of second pixels P2(j,k) may be set to a level of transmission corresponding to a Gaussian (or other) filter function). In this way, sharp edged, often square second pixels P2(j,k) may be used to approximate more rounded/softer edged apertures.
The display device 1 (e.g. controller 11 thereof) may be configured to control every cell 6 to output the same mask image, having the aperture at the same relative location (xo, yo), so that every cell 6 output collimated output light 4 as the same output angle 90ot(n,m) = DO. However, for the reasons explained in relation to Figures 6A to 6C, this is neither essential nor expected, and in most implementations the display device 1 will control the cells 6 so that the output angle Boot(n,m) varies about the average Do across the display device 1. This may be accomplished by causing each array 7, SP(n,m) to output mask images which have apertures centred at different relative locations (x(n,m), y(n,m)).
Similarly, the display device 1 (e.g. controller 11 thereof) may be configured to control every cell 6 to output the same mask image, having the same size and shape of aperture. This may be appropriate if the optical coupling element 8 has efficiency, in terms of a fraction of incident light 9 converted to collimated output light 4 which is largely invariant to output angle Ooot(n,m). In practice, there is likely to be some variation in efficiency with output angle 19aut(n,m), which may be compensated for by controlling each array 7, SP(n,m) to output mask images which differ in one or both of size and shape of the aperture formed. For example, the total amount of incident light 9 for each cell 6 may be controlled with precision by setting second pixels P2(j,k) surrounding a central second pixel P2(j,k) of an aperture to transparencies partly between fully ON (transparent) and fully OFF (opaque/extinction).
In a modification of the first directional backlight unit 22, the second pixels P2(j,k) may be provided by reflective pixels (for example liquid crystal on silicon, LCOS) instead of transmissive liquid crystal pixels. Transmissive second pixels P2 are expected to be more compact and are preferred for near-eye display systems 17 (such as a head mounted display), however, reflective second pixels P2 could be used, and may also be useful in other applications such as head-up displays.
-27 -As an alternative to liquid crystal pixels providing the second pixels P2(j,k), electrochromatographic pixels may be used instead. The choice may depend on the refresh rates required for the display device 1.
The backlight source 13 may in general be of any type presently known for use a backlight for an LC display device such as, for example, an LED backlight, and OLED backlight, a lightguide backlight with input from any suitable light source, a segmented back-light unit and so forth.
When the back-light source 13 is a segmented backlight unit, it may include a separate segment corresponding to each cell 6, and preferably arranged to illuminate only that cell 6. This may allow for improved dark-contrast, by switching off the corresponding segment of the backlight unit when a corresponding area of the image forming array 2 should be outputting black.
Although illustrated in Figure 8 as illuminating a imaging element 21, the imaging element 21 is not part of the first directional backlight unit 22, and may be omitted when the first directional backlight unit 22 is used in different types of device to near eye display system 17. For example, a dynamic privacy display.
Second directional backlight unit Referring also to Figures 10 and 11, a second example of a directional backlight unit 28 is shown (hereinafter the "second directional backlight unit").
Figure 10 shows a single cell 6 of the second directional backlight unit 28. Figure 11 shows a plan view of a 3 by 3 block of arrays 7, SP(n,m) of second pixels P2(j,k).
The second directional backlight unit 28 is the same as the first directional backlight unit 22, except that the second pixels P2(j,k) of each cell 6 are arranged as arrays 7, SP(n,m) surrounded by a buffer region 29.
As illustrated in Figure 10, the buffer regions 29 may be formed of unused pixels 30 which are identical to the second pixels P2(j,k), but which the display device 1 is configured not to use. For example, when the P2(j,k) are provided regions of a single array of monochrome liquid crystal pixels 23, the unused pixels 30 are pixels of that array which are permanently switch OFF.
-28 -Alternatively, as illustrated in Figure 11, no pixels may be formed in the buffer region 29, so that second pixels P2(j,k) are formed exclusively in active areas of each cell 6 corresponding to the array 7, SP(n,m). This may require customisation of a substrate providing the arrays 7, SP(n,m), rather than using an existing array of monochrome liquid crystal pixels 23. In such cases, conductive traces for addressing the second pixels P2(j,k) of each array 7, SP(n,m) may be routed through the buffer region(s) 29, which may enable a greater variety of control sequences when compared to using regions of single array monochrome liquid crystal pixels 23 to provide the arrays 7, SP(n,m).
The buffer regions 29, for example unused pixels 30, may be used because of the paths of light between the backlight source 13 and the optical coupling element 8. Referring in particular to Figure 10, the projected area of the cell 6 onto the backlight source 6 may be divided into three types of areas. A first, or "full" illumination zone 31 which will illuminate the entire optical coupling element 8 of the corresponding cell if all second pixels P2(j,k) of the array 7, SP(n,m) were switched ON concurrently. A second, or "part" illumination zone 32 will illuminate at least a portion of the optical coupling element 8 of the corresponding cell if all second pixels P2(j,k) of the array 7, SP(n,m) were switched ON concurrently. Outside of the plane of Figure 10, the part illumination zone 32 will surround the full illumination zone 31. A third, or "zero" illumination zone 33 will not illuminate any portion of the optical coupling element 8 of the corresponding cell 6 even if all second pixels P2(j,k) of the array 7, SP(n,m) were switched ON concurrently.
In this way, defining the second pixels P2(j,k) of each array 7, SP(n,m) to be surrounded by a buffer region 29 may improve efficiency by only using the most relevant area, and may also reduce cross-talk between adjacent cells 8. Light from the zero illumination zone 33 will reach adjacent cells 8, but the high angles will mean that the luminance of any cross-talk will be greatly reduced compared to defining the array 7, SP(n,m) all the way up to the edge of the cell 8.
Shapes of cells The cells 6 illustrated and described in relation to the first directional backlight unit 22 and the second directional backlight unit 28 have been generally square in shape, with square arrays 7, SP(n,m) formed of square second pixels P2(j,k). Optical coupling elements 8 may be either circular with sufficient radius to cover arrays 7, SP(n,m), or more preferably square. An optical coupling element 8 in the form of a lens may have a square perimeter whilst still having circularly symmetric optics up to that perimeter.
-29 -However, the directional backlight unit 3, including either of the first 22 and/or second 28 directional backlight units are not limited to square shaped cells 6. In general, cells 6 may be of any shape, though in order to avoid gaps in the illumination of the image forming array 2, shapes of cells 6 which tesselate are preferred.
Referring also to Figures 12A through 12F, different cell 6 shapes are illustrated.
Referring in particular to Figures 12A, 12C and 12E, the cells 6 may be square, rectangular or hexagonal respectively, and may have correspondingly shaped optical coupling elements 8, for example lenses as illustrated. Corresponding arrays 7, SP(n,m) of second pixels P2(j,k) are shown in Figures 12B, 12D and 12F respectively. The examples shown in Figures 12B, 12D and 12F include buffer regions 29, but in other examples buffer regions 29 may be omitted. Optical coupling elements 8 in the form of lenses may have circularly symmetric optical properties, simply truncated by a non-circular perimeter.
Referring in particular to Figures 12A and 12B, there is no requirement that the shape of the optical coupling element 8 must match the shape of the array 7, SP(n,m) within a cell 6. For example, the square optical coupling element 8 shown in Figure 12A may be used with a rectangular array 7, SP(n,m) of square second pixels P2(j,k).
Additionally, when a segmented backlight source 13 is used having segments corresponding to each cell 6, each segment of that backlight source 13 may have a shape matching the respective cell 6.
Whilst it will typically be spatially efficient if the second pixels P2(j,k) of each cell 6 have a shape matching the shapes of that cell 6, this is not essential. For example, instead of the hexagonal array 7, SP(n,m) of hexagonal second pixels P2(j,k) shown in Figure 12F, the hexagonal optical coupling element 8 of Figure 12E may be combined with an array of square second pixels P2(j,k) approximating the (pixelated) outline shape of the hexagonal optical coupling element 8.
Third directional backlight unit Referring also to Figure 13, a third example of a directional backlight unit 34 is shown (hereinafter the "third directional backlight unit").
-30 -Figure 13 shows a single cell 6 of the third directional backlight unit 34. The third directional backlight unit 34 is a modification of the second directional backlight unit 28 in which the backlight source 13 does not emit uncollimated backlight 24 across its entire area.
One way to implement this is as illustrated in Figure 13, using a backlight source 13 in the form of a lightguide 35 and one or more in-coupled light sources 36. In an area of the lightguide 35 corresponding to each cell 6, an outcoupling element 37 is provided to extract light from the lightguide 35. Specifically, the outcoupling element 37 corresponding to each cell 6 has a shape and area corresponding to the union of the full illumination zone 31 and the part illumination zone 32 of the array 7, SP(n,m) of that cell 6.
In this way, light is not output in the zero illumination zone 33, improving the light efficiency of the display device 1 and reducing power consumption without decreasing the observable brightness of the display device 1 to a user. This may also help to improve the dark contrast of the display device 1.
In a case where the backlight source 13 is provided by an array of inorganic LEDs or an array of OLEDs, the third directional backlight unit 34 may be implemented by not illuminating LEDs/OLEDs in the zero illumination zone 33. Alternatively, the third directional backlight unit 34 may be implemented by forming the LEDs/OLEDs of a backlight source 13 only in the regions which correspond to union of the full illumination zones 31 and the part illumination zones 32 of the array 7, SP(n,m) of cells 6.
Fourth directional backlight unit Referring also to Figure 14, a fourth example of a directional backlight unit 38 is shown (hereinafter the "fourth directional backlight unit").
Figure 14 shows a single cell 6 of the fourth directional backlight unit 38. The fourth directional backlight unit 38 is the same as the third directional backlight unit 34, except that it further includes optical baffles 39 arranged to block light between the array 7, SP(n,m) of second pixels P2(j,k) of each cell 6 and the optical coupling elements 8 of the adjacent cells 6.
Referring also to Figure 15, a schematic plan view of a portion of the fourth directional backlight unit 38 is shown.
-31 -The optical baffles 39 take the form of walls shaped as a mesh, surrounding and leaving gaps for the optical coupling elements 8 of each cell 6.
In this way, cross-talk between adjacent cells 6 may be further reduced.
The optical baffles 39 illustrated in Figure 14 span a space between the arrays 7, SP(n,m) of second pixels P2(j,k) and the corresponding optical coupling elements 8. In the example shown in Figure 14 this space is roughly equal to the focal length of the optical coupling element, but this is not essential. For example the optical baffles 39 may be formed on an array of monochrome liquid crystal pixels 23 by printing or similar techniques. The optical baffles may additionally serve to mechanically stabilise and control the separation between the arrays 7, SP(n,m) of second pixels P2(j,k) and the corresponding optical coupling elements 8.
Alternatively, the optical baffles 39 could extend further from the second pixels P2(j,k), separating and/or supporting the optical coupling elements 8 of adjacent cells.
However, whilst the optical baffles 39 illustrated in Figure 14 span the space between the second pixels P2(j,k) and the optical coupling elements 8, this is not essential.
Optical baffles may still be effective even if they only extend part-way across the space (and the may extend from either the second pixels P2(j,k) or the optical coupling elements 8).
Although illustrated as a modification of the third directional backlight unit 34, optical baffles 39 may be added to any directional backlight unit 3, including but not limited to the first directional backlight unit 22 and the second directional backlight unit 28 (and described modifications thereof).
When the display device 1 includes optical baffles 39 and uses buffer regions 29, the optical baffles 39 may be aligned to overlie the buffer regions 29.
Fifth directional backlight unit Referring also to Figure 16, a fifth example of a directional backlight unit 40 is shown (hereinafter the "fifth directional backlight unit").
Figure 16 shows a multiple cells 6 of the fifth directional backlight unit 40 in the context of near eye display system 17. Although component parts of the cells 6 have -32 -only labelled for the topmost cell 6 in Figure 16, all the illustrated cells 6 have the same configuration The fifth directional backlight unit 40 is the same as the fourth directional backlight unit 38, except that it further includes an optical diffuser layer 41 positioned between the cells 6 forming the direction and the image forming array 2.
The optical diffuser layer 41 is included to add a small, controlled spreading of the collimated output light 4 which may help to reduce or even remove visibility of boundaries/borders between adjacent cells 6 of the fifth directional backlight unit 4.
The optical diffuser layer 41 may spread a distribution of collimated output light 4 by at most 5°, and preferably less, in some example even less than 1°. The amount of spreading to add will depend on how closely the active areas of cells 6, i.e. the area of the optical coupling element 8 which emits useful light, can be packed together. The larger the gaps between the active areas of cells 6, the more diffusion may be useful to conceal the boundaries from being perceptible in the image seen by a user.
A diffuser spreads a distribution of incident light by 58 is the angle(s) at which the luminance of that distribution is 50% of the luminance at the output angle are changed by 50 before and after passage through the diffuser. For example, collimated output light 4 having a dispersion angle of c before passing through the optical diffuser layer 41 would have an angle of s+.50 afterwards when illuminating the image forming array 2.
The optical diffuser layer 41 is illustrated as being contiguous across the array of cells 6. This provides for a simpler assembly, though there is no technical reason why the same functions could not be provided by a separate optical diffuser (not shown) including in each cell 6.
Alternatively still, the functionality of the optical diffuser layer 41 could be directly integrated the optical coupling elements 8. For example, an array of lenses could be formed to provide optical coupling element 8, then subjected to chemical, thermal and/or mechanical treatments so as to add a controlled amount of scattering. In other cases an engineered diffuser can be embossed on the surface of each optical coupling element 8.
Another alternative approach to provide the same function would be to replace the optical diffuser layer 41 with a low-power lens (not shown). When used, the low- -33 -power lens (not shown) may be of any type described as being usable for a lens providing an optical coupling element 8 and/or an imaging element 21.
Although illustrated as a modification of the fourth directional backlight unit 38, an optical diffuser layer 41 (or equivalent, for example as described hereinbefore) may be added to any directional backlight unit 3, including but not limited to the first directional backlight unit 22, the second directional backlight unit 28 and the third directional backlight unit 34 (and described modifications thereof).
Reflective image forming array Each of the first 22, second 28, third 34, fourth 38 and fifth 40 directional backlight units have been described and illustrated in the context of illuminating a transmissive image forming array 2.
However, the display device 1 is not limited to using a transmissive image forming array 2, and a directional backlight unit 3 may instead by used to illuminate a reflective image forming array 2b (Figure 17) such as a liquid-crystal on silicon (LCOS) display.
Referring also to Figure 17, the first directional backlight unit 22 is illustrated illuminating a reflective image forming array 2b, in the context of near eye display system 17.
The collimated output light 4 from the first directional backlight unit 22 is reflected toward the reflective image forming array 2b by a beam-splitter 42. The light reflected from reflective first pixels P1 of the reflective image forming array 2b then pass through the beam-splitter 42 (though there will be some losses to reflection) before being focused to form the output image in the display exit pupil 15 by imaging element 21.
Although illustrated in relation to the first directional backlight unit 22, any of the second 28, third 34, fourth 38 and fifth 40 directional backlight units may be used instead to illuminate the reflective image forming array 2b.
Sixth directional backlight unit Referring also to Figure 18, a sixth example of a directional backlight unit 43 is shown (hereinafter the "sixth directional backlight unit").
-34 -The sixth directional backlight unit is the same as the first directional backlight unit 22, except that instead of lenses, the optical coupling element 8 of each cell 6 is provided by one or more arrays of shutter pixels 44 stacked in sequence between the array 7, SP(n,m) of second pixels P2(j,k) and the image forming array 2. In the example shown in Figure 18, a single array of shutter pixels 44 forms the optical coupling element 8, however, two or more arrays of shutter pixels 44 may be used (see also Figure 19). Each third, or "shutter", pixel P3 in an array of shutter pixels 44 is a transmissive pixel, for example a liquid crystal pixel. For example, each array of shutter pixels 44 may be structurally identical to the array of monochrome liquid crystal pixels 23 providing the second pixels P2(j,k).
For each cell 6, the display device 1 is configured to control (for example using controller 11) the one or more arrays of shutter pixels 44 such that light from the second pixels P2(j,k) of that cell can only illuminate the image forming array 2 by passing through a sequence of apertures formed using the one or more arrays of shutter pixels 44. In other words, the collimated output light 4 is generated by positioning an aperture formed by switching one or more third pixels P3 ON relative to the second pixels P2(j,k) which are switched ON, to block light from the backlight source 13 unless it is travelling in a direction so as to pass through both apertures.
This naturally generates collimation because only light within a few degrees of an angle between the apertures can pass through to the image forming array 2.
Referring also to Figure 19, a cell of the sixth directional backlight unit 43 is illustrated for an example including two arrays of shutter pixels 44a, 44b.
The arrays of shutter pixels 44a, 44b and the array 7, SP(n,m) of second pixels P2(j,k) are spaced one after the other with separation s. The separation may be between about 0.1 mm and several mm, for example about 0.5 mm.
The second pixel P2 illustrated without hatching is switched on to form an aperture through which the uncollimated backlight 24 passes to form a first cone 45 of incident light 9 centred roughly perpendicular to the array 7, SP(n,m).
The third pixel(s) P3 of the first array of shutter pixels 44a which lie along a line originating from the ON second pixel P2 and oriented at output angle Boor is (are) switched ON and illuminated by the first cone 45 of incident light 9. Incident light 9 having a range of angles which pass through the ON third pixel(s) P3 pass through to form a second cone 46 of light. The second cone 46 of light will be already collimated, -35 -and depending on the application this could provide the collimated output light 4 (as shown in Figure 18).
If tighter collimation is required, then as illustrated in Figure 19, a further, second array of shutter pixels 44b may be used. Again, the third pixel(s) P3 of the second array of shutter pixels 44b which lie along the line originating from the ON second pixel P2 and oriented at output angle Batt is (are) switched ON and illuminated by the second cone 46 of light. Only light within a range of angles which can pass the apertures in the array 7, SP(n,m), the first array of shutter pixels 44a and the second array of shutter pixels 44b will pass through to emerge as a third cone 47 of collimated output light 4.
Each successive array of shutter pixels 44 will provide tighter collimation of the output light 4, at the cost of relatively reduced luminance. The number of arrays of shutter pixels 44 used to form the optical coupling element 8 may be selected for a particular display device based on the required degree of collimation balanced against the luminance which may be provided by the backlight source 13.
The sixth directional backlight unit 43 may be modified in any of the ways described hereinbefore in relation to the first directional backlight unit 22. Similarly, any of the second 28, third 34, fourth 38 and fifth 40 directional backlight units may be modified to replace optical coupling elements 8 in the form of lenses with stacks of one or more arrays of shutter pixels 44.
Although illustrated in Figure 19 with the shutter pixels P3 being the same size as, and coincident with, one another and with the second pixels P2(j,k), this is not essential. For example, the shutter pixels P3 may be offset relative to the second pixels P2(j,k), or relative to one another between different arrays of shutter pixels 44. Similarly, the third pixels P3 may have a different size (for example edge length/diameter) to the second pixels P2(j,k), or relative to one another between different arrays of shutter pixels 44.
Seventh directional backlight unit Referring also to Figure 20, a seventh example of a directional backlight unit 48 is shown (hereinafter the "seventh directional backlight unit").
Figure 20 shows a three cells 6i, 62, 63 of the seventh directional backlight unit 48. The seventh directional backlight unit 48 is the same as the first directional backlight -36 -unit 22, except that instead of the array 7, SP(n,m) of second pixels P2(j,k) of each cell 6 being provided by the combination of the backlight source 13 and a region of the array of monochrome liquid crystal pixels, each second pixel P2(j,k) instead takes the form of an emissive pixel. For example, each second pixel P2(j,k) may by an inorganic light-emitting diode or an organic light-emitting diode. In the example illustrated in Figure 20, the array 7, SP(n,m) of second pixels P2(j,k) for each cell 6i, 62, 63 is provided by a region of an addressable array of emissive pixels 49.
Referring also to Figure 21, and comparing it with Figure 9, it is immediately apparent that changing the nature of the source of the incident light 9 (from transmissive to emissive second pixels P2) does not change the interaction with an optical coupling element 8 in the form of a lens.
Equally, arrays 7, SP(n,m) of emissive second pixels P2(j,k) could be used with a stack or one or more arrays of shutter pixels 44 to provide collimated output light 4.
In this way, any of the second 28, third 34, fourth 38, fifth 40 and sixth 43 directional backlight units may be modified to replace the combination of a backlight source 13 and arrays 7, SP(n,m) of transmissive second pixels P2(j,k) with arrays of emissive second pixels P2(j,k), as described in relation to the seventh directional backlight unit 48.
Modifications It will be appreciated that various modifications may be made to the embodiments hereinbefore described. Such modifications may involve equivalent and other features which are already known in the design, manufacture and use of display apparatuses, devices, systems and component parts thereof and which may be used instead of or in addition to features already described herein. Features of one embodiment may be replaced or supplemented by features of another embodiment.
The directional backlight units 3, 22, 28, 34, 38, 40, 43, 49, and display devices 1 containing them, herein have primarily been described in the context of a directional display system 18 of a near-eye display system 17 (such as a head mounted display). However, the directional backlight units 3, 22, 28, 34, 38, 40, 43, 49 and modification thereof described herein are not limited to use in near-eye display systems 17. Other applications include, without being limited to: -37 - * Large displays: for example to reduce power consumption by providing output luminance directed at observers/viewers; * Mobile 3D displays; * In car (or generally vehicle) entertainment systems: for example so that stray light does not distract the driver/operator; * Dynamic (or static) privacy screens, for example if combined with a camera to track the head and eyes of a user, so that only they are seeing the output, but are not required to maintain their head in a fixed, static volume.
Backlight units/sources 13 are not limited to any particular type, and any sort known in relation to conventional liquid crystal displays (transmissive or reflective), liquid crystal on silicon displays, and/or electrochromatographic displays may be used.
The directional backlight units 3, 22, 28, 34, 38, 40, 43, 49 described herein have cells 6 which are arranged in planar arrays. However, in other examples the cells 6 may be slightly angled relative to one another to form (or follow) an approximately curved surface, which could be concave or convex depending on the application. In this way, directional backlight units 3, 22, 28, 34, 38, 40, 43, 49 may be incorporated into curved display devices 1.
Although claims have been formulated in this application to particular combinations of features, it should be understood that the scope of the disclosure of the present invention also includes any novel features or any novel combination of features disclosed herein either explicitly or implicitly or any generalization thereof, whether or not it relates to the same invention as presently claimed in any claim and whether or not it mitigates any or all of the same technical problems as does the present invention. The applicants hereby give notice that new claims may be formulated to such features and/or combinations of such features during the prosecution of the present application or of any further application derived therefrom.

Claims (25)

  1. -38 -Claims 1. A display device comprising: an image forming array of first pixels; a directional backlight unit arranged to illuminate the image forming array, the directional backlight unit comprising an array of cells, each cell comprising: an array of second pixels; and an optical coupling element arranged to receive light from the second pixels and to output collimated output light; wherein the display device is configured to control the output angle of the collimated output light from each cell by controlling which second pixels of that cell illuminate the respective optical coupling element.
  2. 2. A display device of claim 1, configured such that the collimated output light from each cell illuminates 25% or less of the active area of the image forming array.
  3. 3. The display device of claims 1 or 2, wherein each second pixel takes the form of an emissive pixel.
  4. 4. The display device of claims 1 or 2, further comprising a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels; wherein each second pixel takes the form of a transmissive pixel; and wherein for each cell, the device is configured to control which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
  5. 5. The display device of claims 1 or 2, further comprising a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels; wherein each second pixel takes the form of a reflective pixel; and wherein for each cell, the device is configured to control which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
  6. 6. The display device of claims 4 or 5, wherein the light source comprises a back-light unit.
  7. -39 - 7. The display device of claim 6, wherein the back-light unit is a segmented backlight unit.
  8. 8. The display device according to any one of claims 4 to 7, wherein each second pixel is a liquid crystal pixel.
  9. 9. The display device of any one of claim 1 to 8, wherein each optical coupling element comprises a lens.
  10. 10. The display device of any one of claims 1 to 9, wherein each optical coupling element comprises one or more arrays of shutter pixels stacked in sequence between the second pixels and the image forming array; wherein each shutter pixel comprises a transmissive pixel; wherein for each cell, the display device is configured to control the one or more arrays of shutter pixels such that light from the second pixels of that cell can only illuminate the image forming array by passing through a sequence of apertures formed using the one or more arrays of shutter pixels.
  11. 11. The display device of any one of claims 1 to 10, further comprising optical baffles arrange to block light between the second pixels of each cell and the optical coupling elements of the cells adjacent to that cell.
  12. 12. The display device of any one of claims 1 to 11, wherein the second pixels of each cell are surrounded by a buffer region.
  13. 13. The display device of any one of claims 1 to 12, wherein the directional backlight unit is arranged to illuminate the image forming array with white light, and wherein the first array is configured to output a colour image.
  14. 14. The display device of any one of claims 1 to 13, wherein each optical coupling element is configured to collimate the light received from each cell to within ±3° of an output angle set for that cell by the display device.
  15. 15. The display device of any one of claims 1 to 14, wherein the display device is configured to control the output angle of the collimated output light from each cell to the same angle.
  16. -40 - 16. The display device of any one of claims 1 to 15, further comprising a diffuser disposed between the array of cells and the image forming array.
  17. 17. The display device of any one of claims 1 to 16, wherein each cell is 5 rectangular.
  18. 18. The display device of any one of claims 1 to 16, wherein each cell is hexagonal.
  19. 19. A near eye display comprising the display device of any one of claims 1 to 18 and further comprising: an eye-tracking system configured to determine a location of a user eye pupil; and one or more imaging elements configured to focus light output from the image forming array to a display exit pupil; wherein the display device is configured to control the output angles of each cell such that the display exit pupil formed by the one or more imaging elements coincides with the location of the user eye pupil.
  20. 20. A method of using a display device which comprises: an image forming array of first pixels; a directional backlight unit arranged to illuminate the image forming array, the directional backlight unit comprising an array of cells, each cell comprising: an array of second pixels; and an optical coupling element arranged to receive light from the second pixels and to output collimated output light; the method comprising: controlling the output angle of the collimated output light from each cell by controlling which second pixels of that cell illuminate the respective optical coupling element
  21. 21. The method of claim 20, wherein each second pixel takes the form of an emissive pixel.
  22. 22. The method of claim 20, wherein the display device further comprises a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels, and wherein each second pixel takes the form of a transmissive pixel, the method further comprising, for each cell: -41 -controlling which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
  23. 23. The method of claim 20, wherein the display device further comprises a light source arranged to illuminate the array of cells such that each optical coupling element is separated from the light source by the respective second pixels, and wherein each second pixel takes the form of a reflective pixel, the method further comprising, for each cell: controlling which second pixels of that cell illuminate the respective optical coupling element by causing the second pixels of that cell to output a mask image defining an aperture.
  24. 24. The method of any one of claims 20 to 23, wherein each optical coupling element comprises one or more arrays of shutter pixels stacked in sequence between the second pixels and the image forming array, and wherein each shutter pixel comprises a transmissive pixel, the method further comprising, for each cell: controlling the one or more arrays of shutter pixels such that light from the second pixels of that cell can only illuminate the image forming array by passing through a sequence of apertures formed using the one or more arrays of shutter pixels.
  25. 25. The method of any one of claims 20 to 24, wherein the display device is comprised in a near-eye display which also comprises an eye-tracking system configured to determine a location of a user eye pupil, and one or more imaging elements configured to focus light output from the image forming array to a display exit pupil, the method further comprising: measuring a location of the use eye pupil; and controlling the output angles of each cell of the display device such that the display exit pupil formed by the one or more imaging elements coincides with the location of the user eye pupil.
GB2407535.0A 2024-05-28 2024-05-28 Display device Pending GB2641369A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB2407535.0A GB2641369A (en) 2024-05-28 2024-05-28 Display device
PCT/GB2025/051150 WO2025248234A1 (en) 2024-05-28 2025-05-27 Display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2407535.0A GB2641369A (en) 2024-05-28 2024-05-28 Display device

Publications (2)

Publication Number Publication Date
GB202407535D0 GB202407535D0 (en) 2024-07-10
GB2641369A true GB2641369A (en) 2025-12-03

Family

ID=91759169

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2407535.0A Pending GB2641369A (en) 2024-05-28 2024-05-28 Display device

Country Status (2)

Country Link
GB (1) GB2641369A (en)
WO (1) WO2025248234A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090213147A1 (en) * 2008-02-21 2009-08-27 Sharp Kabushiki Kaisha Single view display
US20140016051A1 (en) * 2010-12-22 2014-01-16 Seereal Technologies S.A. Combined light modulation device for tracking users

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2267579A (en) * 1992-05-15 1993-12-08 Sharp Kk Optical device comprising facing lenticular or parallax screens of different pitch
US11493773B2 (en) 2021-06-07 2022-11-08 Panamorph, Inc. Near-eye display system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090213147A1 (en) * 2008-02-21 2009-08-27 Sharp Kabushiki Kaisha Single view display
US20140016051A1 (en) * 2010-12-22 2014-01-16 Seereal Technologies S.A. Combined light modulation device for tracking users

Also Published As

Publication number Publication date
WO2025248234A1 (en) 2025-12-04
GB202407535D0 (en) 2024-07-10

Similar Documents

Publication Publication Date Title
US10750163B2 (en) Autostereoscopic display device and display method
KR102549398B1 (en) Visual display using time multiplexing
EP3136159B1 (en) Backlight unit and 3d image display apparatus
US7123287B2 (en) Autostereoscopic display
US20200012117A1 (en) Directional backlight
US8502761B2 (en) Transparent component with switchable reflecting elements, and devices including such component
US9507174B2 (en) Spatial focal field type glasses display
CN115698824B (en) Wearable display system with nanowire LED micro-display
US12422675B2 (en) Head-mounted display with narrow angle backlight
JP2024102735A (en) Head-mounted type display device and optical unit
WO2022179312A1 (en) Optical display system and electronics apparatus
US20220155591A1 (en) Eyebox expanding viewing optics assembly for stereo-viewing
GB2641369A (en) Display device
US20240427123A1 (en) Anamorphic directional illumination device
JP7417735B2 (en) Ultra-high resolution optics based on miniature lenses for virtual and mixed reality
JP2024082373A (en) Virtual image display device and head-mounted display device
WO2022196204A1 (en) Display device
US20240248308A1 (en) Virtual image display device and head-mounted display apparatus
US20240192497A1 (en) Virtual image display device and head-mounted display apparatus
WO2025219710A1 (en) Near eye display apparatus
NO20200867A1 (en) A Display Screen Adapted to Correct for Presbyopia
CA3237309A1 (en) Head-wearable display device
GB2597729A (en) A display screen adapted to correct for presbyopia