HK1189059B - Pixel opacity for augmented reality - Google Patents
Pixel opacity for augmented reality Download PDFInfo
- Publication number
- HK1189059B HK1189059B HK14102105.2A HK14102105A HK1189059B HK 1189059 B HK1189059 B HK 1189059B HK 14102105 A HK14102105 A HK 14102105A HK 1189059 B HK1189059 B HK 1189059B
- Authority
- HK
- Hong Kong
- Prior art keywords
- image
- display panel
- virtual image
- display
- imaging system
- Prior art date
Links
Description
Technical Field
The invention relates to imaging technology, in particular to virtual reality technology.
Background
Virtual reality can be viewed as a computer-generated simulated environment in which a user has an apparent physical presence. The virtual reality experience may be generated in 3D and viewed with a Head Mounted Display (HMD), such as glasses, or other wearable display device having near-eye display panels as lenses to display the virtual reality environment in place of the actual environment. However, augmented reality dictates that: the user may still view the surrounding environment through the display lenses of the glasses or other wearable display devices, but also see images of virtual objects that are generated for display and appear as part of the environment. Augmented reality may include any type of input, such as audio and tactile input, as well as virtual images, graphics, and video that enhance or augment the environment experienced by the user. As an emerging technology, augmented reality presents many challenges and design limitations: from generating virtual objects and images to make them appear more realistic in a real environment to developing optics that are small and accurate enough to be implemented in wearable display devices.
Disclosure of Invention
This summary introduces simplified concepts of pixel opacity for augmented reality and is described in the following detailed description and/or illustrated in the accompanying drawings. This summary should not be considered to describe essential features of the claimed subject matter, nor should it be used to determine or limit the scope of the claimed subject matter.
Pixel opacity for augmented reality is described. In embodiments, a display lens system includes a first display panel that displays a virtual image generated to appear as part of an environment when viewed through an optical lens. The second display panel displays an ambient image of the environment viewed through the optical lenses, and the ambient image includes opaque pixels forming a black outline of the virtual image. The display lens system also includes a beam splitter panel for transmitting light of the ambient image and reflecting light of the virtual image to form a composite image that appears as the virtual image displayed on the opaque pixels of the ambient image.
In other embodiments, the imaging application generates a virtual image from the virtual image data for display on the first display panel. The imaging application is implemented to correlate the position of the opaque pixels on the second display panel with the display position of the virtual image on the first display panel, and to control the pixel illumination of the second display panel to turn off the opaque pixels or otherwise spatially modulate the second display panel.
Drawings
Embodiments of pixel opacity for augmented reality are described with reference to the following figures. The same numbers may be used throughout to reference features and components like those shown in the figures:
FIG. 1 illustrates an example imaging system in which embodiments of pixel opacity for augmented reality can be implemented.
2-4 illustrate additional examples of imaging systems in which embodiments of pixel opacity for augmented reality may be implemented.
FIG. 5 illustrates an example system including an example of a wearable display device in which embodiments of pixel opacity for augmented reality can be implemented.
FIG. 6 illustrates one or more example methods for pixel opacity for augmented reality in accordance with one or more embodiments.
FIG. 7 illustrates components of an example device that can implement embodiments of pixel opacity for augmented reality.
Detailed Description
Embodiments of pixel opacity for augmented reality are described. As described above, one challenge in implementing augmented reality is generating virtual images such that they appear realistic in a real environment when viewed by a user through a wearable display device, such as a head-mounted display (HMD). When a virtual image, such as any type of object, video, text, graphics, etc., is displayed in a real environment, the virtual image appears semi-transparent to the user. Pixel opacity for augmented reality provides a technique for enhancing the contrast of a virtual image relative to an ambient image without the virtual image appearing semi-transparent when the virtual image is displayed as appearing as part of the environment.
In embodiments, an imaging application is implemented to generate a virtual image for display as part of the surrounding environment when viewed by a user through left and right display lens systems of a wearable display device. The imaging application is also implemented to correlate the display position of the virtual image on the environment image, and then to extinguish (black-out) the pixel correlated with the display position of the virtual image on the display panel displaying the environment image. The pixels that are extinguished are opaque and form a filled outline, or black outline, of the virtual image in the ambient image. Thus, the virtual image does not appear semi-transparent because there is no illumination from opaque pixels behind the virtual image.
Although the features and concepts of pixel opacity for augmented reality can be implemented in any number of different devices, systems, environments, and/or configurations, embodiments of pixel opacity for augmented reality are described in the context of the following example devices, systems, and methods.
FIG. 1 illustrates an example imaging system 100 in which embodiments of pixel opacity for augmented reality can be implemented. The exemplary imaging system represents a refractive telescope with a one-to-one magnification through which a user can view the surrounding environment. An example imaging system includes individual optical lenses of a refractive telescope. The objective lens 102 and the erecting lens 104 project an ambient image 106 onto a display panel 108, such as a transparent LCD (liquid crystal display) or any other type of transmissive or reflective display panel. The imaging system further comprises a display panel 110 for displaying a virtual image 112, the virtual image 112 being generated so as to appear as part of the environment when viewed through the optical lenses of the imaging system. The virtual image may be any type of object, video, text, graphics, etc. that is generated for display as part of the environment in an augmented reality implementation.
In various embodiments, the example imaging system 100 may be implemented as left and right display lens systems of a wearable display device such as described with reference to fig. 5. The wearable display device may be implemented as any type of glasses or Head Mounted Display (HMD) including an implementation of an imaging system 100 (e.g., left and right display lens systems) through which a user may view the surrounding environment, but also see virtual images generated for display and rendering as part of the environment. The wearable display device and/or controller unit for the wearable display device implement an imaging application, such as a software application, to implement embodiments of pixel opacity for augmented reality as described herein.
In embodiments, the imaging application generates a virtual image 112 for display on the display panel 110. An imaging application is also implemented to spatially modulate or otherwise individually control pixels of the display panel 108 for pixel-level opacity of those pixels associated with the display location of the virtual image 112 in the ambient image 106. For example, the display panel 108 may be configured for pixel on-off control to block light, while the imaging application controls pixel illumination of opaque pixels 114 associated with the display position of the virtual image. The opaque pixels form a filled outline, or black outline, of the virtual image in the ambient image. In an alternative embodiment, the imaging application may control blanking out all of the display panel 108 and generate an entirely new visual as a virtual image for display on the display panel 110.
The exemplary imaging system 100 includes a beam splitter panel 116 (also commonly referred to as an 50/50 mirror, or polarizing beam splitter). The beam splitter panel is implemented to transmit light of the ambient image 106 and reflect light of the virtual image 112 to form a composite image 120 through eyepiece optics 118, the composite image 120 appearing as a virtual image displayed on opaque pixels 114 of the ambient image. The opaque pixels that are extinguished specify: the virtual image does not appear semi-transparent but appears in the composite image with a high contrast with respect to the ambient image.
FIG. 2 illustrates another example imaging system 200 in which embodiments of pixel opacity for augmented reality can be implemented. Similar to the example imaging system described with reference to FIG. 1, the example imaging system 200 represents a refractive telescope with a one-to-one magnification through which a user may view the surrounding environment. The example imaging system may also be implemented as left and right display lens systems of a wearable display device such as described with reference to fig. 5.
The example imaging system 200 includes an eyepiece 202 and an erecting lens 204, the erecting lens 204 projecting an ambient image 206 onto a display panel 208, such as a reflective LCOS (liquid crystal on silicon) display, or any other type of reflective display panel (e.g., DLP or DMD). The ambient image is reflected from the beam splitter panel 210 (or polarizing beam splitter panel) onto the display panel 208. The ambient image is then reflected from the display panel back through the beam splitter panel 210 to the mirror panel 212. The imaging system further comprises a display panel 214 for displaying a virtual image 216, the virtual image 216 being generated so as to appear as part of the environment when viewed through the optical lenses of the imaging system. Second beam splitter panel 218 is implemented to transmit light of ambient image 206 and reflect light of virtual image 216 to form a composite image 222 through eyepiece optics 220, composite image 222 appearing as a virtual image displayed on opaque pixels of the ambient image.
Examples of an environmental image, a virtual image, a composite image, and opaque pixels of the environmental image are described with reference to the example imaging system shown in fig. 1. Further, the imaging application is implemented to spatially modulate or otherwise individually control the pixels of the display panel 208 for pixel-level opacity of those pixels that are correlated to the display location of the virtual image in the ambient image.
FIG. 3 illustrates another example imaging system 300 in which embodiments of pixel opacity for augmented reality can be implemented. This example imaging system includes similar components as described with reference to the imaging system shown in fig. 2, but in a different configuration. The example imaging system 300 may also be implemented as left and right display lens systems of a wearable display device such as described with reference to fig. 5.
The example imaging system 300 includes an eyepiece 302 and an erecting lens 304, the erecting lens 304 projecting an ambient image 306 onto a display panel 308, such as a reflective LCOS (liquid crystal on silicon) display, or any other type of reflective display panel (e.g., DLP or DMD). The ambient image is reflected from the mirror plate 310 onto the display panel 308. The ambient image is then reflected from the display panel back to the mirror panel 312. The imaging system further comprises a display panel 314 for displaying a virtual image 316, the virtual image 316 being generated so as to appear as part of the environment when viewed through the optical lenses of the imaging system. Beam splitter panel 318 is implemented to transmit light of ambient image 306 and reflect light of virtual image 316 to form composite image 322 through eyepiece optics 320, with composite image 222 appearing as a virtual image displayed on opaque pixels of the ambient image.
Examples of an environmental image, a virtual image, a composite image, and opaque pixels of the environmental image are described with reference to the example imaging system shown in fig. 1. Further, the imaging application may be implemented to spatially modulate or otherwise individually control the pixels of the display panel 308 for pixel-level opacities of those pixels that are correlated to the display location of the virtual image in the ambient image.
FIG. 4 illustrates another example imaging system 400 in which embodiments of pixel opacity for augmented reality can be implemented. This example imaging system is implemented using reflected and refracted optical power components, such as irregular prisms using an inverse configuration. Alternatively, the example imaging system may be implemented using a reflected light power component. The example imaging system 400 may also be implemented as left and right display lens systems of a wearable display device such as described with reference to fig. 5.
The example imaging system 400 includes a first reflective and refractive optical power component 402 that projects an ambient image 404 upward onto a display panel 406, such as a reflective LCOS (liquid Crystal on silicon) display, or any other type of reflective display panel (e.g., DLP or DMD). The ambient image is reflected from the first reflective element 402 to the mirror plate 408 and from the beam splitter panel 410 to the display panel 406. The ambient image is then reflected from the display panel back to the beam splitter panel 410. The imaging system further comprises a display panel 412 for displaying a virtual image 414, the virtual image 414 being generated so as to appear as part of the environment when viewed through the reflective element. The beam splitter panel 410 is implemented to transmit light of the ambient image 404 and to reflect light of the virtual image 414 to the second reflected and refracted optical power element 416 to form a composite image 418, the composite image 418 appearing as a virtual image displayed on opaque pixels of the ambient image. In this example, the first and second reflective elements are implemented for total internal reflection (FTIR) in order to reflect light of the ambient image and/or the virtual image.
Examples of an environmental image, a virtual image, a composite image, and opaque pixels of the environmental image are described with reference to the example imaging system shown in fig. 1. Further, the imaging application may be implemented to spatially modulate or otherwise individually control the pixels of display panel 406 for pixel-level opacity of those pixels that are correlated to the display location of the virtual image in the ambient image.
In the example imaging system 400, the display panels 406 and 412, along with the mirror panel 408 and the beam splitter panel 410, form an imaging unit 420 of the imaging system. In various embodiments, the imaging unit 420 may be mounted in and/or integrated into a frame of a wearable display device (such as the wearable display device described with reference to fig. 5). The imaging unit 420 may be integrated into the frame of the wearable display device or mounted on top of the frame (also referred to as mounted on the forehead) over the display lens of the wearable display device. Alternatively, the imaging unit 420 may be integrated into or mounted to the side of the frame of the wearable display device (also referred to as being mounted in the temple portion).
FIG. 5 illustrates an example system 500 including an example wearable display device 502 in which embodiments of pixel opacity for augmented reality can be implemented. The wearable display device may be implemented as any type of glasses or Head Mounted Display (HMD) that includes a display lens system 504 (e.g., left and right display lens systems) through which a user may view the surrounding environment, but also see virtual images (e.g., any type of objects, video, text, graphics, etc.) that are generated for display and rendering as part of the environment.
Wearable display device 502 may be implemented as a stand-alone portable system that includes memory, software, a processor, and/or a power source. Alternatively or additionally, the wearable display device may be communicatively connected to the controller 506, the controller 406 including any one or combination of memory, software, a processor, and/or a power source (such as a battery unit). The controller may be implemented for wired or wireless communication with the wearable display device. The controller and/or wearable display device may also be implemented with any number and combination of differing components as further described with reference to the exemplary device shown in fig. 7. For example, the controller and/or wearable display device includes an imaging application, such as a software application, implemented as computer-executable instructions and executed by a processor to implement embodiments of pixel opacity for augmented reality described herein.
In embodiments, the controller may be implemented as a dedicated device (e.g., wired controller 506), a mobile phone 508, a tablet or other portable computer device, a gaming system 510, or any other type of electronic device as follows: the electronic device may be implemented to process and generate a virtual image for display as part of an environment viewed through a display lens system of a wearable display device. The controller may wirelessly pass through WiFiTM、BluetoothTMInfrared (IR), RFID transmission, Wireless Universal Serial Bus (WUSB), cellular, or by other wireless communication techniques with a wearable display device.
Exemplary system 500 also includes a data server 512 or data service that communicates or otherwise distributes virtual image data 514 to wearable display device 502 over communication network 516. For example, the data server may be part of a network-based gaming system that generates virtual images for augmented reality display at a wearable display device. Alternatively, the data server may be part of a navigation system that delivers navigation guidance and information for display in the display lens system 504 of the wearable display device. In another example, the data server may be part of a messaging service, such as an email or text messaging system, that delivers email and/or text messages to a wearable display device for display in a display lens system, where a user may read the messages as augmented reality images displayed over an environment viewed through the wearable display device.
Any of the devices, servers, and/or services may communicate via a communication network 516, which communication network 516 may be implemented to include wired and/or wireless networks. The communication network may also be implemented using any type of network topology and/or communication protocol, and may be represented or otherwise implemented as a combination of two or more networks, to include an IP-based network and/or the internet. The communication network may also include a mobile carrier network managed by a mobile carrier, such as a communication service provider, a cellular telephone provider, and/or an internet service provider.
Wearable display device 502 includes a frame 518, such as in the form of glasses, goggles, or other structure, that supports and incorporates the various components of the device and acts as a conduit for electrical and other component connections. The component modules 520 (or component modules on the left, right, and/or both sides of the device frame) incorporate any of a variety of components, such as processing and control circuitry, memory, software, a processor, a GPS transceiver, and/or a power supply. The wearable display device may also include a microphone 522 for recording audio data from the surrounding environment, and headphones for audio feedback as part of the augmented reality experience.
Wearable display device 502 also includes various cameras 524 that capture video and still images of the surrounding environment. The images and video may be processed on the device and/or by a controller device (e.g., controller 506) and used to create a mapping field to orient and track the user in the environmental space. The wearable display device may also include an eye tracking camera for determining a user's eye position and tracking eye movement. The wearable display device may also include a temperature sensor, and an inertial sensor for sensing position, orientation, and/or acceleration of the wearable display device.
An example of display lens system 504 is shown from a viewer perspective 526 of wearable display device 502 as if the display lens system were viewed from the top of the device. The display lens system includes an imaging system 528, which imaging system 528 may be implemented with any number of micro-display panels, lenses, and reflective elements to display and project a virtual image onto the see-through and reflective waveguide 530. Display lens system 504 and/or imaging system 528 may be implemented as any of the various imaging systems described with reference to fig. 1-4 above to implement embodiments of pixel opacity for augmented reality. The see-through reflective waveguide 530 is implemented for internal reflection and conducts visible light 532 of a virtual image generated by the imaging unit for viewing by a user, and also passes light 534 from the surrounding environment for viewing by the user.
The micro-display panel, lenses and/or reflective elements of the imaging system 528 may be implemented with various display technologies, such as with a transparent LCD or using a transmissive projection technology, where the light source is modulated by an optically active material and backlit with white light. These techniques can generally be implemented using LCD-type displays with powerful backlights and high optical power densities. Alternatively, the micro-display and/or reflective elements may be implemented using reflective technologies, such as Digital Light Processing (DLP) and Liquid Crystal On Silicon (LCOS), which reflect external light that is reflected and modulated by the optical material.
In an embodiment, the imaging system 528 (or other components of the display lens system 504) may be implemented to include an Infrared (IR) laser used for system calibration and/or as an illumination source for an eye tracking system and camera that track the position of the user's eye. The eye tracking system comprises an eye tracking illumination source (which is not visible light) and comprises an eye tracking IR sensor. The IR sensor may be implemented as: an IR camera providing infrared image data of the eye for eye tracking processing; or an IR sensor that detects eye reflections when the eye is illuminated. The see-through and reflective waveguide 530 may also be used for infrared illumination and eye reflections that are used by the eye tracking system to track the position of the user's eye.
In this example, display lens system 504 includes optional opacity filter 536 and see-through lens 538 on each side of waveguide 530. The see-through lens may be a standard spectacle lens and be made as prescribed (or not). The opacity filter selectively blocks natural light, either uniformly or on a per-pixel basis, from passing through the see-through and reflective waveguides to enhance the contrast of the displayed virtual image.
An example method 600 in accordance with one or more embodiments for pixel opacity for augmented reality is described with reference to fig. 6. Generally, any of the services, functions, methods, procedures, components, and modules described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or any combination thereof. A software implementation represents program code that performs specified tasks when executed by a computer processor. The example methods may be described in the general context of computer-executable instructions, which may include software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and the like. The program code can be stored in one or more computer-readable storage media devices, both local and/or remote to a computer processor. The method may also be practiced in a distributed computing environment by multiple computer devices. Furthermore, the features described herein are platform-independent and may be implemented on a variety of computing platforms having a variety of processors.
Fig. 6 illustrates one or more example methods (600) for pixel opacity for augmented reality. The order in which the method blocks are described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternate method.
At block 602, a virtual image is displayed on a first display panel. For example, the display panel 110 (fig. 1) displays a virtual image 112, the virtual image 112 being generated to appear as part of the environment when viewed through the optical lenses of the imaging system 100. At block 604, an environment image of the environment is displayed on a second display panel. For example, the display panel 108 displays the ambient image 106, the ambient image 106 including opaque pixels 114 forming a black outline of the virtual image. In embodiments, imaging application 720 (fig. 7) generates a virtual image for display as appearing as part of the surrounding environment when viewed by a user through the left and right display lens systems of wearable display device 502 (fig. 5).
The position of the opaque pixels on the second display panel is correlated to the display position of the virtual image on the first display panel at block 606, and the pixel illumination of the second display panel is controlled at block 608. For example, the imaging application 720 spatially modulates or otherwise individually controls the pixels of the display panel 108 for the pixel-level opacities of those pixels that are associated with the display location of the virtual image 112 in the ambient image 106. The display panel 108 may be configured for pixel on-off control of blocking light, while the imaging application controls pixel illumination to turn off opaque pixels 114 associated with the display position of the virtual image.
At block 610, light of the ambient image is transmitted through eyepiece optics, and at block 612, light of the virtual image is reflected through eyepiece optics. At block 614, a composite image is formed that appears as a virtual image displayed on the opaque pixels of the ambient image. For example, the beam splitter panel 116 transmits light of the environment image 106 (e.g., illuminated from the display panel 108) and reflects light of the virtual image 112 (e.g., illuminated from the display panel 110), through the eyepiece optics 118 to form a composite image 120, the composite image 120 appearing as the virtual image 112 displayed on the opaque pixels 114 of the environment image. Opaque pixels in the ambient image that are extinguished specify: the virtual image does not appear semi-transparent but appears in the composite image with a high contrast with respect to the ambient image.
Fig. 7 illustrates various components of an example device 700, which may be implemented as any of the devices described with reference to previous fig. 1-6, such as a wearable display device and/or a controller for a wearable display device. In various embodiments, the device may be implemented as any one or combination of a stationary or mobile device in any form of a consumer device, computer device, portable device, communication device, telephone device, navigation device, appliance device, gaming device, media playback device, and/or electronic device. The device may also be associated with a user (i.e., a person) and/or an entity that operates the device such that a device describes logical devices that include users, software, firmware, hardware, and/or a combination of devices.
The device 700 includes a communication device 702 that enables wired and/or wireless communication of device data 704, such as virtual image data as well as video and image data, and other media content stored on the device. The media content stored on the device may include any type of audio, video, and/or image data. The device includes one or more data inputs 706 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs as well as any other type of audio, video, and/or image data received from any content and/or data source.
The device 700 also includes communication interfaces 708, such as any one or more of a serial, parallel, network, or wireless interface. The communication interfaces provide a connection and/or communication links between the device and a communication network by which other electronic, computing, and communication devices communicate data with the device.
The device 700 includes one or more processors 710 (e.g., any of microprocessors, controllers, and the like) which process computer-executable instructions to control the operation of the device. Alternatively or in addition, the device can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 712. Although not shown, the device may include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Device 700 also includes one or more memory devices 714 (e.g., computer-readable storage media) that allow for storage of data, such as Random Access Memory (RAM), non-volatile memory (e.g., Read Only Memory (ROM), flash memory, etc.), and disk storage devices. A disk storage device can be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable disk, and the like. The device may also include a mass storage media device. Computer readable storage media can be any available medium or media that can be accessed by a computing device.
The memory device 714 provides data storage mechanisms to store the device data 704, other types of information and/or data, and device applications 716. For example, an operating system 718 can be maintained as a software application in a memory device and executed on processors. The device applications can also include a device manager or controller, such as any form of a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so forth. In this example, the device applications also include an imaging application 720, the imaging application 720 implementing embodiments of augmented reality pixel opacity as described herein.
Device 700 can also include an audio and/or video processing system 722 that generates audio data for an audio system 724 and/or generates display data for a display system 726. In implementations, the audio system and/or the display system are external components of the device. Alternatively, the audio system and/or the display system are integrated components of the example device.
Although embodiments of pixel opacity for augmented reality have been described in language specific to features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of pixel opacity for augmented reality.
Claims (10)
1. An imaging system, comprising:
a first display panel configured to display a virtual image, the virtual image being generated so as to appear as part of an environment when viewed through an optical lens, the optical lens being configured to implement a refractive telescope comprising an objective lens and an erecting lens;
a second display panel configured to display an ambient image of the environment viewed through the optical lens, the ambient image including opaque pixels forming a black outline of the virtual image;
a first reflected and refracted optical power component configured to project the ambient image onto the second display panel;
a beam splitter panel configured to transmit light of the ambient image and reflect light of the virtual image to form a composite image that appears as the virtual image displayed on opaque pixels of the ambient image; and
a second reflective and refractive optical power component configured in reverse to the first reflective and refractive optical power component, configured to reflect the composite image for viewing.
2. The imaging system of claim 1, wherein the optical lens implements a refractive telescope having a one-to-one magnification, the refractive telescope configured to form the environmental image of the environment as viewed through the refractive telescope.
3. The imaging system of claim 1, wherein the virtual image appears in the composite image to have a high contrast relative to the environmental image.
4. The imaging system of claim 1, further comprising eyepiece optics through which the composite image is formed for viewing.
5. The imaging system of claim 1, further comprising a camera configured to capture the environmental image for pixelated display on the second display plane.
6. The imaging system of claim 1, wherein the second display panel is one of a transparent LCD or a reflective LCOS, the second display panel including pixels configured for on-off control, wherein the opaque pixels are turned off.
7. The imaging system of claim 1, further comprising an imaging application configured to spatially modulate the second display panel to correlate the location of the opaque pixels with the display location of the virtual image on the first display panel.
8. The imaging system of claim 1, further comprising an imaging application configured to:
generating the virtual image from virtual image data for display on the first display panel;
correlating the position of the opaque pixels on the second display panel with the display position of the virtual image on the first display panel; and
controlling pixel illumination of the second display panel to turn off the opaque pixels.
9. An imaging method, comprising:
displaying a virtual image on a first display panel, the virtual image being generated so as to appear as part of an environment when viewed through an optical lens configured to implement a refractive telescope comprising an objective lens and an erecting lens;
displaying an ambient image of the environment on a second display panel, the ambient image comprising opaque pixels forming a black outline of the virtual image;
projecting, by a first reflected and refracted optical power component, the ambient image onto the second display panel; forming a composite image that appears as the virtual image displayed on opaque pixels of the environmental image; and
reflecting the composite image for viewing by a second reflective and refractive optical power component configured in reverse with the first reflective and refractive optical power component.
10. The method of claim 9, wherein forming the composite image comprises:
transmitting light of the ambient image through eyepiece optics; and reflecting light of the virtual image through the eyepiece optics.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/336,873 | 2011-12-23 | ||
| US13/336,873 US9223138B2 (en) | 2011-12-23 | 2011-12-23 | Pixel opacity for augmented reality |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1189059A1 HK1189059A1 (en) | 2014-05-23 |
| HK1189059B true HK1189059B (en) | 2016-12-09 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9223138B2 (en) | Pixel opacity for augmented reality | |
| US8810600B2 (en) | Wearable display device calibration | |
| US8638498B2 (en) | Eyebox adjustment for interpupillary distance | |
| US8989535B2 (en) | Multiple waveguide imaging structure | |
| US9297996B2 (en) | Laser illumination scanning | |
| US8917453B2 (en) | Reflective array waveguide | |
| US10063846B2 (en) | Selective illumination of a region within a field of view | |
| US10502876B2 (en) | Waveguide optics focus elements | |
| US9674436B2 (en) | Selective imaging zones of an imaging sensor | |
| US9151984B2 (en) | Active reflective surfaces | |
| CN104871068B (en) | Automatic stereo augmented reality display | |
| US10274731B2 (en) | Optical see-through near-eye display using point light source backlight | |
| US20230341683A1 (en) | Near eye 3d display with separate phase and amplitude modulators | |
| CN112236711A (en) | Apparatus and method for image display | |
| CN110967828B (en) | Display system and head-mounted display device | |
| HK1189059B (en) | Pixel opacity for augmented reality | |
| Kiyokawa | Occlusion displays | |
| HK1184231A (en) | Reflective array waveguide |