CA2388766A1 - Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames - Google Patents
Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames Download PDFInfo
- Publication number
- CA2388766A1 CA2388766A1 CA 2388766 CA2388766A CA2388766A1 CA 2388766 A1 CA2388766 A1 CA 2388766A1 CA 2388766 CA2388766 CA 2388766 CA 2388766 A CA2388766 A CA 2388766A CA 2388766 A1 CA2388766 A1 CA 2388766A1
- Authority
- CA
- Canada
- Prior art keywords
- wearer
- media device
- visual media
- cndot
- lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000000007 visual effect Effects 0.000 claims description 76
- 230000003287 optical effect Effects 0.000 claims description 36
- 230000004438 eyesight Effects 0.000 claims description 23
- 239000000835 fiber Substances 0.000 claims description 11
- 210000001747 pupil Anatomy 0.000 claims description 10
- 239000000463 material Substances 0.000 claims description 8
- 229910052751 metal Inorganic materials 0.000 abstract description 10
- 239000002184 metal Substances 0.000 abstract description 10
- 239000004033 plastic Substances 0.000 abstract description 8
- 230000003595 spectral effect Effects 0.000 abstract description 4
- 239000011521 glass Substances 0.000 description 22
- 230000001404 mediated effect Effects 0.000 description 19
- 238000013461 design Methods 0.000 description 15
- 230000003190 augmentative effect Effects 0.000 description 10
- 230000007704 transition Effects 0.000 description 9
- 230000004297 night vision Effects 0.000 description 7
- 230000008447 perception Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 230000003203 everyday effect Effects 0.000 description 6
- 230000036961 partial effect Effects 0.000 description 6
- 208000035239 Synesthesia Diseases 0.000 description 5
- 238000001228 spectrum Methods 0.000 description 5
- 241000220317 Rosa Species 0.000 description 4
- 238000013459 approach Methods 0.000 description 4
- 238000002474 experimental method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000016776 visual perception Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- MCSXGCZMEPXKIW-UHFFFAOYSA-N 3-hydroxy-4-[(4-methyl-2-nitrophenyl)diazenyl]-N-(3-nitrophenyl)naphthalene-2-carboxamide Chemical compound Cc1ccc(N=Nc2c(O)c(cc3ccccc23)C(=O)Nc2cccc(c2)[N+]([O-])=O)c(c1)[N+]([O-])=O MCSXGCZMEPXKIW-UHFFFAOYSA-N 0.000 description 2
- 230000000903 blocking effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000002354 daily effect Effects 0.000 description 2
- 230000003292 diminished effect Effects 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 239000000123 paper Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 241001050985 Disco Species 0.000 description 1
- 101100117236 Drosophila melanogaster speck gene Proteins 0.000 description 1
- 208000001613 Gambling Diseases 0.000 description 1
- 241000697035 Heteropriacanthus cruentatus Species 0.000 description 1
- 208000018737 Parkinson disease Diseases 0.000 description 1
- 241000276498 Pollachius virens Species 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 230000006750 UV protection Effects 0.000 description 1
- 206010047571 Visual impairment Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000003287 bathing Methods 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000006854 communication Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- 230000006735 deficit Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000003467 diminishing effect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000007937 eating Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 239000010437 gem Substances 0.000 description 1
- 229910001751 gemstone Inorganic materials 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000000034 method Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 230000005043 peripheral vision Effects 0.000 description 1
- 239000006223 plastic coating Substances 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 230000004224 protection Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000000192 social effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009182 swimming Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 239000010936 titanium Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000035899 viability Effects 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C11/00—Non-optical adjuncts; Attachment thereof
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Eyeglasses (AREA)
Abstract
A wearable video or information display or fixed display or aimer is mounted in a portion of an eyeglass concealment element such as an eyeglass frame or similar structure that passes within a foveally viewable region of the wearer's field of view.
The concealment element is one which is ordinarily not considered to be transparent or viewable, e.g. not ordinarily considered to be something that a wearer would look into or at or through. In some embodiments the concealment element is an eyeglass frame, because eyeglass frames are ordinarily not elements that we look through, or at, or into. The concealment element is for being present within a foveally directable portion of the wearer's field of view, so that the wearer can look directly at, into, or through the concealment member. In one embodiment the concealer is an eyeglass frame, consisting of a thin horizontal metal band that passes directly in front of the wearer's eyes. A single two-eyed eyeglass lens, or two eyeglass lenses borne by the band can be functional (for a wearer's prescription) or merely decorative, to visually justify the existence of the metal band, making it appear to take on the role of a mere physical support for the eyeglass lens(es). In another embodiment, plastic frames having viewability or transparency in some spectral band are used. In the preferred embodiment, one, two, or four eyeglass lenses are mounted to or borne by the frame in such a manner as to extend above, as well as below, the frame, so that the appearance of one or two eyeglass lenses borne above and below the central concealer (e.g. front portion of the frame) is attained.
The concealment element is one which is ordinarily not considered to be transparent or viewable, e.g. not ordinarily considered to be something that a wearer would look into or at or through. In some embodiments the concealment element is an eyeglass frame, because eyeglass frames are ordinarily not elements that we look through, or at, or into. The concealment element is for being present within a foveally directable portion of the wearer's field of view, so that the wearer can look directly at, into, or through the concealment member. In one embodiment the concealer is an eyeglass frame, consisting of a thin horizontal metal band that passes directly in front of the wearer's eyes. A single two-eyed eyeglass lens, or two eyeglass lenses borne by the band can be functional (for a wearer's prescription) or merely decorative, to visually justify the existence of the metal band, making it appear to take on the role of a mere physical support for the eyeglass lens(es). In another embodiment, plastic frames having viewability or transparency in some spectral band are used. In the preferred embodiment, one, two, or four eyeglass lenses are mounted to or borne by the frame in such a manner as to extend above, as well as below, the frame, so that the appearance of one or two eyeglass lenses borne above and below the central concealer (e.g. front portion of the frame) is attained.
Description
Patent Application of W. Steve G. Mann for EYEGLASS FRAMES BASED VISUAL INTERMEDIARY OR
EYEGLASSES WITH OPERATIONALLY, ACTUALLY, OR
COMPUTATIONALLY, TRANSPARENT OR VIEWABLE FRAMES
of which the following is a specification:
FIELD OF THE INVENTION
The present invention pertains generally to a wearable video or information display apparatus or aimer that provides the wearer with a computer data or video display, or aimer, concealed in what appear to others to look like ordinary eyeglasses.
BACKGROUND OF THE INVENTION
There are many uses for a normal-looking computer screen, or display, or even just an aimer. A clock built into eyeglasses can tell us the time. A pager in eyeglasses can allow us to see messages without taking our eyes off our business associates.
Being able to see messages without breaking eye to eye contact with other people can help those suffering from memory disability, because an appointment, book, scheduler, or name prompter can appear in view.
Even an aimer (crosshairs, reticle, graticule, or the like) can help persons with Parkinsons's disease decide the next step to take. A simple aimer, even just a hori-zonatal line drawn on top of reality can help, and if it can be done in eyeglasses that look normal, it can help all the more. Like hearing aids, such devices should be subtle (not calling attention to themselves).
Aimers are also useful in photography and movie and video production, where it is desirable to capture events in a natural manner with minimal intervention and disturbance. Current state-of-the-art photographic or video apparatus, even in its most simple "point and click" form, creates a visual disturbance to others and at-tracts considerable attention on account of the gesture of bringing the camera up to the eye. Even if the size of the camera could be reduced to the point of being negligible (e.g. no bigger than the eyecup of a typical camera viewfinder, for ex-ample), the very gesture of holding a display device up to, or bringing a display device up to the eye is unnatural and attracts considerable attention, especially in corrupt establishments such as certain gambling casinos, pawnbroker jewellery stores, or criminal-owned establishments where photography is often prohibited.
Although there exist a variety of unobtrusive cameras such a camera concealed beneath the jewel of a necktie clip, cameras concealed in baseball caps, and cameras concealed in eyeglasses, these cameras tend to produce inferior images, not just because of the technical limitations imposed by their small size, but, more importantly because they lack a means of accurately determining which objects in the scene are within the field of view of the camera to aim the camera for obtaining a picture having good photographic or videographic composition, and because of the fact that they do not capture exactly what the wearer is looking at. Because of the lack of a viewfinder, investigative video and photojournalism made with such cameras suffers from poor composition. Accordingly, such covert cameras are often fitted with very wide angle lenses so that the poor aim will not result in missing important subject matter. As a result of these wide angle lenses, details in the scene are typically much more poorly rendered than they would be if a normal or tele lens were used. Moreover, there is a discrepancy between the location of the camera, and the wearer's eye, so that these cameras fail to capture a true first-person perspective esepcially at close range, such as when the wearer looks into a microscope or telescope and the camera fails to record this experience.
Traditional camera viewfinders often include the ability to overlay virtual objects, such as camera shutter speed, or the like, on top of reality, as described in U.S.
Pat. No. 5664244 which describes a viewfinder with additional information display capability. In this case, it is desired that both the virtual objects that exist within the viewfinder and the real objects that exist beyond the viewfinder appear in sharp focus.
Computer information displays may be used for simple tasks like reading messages while walking around, up to and including more advanced concepts such as Mediated Reality to assist the visually impaired, for electronic newsgathering, personal safety (evidence collection) and the like.
EYEGLASSES WITH OPERATIONALLY, ACTUALLY, OR
COMPUTATIONALLY, TRANSPARENT OR VIEWABLE FRAMES
of which the following is a specification:
FIELD OF THE INVENTION
The present invention pertains generally to a wearable video or information display apparatus or aimer that provides the wearer with a computer data or video display, or aimer, concealed in what appear to others to look like ordinary eyeglasses.
BACKGROUND OF THE INVENTION
There are many uses for a normal-looking computer screen, or display, or even just an aimer. A clock built into eyeglasses can tell us the time. A pager in eyeglasses can allow us to see messages without taking our eyes off our business associates.
Being able to see messages without breaking eye to eye contact with other people can help those suffering from memory disability, because an appointment, book, scheduler, or name prompter can appear in view.
Even an aimer (crosshairs, reticle, graticule, or the like) can help persons with Parkinsons's disease decide the next step to take. A simple aimer, even just a hori-zonatal line drawn on top of reality can help, and if it can be done in eyeglasses that look normal, it can help all the more. Like hearing aids, such devices should be subtle (not calling attention to themselves).
Aimers are also useful in photography and movie and video production, where it is desirable to capture events in a natural manner with minimal intervention and disturbance. Current state-of-the-art photographic or video apparatus, even in its most simple "point and click" form, creates a visual disturbance to others and at-tracts considerable attention on account of the gesture of bringing the camera up to the eye. Even if the size of the camera could be reduced to the point of being negligible (e.g. no bigger than the eyecup of a typical camera viewfinder, for ex-ample), the very gesture of holding a display device up to, or bringing a display device up to the eye is unnatural and attracts considerable attention, especially in corrupt establishments such as certain gambling casinos, pawnbroker jewellery stores, or criminal-owned establishments where photography is often prohibited.
Although there exist a variety of unobtrusive cameras such a camera concealed beneath the jewel of a necktie clip, cameras concealed in baseball caps, and cameras concealed in eyeglasses, these cameras tend to produce inferior images, not just because of the technical limitations imposed by their small size, but, more importantly because they lack a means of accurately determining which objects in the scene are within the field of view of the camera to aim the camera for obtaining a picture having good photographic or videographic composition, and because of the fact that they do not capture exactly what the wearer is looking at. Because of the lack of a viewfinder, investigative video and photojournalism made with such cameras suffers from poor composition. Accordingly, such covert cameras are often fitted with very wide angle lenses so that the poor aim will not result in missing important subject matter. As a result of these wide angle lenses, details in the scene are typically much more poorly rendered than they would be if a normal or tele lens were used. Moreover, there is a discrepancy between the location of the camera, and the wearer's eye, so that these cameras fail to capture a true first-person perspective esepcially at close range, such as when the wearer looks into a microscope or telescope and the camera fails to record this experience.
Traditional camera viewfinders often include the ability to overlay virtual objects, such as camera shutter speed, or the like, on top of reality, as described in U.S.
Pat. No. 5664244 which describes a viewfinder with additional information display capability. In this case, it is desired that both the virtual objects that exist within the viewfinder and the real objects that exist beyond the viewfinder appear in sharp focus.
Computer information displays may be used for simple tasks like reading messages while walking around, up to and including more advanced concepts such as Mediated Reality to assist the visually impaired, for electronic newsgathering, personal safety (evidence collection) and the like.
Mixed Reality exists in many forms along a continuum from Augmented Reality (reality enhanced by graphics, such as Sutherland's work from more than 30 years ago) to more recent efforts at Augmented Virtuality (graphics enhanced by reality, graphics enhanced by video, etc.). Mixed Reality provides numerous ways to add together (mix) various proportions of real and virtual worlds. However, Mediated Reality is an even older tradition, introduced by Stratton more than 100 years ago.
Stratton presented two important ideas: ( 1 ) the idea of constructing special eyeglasses to modify how he saw the world; and (2) the ecologically motivated approach to conducting his experiments within the domain of his own everyday personal life.
Stratton's seminal work has inspired a wide variety of devices that can be used to augment, deliberately diminish, or otherwise alter human visual perception.
Problems with the existing taxonomies (e.g. mixed reality) and distinctions (e.g.
optical versus video see-through) arise when we consider reality-modifying devices. In this case, it is necessary to use the expanded taxonomy> of mediated reality, and the notion of ilhcsory transparency rather than video transparency (e.g. a more general framework is needed).
The need for various new designs for reality-modifying devices, especially those used to modify only a portion of the visual reality stream, suitable for use in everyday life, is becoming more and more evident.
In 1989, Jaron Lanier, CEO of VPL, coined the term "virtual reality" to bring a wide variety of virtual projects under a single rubric. Tom Caudell coined the term "augmented reality" at Boeing in the early 1990s, while working together with David Mizell, researching ways to superimpose diagrams and markings to guide workers on a factory floor. Augmented reality, involving superposition of computer graphics onto a view of the real world, was initially proposed and explored by Ivan Sutherland and his students at Harvard University and University of Utah, in the 1960s.
Augmented reality is generally defined as computer displays that add virtual infor-mation to a person's perception of the real world. One way to add virtual information to our visual field of view is to display the information on some kind of beamsplitter or other similar device, and thus use the human eye itself as the adder that achieves the superposition of the real and virtual information.
When the real world, as well as the virtual world, is to be experienced electronically (e.g. by seeing the real world through a camera), the virtual world and real world are blended together using well-known methods. Such superposition of video signals is commonly used in broadcast television, for example, to overlay virtual information (closed captioning, title credits, weather information, etc.) from a computer onto a broadcast video signal from a television camera. This superposition is typically achieved by way of a device called a "video mixer" that adds two (or more) video signals. In fact, broadcast television video signals are amplit-ude modulated (AM) because one of the original reasons for selecting AlVI (in addition to reduced spectral requirements, etc.) was so to simplify the design of video misers, so that simple adjustable voltage dividers (potentiometers) could be used to blend two or more video signals. AM broadcasts allow for easy cro.ssfades between two received video signals.
The concept of illusory transparency continues to apply to systems that do not involve video. Examples of illusory transparency include computational see-through by way of laser EyeTap devices, as well as optical computing to produce a similar illusory transparency under computer program control. With optical computing, the concept of illusory transparency continues to provide a distinction between devices that can modify (not just add to) our visual perception and devices that cannot.
Reality-modifying devices that do not involve video therefore problematize the notion of video see-through.
Consider audio as a simple example of illusory transparency that is clearly not an example of video see-through. In a small classroom, a professor will often speak to the students and augment a lecture with some virtual (e.g. computer-generated) audiovisual materials, using small speakers connected to a laptop computer, so that the students hear the virtual world of the computer through the speakers while they hear the professor's narration directly. The summation of the real (narration) and the virtual takes place inside the ear. However, in a large auditorium, the professor may use a large public address system, in which case the professor's voice, as typically received by a microphone, is mixed with the audio output of the computer, by way of an aredio mixer.
(An audio mixer is a device that adds together a variety of different audio inputs to produce a weighted sum of the inputs for use in a public address system, broadcast, or audio recording. Typically each input is multiplied by an adjustable constant prior to the addition, and these weighting constants are each adjusted by a separate volume control. Alternatively, mixers used for only two inputs (disco mixers used by disc jockeys, as well as many video mixers) often have a single slider to provide a crossfade from one input to the other, allowing the user to select any blend of the two inputs.) Such an example can also be applied to headphones that have built in microphones to allow the wearer to hear reality mixed in with some virtuality, using this concept of illusory transparency applied to audio signals.
Although addition is the correct operation for audio signals, it has been conclu-sively shown that the addition of video signals, in augmented reality, is fundamentally flawed because cameras operate approximately logarithmically and displays (includ-ing most head mounted displays) operate anti-logarithmically). Thus video mixers effectively perform something that is closer to multiplication than it is addition, be-cause they operate on approximately the logarithms of the underlying true physical quantities of light. Accordingly, antihomomorphic filters have been proposed for aug-mented reality, wherein this undesirable effect is cancelled.
In the theory of radio receivers, the concept of mixer is typically extended to in-clude multipliers, and other more general nonlinear elements such as signal diodes (detectors) that form the essence of the miser stage of a superheterodyne radio re-ceiver to give rise to an additivity of frequency spectral components. Thus the term mixer is commonly used to describe signal adders as well as signal multipliers.
Mixed reality covers a continuum between graphics enhanced by video and video enhanced by graphics. However, there are important categories of visual information processors that do not fit within this taxonomy or continuum.
One of the earliest and most important such examples was the eyeglasses George Stratton built and wore. His eyewear, made from two lenses of equal focal length, spaced two focal lengths apart, was basically an inverting telescope with unity mag-nification, so that he saw the world upside-down.
These eyeglasses, in many ways, actually diminished his perception of reality.
His deliberately diminished perception of reality was neither graphics enhanced by video, nor was it video enhanced by graphics, nor any linear combination of these two. (Moreover it was an example of optical see-through that is not an example of registered illusory transparency, e.g. it problematizes the notion of the optical see-through concept because both the mediation zone, as well as the space around it, are examples of optical-only processing. ) Others (Kohler, Dolezal, etc.) followed in Stratton's footsteps by living their day-to-day lives (eating, swimming, cycling, etc. ) through left-right reversing eyeglasses, prisms, and other optics that were neither examples of augmented reality nor aug-mented virtuality (and for which the optical see-through versus video see-through distinction fails to properly address important similarities and differences among var-ious such devices).
Thus a more general concept than that of mixed reality is needed.
Another class of devices that suggest a need for a more general concept are systems that provide synthetic synesthesia. Synesthesia is an empairment that affects many individuals naturally, but can be deliberately synthesized for anyone, using special devices that accept input of one (or more) senses) and translate it to an output to one or more other senses.
Here the word "empairment" is used because it is not agreed upon whether synes-thesia is an impairment or an empowerment. There are mixed feelings in the com-munity as to whether naturally occurring synesthesia is an asset or a detriment.
Examples of devices for providing synthetic synesthesia include the vision substi-tution devices used by the visually impaired, such as Leslie Kay's SonicVisioN
prod-uct which converts sonar (reflected ultrasound) to audible sound, and Peter Meijer's "vOICe" program.
Because of the existence of a broad range of devices that modify human percep-tion, mediated reality, a more general framework that includes the reality virtuality continuum, as well as devices for modifying as well as mixing these various aspects of reality and virtuality, has been proposed.
Mediated Reality, therefore, refers to a general framework for artificial modification of human perception by way of devices for augmenting, deliberately diminishing, and more generally, for otherwise altering sensory input.
SUMMARY OF THE INVENTION
The eyeglass concealer-as-intermediary devices of the invention use a concealer such as eyeglass frames, sticker, pattern, or the like, that has an external actual or apparent opacity, as a mediating or intermediary element for being within a field of view of the wearer, such that the concealer bisects the wearer's field of view to define areas that the wearer can see both above and below the concealer.
In one embodiment, the concealer contain at least one camera system or simi-lar imaging system or sensor, for being effectively at an eye of a wearer. In some embodiments, the camera is connected to a display borne by the apparatus of the invention.
In another embodiment, a display is concealed in or behind the concealer, where the concealer is intermediate between at least one eye of the wearer, and subject matter that remains at least partially within a field of view of the wearer.
In some embodiments a camera, borne by the apparatus of the invention, supplies a video signal to the display.
In a third embodiment, the intermediary concealer contains both camera and display, to provide a reality mediator within the intermediary.
In any of the above five (three main) embodiments, the concealer may consist of eyeglass frames that pass into or near a central region of the wearer's visual field of view.
A new kind of reality mediator that uses the eyeglass frames themselves as the mediating element is thus possible. This design makes it impossible for those other than the wearer, even upon very close inspection, to determine whether or not the eyeglasses contain a reality mediator. Such designs make possible livelong experiments and mediated reality experiences in personal everyday life, without the social stigma associated with obvious reality-modifying devices.
Since the 1970s the author has been exploring electronically mediated environ-ments using body-borne computers. These explorations in Computer Mediated Re-ality were an attempt at creating a new way of experiencing the perceptual world, using a variety of different kinds of sensors, transducers, and other body-borne de-vices controlled by a wearable computer, as described in Intelligent Image Processing, published by John Wiley and Sons (S. Mann, November 2001).
Early on, the author recognized the utility of computer mediated perception, such as the ability to see in different spectral bands and to share this computer mediated vision with remote experts in real time.
Such devices can be used to modify the visual perception of reality within cer-taro mediation zones (e.g. only one eye rather than both eyes, or only a portion of that eye), giving rise to partially mediated realitly. Moreover these devices can also be worn with prescription eyeglass lenses, or even have prescription eyeglass lenses incorporated into the design. Prescription eyeglasses are themselves partial reality mediators, having a Peripheral zone, a transition zone (frames or lens edges), and one or more mediation zones (one or more lenses; or lens zones as in bifocal, trifocal, etc., lenses).
In this way the mediated reality framework can be used to describe many layers of mediation.
This general framework can, for example, describe computer mediated communi-cations such as when a community of users modify each other's visual perception of reality as a way of communicating with each other (e.g. a spouse remotely scribbling a long nose on the face of a used car salesman, or an experience which the author actually had in a cafe, when a visitor to the author's web site modified the face of a police officer and sent the modified result back into the author's visual reality stream).
It can also, for example, describe how biofeedback-controlled night vision goggles are seen through the wearer's own corrective eyewear or contact lenses.
Mediated Reality has much of its origins in ecologically based practice (in the sense of taking place in the real world, within the context of day-to-day life, ecological validity as a special case of external validity.
For example, an important element of Stratton's work was that he wore the device in his ordinary everyday life. If performed on other subjects, such work might have far outstripped the ability of university ethics committees, the protocols required of "informed consent" , and the tendency for many academics to work in labs, controlled spaces, and existing literature.
Unlike traditional scientific experiments that take place in a controlled lab-like setting (and therefore do not always translate well into the real world), Stratton's approach required a continuous rather than intermittent commitment. For example, he would remove the eyewear only to bathe or sleep, and he even kept his eyes closed during bathing, to ensure that no un-mediated light from the outside world could get into his eyes directly. This work involved a commitment on his part, to devote his very existence - his personal life - to science.
Stratton captured a certain important human element in his broad seminal work, which laid the foundation for others to later do carefully controlled lab experiments within narrower academic disciplines. Moreover, his approach was one that broke down the boundaries between work and leisure activity, as well as the boundaries between the laboratory and the real world.
Similarly, it is desired to integrate computer mediated reality into daily life. The past 22 years of wearing computerized reality mediators in everyday life has provided the author with some insight into some of the sociological factors such as how others react to such devices. In particular, this has given rise to a desire to design and build reality mediators that do not have an unusual appearance.
Typical virtual reality headsets, and other cumbersome devices are not well suited to ordinary everyday life because of the bulky constrained and tethered operation, as well as their unusual appearance.
Indeed, it is preferable that commonly used reality mediators, such as hearing aids and personal eyeglasses must have an unobtrusive (or hidden) appearance, or be designed to be sleek, slender, and fashionable.
EyeTap devices, such as EyeTap infrared night vision computer system, and 1970s or 1980s style wearable computers when the components are visible, tend to have a frightening appearance. For example, the "glass eye" effect, in which optics, projected to the center of projection of a lens of the tapped eye are visible to others, is often disturbing to some individuals when they look a cyborg in the eye.
1990s embodiments were typically covert, with systems built into dark glasses tends to mitigate this undesirable social effect. The eyeglass lenses are also transpar-ent in the infrared, allowing the night vision portion of the apparatus to take over in low light, without loss of gain. In the visible portion of the spectrum, a lOdB loss of sensitivity is typically incurred to conceal the color sensor elements and optics.
The author's wearable computer reality mediators have evolved from headsets of the 1970s, to eyeglasses with optics outside the glasses in the 1980s, to eyeglasses with the optics built inside the glasses in the 1990s.
Presently, however, in the context of the present invention, it is desired to have eyeglasses with mediation zones built into the frames, lens edges, or the cut lines of bifocal lenses (e.g. exit pupil and associated optics concealed by the transition regions between frames and surroundings, etc.).
Reality mediators that have the capability to measure and resynthesize electro-magnetic energy that would otherwise pass through the center of projection of a lens of an eye of a user, are referred to as EyeTap devices. These devices divert at least a portion of eyeward bound light into a measurement system that measures how much light would have entered the eye in the absence of the device. Some eyetap devices use a focus control to reconstruct light in a depth plane that moves to follow subject matter of interest. Others reconstruct light. in a wide range of depth planes, in some cases having infinite or near infinite depth of field.
The present invention covers both EyeTaps as well as mere information displays that do not necessarily have or involve any means of analysis or measurement or responsiveness to light in the environment.
In embodiments of the present invention, a partial reality mediator is constructed, within the transition zone of eyeglasses, eyeglass lenses, frames, fake stickers (like "UV
100%" ), and the like.
In the preferred embodiment, transition elements are eyeglass frames that pass into a visually directable portion of the wearer's field of view (e.g. a portion to which the wearer can direct visual attention).
Typical EyeTap eyeglasses are custom made for each wearer, and therefore fit very accurately, making it possible to have a zero size, or near zero size exit pupil, so that an aremac functions nearly like a pinhole camera in reverse (e.g. having infinite depth of displayed focus). Especially in the context of lower resolution displays (e.g.
quarter VGA), such an arrangement has been shown to be useful.
The concept of the aremac is well known in the art (Intelligent Image Processing, Wiley 2001).
Moreover, with the aremac, depth of focus can be controlled or enhanced, either by the end-user (by adjusting an iris on the aremac), under computer program control (automatically adjusting the iris in response to information obtained by the camera's autofocus system and knowledge of the depth of field of the scene), or in manufacture (e.g. simply including a small enough aperture stop that it need not be adjusted). A
small aperture stop, installed during manufacture, can be so made that it provides depth of focus equal to the depth of focus of the eye (e.g. so that the aremac presents an image that appears sharp when the eye is focused anywhere from 4 inches to infinity, which is the normal range of a healthy eye).
Many modern miniature surveillance cameras have very small apertures, and thus provide essentially unlimited depth of focus. Pinhole cameras (which were tradition-ally only of historical or artistic interest for use with film) have made a comeback in the world of covert video surveillance. Pinhole cameras provide essentially infinite depth of focus. Because video resolution is so much lower than film resolution, the diffraction effects of the pinhole are less evident. Similarly, in applications where NTSC resolution is sufficient (e.g. for providing a low to moderate resolution reality mediator based a small pinhole camera, for a computer mediated reality or partially mediated reality experience based on imagery that originated from NTSC
resolution image capture), the pinhole aremac is very well suited.
The pinhole aremac allows for ultra miniaturization of the devices, using solid state lasers, and miniature optics.
However, even a very small size optical element, when placed within the open area of an eyeglass lens, looks unusual. Thus eyeglasses having display optics embedded in an eyeglass lens, such as those made by Microoptical (http://www.microopticalcorp.com), still appear unsual. In normal conversation, people tend to look one-another right in the eye, and therefore will notice even the slightest speck of dust, on an eyeglass lens.
Therefore, within the scope of the invention, any intermediary elements may be installed in an eyeglass lens in the transition zones, e.g. between peripheral vision and the eyeglass lens itself (e.g. in the frames), or between different portions of a multifocal (bifocal, trifocal, etc.) eyeglass lens.
Consider first the implementation of a reality mediator for installation in bifocal eyeglasses.
In the case in which a monocular version of the apparatus is being used, the apparatus is built into one lens, and a dummy version of the diverter portion of the apparatus is positioned in the other lens for visual symmetry. It was found that such an arrangement tended to call less attention to itself than when only one diverter was used for a monocular implementation.
These diverters may be integrated into the lenses in such a manner to have the appearance of the lenses in ordinary bifocal eyeglasses.
Where the wearer does not require bifocal lenses, the cut line in the lens can still be made, such that the EyeTap defines a transition between two lenses of equal focal length.
Transition zone reality mediators: Reversal of the roles of eyeglass frames and eyeglass lenses The size of the mediation zone that can be concealed in the cut-line of a bifocal eyeglass lens is somewhat limited.
The peripheral transfer function provides a more ideal location for a partial real-ity mediator, because the device can be concealed directly within the frames of the eyeglasses.
Reality mediators incorporated into the eyeglass frames, upon close inspection with the unaided eye, have a normal appearance. Even using a magnifier, zoom lens, or the like, for close inspection, other people cannot see the device.
A typical coherent fiber or fiber bundle can bring 30,000 array elements into a lmm wide eyeglass frame, and remain completely concealed, especially if hollow frames are used, together with a shrouding element, e.g. a front-shroud made of smoked acrylic or infrared-only transparent material.
In view of such a concealment opportunity, a new kind of eyeglass design in which the frames come right through the center of the visual field, emerges.
Photochromic prescription lenses are drilled in two places on the left eye, and four places on the right eye, to accommodate a break in the eyeglass frame along the right eye (the right lens being held on with two miniature bolts on either side of the break).
Fiber optic bundles concealed by the frames are then bonded in place to locate a camera and aremac in back of the device.
Although the resolution is poor some research prototypes were built to prove the viability of using eyeglass frames as a mediating element. The frames being slender enough (e.g. typically one or two millimeters wide) do not appreciably interfere with normal vision, being close enough to the eye to be out of focus.
But even if the frames are to be made wider, they can be made out of a see-through material, or they can be seen through by way of the illusory transparency afforded by the EyeTap principle described in the aforementioned John Wiley text book. Therefore, there is definite merit in seeing the world trough eyeglass frames.
In particular, a more sophisticated design uses a plastic coating to completely conceal all the elements, so that even when examined under a microscope, evidence of the EyeTap is not visible.
Eyeglasses having a transparent frame may be used to hide cameras, displays, and other materials, without appreciably blocking the wearer's vision. EyeTap de-vices consisting of optical rails concealed into eyeglass frames are also possible within the scope of the invention. In a preferred embodiment the frames are completely transparent in the infrared and dark (but still partially transparent) in the visible portion of the spectrum.
The eyeglasses have a normal appearance, with clear lenses in which others can see both of the wearer's eyes. When mediation is desired, the wearer simply looks through the eyeglass frames.
In one embodiment, inside the eyeglasses is an the optical rail. The rail is prefer-ably 7mm wide, and allows for rapid prototyping with various 7mm wide optical components that can be inserted and slid along the rail. A broadcast quality color camera (Elmo (aN401) is inserted with an aremac, diverter, and magnifying lens. An infrared camera may also be inserted into the optical rail, when desired, because the system is completely modular and re-configurable in the field.
The rail contains a camera, diverter, condenser lens, and pinhole (laser) aremac for providing a near-zero exit pupil size. Optical elements are critically aligned using a small piece of cardboard or paper to wedge each optical element, which are then moved by fine tweezers. The near-zero exit pupil size also makes it necessary to custom-make the nosebridge to fit the wearer's face exactly.
In one embodiment it is desired that the material be completely transparent in the infrared and partially transparent in the visible portion of the spectrum.
In the visible portion of the spectrum the frames function like dark sunglasses, so that the wearer can see out, but others cannot see in. Additionally, a broadcast quality color camera (a satisfactory camera being the ELI\r0 (aN401) provides an EyeTap camera, together with optics that position the effective center of projection of the camera at the center of the lens of the right eye of the wearer, when the wearer looks slightly up.
A 7mm slot is milled into the inside of the frame, which is then polished, so that various optical elements can be slid back and forth in the hidden slot in the frame.
Because the frame is transparent in the infrared, the light loss is much lower when an infrared camera is inserted into the slot.
Clear lenses hang down from the frame which sits slightly above eye level, so that the wearer looks upward, or tips the head forward in order to bring the mediation zone directly in front of the field of view.
The lenses also rise up above the frame slot, so that the eyeglasses do not look unusual, as they might if the lenses only hung below the frames.
When the lenses only hang below the frames the eyeglasses look too much like reading glasses and people wonder why the wearer is wearing reading glasses all the time in regular daily life. It looks strange to be wearing reading glasses while walking down the street (e.g. unless looking at close subject matter), so having the eyeglass lenses extend upward above the frames is a key inventive step, because the frames look normal, yet the front piece of the frames runs straight in front of the eyes, only a little above eye level.
Alternatively, four lenses can be used, each being half lenses, so that reading lenses can hang below the display bar, and distance lenses can be mounted above it, and it can be made to look normal, and in fact it can be made to look classic, like the old bifocal lenses of bygone years in which four lenses were used, two for each eye. This look of the old bifocal eyeglasses, with the frame running down the middle between the two lenses of each eye, can be nostalgic, yet ultra modern with the wearer actually functioning as a videographic cyborg and yet still looking like a luddite.
The invention disclosed here facilitates the display of computer data, video, or still pictures right down a mid portion of one or both of the lenses of a pair of eyeglasses. Because the frames beautifully bisect the vision, into two portions, there is provided an area of vision above the display zone, and another area of vision below the mediation zone. Typically, thus, the mediation zone is within a horizontal stripe that splits the wearer's visual field in half, or approximately half, e.g.
maybe one third is above and two thirds is below the mediation zone.
To others, the apparatus of the invention looks just like old fashioned reading glasses, but to the wearer of the glasses, there appears to be a computer screen or television visible only to the wearer of the glasses.
The kind of reading glasses referred to in this disclosure are normally worn in all aspects of life, not just reading. This is because they are not (nor do they look like) reading-only glasses. A lower portion of the lenses are worn in the lower half of the eye's field of view, and an upper portion are worn above.
The glasses don't need to have a prescription (they can simply be pseudo-intellectual reading glasses), or they can be actual glasses with two prescriptions if desired.
Thus the frame runs through the lens (if there is a frame around the lens) and will pass over the center of the wearer's eyes, so that the central field of vision is directly aimed at the eyeglass frames, or can be aimed there if desired.
Alternatively, if the glasses do not have frames, some other concealer can be used, such as a sticker that says "100% UV protection" or anything else we might want to put on the lens to hide the active display element behind it.
The lenses themselves may have infinite focal length (zero power, e.g. zero diopters), so that the reading glasses are simply for effect, e.g. something that looks normal in apparance yet provides a mounting point for the display directly in front of an eye of the wearer.
In some embodiments, the display borne by the glasses is connected to a computer, while in other embodiments it is connected to a camera. Preferably, when a camera is used together with the apparatus of the invention, the camera is also borne by the reading glasses. When the display shows the output of a camera, the invention facilitates a new form of visual art, in which the artist may capture, with relatively little effort, a visual experience as viewed from his or her own perspective.
With some practice, it is possible to develop a very steady body posture and mode of movement that best produces video of the genre pertaining to this embodiment of the invention.
Because the apparatus may be made lightweight and situated close to the eye, there is not the protrusion associated with carrying a hand-held camera, or using a traditional head mounted display or head worn television viewfinder.
The exit pupil can be very small and precise, since the frames can actually be quite close to the eye, especially if they come in behind the lenses.
Also because components of the proposed invention are mounted very close to the head, in a manner that balances the weight distribution, the apparatus does not restrict the wearer's head movement or encumber the wearer appreciably.
With known video or movie cameras, the best operators tend to be very large peo-ple who have trained for many years in the art of smooth control of the cumbersome video or motion picture film cameras used. In addition to requiring a very large per-son to optimally operate such cameras, various stabilization devices are often used, which make the apparatus even more cumbersome. The apparatus of the invention may be optimally operated by people of any size. Even young children can become quite proficient in the use of the lens-mid display system having lens-mid frames.
A typical embodiment of the invention comprises one or two spatial light modu-lators or other display means built into a pair of reading glasses.
In some embodiments, a beamsplitter or a mirror silvered on both sides is used to combine the image of the display with the apparent position of a camera also borne by the apparatus of the invention.
Accordingly the present invention in one aspect comprises a wearable display system using eyeglasses, in particular, using a portion of the eyeglass frames that passes directly in front of an eye of the wearer of the eyeglasses, where one or two eyeglass lenses extend both below the frame and above the frame. Preferably the eyeglasses are otherwise rimless, where instead of a rim around the lens, the lens is mid-supported, so that the frames will bisect the wearer's vision in such a way that the wearer can see foveally both above and below the frame.
In another embodiment, the eyeglasses also contain a camera. In this embodi-ment, preferably the display is responsive to an output, of the camera.
Preferably this arrangement gives rise to a reality mediator, such as a night vision system.
Prefer-ably, the camera is an eyetap camera. Preferably the mediated reality environment provides functions of a seeing aid for the disabled, including a wearable face recog-nizer, wayfinder, visual memory prosthetic, and notice server for collecting evidence of slip and fall. Preferably the device provides biofeedback and a body instrument panel, such as electrocardiogram. Preferably the electrocardiogram (ECG) can be synchronized with an EVG (electrovisualgrarn, e.g. video capture from the EyeTap) for being digitally notarized and stored in an evidence file.
According to another aspect of the invention, there is provided a display means between two pairs of lenses, where the display means bisects the visual field of view, so that four lenses are used, two above (one for each eye) and two below (one for each eye) the display means.
According to another aspect of the invention, there is provided an optical channel in eyeglass frames, where the frames rest upon the nose and come across a main part of the wearer's visual field of view.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in more detail, by way of examples which in no way are meant to limit the scope of the invention, but, rather, these examples will serve to illustrate the invention with reference to the accompanying drawings, in which:
FIG. 1 shows an information appliance concealed in operationally transparent eyeglass frames.
FIG. 2 shows a closeup view of one embodiment of operational transparency.
FIG. 3 shows an upward-looking embodiment of the invention for an information look-up system.
FIG. 4 shows continuation concealment of eyeglass frame functionality.
FIG. 5 shows a lens-based frame reinforcer.
FIG. 6 shows a media device with a concealer having a front portion consisting of a sticker, for being worn in front of one eye of a wearer of the media device.
FIG. 7 shows a media device with a concealer for being worn in front of both eyes of a wearer of the media device, but where only one eye is for media.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
While the invention shall now be described with reference to the preferred em-bodiments shown in the drawings, it should be understood that the intention is not to limit the invention only to the particular embodiments shown but rather to cover all alterations, modifications and equivalent arrangements possible within the scope of appended claims.
In all aspects of the present invention, references to "viewscreen" shall not be limited to just miniaturized television monitors, displays, or the like, or computer monitors, displays, or the like, but shall also include computer data display means, as well as fixed display means, where such fixed display means include crosshairs, graticules, reticles, brackets, etc., and other static, photographic, optical, or video display devices, still picture display devices, ASCII text display devices, terminals, and systems that directly scan light onto the retina of the eye to form the perception of an image, whether or not there is an actual or aerial image formed outside the eye.
Similarly references to "camera" mean any device or collection of devices capable of simultaneously determining a quantity of light arriving from a plurality of directions and or at a plurality of locations, or determining some other attribute of light arriving from a plurality of directions and or at a plurality of locations.
References to ''processor" , or "computer" shall include sequential instruction, par-allel instruction, and special purpose architectures such as digital signal processing hardware, Field Programmable Gate Arrays (FPGAs), programmable logic devices, as well as analog signal processing devices.
When it is said that object "A" is "borne" by object "B", this shall include the possibilities that A is attached to B, that A is part of B, that A is built into B, or that A is B.
FIG. 1 shows an information appliance concealed in operationally transparent eyeglass frames 1F. These frames pass within view of a wearer of the eyeglasses.
When it is said that the frames pass within a field of view of the wearer, what is meant is that the foveal region of the wearer's eyesight can be directed through or at the frames. Thus, although many eyeglass frames are visible to the wearer while wearing the glasses, the frames are typically, by design, located out in the periphery of the wearer's field of view so that the wearer does not (and in fact typically cannot) look directly at the frames. However, within the the scope of the present invention, a foveally viewable eyeglass frame is desired. The frame may be made foveally viewable by placing it directly in front of the eyes, or by placing it only slightly out of direct view, so that the wearer can simply look up or down slightly to be still looking directly at or through the frames. Thus the frames will interrupt a foveally directable portion of the field of view. This does not mean necessarily that the interrupted portion of the visual field of view is always staying within the foveal portion of the view, but merely that it can be made to come within the wearer's foveal field of view, in the sense that the wearer is free to look in the direction of the frames and see through the frames. Though the frames will be out of focus at such close distance, the wearer of the foveally interrupting frames will be able to have his or her central field of view interrupted by the frames through a choice of look direction that puts the frames at or near a central field of vision.
Left lens 1L and right lens 1R may be functional (with a prescription or with other embedded technological enhancements) or decorative (e.g. like the lenses in "pseudo-intellectual" eyeglasses that are fashionable but have a focal length of approximately infinity (e.g. 0 diopter "blanks" ).
What is meant by operationally transparent is that a wearer of the eyeglasses can still see reasonably well, despite the fact that the eyeglass frames come into the wearer's field of vision. Operational transparency may be attained by having frame 1F be sufficiently narrow as to not appreciably block the wearer's vision. For example, eyeglass frames having a width of one or two millimeters can run across directly in front of the wearer's eyes, and because the size of the frames may be less than (or scarecely larger than) the diameter of the wearer's pupil, eye iris opening, or the like, the wearer can still see, as if seeing "through" the frames. Indeed, the frames will be so close to the eye as to appear out of focus, and thus not appreciably block vision.
Concealed within the operationally transparent frame, particularly within a view-able portion of the frame, is an information display device. This enables the wearer to see the information display device. Appropriate optics enable a computer screen, or the like, to exist in sharp focus, while the viewable section of the frames are out of focus and do not obstruct the wearer's eyes, or the view of the wearer's eyes by other people.
Thus the wearer can look normal while reading a computer screen, or the like, so that other people do not know that the wearer is viewing the screen.
In one embodiment, a diverter 1D diverts light from an aremac 1A. The aremac and laser aremac are devices that are well known in the art, as described in Intelligent Image Processing, published by John Wiley and Sons (S. Mann, November 2001).
Diverter 1D is preferably contained in an eye-blocking portion of the eyeglass frame 1F.
Additionally, in some embodiments a camera 1C may be included in the device.
Thus a display device of the invention may be used as a viewfinder. In this case, the operational transparency of the eyeglass frames may comprise, include, or consist of, an illusory transparency facilitated by seeing through the camera in at least a portion of the visual field of view of the eyeglasses.
In some embodiments camera 1C is an EyeTap camera, the concept of EyeTap cameras being well known in the art (See for example, Intelligent Image Processing, John Wiley and Sons).
An embodiment in which the operational transparency of the eyeglass frames com-prises, includes, or consists of, an illusory transparency facilitated by seeing through a camera in at least a portion of the visual field of view, will be referred to as an illusory operational transparency, or partial illusory operational transparency.
Traditinally, with ordinary eyeglasses not embodying the invention disclosed herein, the wearer looks through the eyeglass lenses and the fraanes are merely decorative (as well as funtional, of course, to hold the lenses in place). When using an at least partial illusory operational transparency of the present invention, there is an at least partial role reversal between the role of frames and lenses. In this case, the frames may become, at least in part, the element that the wearer looks through, while the lenses become, at least in part, decorative elements.
In particular, merely having a metal bar, or hollow plastic bar, or other viewable framing element pass in front of the eyes without any lenses attached to it, would arouse curiosity, for it would seem strange to be wearing a metal band, plastic channel, or other device in front of the eyes.
However, hanging lenses from this device will make it appear to take on the role of a mere eyeglass frame. It is quite possible, within the scope of this invention, to design a fashionable eyeglass frame, such that the frame passes directly in front of the eyes, yet supports two dummy lenses, such as to look very normal in appearance.
Additionally, nose bridge elements 1N may be incorporated into the lenses them-selves, so that a very fashionable "minimalist" design results. Cables, such as wire, 1CW and lAW connect the camera and aremac, respectively, to a body worn com-puter. Preferably one cable on each side of the eyeglasses has an appearance similar to that of eyeglass safety straps, by virtue of hollow eyeglass temples, together with an imitation of the safety straps so familiar under the Croakies (TM) trade mark.
Fiber optic may be used in place of wires 1CW or lAW. For example, camera 1C
may be a fiber optic camera with head-end located in the temple of the eyeglass frames, or in back of the frames, out of view of people who might be closely inspecting the wearer.
FIG. 2 shows a closeup view of one embodiment of operational transparency.
Hollow frames 1F conceal display elements behind a front surface 2F that is dark smoked material, so that the frames look like black plastic frames. A groove allows various optical elements such as display optics 20 to be slid back and forth in the groove. Preferably a friction fit allows elements to be adjusted to suit individual users or preferences. A satisfactory slot width is 7mm, so that a 7mm cube beam splitter can be jammed into the slot and moved as desired. One or two thicknesses of paper can be used to attain a friction fit. A camera such as a Toshiba C~N401 which has a diameter slightly larger than 7mm makes a nice friction fit in the slot, so that it tends to stay where put. It can be moved back and forth in the slot to adjust the EyeTap distance or to accomodate for the particular Inter Pupilary Distance (IPD) of various wearers.
The modular nature of such a device allows the wearer to change to a night vision camera (infrared, etc.) as desired. In this case, frames 1F are preferably made of material that is dark in the visible (say, perhaps 10% transmissivity in the visible) and light in the infrared (say, perhaps 90% transmissivity in the infrared).
This allows the same eyeglass frames to be used for day vision and night vision with a simple change of camera, while also ensuring that night vision sensitivity is not lost in making the plastic frames appear dark to the eyes of persons other than the wearer.
In the visible portion of the spectrum the frames are like dark sunglasses -the wearer can see out through the frames, but others can't see into the frames.
In the infrared, the frames axe like clear frames -- optical instruments can see out, but because the transparency is below the visible spectrum, others can't see into the frames unless they are also wearing glasses of this sort equipped with infrared cameras or the like (e.g. one cyborg can see that another person is a cyborg but not otherwise be seen by non-cyborgs).
In an alternate embodiment, a deep red, burgundy, deep dark pink, or rose col-ored eyeglass frame is used, wherein the color of the frames is carefully chosen so that it is transparent in the infrared, dark in the green and blue, and has a tran-sition band in the visible red. In this way the frames are still quite dark, being as the deep red, burgundy, or dark pink color conceals the apparatus, but looks a little less "dark and evil" than black plastic frames. Such a transition band plastic mate-rial THEREFORE ALLOWS THE WEARER TO SEE THE WORLD THROUGH
EYEGLASSES WITH ROSE COLORED FRAMES.
Clear eyeglass lenses may then be mounted on the inside, to cover the opening in the slot channel, and to protect the wearer from injury by broken pieces of optics, sharp edges, etc, in times of impact.
If no camera is being used, the display can still be present in the channel, and the wearer can see through the eyeglass frames, which become operationally transparent as, for example, rose colored eyeglass frames, where the wearer's vision is normal in most of the field of view, but with a thin horizontal band of vision that is seen through the rose colored tint of the eyeglass frames.
FIG. 3 shows an upward-looking embodiment of the invention for an information look-up system. Eye 3E looks upward through eyeglass frames that are slightly above eye level. The frames look normal because they are not so low that they appear to cover the eyes, but they are low enough to be in view, in an operationally transparent manner. An upper frame surface 3UF is preferably made of metal, as is a lower frame surface 3LF. Sandwiched between these two pieces of metal is an operationally transparent piece, having front surface 3F and back surface 3B, in which are embedded optical elements such as diverter 1D and aremac 1A.
Surfaces 3UF and 3LF may be wavy or curved manifolds that present themselves edge-on to a wearer, so that they block only on a set of approximately zero measure.
Surfaces that thus do not block appreciably the view of the wearer, are referred to as eyeward-edge manifolds. Whether the eyeward edge manifold is a piece of straight metal, wavy metal, curved metal, or the like, having at least a portion that is seen edge-on as it waves, curves, or goes straight, or whether it is by virtue of a step change in refractive index (e.g. from air to plastic) that the eyeward-edge manifold produces minimal disruption of normal vision, such surfaces that block on approximately zero measure are useful. Moreover, transparently distortion free front and back surfaces 3F and 3B may be arranged to present no grading in refraction, and thus sustain a operational transparency.
In one embodiment, a camera provides the seeing, so that the surfaces 3B and 3F need not actually be transparent in the same place, so long as light can somehow get into the camera and out of the display. In such virtual or illusory transparency, there need not be a complete passive optical path through the frames. In other embodiments, the frames are actually physicall transparent, so that the operational transparency is real transparency.
FIG. 4 shows continuation concealment of eyeglass frame functionality. A lower lens portion 4LL exists on some manifold that is at least continuous and differentiable with that of an upper lens portion 4UL. Thus, although there may be a break in the lens between these two portions, there is an appearance of continuation, that makes the eyeglass frames appear less functional, and more of a mere fashion statement.
Also, in some embodiments, it is preferable to angle the front surface 4F
downward, so that even though the wearer looks up to see through or into the frames, the frames themselves do not reflect environmental light back to other people. This overcomes a drawback of the design in Fig. 3 where the thick shiny black frames reflect specular highlights from ceiling lights into other people's eyes.
The downward angled design avoids specular highlights, while defining an opera-tionally transparent wedge. The wedge is an important element of some embodiments of the invention because the wedge defines a near-zero transparency occlusion zone.
The upper surface 3UF and lower surface 3LF are both seen on-edge, so that they appear very thin to eye 3E. Surfaces 3UF and 3LF are preferably made of thin sheet metal, such as black anodized aluminum, titanium, or the like, so that they obstruct very little of the wearer's view. A wedge shaped operationally transparent section of the frame is thus designed, through Zurnike polymomials, or the like, to have an appropriate curvature on both sides as to be operationally transparent in at least a portion of view around a display or mediation zone.
Thus a display that only blocks a small portion of the wearer's view is flanked to the left and right with real operational transparency zones.
FIG. 5 details how an eyeglass lens itself may provide structural support of an eyeglass frame that has been weakend in a portion of the frame for being in front of an eye of the wearer. In some embodiments, it is desired to have an exit pupil, or entrance pupil within eyeglass frames. Accordingly, opening 50 exists with no frame support, because it comprises a transparency break in the eyeglass frame 1F.
The transparency break allows an optical element to be placed at the location of opening 50, to serve the purpose of operational transparency. This break in frame 1F
requires structural support, so the right lens is screwed on with four screws or bolts or other fasteners. Fasteners 51R and 52R hold the left side of right lens 1R, while fasteners 53R and 54R hold the right side of right lens 1R.
In actual practice, if only one eye is tapped or displayed to, the left lens 1L could be held with only two fastners 51L and 52L. However, as this might give rise to an unusual appearance (due to assymetry), four fasteners may also be used on the left lens 1L.
Of course both eyes can have openings in which case four fasteners may be used.
Other less visible mounting approaches may also be used. Alternatively, a thin front piece 5F may slide or snap over the entire device to shroud it from view. In this case, the front cover piece 5F may snap onto the exposed fasteners. The fasteners may also bear fiber optics or other devices fastened thereto, outside the lenses 1L and 1R, but behind the front cover 5F. Thus many optical elements may be concealed between front cover piece 5F and the eyeglass lenses (1L and 1R).
FIG. 6 details how a decorative or similar item may be affixed to a lens to conceal media. Here a sticker, 100UV, that says "100% UV" , such as one commonly finds on new eyeglasses is used. The concealer may be any other kind of object that will not appear unusual, in blocking the vision of the wearer. The sticker appears backwards in the drawing because we are looking at the glasses from the wearer's perspective, and the sticker thus appears normal to other people.
Lens 1R extends both above and below the concealer, so that the concealer falls within a foveaable portion of the wearer's vision. The concealer is preferably, thus, within a central portion of the wearer's visual field, so that the wearer can look both above and below the concealer.
FIG. 7 shows another embodiment, with a stylish concealer that also is part of the frame 1F. The left part 7L of the concealer runs in front of the left eye of the wearer, whereas a right part 7R runs in front of the right eye. In between parts 7L
and 7R, a nose bridge 7N is fashioned from the concealer, so that the whole thing looks continuous and natural, as if part of a new kind of stylish eyeglasses.
Lenses 1L and 1R may be affixed directly to parts iL and 7R respectively.
An opening 50 is available for either a display device, an eye camera, an eye tracker, or other similar form of eye media.
The eyeglasses may, if desired, include a camera that is not necessarily an eye camera or eyetap camera. In this case, opening 50 may be used to display an output of the camera, processed by a computer, possibly with a coordinate transformation to effect an eyetap perspective, if desired.
From the foregoing description, it will thus be evident that the present invention provides a design for a wearable display system built into a wearer's in-view portion of eyeglass frames. As various changes can be made in the above embodiments and operating methods without departing from the spirit or scope of the invention, it is intended that all matter contained in the above description or shown in the accom-panying drawings should be interpreted as illustrative and not in a limiting sense.
Variations or modifications to the design and construction of this invention, within the scope of the invention, may occur to those skilled in the art upon reviewing the disclosure herein. Such variations or modifications, if within the spirit of this invention, are intended to be encompassed within the scope of any claims to patent protection issuing upon this invention.
Stratton presented two important ideas: ( 1 ) the idea of constructing special eyeglasses to modify how he saw the world; and (2) the ecologically motivated approach to conducting his experiments within the domain of his own everyday personal life.
Stratton's seminal work has inspired a wide variety of devices that can be used to augment, deliberately diminish, or otherwise alter human visual perception.
Problems with the existing taxonomies (e.g. mixed reality) and distinctions (e.g.
optical versus video see-through) arise when we consider reality-modifying devices. In this case, it is necessary to use the expanded taxonomy> of mediated reality, and the notion of ilhcsory transparency rather than video transparency (e.g. a more general framework is needed).
The need for various new designs for reality-modifying devices, especially those used to modify only a portion of the visual reality stream, suitable for use in everyday life, is becoming more and more evident.
In 1989, Jaron Lanier, CEO of VPL, coined the term "virtual reality" to bring a wide variety of virtual projects under a single rubric. Tom Caudell coined the term "augmented reality" at Boeing in the early 1990s, while working together with David Mizell, researching ways to superimpose diagrams and markings to guide workers on a factory floor. Augmented reality, involving superposition of computer graphics onto a view of the real world, was initially proposed and explored by Ivan Sutherland and his students at Harvard University and University of Utah, in the 1960s.
Augmented reality is generally defined as computer displays that add virtual infor-mation to a person's perception of the real world. One way to add virtual information to our visual field of view is to display the information on some kind of beamsplitter or other similar device, and thus use the human eye itself as the adder that achieves the superposition of the real and virtual information.
When the real world, as well as the virtual world, is to be experienced electronically (e.g. by seeing the real world through a camera), the virtual world and real world are blended together using well-known methods. Such superposition of video signals is commonly used in broadcast television, for example, to overlay virtual information (closed captioning, title credits, weather information, etc.) from a computer onto a broadcast video signal from a television camera. This superposition is typically achieved by way of a device called a "video mixer" that adds two (or more) video signals. In fact, broadcast television video signals are amplit-ude modulated (AM) because one of the original reasons for selecting AlVI (in addition to reduced spectral requirements, etc.) was so to simplify the design of video misers, so that simple adjustable voltage dividers (potentiometers) could be used to blend two or more video signals. AM broadcasts allow for easy cro.ssfades between two received video signals.
The concept of illusory transparency continues to apply to systems that do not involve video. Examples of illusory transparency include computational see-through by way of laser EyeTap devices, as well as optical computing to produce a similar illusory transparency under computer program control. With optical computing, the concept of illusory transparency continues to provide a distinction between devices that can modify (not just add to) our visual perception and devices that cannot.
Reality-modifying devices that do not involve video therefore problematize the notion of video see-through.
Consider audio as a simple example of illusory transparency that is clearly not an example of video see-through. In a small classroom, a professor will often speak to the students and augment a lecture with some virtual (e.g. computer-generated) audiovisual materials, using small speakers connected to a laptop computer, so that the students hear the virtual world of the computer through the speakers while they hear the professor's narration directly. The summation of the real (narration) and the virtual takes place inside the ear. However, in a large auditorium, the professor may use a large public address system, in which case the professor's voice, as typically received by a microphone, is mixed with the audio output of the computer, by way of an aredio mixer.
(An audio mixer is a device that adds together a variety of different audio inputs to produce a weighted sum of the inputs for use in a public address system, broadcast, or audio recording. Typically each input is multiplied by an adjustable constant prior to the addition, and these weighting constants are each adjusted by a separate volume control. Alternatively, mixers used for only two inputs (disco mixers used by disc jockeys, as well as many video mixers) often have a single slider to provide a crossfade from one input to the other, allowing the user to select any blend of the two inputs.) Such an example can also be applied to headphones that have built in microphones to allow the wearer to hear reality mixed in with some virtuality, using this concept of illusory transparency applied to audio signals.
Although addition is the correct operation for audio signals, it has been conclu-sively shown that the addition of video signals, in augmented reality, is fundamentally flawed because cameras operate approximately logarithmically and displays (includ-ing most head mounted displays) operate anti-logarithmically). Thus video mixers effectively perform something that is closer to multiplication than it is addition, be-cause they operate on approximately the logarithms of the underlying true physical quantities of light. Accordingly, antihomomorphic filters have been proposed for aug-mented reality, wherein this undesirable effect is cancelled.
In the theory of radio receivers, the concept of mixer is typically extended to in-clude multipliers, and other more general nonlinear elements such as signal diodes (detectors) that form the essence of the miser stage of a superheterodyne radio re-ceiver to give rise to an additivity of frequency spectral components. Thus the term mixer is commonly used to describe signal adders as well as signal multipliers.
Mixed reality covers a continuum between graphics enhanced by video and video enhanced by graphics. However, there are important categories of visual information processors that do not fit within this taxonomy or continuum.
One of the earliest and most important such examples was the eyeglasses George Stratton built and wore. His eyewear, made from two lenses of equal focal length, spaced two focal lengths apart, was basically an inverting telescope with unity mag-nification, so that he saw the world upside-down.
These eyeglasses, in many ways, actually diminished his perception of reality.
His deliberately diminished perception of reality was neither graphics enhanced by video, nor was it video enhanced by graphics, nor any linear combination of these two. (Moreover it was an example of optical see-through that is not an example of registered illusory transparency, e.g. it problematizes the notion of the optical see-through concept because both the mediation zone, as well as the space around it, are examples of optical-only processing. ) Others (Kohler, Dolezal, etc.) followed in Stratton's footsteps by living their day-to-day lives (eating, swimming, cycling, etc. ) through left-right reversing eyeglasses, prisms, and other optics that were neither examples of augmented reality nor aug-mented virtuality (and for which the optical see-through versus video see-through distinction fails to properly address important similarities and differences among var-ious such devices).
Thus a more general concept than that of mixed reality is needed.
Another class of devices that suggest a need for a more general concept are systems that provide synthetic synesthesia. Synesthesia is an empairment that affects many individuals naturally, but can be deliberately synthesized for anyone, using special devices that accept input of one (or more) senses) and translate it to an output to one or more other senses.
Here the word "empairment" is used because it is not agreed upon whether synes-thesia is an impairment or an empowerment. There are mixed feelings in the com-munity as to whether naturally occurring synesthesia is an asset or a detriment.
Examples of devices for providing synthetic synesthesia include the vision substi-tution devices used by the visually impaired, such as Leslie Kay's SonicVisioN
prod-uct which converts sonar (reflected ultrasound) to audible sound, and Peter Meijer's "vOICe" program.
Because of the existence of a broad range of devices that modify human percep-tion, mediated reality, a more general framework that includes the reality virtuality continuum, as well as devices for modifying as well as mixing these various aspects of reality and virtuality, has been proposed.
Mediated Reality, therefore, refers to a general framework for artificial modification of human perception by way of devices for augmenting, deliberately diminishing, and more generally, for otherwise altering sensory input.
SUMMARY OF THE INVENTION
The eyeglass concealer-as-intermediary devices of the invention use a concealer such as eyeglass frames, sticker, pattern, or the like, that has an external actual or apparent opacity, as a mediating or intermediary element for being within a field of view of the wearer, such that the concealer bisects the wearer's field of view to define areas that the wearer can see both above and below the concealer.
In one embodiment, the concealer contain at least one camera system or simi-lar imaging system or sensor, for being effectively at an eye of a wearer. In some embodiments, the camera is connected to a display borne by the apparatus of the invention.
In another embodiment, a display is concealed in or behind the concealer, where the concealer is intermediate between at least one eye of the wearer, and subject matter that remains at least partially within a field of view of the wearer.
In some embodiments a camera, borne by the apparatus of the invention, supplies a video signal to the display.
In a third embodiment, the intermediary concealer contains both camera and display, to provide a reality mediator within the intermediary.
In any of the above five (three main) embodiments, the concealer may consist of eyeglass frames that pass into or near a central region of the wearer's visual field of view.
A new kind of reality mediator that uses the eyeglass frames themselves as the mediating element is thus possible. This design makes it impossible for those other than the wearer, even upon very close inspection, to determine whether or not the eyeglasses contain a reality mediator. Such designs make possible livelong experiments and mediated reality experiences in personal everyday life, without the social stigma associated with obvious reality-modifying devices.
Since the 1970s the author has been exploring electronically mediated environ-ments using body-borne computers. These explorations in Computer Mediated Re-ality were an attempt at creating a new way of experiencing the perceptual world, using a variety of different kinds of sensors, transducers, and other body-borne de-vices controlled by a wearable computer, as described in Intelligent Image Processing, published by John Wiley and Sons (S. Mann, November 2001).
Early on, the author recognized the utility of computer mediated perception, such as the ability to see in different spectral bands and to share this computer mediated vision with remote experts in real time.
Such devices can be used to modify the visual perception of reality within cer-taro mediation zones (e.g. only one eye rather than both eyes, or only a portion of that eye), giving rise to partially mediated realitly. Moreover these devices can also be worn with prescription eyeglass lenses, or even have prescription eyeglass lenses incorporated into the design. Prescription eyeglasses are themselves partial reality mediators, having a Peripheral zone, a transition zone (frames or lens edges), and one or more mediation zones (one or more lenses; or lens zones as in bifocal, trifocal, etc., lenses).
In this way the mediated reality framework can be used to describe many layers of mediation.
This general framework can, for example, describe computer mediated communi-cations such as when a community of users modify each other's visual perception of reality as a way of communicating with each other (e.g. a spouse remotely scribbling a long nose on the face of a used car salesman, or an experience which the author actually had in a cafe, when a visitor to the author's web site modified the face of a police officer and sent the modified result back into the author's visual reality stream).
It can also, for example, describe how biofeedback-controlled night vision goggles are seen through the wearer's own corrective eyewear or contact lenses.
Mediated Reality has much of its origins in ecologically based practice (in the sense of taking place in the real world, within the context of day-to-day life, ecological validity as a special case of external validity.
For example, an important element of Stratton's work was that he wore the device in his ordinary everyday life. If performed on other subjects, such work might have far outstripped the ability of university ethics committees, the protocols required of "informed consent" , and the tendency for many academics to work in labs, controlled spaces, and existing literature.
Unlike traditional scientific experiments that take place in a controlled lab-like setting (and therefore do not always translate well into the real world), Stratton's approach required a continuous rather than intermittent commitment. For example, he would remove the eyewear only to bathe or sleep, and he even kept his eyes closed during bathing, to ensure that no un-mediated light from the outside world could get into his eyes directly. This work involved a commitment on his part, to devote his very existence - his personal life - to science.
Stratton captured a certain important human element in his broad seminal work, which laid the foundation for others to later do carefully controlled lab experiments within narrower academic disciplines. Moreover, his approach was one that broke down the boundaries between work and leisure activity, as well as the boundaries between the laboratory and the real world.
Similarly, it is desired to integrate computer mediated reality into daily life. The past 22 years of wearing computerized reality mediators in everyday life has provided the author with some insight into some of the sociological factors such as how others react to such devices. In particular, this has given rise to a desire to design and build reality mediators that do not have an unusual appearance.
Typical virtual reality headsets, and other cumbersome devices are not well suited to ordinary everyday life because of the bulky constrained and tethered operation, as well as their unusual appearance.
Indeed, it is preferable that commonly used reality mediators, such as hearing aids and personal eyeglasses must have an unobtrusive (or hidden) appearance, or be designed to be sleek, slender, and fashionable.
EyeTap devices, such as EyeTap infrared night vision computer system, and 1970s or 1980s style wearable computers when the components are visible, tend to have a frightening appearance. For example, the "glass eye" effect, in which optics, projected to the center of projection of a lens of the tapped eye are visible to others, is often disturbing to some individuals when they look a cyborg in the eye.
1990s embodiments were typically covert, with systems built into dark glasses tends to mitigate this undesirable social effect. The eyeglass lenses are also transpar-ent in the infrared, allowing the night vision portion of the apparatus to take over in low light, without loss of gain. In the visible portion of the spectrum, a lOdB loss of sensitivity is typically incurred to conceal the color sensor elements and optics.
The author's wearable computer reality mediators have evolved from headsets of the 1970s, to eyeglasses with optics outside the glasses in the 1980s, to eyeglasses with the optics built inside the glasses in the 1990s.
Presently, however, in the context of the present invention, it is desired to have eyeglasses with mediation zones built into the frames, lens edges, or the cut lines of bifocal lenses (e.g. exit pupil and associated optics concealed by the transition regions between frames and surroundings, etc.).
Reality mediators that have the capability to measure and resynthesize electro-magnetic energy that would otherwise pass through the center of projection of a lens of an eye of a user, are referred to as EyeTap devices. These devices divert at least a portion of eyeward bound light into a measurement system that measures how much light would have entered the eye in the absence of the device. Some eyetap devices use a focus control to reconstruct light in a depth plane that moves to follow subject matter of interest. Others reconstruct light. in a wide range of depth planes, in some cases having infinite or near infinite depth of field.
The present invention covers both EyeTaps as well as mere information displays that do not necessarily have or involve any means of analysis or measurement or responsiveness to light in the environment.
In embodiments of the present invention, a partial reality mediator is constructed, within the transition zone of eyeglasses, eyeglass lenses, frames, fake stickers (like "UV
100%" ), and the like.
In the preferred embodiment, transition elements are eyeglass frames that pass into a visually directable portion of the wearer's field of view (e.g. a portion to which the wearer can direct visual attention).
Typical EyeTap eyeglasses are custom made for each wearer, and therefore fit very accurately, making it possible to have a zero size, or near zero size exit pupil, so that an aremac functions nearly like a pinhole camera in reverse (e.g. having infinite depth of displayed focus). Especially in the context of lower resolution displays (e.g.
quarter VGA), such an arrangement has been shown to be useful.
The concept of the aremac is well known in the art (Intelligent Image Processing, Wiley 2001).
Moreover, with the aremac, depth of focus can be controlled or enhanced, either by the end-user (by adjusting an iris on the aremac), under computer program control (automatically adjusting the iris in response to information obtained by the camera's autofocus system and knowledge of the depth of field of the scene), or in manufacture (e.g. simply including a small enough aperture stop that it need not be adjusted). A
small aperture stop, installed during manufacture, can be so made that it provides depth of focus equal to the depth of focus of the eye (e.g. so that the aremac presents an image that appears sharp when the eye is focused anywhere from 4 inches to infinity, which is the normal range of a healthy eye).
Many modern miniature surveillance cameras have very small apertures, and thus provide essentially unlimited depth of focus. Pinhole cameras (which were tradition-ally only of historical or artistic interest for use with film) have made a comeback in the world of covert video surveillance. Pinhole cameras provide essentially infinite depth of focus. Because video resolution is so much lower than film resolution, the diffraction effects of the pinhole are less evident. Similarly, in applications where NTSC resolution is sufficient (e.g. for providing a low to moderate resolution reality mediator based a small pinhole camera, for a computer mediated reality or partially mediated reality experience based on imagery that originated from NTSC
resolution image capture), the pinhole aremac is very well suited.
The pinhole aremac allows for ultra miniaturization of the devices, using solid state lasers, and miniature optics.
However, even a very small size optical element, when placed within the open area of an eyeglass lens, looks unusual. Thus eyeglasses having display optics embedded in an eyeglass lens, such as those made by Microoptical (http://www.microopticalcorp.com), still appear unsual. In normal conversation, people tend to look one-another right in the eye, and therefore will notice even the slightest speck of dust, on an eyeglass lens.
Therefore, within the scope of the invention, any intermediary elements may be installed in an eyeglass lens in the transition zones, e.g. between peripheral vision and the eyeglass lens itself (e.g. in the frames), or between different portions of a multifocal (bifocal, trifocal, etc.) eyeglass lens.
Consider first the implementation of a reality mediator for installation in bifocal eyeglasses.
In the case in which a monocular version of the apparatus is being used, the apparatus is built into one lens, and a dummy version of the diverter portion of the apparatus is positioned in the other lens for visual symmetry. It was found that such an arrangement tended to call less attention to itself than when only one diverter was used for a monocular implementation.
These diverters may be integrated into the lenses in such a manner to have the appearance of the lenses in ordinary bifocal eyeglasses.
Where the wearer does not require bifocal lenses, the cut line in the lens can still be made, such that the EyeTap defines a transition between two lenses of equal focal length.
Transition zone reality mediators: Reversal of the roles of eyeglass frames and eyeglass lenses The size of the mediation zone that can be concealed in the cut-line of a bifocal eyeglass lens is somewhat limited.
The peripheral transfer function provides a more ideal location for a partial real-ity mediator, because the device can be concealed directly within the frames of the eyeglasses.
Reality mediators incorporated into the eyeglass frames, upon close inspection with the unaided eye, have a normal appearance. Even using a magnifier, zoom lens, or the like, for close inspection, other people cannot see the device.
A typical coherent fiber or fiber bundle can bring 30,000 array elements into a lmm wide eyeglass frame, and remain completely concealed, especially if hollow frames are used, together with a shrouding element, e.g. a front-shroud made of smoked acrylic or infrared-only transparent material.
In view of such a concealment opportunity, a new kind of eyeglass design in which the frames come right through the center of the visual field, emerges.
Photochromic prescription lenses are drilled in two places on the left eye, and four places on the right eye, to accommodate a break in the eyeglass frame along the right eye (the right lens being held on with two miniature bolts on either side of the break).
Fiber optic bundles concealed by the frames are then bonded in place to locate a camera and aremac in back of the device.
Although the resolution is poor some research prototypes were built to prove the viability of using eyeglass frames as a mediating element. The frames being slender enough (e.g. typically one or two millimeters wide) do not appreciably interfere with normal vision, being close enough to the eye to be out of focus.
But even if the frames are to be made wider, they can be made out of a see-through material, or they can be seen through by way of the illusory transparency afforded by the EyeTap principle described in the aforementioned John Wiley text book. Therefore, there is definite merit in seeing the world trough eyeglass frames.
In particular, a more sophisticated design uses a plastic coating to completely conceal all the elements, so that even when examined under a microscope, evidence of the EyeTap is not visible.
Eyeglasses having a transparent frame may be used to hide cameras, displays, and other materials, without appreciably blocking the wearer's vision. EyeTap de-vices consisting of optical rails concealed into eyeglass frames are also possible within the scope of the invention. In a preferred embodiment the frames are completely transparent in the infrared and dark (but still partially transparent) in the visible portion of the spectrum.
The eyeglasses have a normal appearance, with clear lenses in which others can see both of the wearer's eyes. When mediation is desired, the wearer simply looks through the eyeglass frames.
In one embodiment, inside the eyeglasses is an the optical rail. The rail is prefer-ably 7mm wide, and allows for rapid prototyping with various 7mm wide optical components that can be inserted and slid along the rail. A broadcast quality color camera (Elmo (aN401) is inserted with an aremac, diverter, and magnifying lens. An infrared camera may also be inserted into the optical rail, when desired, because the system is completely modular and re-configurable in the field.
The rail contains a camera, diverter, condenser lens, and pinhole (laser) aremac for providing a near-zero exit pupil size. Optical elements are critically aligned using a small piece of cardboard or paper to wedge each optical element, which are then moved by fine tweezers. The near-zero exit pupil size also makes it necessary to custom-make the nosebridge to fit the wearer's face exactly.
In one embodiment it is desired that the material be completely transparent in the infrared and partially transparent in the visible portion of the spectrum.
In the visible portion of the spectrum the frames function like dark sunglasses, so that the wearer can see out, but others cannot see in. Additionally, a broadcast quality color camera (a satisfactory camera being the ELI\r0 (aN401) provides an EyeTap camera, together with optics that position the effective center of projection of the camera at the center of the lens of the right eye of the wearer, when the wearer looks slightly up.
A 7mm slot is milled into the inside of the frame, which is then polished, so that various optical elements can be slid back and forth in the hidden slot in the frame.
Because the frame is transparent in the infrared, the light loss is much lower when an infrared camera is inserted into the slot.
Clear lenses hang down from the frame which sits slightly above eye level, so that the wearer looks upward, or tips the head forward in order to bring the mediation zone directly in front of the field of view.
The lenses also rise up above the frame slot, so that the eyeglasses do not look unusual, as they might if the lenses only hung below the frames.
When the lenses only hang below the frames the eyeglasses look too much like reading glasses and people wonder why the wearer is wearing reading glasses all the time in regular daily life. It looks strange to be wearing reading glasses while walking down the street (e.g. unless looking at close subject matter), so having the eyeglass lenses extend upward above the frames is a key inventive step, because the frames look normal, yet the front piece of the frames runs straight in front of the eyes, only a little above eye level.
Alternatively, four lenses can be used, each being half lenses, so that reading lenses can hang below the display bar, and distance lenses can be mounted above it, and it can be made to look normal, and in fact it can be made to look classic, like the old bifocal lenses of bygone years in which four lenses were used, two for each eye. This look of the old bifocal eyeglasses, with the frame running down the middle between the two lenses of each eye, can be nostalgic, yet ultra modern with the wearer actually functioning as a videographic cyborg and yet still looking like a luddite.
The invention disclosed here facilitates the display of computer data, video, or still pictures right down a mid portion of one or both of the lenses of a pair of eyeglasses. Because the frames beautifully bisect the vision, into two portions, there is provided an area of vision above the display zone, and another area of vision below the mediation zone. Typically, thus, the mediation zone is within a horizontal stripe that splits the wearer's visual field in half, or approximately half, e.g.
maybe one third is above and two thirds is below the mediation zone.
To others, the apparatus of the invention looks just like old fashioned reading glasses, but to the wearer of the glasses, there appears to be a computer screen or television visible only to the wearer of the glasses.
The kind of reading glasses referred to in this disclosure are normally worn in all aspects of life, not just reading. This is because they are not (nor do they look like) reading-only glasses. A lower portion of the lenses are worn in the lower half of the eye's field of view, and an upper portion are worn above.
The glasses don't need to have a prescription (they can simply be pseudo-intellectual reading glasses), or they can be actual glasses with two prescriptions if desired.
Thus the frame runs through the lens (if there is a frame around the lens) and will pass over the center of the wearer's eyes, so that the central field of vision is directly aimed at the eyeglass frames, or can be aimed there if desired.
Alternatively, if the glasses do not have frames, some other concealer can be used, such as a sticker that says "100% UV protection" or anything else we might want to put on the lens to hide the active display element behind it.
The lenses themselves may have infinite focal length (zero power, e.g. zero diopters), so that the reading glasses are simply for effect, e.g. something that looks normal in apparance yet provides a mounting point for the display directly in front of an eye of the wearer.
In some embodiments, the display borne by the glasses is connected to a computer, while in other embodiments it is connected to a camera. Preferably, when a camera is used together with the apparatus of the invention, the camera is also borne by the reading glasses. When the display shows the output of a camera, the invention facilitates a new form of visual art, in which the artist may capture, with relatively little effort, a visual experience as viewed from his or her own perspective.
With some practice, it is possible to develop a very steady body posture and mode of movement that best produces video of the genre pertaining to this embodiment of the invention.
Because the apparatus may be made lightweight and situated close to the eye, there is not the protrusion associated with carrying a hand-held camera, or using a traditional head mounted display or head worn television viewfinder.
The exit pupil can be very small and precise, since the frames can actually be quite close to the eye, especially if they come in behind the lenses.
Also because components of the proposed invention are mounted very close to the head, in a manner that balances the weight distribution, the apparatus does not restrict the wearer's head movement or encumber the wearer appreciably.
With known video or movie cameras, the best operators tend to be very large peo-ple who have trained for many years in the art of smooth control of the cumbersome video or motion picture film cameras used. In addition to requiring a very large per-son to optimally operate such cameras, various stabilization devices are often used, which make the apparatus even more cumbersome. The apparatus of the invention may be optimally operated by people of any size. Even young children can become quite proficient in the use of the lens-mid display system having lens-mid frames.
A typical embodiment of the invention comprises one or two spatial light modu-lators or other display means built into a pair of reading glasses.
In some embodiments, a beamsplitter or a mirror silvered on both sides is used to combine the image of the display with the apparent position of a camera also borne by the apparatus of the invention.
Accordingly the present invention in one aspect comprises a wearable display system using eyeglasses, in particular, using a portion of the eyeglass frames that passes directly in front of an eye of the wearer of the eyeglasses, where one or two eyeglass lenses extend both below the frame and above the frame. Preferably the eyeglasses are otherwise rimless, where instead of a rim around the lens, the lens is mid-supported, so that the frames will bisect the wearer's vision in such a way that the wearer can see foveally both above and below the frame.
In another embodiment, the eyeglasses also contain a camera. In this embodi-ment, preferably the display is responsive to an output, of the camera.
Preferably this arrangement gives rise to a reality mediator, such as a night vision system.
Prefer-ably, the camera is an eyetap camera. Preferably the mediated reality environment provides functions of a seeing aid for the disabled, including a wearable face recog-nizer, wayfinder, visual memory prosthetic, and notice server for collecting evidence of slip and fall. Preferably the device provides biofeedback and a body instrument panel, such as electrocardiogram. Preferably the electrocardiogram (ECG) can be synchronized with an EVG (electrovisualgrarn, e.g. video capture from the EyeTap) for being digitally notarized and stored in an evidence file.
According to another aspect of the invention, there is provided a display means between two pairs of lenses, where the display means bisects the visual field of view, so that four lenses are used, two above (one for each eye) and two below (one for each eye) the display means.
According to another aspect of the invention, there is provided an optical channel in eyeglass frames, where the frames rest upon the nose and come across a main part of the wearer's visual field of view.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be described in more detail, by way of examples which in no way are meant to limit the scope of the invention, but, rather, these examples will serve to illustrate the invention with reference to the accompanying drawings, in which:
FIG. 1 shows an information appliance concealed in operationally transparent eyeglass frames.
FIG. 2 shows a closeup view of one embodiment of operational transparency.
FIG. 3 shows an upward-looking embodiment of the invention for an information look-up system.
FIG. 4 shows continuation concealment of eyeglass frame functionality.
FIG. 5 shows a lens-based frame reinforcer.
FIG. 6 shows a media device with a concealer having a front portion consisting of a sticker, for being worn in front of one eye of a wearer of the media device.
FIG. 7 shows a media device with a concealer for being worn in front of both eyes of a wearer of the media device, but where only one eye is for media.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
While the invention shall now be described with reference to the preferred em-bodiments shown in the drawings, it should be understood that the intention is not to limit the invention only to the particular embodiments shown but rather to cover all alterations, modifications and equivalent arrangements possible within the scope of appended claims.
In all aspects of the present invention, references to "viewscreen" shall not be limited to just miniaturized television monitors, displays, or the like, or computer monitors, displays, or the like, but shall also include computer data display means, as well as fixed display means, where such fixed display means include crosshairs, graticules, reticles, brackets, etc., and other static, photographic, optical, or video display devices, still picture display devices, ASCII text display devices, terminals, and systems that directly scan light onto the retina of the eye to form the perception of an image, whether or not there is an actual or aerial image formed outside the eye.
Similarly references to "camera" mean any device or collection of devices capable of simultaneously determining a quantity of light arriving from a plurality of directions and or at a plurality of locations, or determining some other attribute of light arriving from a plurality of directions and or at a plurality of locations.
References to ''processor" , or "computer" shall include sequential instruction, par-allel instruction, and special purpose architectures such as digital signal processing hardware, Field Programmable Gate Arrays (FPGAs), programmable logic devices, as well as analog signal processing devices.
When it is said that object "A" is "borne" by object "B", this shall include the possibilities that A is attached to B, that A is part of B, that A is built into B, or that A is B.
FIG. 1 shows an information appliance concealed in operationally transparent eyeglass frames 1F. These frames pass within view of a wearer of the eyeglasses.
When it is said that the frames pass within a field of view of the wearer, what is meant is that the foveal region of the wearer's eyesight can be directed through or at the frames. Thus, although many eyeglass frames are visible to the wearer while wearing the glasses, the frames are typically, by design, located out in the periphery of the wearer's field of view so that the wearer does not (and in fact typically cannot) look directly at the frames. However, within the the scope of the present invention, a foveally viewable eyeglass frame is desired. The frame may be made foveally viewable by placing it directly in front of the eyes, or by placing it only slightly out of direct view, so that the wearer can simply look up or down slightly to be still looking directly at or through the frames. Thus the frames will interrupt a foveally directable portion of the field of view. This does not mean necessarily that the interrupted portion of the visual field of view is always staying within the foveal portion of the view, but merely that it can be made to come within the wearer's foveal field of view, in the sense that the wearer is free to look in the direction of the frames and see through the frames. Though the frames will be out of focus at such close distance, the wearer of the foveally interrupting frames will be able to have his or her central field of view interrupted by the frames through a choice of look direction that puts the frames at or near a central field of vision.
Left lens 1L and right lens 1R may be functional (with a prescription or with other embedded technological enhancements) or decorative (e.g. like the lenses in "pseudo-intellectual" eyeglasses that are fashionable but have a focal length of approximately infinity (e.g. 0 diopter "blanks" ).
What is meant by operationally transparent is that a wearer of the eyeglasses can still see reasonably well, despite the fact that the eyeglass frames come into the wearer's field of vision. Operational transparency may be attained by having frame 1F be sufficiently narrow as to not appreciably block the wearer's vision. For example, eyeglass frames having a width of one or two millimeters can run across directly in front of the wearer's eyes, and because the size of the frames may be less than (or scarecely larger than) the diameter of the wearer's pupil, eye iris opening, or the like, the wearer can still see, as if seeing "through" the frames. Indeed, the frames will be so close to the eye as to appear out of focus, and thus not appreciably block vision.
Concealed within the operationally transparent frame, particularly within a view-able portion of the frame, is an information display device. This enables the wearer to see the information display device. Appropriate optics enable a computer screen, or the like, to exist in sharp focus, while the viewable section of the frames are out of focus and do not obstruct the wearer's eyes, or the view of the wearer's eyes by other people.
Thus the wearer can look normal while reading a computer screen, or the like, so that other people do not know that the wearer is viewing the screen.
In one embodiment, a diverter 1D diverts light from an aremac 1A. The aremac and laser aremac are devices that are well known in the art, as described in Intelligent Image Processing, published by John Wiley and Sons (S. Mann, November 2001).
Diverter 1D is preferably contained in an eye-blocking portion of the eyeglass frame 1F.
Additionally, in some embodiments a camera 1C may be included in the device.
Thus a display device of the invention may be used as a viewfinder. In this case, the operational transparency of the eyeglass frames may comprise, include, or consist of, an illusory transparency facilitated by seeing through the camera in at least a portion of the visual field of view of the eyeglasses.
In some embodiments camera 1C is an EyeTap camera, the concept of EyeTap cameras being well known in the art (See for example, Intelligent Image Processing, John Wiley and Sons).
An embodiment in which the operational transparency of the eyeglass frames com-prises, includes, or consists of, an illusory transparency facilitated by seeing through a camera in at least a portion of the visual field of view, will be referred to as an illusory operational transparency, or partial illusory operational transparency.
Traditinally, with ordinary eyeglasses not embodying the invention disclosed herein, the wearer looks through the eyeglass lenses and the fraanes are merely decorative (as well as funtional, of course, to hold the lenses in place). When using an at least partial illusory operational transparency of the present invention, there is an at least partial role reversal between the role of frames and lenses. In this case, the frames may become, at least in part, the element that the wearer looks through, while the lenses become, at least in part, decorative elements.
In particular, merely having a metal bar, or hollow plastic bar, or other viewable framing element pass in front of the eyes without any lenses attached to it, would arouse curiosity, for it would seem strange to be wearing a metal band, plastic channel, or other device in front of the eyes.
However, hanging lenses from this device will make it appear to take on the role of a mere eyeglass frame. It is quite possible, within the scope of this invention, to design a fashionable eyeglass frame, such that the frame passes directly in front of the eyes, yet supports two dummy lenses, such as to look very normal in appearance.
Additionally, nose bridge elements 1N may be incorporated into the lenses them-selves, so that a very fashionable "minimalist" design results. Cables, such as wire, 1CW and lAW connect the camera and aremac, respectively, to a body worn com-puter. Preferably one cable on each side of the eyeglasses has an appearance similar to that of eyeglass safety straps, by virtue of hollow eyeglass temples, together with an imitation of the safety straps so familiar under the Croakies (TM) trade mark.
Fiber optic may be used in place of wires 1CW or lAW. For example, camera 1C
may be a fiber optic camera with head-end located in the temple of the eyeglass frames, or in back of the frames, out of view of people who might be closely inspecting the wearer.
FIG. 2 shows a closeup view of one embodiment of operational transparency.
Hollow frames 1F conceal display elements behind a front surface 2F that is dark smoked material, so that the frames look like black plastic frames. A groove allows various optical elements such as display optics 20 to be slid back and forth in the groove. Preferably a friction fit allows elements to be adjusted to suit individual users or preferences. A satisfactory slot width is 7mm, so that a 7mm cube beam splitter can be jammed into the slot and moved as desired. One or two thicknesses of paper can be used to attain a friction fit. A camera such as a Toshiba C~N401 which has a diameter slightly larger than 7mm makes a nice friction fit in the slot, so that it tends to stay where put. It can be moved back and forth in the slot to adjust the EyeTap distance or to accomodate for the particular Inter Pupilary Distance (IPD) of various wearers.
The modular nature of such a device allows the wearer to change to a night vision camera (infrared, etc.) as desired. In this case, frames 1F are preferably made of material that is dark in the visible (say, perhaps 10% transmissivity in the visible) and light in the infrared (say, perhaps 90% transmissivity in the infrared).
This allows the same eyeglass frames to be used for day vision and night vision with a simple change of camera, while also ensuring that night vision sensitivity is not lost in making the plastic frames appear dark to the eyes of persons other than the wearer.
In the visible portion of the spectrum the frames are like dark sunglasses -the wearer can see out through the frames, but others can't see into the frames.
In the infrared, the frames axe like clear frames -- optical instruments can see out, but because the transparency is below the visible spectrum, others can't see into the frames unless they are also wearing glasses of this sort equipped with infrared cameras or the like (e.g. one cyborg can see that another person is a cyborg but not otherwise be seen by non-cyborgs).
In an alternate embodiment, a deep red, burgundy, deep dark pink, or rose col-ored eyeglass frame is used, wherein the color of the frames is carefully chosen so that it is transparent in the infrared, dark in the green and blue, and has a tran-sition band in the visible red. In this way the frames are still quite dark, being as the deep red, burgundy, or dark pink color conceals the apparatus, but looks a little less "dark and evil" than black plastic frames. Such a transition band plastic mate-rial THEREFORE ALLOWS THE WEARER TO SEE THE WORLD THROUGH
EYEGLASSES WITH ROSE COLORED FRAMES.
Clear eyeglass lenses may then be mounted on the inside, to cover the opening in the slot channel, and to protect the wearer from injury by broken pieces of optics, sharp edges, etc, in times of impact.
If no camera is being used, the display can still be present in the channel, and the wearer can see through the eyeglass frames, which become operationally transparent as, for example, rose colored eyeglass frames, where the wearer's vision is normal in most of the field of view, but with a thin horizontal band of vision that is seen through the rose colored tint of the eyeglass frames.
FIG. 3 shows an upward-looking embodiment of the invention for an information look-up system. Eye 3E looks upward through eyeglass frames that are slightly above eye level. The frames look normal because they are not so low that they appear to cover the eyes, but they are low enough to be in view, in an operationally transparent manner. An upper frame surface 3UF is preferably made of metal, as is a lower frame surface 3LF. Sandwiched between these two pieces of metal is an operationally transparent piece, having front surface 3F and back surface 3B, in which are embedded optical elements such as diverter 1D and aremac 1A.
Surfaces 3UF and 3LF may be wavy or curved manifolds that present themselves edge-on to a wearer, so that they block only on a set of approximately zero measure.
Surfaces that thus do not block appreciably the view of the wearer, are referred to as eyeward-edge manifolds. Whether the eyeward edge manifold is a piece of straight metal, wavy metal, curved metal, or the like, having at least a portion that is seen edge-on as it waves, curves, or goes straight, or whether it is by virtue of a step change in refractive index (e.g. from air to plastic) that the eyeward-edge manifold produces minimal disruption of normal vision, such surfaces that block on approximately zero measure are useful. Moreover, transparently distortion free front and back surfaces 3F and 3B may be arranged to present no grading in refraction, and thus sustain a operational transparency.
In one embodiment, a camera provides the seeing, so that the surfaces 3B and 3F need not actually be transparent in the same place, so long as light can somehow get into the camera and out of the display. In such virtual or illusory transparency, there need not be a complete passive optical path through the frames. In other embodiments, the frames are actually physicall transparent, so that the operational transparency is real transparency.
FIG. 4 shows continuation concealment of eyeglass frame functionality. A lower lens portion 4LL exists on some manifold that is at least continuous and differentiable with that of an upper lens portion 4UL. Thus, although there may be a break in the lens between these two portions, there is an appearance of continuation, that makes the eyeglass frames appear less functional, and more of a mere fashion statement.
Also, in some embodiments, it is preferable to angle the front surface 4F
downward, so that even though the wearer looks up to see through or into the frames, the frames themselves do not reflect environmental light back to other people. This overcomes a drawback of the design in Fig. 3 where the thick shiny black frames reflect specular highlights from ceiling lights into other people's eyes.
The downward angled design avoids specular highlights, while defining an opera-tionally transparent wedge. The wedge is an important element of some embodiments of the invention because the wedge defines a near-zero transparency occlusion zone.
The upper surface 3UF and lower surface 3LF are both seen on-edge, so that they appear very thin to eye 3E. Surfaces 3UF and 3LF are preferably made of thin sheet metal, such as black anodized aluminum, titanium, or the like, so that they obstruct very little of the wearer's view. A wedge shaped operationally transparent section of the frame is thus designed, through Zurnike polymomials, or the like, to have an appropriate curvature on both sides as to be operationally transparent in at least a portion of view around a display or mediation zone.
Thus a display that only blocks a small portion of the wearer's view is flanked to the left and right with real operational transparency zones.
FIG. 5 details how an eyeglass lens itself may provide structural support of an eyeglass frame that has been weakend in a portion of the frame for being in front of an eye of the wearer. In some embodiments, it is desired to have an exit pupil, or entrance pupil within eyeglass frames. Accordingly, opening 50 exists with no frame support, because it comprises a transparency break in the eyeglass frame 1F.
The transparency break allows an optical element to be placed at the location of opening 50, to serve the purpose of operational transparency. This break in frame 1F
requires structural support, so the right lens is screwed on with four screws or bolts or other fasteners. Fasteners 51R and 52R hold the left side of right lens 1R, while fasteners 53R and 54R hold the right side of right lens 1R.
In actual practice, if only one eye is tapped or displayed to, the left lens 1L could be held with only two fastners 51L and 52L. However, as this might give rise to an unusual appearance (due to assymetry), four fasteners may also be used on the left lens 1L.
Of course both eyes can have openings in which case four fasteners may be used.
Other less visible mounting approaches may also be used. Alternatively, a thin front piece 5F may slide or snap over the entire device to shroud it from view. In this case, the front cover piece 5F may snap onto the exposed fasteners. The fasteners may also bear fiber optics or other devices fastened thereto, outside the lenses 1L and 1R, but behind the front cover 5F. Thus many optical elements may be concealed between front cover piece 5F and the eyeglass lenses (1L and 1R).
FIG. 6 details how a decorative or similar item may be affixed to a lens to conceal media. Here a sticker, 100UV, that says "100% UV" , such as one commonly finds on new eyeglasses is used. The concealer may be any other kind of object that will not appear unusual, in blocking the vision of the wearer. The sticker appears backwards in the drawing because we are looking at the glasses from the wearer's perspective, and the sticker thus appears normal to other people.
Lens 1R extends both above and below the concealer, so that the concealer falls within a foveaable portion of the wearer's vision. The concealer is preferably, thus, within a central portion of the wearer's visual field, so that the wearer can look both above and below the concealer.
FIG. 7 shows another embodiment, with a stylish concealer that also is part of the frame 1F. The left part 7L of the concealer runs in front of the left eye of the wearer, whereas a right part 7R runs in front of the right eye. In between parts 7L
and 7R, a nose bridge 7N is fashioned from the concealer, so that the whole thing looks continuous and natural, as if part of a new kind of stylish eyeglasses.
Lenses 1L and 1R may be affixed directly to parts iL and 7R respectively.
An opening 50 is available for either a display device, an eye camera, an eye tracker, or other similar form of eye media.
The eyeglasses may, if desired, include a camera that is not necessarily an eye camera or eyetap camera. In this case, opening 50 may be used to display an output of the camera, processed by a computer, possibly with a coordinate transformation to effect an eyetap perspective, if desired.
From the foregoing description, it will thus be evident that the present invention provides a design for a wearable display system built into a wearer's in-view portion of eyeglass frames. As various changes can be made in the above embodiments and operating methods without departing from the spirit or scope of the invention, it is intended that all matter contained in the above description or shown in the accom-panying drawings should be interpreted as illustrative and not in a limiting sense.
Variations or modifications to the design and construction of this invention, within the scope of the invention, may occur to those skilled in the art upon reviewing the disclosure herein. Such variations or modifications, if within the spirit of this invention, are intended to be encompassed within the scope of any claims to patent protection issuing upon this invention.
Claims (32)
1. A wearable visual media device comprising:
.cndot.a concealer having a front portion for being worn in front of at least one eye of a wearer of said media device;
.cndot. at least a visible portion of said front portion of said concealer, said visible portion for being principally within a field of view of a wearer of said media device;
.cndot. a visual media interface borne by said concealer;
.cndot. at least one lens borne by said visual media device, said lens having a lens portion that extends below said front portion, for providing a view of subject matter below said visual media interface, and a lens portion that extends above said front portion, for providing a view of subject above said visual media interface.
.cndot.a concealer having a front portion for being worn in front of at least one eye of a wearer of said media device;
.cndot. at least a visible portion of said front portion of said concealer, said visible portion for being principally within a field of view of a wearer of said media device;
.cndot. a visual media interface borne by said concealer;
.cndot. at least one lens borne by said visual media device, said lens having a lens portion that extends below said front portion, for providing a view of subject matter below said visual media interface, and a lens portion that extends above said front portion, for providing a view of subject above said visual media interface.
2. The wearable visual media device of claim 1, in which a wearer of said device has a central field of view into said visual media interface, a lower field of view below said visual media interface, and an upper field of view, above said visual media interface, said lower and upper fields of view not appreciably affected by said visual media interface, said upper, lower, and central fields of view being at least partially viewable through said lens.
3. The wearable visual media device of claim 1 where said visual media interface is a camera.
4. The wearable visual media device of claim 1 where said visual media interface is a display.
5. The wearable visual media device of claim 1 where said visual media interface is a reality mediator comprised of both a camera and a display, said display responsive to an output of said camera.
6. The wearable visual media device of claim 1 said visual media device including at least one hollow temple side piece for making at least one connection between said visual media interface and at least one other device for being borne by a body of a wearer of said wearable visual media device.
7. The wearable visual media device of claim 1 where said concealer comprises at least a front portion of eyeglass frames, said at least one lens borne by said eyeglass frames.
8. The wearable visual media device of claim 7 where said frames become substan-tially thicker in a region where they pass in front of a space for an eye opening when worn.
9. The wearable visual media device of claim 7 where said frames include a nose bridge.
10. The wearable visual media device of claim 1 where said concealer comprises an at least partially opaque decorative label borne by said least one lens, said decorative label for being visible to persons other than a wearer of said media device.
11. The wearable visual media device of any of claims 1 to 10, where said visual media interface is located slightly above a straight-ahead view direction when worn by a wearer of said wearable visual media device.
12. The wearable visual media device of any of claims 1 to 10, where said visual media interface is located within a frame having at least one semi-transparent wedge-shaped portion, said portion having a lower and upper surface, said lower and upper surface both being viewed edge-on by an eye of a wearer when said wearable visual media device is worn by said wearer.
13. The wearable visual media device of any of claims 1 to 10, where said visual media interface is located within a frame having at least a semi-transparent wedge-shaped portion, said portion having at least one surface, said surface being at least one of: an upper surface; a lower surface, said surface being an eyeward-edge manifold.
14. The wearable visual media device of any of claims 1 to 10, where said visual media interface is located at least partially within a frame having a semi-transparent wedge-shaped portion, said portion having at least one surface ar-ranged to block vision only on a set of approximately zero measure of a wearer of said wearable visual media device.
15. The wearable visual media device of any of claims 1 to 10, where said front portion has at least one surface arranged to block vision only on a set of ap-proximately zero measure of a wearer of said wearable visual media device.
16. The wearable visual media device of any of claims 1 to 10, where said front portion has a top and bottom surface, both arranged to block vision only on a set of approximately zero measure of a wearer of said wearable visual media device.
17. The wearable visual media device of claim 16, where said front portion in-cludes a semi-transparent wedge-shaped material, having front and back sur-faces arranged to provide negligible refractive distortion to said wearer, when said wearer is looking through said front portion of said frame.
18. A wearable visual media device comprising:
.cndot. a frame having a front portion for being worn in front of at least one eye of a wearer of said media device;
.cndot. at least a visible portion of said front portion of said frame, said visible portion for being principally within a field of view of a wearer of said media device;
.cndot. an informatic exit pupil facing an eyeward side of said visible portion of said frame;
.cndot. a left lens and right lens borne by said frame, said left lens and said right lens each having a lens portion that extends below said front portion of said frame, and a lens portion that extends above said front portion of said frame.
.cndot. a frame having a front portion for being worn in front of at least one eye of a wearer of said media device;
.cndot. at least a visible portion of said front portion of said frame, said visible portion for being principally within a field of view of a wearer of said media device;
.cndot. an informatic exit pupil facing an eyeward side of said visible portion of said frame;
.cndot. a left lens and right lens borne by said frame, said left lens and said right lens each having a lens portion that extends below said front portion of said frame, and a lens portion that extends above said front portion of said frame.
19. A wearable visual media device comprising:
.cndot. eyeglass frames where said eyeglass frames include a front portion that rests in front of the eyes of a wearer of said eyeglass frames, said front portion for being at least partially within a foveally viewable region of said wearer's visual field of view;
.cndot. an optical cavity in said front portion of said eyeglass frames;
.cndot. a viewing means on a side of said optical cavity, said viewing means for facing an eye of said wearer of said eyeglass frames when said eyeglass frames are worn by said wearer;
.cndot. electronic display means viewable by said viewing means;
.cndot. at least one eyeglass lens having a portion that extends at least partially above said front portion of said eyeglass frames.
.cndot. eyeglass frames where said eyeglass frames include a front portion that rests in front of the eyes of a wearer of said eyeglass frames, said front portion for being at least partially within a foveally viewable region of said wearer's visual field of view;
.cndot. an optical cavity in said front portion of said eyeglass frames;
.cndot. a viewing means on a side of said optical cavity, said viewing means for facing an eye of said wearer of said eyeglass frames when said eyeglass frames are worn by said wearer;
.cndot. electronic display means viewable by said viewing means;
.cndot. at least one eyeglass lens having a portion that extends at least partially above said front portion of said eyeglass frames.
20. The visual media device of claim 19 where said optical cavity contains elements of a data information display.
21. The visual media device of claim 19 where said optical cavity contains portions of both a camera and display.
22. The visual media device of claim 19 where said optical cavity contains an op-tical rail, said optical rail containing at least some optical elements for being adjustable by an end user.
23. The visual media device of claim 19 where said media device further includes a camera, a light path from a scene into said camera passes through said optical cavity of said eyeglasses and in which there is a ray of light from said scene which is diverted in a direction along said optical cavity.
24. A wearable visual media device comprising:
.cndot. a slender horizontal front portion for being in front of the eyes of a wearer of said media device, said front portion for being at least partially within a foveally viewable region of said wearer's visual field of view;
.cndot. optical elements in said front portion of said media device;
.cndot. a viewer facing a wearer-eyeward side of said front portion;
.cndot. at least one eyeglass lens having a portion that extends at least partially above said front portion of said media device.
.cndot. a slender horizontal front portion for being in front of the eyes of a wearer of said media device, said front portion for being at least partially within a foveally viewable region of said wearer's visual field of view;
.cndot. optical elements in said front portion of said media device;
.cndot. a viewer facing a wearer-eyeward side of said front portion;
.cndot. at least one eyeglass lens having a portion that extends at least partially above said front portion of said media device.
25. A wearable media device comprising:
.cndot. an operationally transparent slender horizontal front portion for being in front of the eyes of a wearer of said media device, said front portion for at least partially interrupting a foveally directable region of said wearer's visual field of view;
.cndot. optical elements in said front portion of said media device, said optical elements including at least one of:
- an optical input to a camera:
- an objective lens of a camera;
- an exit pupil of a display;
- an output of an aremac;
- a diverter, ;
.cndot. at least one eyeglass lens having a portion that extends at least partially above said front portion of said media device.
.cndot. an operationally transparent slender horizontal front portion for being in front of the eyes of a wearer of said media device, said front portion for at least partially interrupting a foveally directable region of said wearer's visual field of view;
.cndot. optical elements in said front portion of said media device, said optical elements including at least one of:
- an optical input to a camera:
- an objective lens of a camera;
- an exit pupil of a display;
- an output of an aremac;
- a diverter, ;
.cndot. at least one eyeglass lens having a portion that extends at least partially above said front portion of said media device.
26. The media device in any of claims 1 to 25 further including .cndot. an electronic source of light borne by said media device;
.cndot. image forming means, where said image forming means directs light from said electronic source of light into an eye of the wearer of said media device.
.cndot. image forming means, where said image forming means directs light from said electronic source of light into an eye of the wearer of said media device.
27. A wearable reality mediator device including the features of any of claims 1 to 25 including means for generating a ray of virtual light approximately collinear with a corresponding ray of real light entering said reality mediator device.
28. A wearable mediating device including the features of any of claims 1 to 25 said mediating device further including a spatial light modulator responsive to an output of a body borne computer.
29. The device in any of claims 1 to 25 for displaying information to a wearer of said device when said wearer looks primarily straight ahead and slightly upward.
30. A wearable visual media device comprising:
.cndot. a concealer having a long thin slender front portion for being worn in front of at least one eye of a wearer of said media device;
.cndot. at least a visible portion of said front portion of said concealer, said visible portion for being principally within a field of view of a wearer of said media device;
.cndot. a fiber optic visual media interface borne by said concealer;
.cndot. at least one lens borne by said visual media device, said lens having a lens portion that extends below said front portion, and a lens portion that extends above said front portion.
.cndot. a concealer having a long thin slender front portion for being worn in front of at least one eye of a wearer of said media device;
.cndot. at least a visible portion of said front portion of said concealer, said visible portion for being principally within a field of view of a wearer of said media device;
.cndot. a fiber optic visual media interface borne by said concealer;
.cndot. at least one lens borne by said visual media device, said lens having a lens portion that extends below said front portion, and a lens portion that extends above said front portion.
31. The wearable visual media device of claim 30, said fiber optic visual media interface including a fiber optic camera.
32. The wearable visual media device of claim 30, said fiber optic visual media interface including a fiber optic light source.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA 2388766 CA2388766A1 (en) | 2002-06-17 | 2002-06-17 | Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA 2388766 CA2388766A1 (en) | 2002-06-17 | 2002-06-17 | Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2388766A1 true CA2388766A1 (en) | 2003-12-17 |
Family
ID=30005488
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA 2388766 Abandoned CA2388766A1 (en) | 2002-06-17 | 2002-06-17 | Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames |
Country Status (1)
Country | Link |
---|---|
CA (1) | CA2388766A1 (en) |
Cited By (111)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102013207063A1 (en) * | 2013-04-19 | 2014-10-23 | Bayerische Motoren Werke Aktiengesellschaft | A method of selecting an information source from a plurality of information sources for display on a display of data glasses |
CN104155772A (en) * | 2014-08-04 | 2014-11-19 | 联想(北京)有限公司 | Wearable device and electronic storage equipment |
US9341866B2 (en) | 2012-04-24 | 2016-05-17 | Sergio Martin Carabajal | Spectacles having a built-in computer |
EP3076660A1 (en) * | 2015-03-31 | 2016-10-05 | Xiaomi Inc. | Method and apparatus for displaying framing information |
US9823741B2 (en) | 2013-04-19 | 2017-11-21 | Bayerische Motoren Werke Aktiengesellschaft | Method for selecting an information source for display on smart glasses |
CN110007472A (en) * | 2019-05-24 | 2019-07-12 | 平顶山学院 | Auto-adjustable laser beam expander/reducer |
DE102014105011B4 (en) * | 2014-04-08 | 2020-02-20 | Clipland Gmbh | System for visualizing the field of view of an optical device |
CN113109290A (en) * | 2021-04-08 | 2021-07-13 | 晨光生物科技集团股份有限公司 | Method for rapidly predicting attenuation speed of natural pigment |
US11544888B2 (en) | 2019-06-06 | 2023-01-03 | Magic Leap, Inc. | Photoreal character configurations for spatial computing |
US11550157B2 (en) | 2017-07-24 | 2023-01-10 | Mentor Acquisition One, Llc | See-through computer display systems |
US11561613B2 (en) | 2020-05-29 | 2023-01-24 | Magic Leap, Inc. | Determining angular acceleration |
US11561615B2 (en) | 2017-04-14 | 2023-01-24 | Magic Leap, Inc. | Multimodal eye tracking |
US11567328B2 (en) | 2017-07-24 | 2023-01-31 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
US11586048B2 (en) | 2016-06-01 | 2023-02-21 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US11587563B2 (en) | 2019-03-01 | 2023-02-21 | Magic Leap, Inc. | Determining input for speech processing engine |
US11592665B2 (en) | 2019-12-09 | 2023-02-28 | Magic Leap, Inc. | Systems and methods for operating a head-mounted display system based on user identity |
US11592669B2 (en) | 2016-03-02 | 2023-02-28 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US11599326B2 (en) | 2014-02-11 | 2023-03-07 | Mentor Acquisition One, Llc | Spatial location presentation in head worn computing |
US11619965B2 (en) | 2018-10-24 | 2023-04-04 | Magic Leap, Inc. | Asynchronous ASIC |
US11619820B2 (en) | 2014-01-21 | 2023-04-04 | Mentor Acquisition One, Llc | See-through computer display systems |
US11627430B2 (en) | 2019-12-06 | 2023-04-11 | Magic Leap, Inc. | Environment acoustics persistence |
US11630315B2 (en) | 2014-08-12 | 2023-04-18 | Mentor Acquisition One, Llc | Measuring content brightness in head worn computing |
US11632646B2 (en) | 2019-12-20 | 2023-04-18 | Magic Leap, Inc. | Physics-based audio and haptic synthesis |
US11636843B2 (en) | 2020-05-29 | 2023-04-25 | Magic Leap, Inc. | Surface appropriate collisions |
US11650416B2 (en) | 2014-01-21 | 2023-05-16 | Mentor Acquisition One, Llc | See-through computer display systems |
US11651762B2 (en) | 2018-06-14 | 2023-05-16 | Magic Leap, Inc. | Reverberation gain normalization |
US11651565B2 (en) | 2018-09-25 | 2023-05-16 | Magic Leap, Inc. | Systems and methods for presenting perspective views of augmented reality virtual object |
US11657585B2 (en) | 2018-02-15 | 2023-05-23 | Magic Leap, Inc. | Mixed reality musical instrument |
US11654074B2 (en) | 2016-02-29 | 2023-05-23 | Mentor Acquisition One, Llc | Providing enhanced images for navigation |
US11663794B2 (en) | 2014-06-09 | 2023-05-30 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US11668939B2 (en) | 2017-07-24 | 2023-06-06 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
US11696087B2 (en) | 2018-10-05 | 2023-07-04 | Magic Leap, Inc. | Emphasis for audio spatialization |
US11699262B2 (en) | 2017-03-30 | 2023-07-11 | Magic Leap, Inc. | Centralized rendering |
US11704874B2 (en) | 2019-08-07 | 2023-07-18 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
US11703755B2 (en) | 2017-05-31 | 2023-07-18 | Magic Leap, Inc. | Fiducial design |
US11722812B2 (en) | 2017-03-30 | 2023-08-08 | Magic Leap, Inc. | Non-blocking dual driver earphones |
US11721303B2 (en) | 2015-02-17 | 2023-08-08 | Mentor Acquisition One, Llc | See-through computer display systems |
US11719934B2 (en) | 2014-01-21 | 2023-08-08 | Mentor Acquisition One, Llc | Suppression of stray light in head worn computing |
US11727223B2 (en) | 2014-04-25 | 2023-08-15 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US11736888B2 (en) | 2018-02-15 | 2023-08-22 | Magic Leap, Inc. | Dual listener positions for mixed reality |
US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US11763559B2 (en) | 2020-02-14 | 2023-09-19 | Magic Leap, Inc. | 3D object annotation |
US11768417B2 (en) | 2016-09-08 | 2023-09-26 | Mentor Acquisition One, Llc | Electrochromic systems for head-worn computer systems |
US11770671B2 (en) | 2018-06-18 | 2023-09-26 | Magic Leap, Inc. | Spatial audio for interactive audio environments |
US11778411B2 (en) | 2018-10-05 | 2023-10-03 | Magic Leap, Inc. | Near-field audio rendering |
US11778148B2 (en) | 2019-12-04 | 2023-10-03 | Magic Leap, Inc. | Variable-pitch color emitting display |
US11778410B2 (en) | 2020-02-14 | 2023-10-03 | Magic Leap, Inc. | Delayed audio following |
US11778400B2 (en) | 2018-06-14 | 2023-10-03 | Magic Leap, Inc. | Methods and systems for audio signal filtering |
US11778398B2 (en) | 2019-10-25 | 2023-10-03 | Magic Leap, Inc. | Reverberation fingerprint estimation |
US11771915B2 (en) | 2016-12-30 | 2023-10-03 | Mentor Acquisition One, Llc | Head-worn therapy device |
US11782529B2 (en) | 2014-01-17 | 2023-10-10 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US11782274B2 (en) | 2014-01-24 | 2023-10-10 | Mentor Acquisition One, Llc | Stray light suppression for head worn computing |
US11790617B2 (en) | 2014-06-09 | 2023-10-17 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US11790935B2 (en) | 2019-08-07 | 2023-10-17 | Magic Leap, Inc. | Voice onset detection |
US11800174B2 (en) | 2018-02-15 | 2023-10-24 | Magic Leap, Inc. | Mixed reality virtual reverberation |
US11796805B2 (en) | 2014-01-21 | 2023-10-24 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US11797720B2 (en) | 2020-02-14 | 2023-10-24 | Magic Leap, Inc. | Tool bridge |
US11809022B2 (en) | 2014-04-25 | 2023-11-07 | Mentor Acquisition One, Llc | Temple and ear horn assembly for headworn computer |
US11809628B2 (en) | 2014-12-03 | 2023-11-07 | Mentor Acquisition One, Llc | See-through computer display systems |
US11816296B2 (en) | 2015-07-22 | 2023-11-14 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US11825257B2 (en) | 2016-08-22 | 2023-11-21 | Mentor Acquisition One, Llc | Speaker systems for head-worn computer systems |
US11822090B2 (en) | 2014-01-24 | 2023-11-21 | Mentor Acquisition One, Llc | Haptic systems for head-worn computers |
US11843931B2 (en) | 2018-06-12 | 2023-12-12 | Magic Leap, Inc. | Efficient rendering of virtual soundfields |
US11851177B2 (en) | 2014-05-06 | 2023-12-26 | Mentor Acquisition One, Llc | Unmanned aerial vehicle launch system |
US11854566B2 (en) | 2018-06-21 | 2023-12-26 | Magic Leap, Inc. | Wearable system speech processing |
US11861803B2 (en) | 2020-02-14 | 2024-01-02 | Magic Leap, Inc. | Session manager |
US11867537B2 (en) | 2015-05-19 | 2024-01-09 | Magic Leap, Inc. | Dual composite light field device |
US11880041B2 (en) | 2014-04-25 | 2024-01-23 | Mentor Acquisition One, Llc | Speaker assembly for headworn computer |
US11886631B2 (en) | 2018-12-27 | 2024-01-30 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US11886638B2 (en) | 2015-07-22 | 2024-01-30 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US11892644B2 (en) | 2014-01-21 | 2024-02-06 | Mentor Acquisition One, Llc | See-through computer display systems |
US11895483B2 (en) | 2017-10-17 | 2024-02-06 | Magic Leap, Inc. | Mixed reality spatial audio |
US11900554B2 (en) | 2014-01-24 | 2024-02-13 | Mentor Acquisition One, Llc | Modification of peripheral content in world-locked see-through computer display systems |
US11910183B2 (en) | 2020-02-14 | 2024-02-20 | Magic Leap, Inc. | Multi-application audio rendering |
US11917384B2 (en) | 2020-03-27 | 2024-02-27 | Magic Leap, Inc. | Method of waking a device using spoken voice commands |
US11935180B2 (en) | 2019-10-18 | 2024-03-19 | Magic Leap, Inc. | Dual IMU SLAM |
US11936733B2 (en) | 2018-07-24 | 2024-03-19 | Magic Leap, Inc. | Application sharing |
US11940629B2 (en) | 2014-07-08 | 2024-03-26 | Mentor Acquisition One, Llc | Optical configurations for head-worn see-through displays |
US11948256B2 (en) | 2018-10-09 | 2024-04-02 | Magic Leap, Inc. | Systems and methods for artificial intelligence-based virtual and augmented reality |
US11947120B2 (en) | 2017-08-04 | 2024-04-02 | Mentor Acquisition One, Llc | Image expansion optic for head-worn computer |
US11960089B2 (en) | 2014-06-05 | 2024-04-16 | Mentor Acquisition One, Llc | Optical configurations for head-worn see-through displays |
US11959997B2 (en) | 2019-11-22 | 2024-04-16 | Magic Leap, Inc. | System and method for tracking a wearable device |
US11961194B2 (en) | 2019-10-25 | 2024-04-16 | Magic Leap, Inc. | Non-uniform stereo rendering |
US11963868B2 (en) | 2020-06-01 | 2024-04-23 | Ast Products, Inc. | Double-sided aspheric diffractive multifocal lens, manufacture, and uses thereof |
US11988837B2 (en) | 2018-04-24 | 2024-05-21 | Mentor Acquisition One, Llc | See-through computer display systems with vision correction and increased content density |
US12001599B2 (en) | 2004-07-28 | 2024-06-04 | Ingeniospec, Llc | Head-worn device with connection region |
US12025855B2 (en) | 2004-07-28 | 2024-07-02 | Ingeniospec, Llc | Wearable audio system supporting enhanced hearing support |
US12044901B2 (en) | 2005-10-11 | 2024-07-23 | Ingeniospec, Llc | System for charging embedded battery in wireless head-worn personal electronic apparatus |
US12050321B2 (en) | 2016-05-09 | 2024-07-30 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
US12078870B2 (en) | 2003-04-15 | 2024-09-03 | Ingeniospec, Llc | Eyewear housing for charging embedded battery in eyewear frame |
US12079938B2 (en) | 2020-02-10 | 2024-09-03 | Magic Leap, Inc. | Dynamic colocation of virtual content |
US12095833B2 (en) | 2011-10-28 | 2024-09-17 | Magic Leap, Inc. | System and method for augmented and virtual reality |
US12093453B2 (en) | 2014-01-21 | 2024-09-17 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US12105281B2 (en) | 2014-01-21 | 2024-10-01 | Mentor Acquisition One, Llc | See-through computer display systems |
US12111473B2 (en) | 2016-09-08 | 2024-10-08 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US12112089B2 (en) | 2014-02-11 | 2024-10-08 | Mentor Acquisition One, Llc | Spatial location presentation in head worn computing |
US12145505B2 (en) | 2014-03-28 | 2024-11-19 | Mentor Acquisition One, Llc | System for assisted operator safety using an HMD |
US12164180B2 (en) | 2003-10-09 | 2024-12-10 | Ingeniospec, Llc | Eyewear supporting distributed and embedded electronic components |
US12174378B2 (en) | 2014-06-17 | 2024-12-24 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12185083B2 (en) | 2020-03-02 | 2024-12-31 | Magic Leap, Inc. | Immersive audio platform |
US12232688B2 (en) | 2014-07-15 | 2025-02-25 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US12242138B1 (en) | 2004-10-12 | 2025-03-04 | Ingeniospec, Llc | Wireless headset supporting messages and hearing enhancement |
US12245097B2 (en) | 2019-03-25 | 2025-03-04 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US12248198B2 (en) | 2005-10-11 | 2025-03-11 | Ingeniospec, Llc | Eyewear having flexible printed circuit substrate supporting electrical components |
US12267654B2 (en) | 2018-05-30 | 2025-04-01 | Magic Leap, Inc. | Index scheming for filter parameters |
US12277586B2 (en) | 2015-06-24 | 2025-04-15 | Magic Leap, Inc. | Augmented reality systems and methods for purchasing |
US12306413B2 (en) | 2021-03-12 | 2025-05-20 | Magic Leap , Inc. | Athermalization concepts for polymer eyepieces used in augmented reality or mixed reality devices |
US12327573B2 (en) | 2019-04-19 | 2025-06-10 | Magic Leap, Inc. | Identifying input for speech recognition engine |
US12417766B2 (en) | 2020-09-30 | 2025-09-16 | Magic Leap, Inc. | Voice user interface using non-linguistic input |
US12430860B2 (en) | 2024-05-14 | 2025-09-30 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
-
2002
- 2002-06-17 CA CA 2388766 patent/CA2388766A1/en not_active Abandoned
Cited By (205)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12078870B2 (en) | 2003-04-15 | 2024-09-03 | Ingeniospec, Llc | Eyewear housing for charging embedded battery in eyewear frame |
US12164180B2 (en) | 2003-10-09 | 2024-12-10 | Ingeniospec, Llc | Eyewear supporting distributed and embedded electronic components |
US12001599B2 (en) | 2004-07-28 | 2024-06-04 | Ingeniospec, Llc | Head-worn device with connection region |
US12238494B1 (en) | 2004-07-28 | 2025-02-25 | Ingeniospec, Llc | Head-worn device with connection region |
US12025855B2 (en) | 2004-07-28 | 2024-07-02 | Ingeniospec, Llc | Wearable audio system supporting enhanced hearing support |
US12140819B1 (en) | 2004-07-28 | 2024-11-12 | Ingeniospec, Llc | Head-worn personal audio apparatus supporting enhanced audio output |
US12242138B1 (en) | 2004-10-12 | 2025-03-04 | Ingeniospec, Llc | Wireless headset supporting messages and hearing enhancement |
US12345955B2 (en) | 2005-10-11 | 2025-07-01 | Ingeniospec, Llc | Head-worn eyewear structure with internal fan |
US12248198B2 (en) | 2005-10-11 | 2025-03-11 | Ingeniospec, Llc | Eyewear having flexible printed circuit substrate supporting electrical components |
US12313913B1 (en) | 2005-10-11 | 2025-05-27 | Ingeniospec, Llc | System for powering head-worn personal electronic apparatus |
US12044901B2 (en) | 2005-10-11 | 2024-07-23 | Ingeniospec, Llc | System for charging embedded battery in wireless head-worn personal electronic apparatus |
US12095833B2 (en) | 2011-10-28 | 2024-09-17 | Magic Leap, Inc. | System and method for augmented and virtual reality |
US9341866B2 (en) | 2012-04-24 | 2016-05-17 | Sergio Martin Carabajal | Spectacles having a built-in computer |
DE102013207063A1 (en) * | 2013-04-19 | 2014-10-23 | Bayerische Motoren Werke Aktiengesellschaft | A method of selecting an information source from a plurality of information sources for display on a display of data glasses |
US9823741B2 (en) | 2013-04-19 | 2017-11-21 | Bayerische Motoren Werke Aktiengesellschaft | Method for selecting an information source for display on smart glasses |
US9823735B2 (en) | 2013-04-19 | 2017-11-21 | Bayerische Motoren Werke Aktiengesellschaft | Method for selecting an information source from a plurality of information sources for display on a display of smart glasses |
US11782529B2 (en) | 2014-01-17 | 2023-10-10 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12045401B2 (en) | 2014-01-17 | 2024-07-23 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12108989B2 (en) | 2014-01-21 | 2024-10-08 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US12333069B2 (en) | 2014-01-21 | 2025-06-17 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US11619820B2 (en) | 2014-01-21 | 2023-04-04 | Mentor Acquisition One, Llc | See-through computer display systems |
US11892644B2 (en) | 2014-01-21 | 2024-02-06 | Mentor Acquisition One, Llc | See-through computer display systems |
US12105281B2 (en) | 2014-01-21 | 2024-10-01 | Mentor Acquisition One, Llc | See-through computer display systems |
US11796799B2 (en) | 2014-01-21 | 2023-10-24 | Mentor Acquisition One, Llc | See-through computer display systems |
US11796805B2 (en) | 2014-01-21 | 2023-10-24 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US11650416B2 (en) | 2014-01-21 | 2023-05-16 | Mentor Acquisition One, Llc | See-through computer display systems |
US11947126B2 (en) | 2014-01-21 | 2024-04-02 | Mentor Acquisition One, Llc | See-through computer display systems |
US12093453B2 (en) | 2014-01-21 | 2024-09-17 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US12204097B2 (en) | 2014-01-21 | 2025-01-21 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US12386186B2 (en) | 2014-01-21 | 2025-08-12 | Mentor Acquisition One, Llc | See-through computer display systems |
US11737666B2 (en) | 2014-01-21 | 2023-08-29 | Mentor Acquisition One, Llc | Eye imaging in head worn computing |
US11669163B2 (en) | 2014-01-21 | 2023-06-06 | Mentor Acquisition One, Llc | Eye glint imaging in see-through computer display systems |
US12007571B2 (en) | 2014-01-21 | 2024-06-11 | Mentor Acquisition One, Llc | Suppression of stray light in head worn computing |
US11719934B2 (en) | 2014-01-21 | 2023-08-08 | Mentor Acquisition One, Llc | Suppression of stray light in head worn computing |
US11900554B2 (en) | 2014-01-24 | 2024-02-13 | Mentor Acquisition One, Llc | Modification of peripheral content in world-locked see-through computer display systems |
US11822090B2 (en) | 2014-01-24 | 2023-11-21 | Mentor Acquisition One, Llc | Haptic systems for head-worn computers |
US11782274B2 (en) | 2014-01-24 | 2023-10-10 | Mentor Acquisition One, Llc | Stray light suppression for head worn computing |
US12158592B2 (en) | 2014-01-24 | 2024-12-03 | Mentor Acquisition One, Llc | Haptic systems for head-worn computers |
US12266064B2 (en) | 2014-01-24 | 2025-04-01 | Mentor Acquisition One, Llc | Modification of peripheral content in world-locked see-through computer display systems |
US12326565B2 (en) | 2014-01-24 | 2025-06-10 | Mentor Acquisition One, Llc | Stray light suppression for head worn computing |
US12066635B2 (en) | 2014-01-24 | 2024-08-20 | Mentor Acquisition One, Llc | Stray light suppression for head worn computing |
US11599326B2 (en) | 2014-02-11 | 2023-03-07 | Mentor Acquisition One, Llc | Spatial location presentation in head worn computing |
US12112089B2 (en) | 2014-02-11 | 2024-10-08 | Mentor Acquisition One, Llc | Spatial location presentation in head worn computing |
US12145505B2 (en) | 2014-03-28 | 2024-11-19 | Mentor Acquisition One, Llc | System for assisted operator safety using an HMD |
DE102014105011B4 (en) * | 2014-04-08 | 2020-02-20 | Clipland Gmbh | System for visualizing the field of view of an optical device |
US11727223B2 (en) | 2014-04-25 | 2023-08-15 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US12050884B2 (en) | 2014-04-25 | 2024-07-30 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US12353841B2 (en) | 2014-04-25 | 2025-07-08 | Mentor Acquisition One, Llc | Language translation with head-worn computing |
US11880041B2 (en) | 2014-04-25 | 2024-01-23 | Mentor Acquisition One, Llc | Speaker assembly for headworn computer |
US12210164B2 (en) | 2014-04-25 | 2025-01-28 | Mentor Acquisition One, Llc | Speaker assembly for headworn computer |
US11809022B2 (en) | 2014-04-25 | 2023-11-07 | Mentor Acquisition One, Llc | Temple and ear horn assembly for headworn computer |
US12197043B2 (en) | 2014-04-25 | 2025-01-14 | Mentor Acquisition One, Llc | Temple and ear horn assembly for headworn computer |
US11851177B2 (en) | 2014-05-06 | 2023-12-26 | Mentor Acquisition One, Llc | Unmanned aerial vehicle launch system |
US11960089B2 (en) | 2014-06-05 | 2024-04-16 | Mentor Acquisition One, Llc | Optical configurations for head-worn see-through displays |
US12174388B2 (en) | 2014-06-05 | 2024-12-24 | Mentor Acquisition One, Llc | Optical configurations for head-worn see-through displays |
US12205230B2 (en) | 2014-06-09 | 2025-01-21 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US11887265B2 (en) | 2014-06-09 | 2024-01-30 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US11663794B2 (en) | 2014-06-09 | 2023-05-30 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US11790617B2 (en) | 2014-06-09 | 2023-10-17 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US12154240B2 (en) | 2014-06-09 | 2024-11-26 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
US12174378B2 (en) | 2014-06-17 | 2024-12-24 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12242068B2 (en) | 2014-07-08 | 2025-03-04 | Mentor Acquisition One, Llc | Optical configurations for head-worn see-through displays |
US11940629B2 (en) | 2014-07-08 | 2024-03-26 | Mentor Acquisition One, Llc | Optical configurations for head-worn see-through displays |
US12232688B2 (en) | 2014-07-15 | 2025-02-25 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
CN104155772A (en) * | 2014-08-04 | 2014-11-19 | 联想(北京)有限公司 | Wearable device and electronic storage equipment |
US11630315B2 (en) | 2014-08-12 | 2023-04-18 | Mentor Acquisition One, Llc | Measuring content brightness in head worn computing |
US11809628B2 (en) | 2014-12-03 | 2023-11-07 | Mentor Acquisition One, Llc | See-through computer display systems |
US12164693B2 (en) | 2014-12-03 | 2024-12-10 | Mentor Acquisition One, Llc | See-through computer display systems |
US12142242B2 (en) | 2015-02-17 | 2024-11-12 | Mentor Acquisition One, Llc | See-through computer display systems |
US11721303B2 (en) | 2015-02-17 | 2023-08-08 | Mentor Acquisition One, Llc | See-through computer display systems |
EP3076660A1 (en) * | 2015-03-31 | 2016-10-05 | Xiaomi Inc. | Method and apparatus for displaying framing information |
US11867537B2 (en) | 2015-05-19 | 2024-01-09 | Magic Leap, Inc. | Dual composite light field device |
US12277586B2 (en) | 2015-06-24 | 2025-04-15 | Magic Leap, Inc. | Augmented reality systems and methods for purchasing |
US11816296B2 (en) | 2015-07-22 | 2023-11-14 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12216821B2 (en) | 2015-07-22 | 2025-02-04 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US11886638B2 (en) | 2015-07-22 | 2024-01-30 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12271560B2 (en) | 2015-07-22 | 2025-04-08 | Mentor Acquisition One, Llc | External user interface for head worn computing |
US12171713B2 (en) | 2016-02-29 | 2024-12-24 | Mentor Acquisition One, Llc | Providing enhanced images for navigation |
US11654074B2 (en) | 2016-02-29 | 2023-05-23 | Mentor Acquisition One, Llc | Providing enhanced images for navigation |
US11592669B2 (en) | 2016-03-02 | 2023-02-28 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US12007562B2 (en) | 2016-03-02 | 2024-06-11 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US12405466B2 (en) | 2016-03-02 | 2025-09-02 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US12320982B2 (en) | 2016-05-09 | 2025-06-03 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
US12050321B2 (en) | 2016-05-09 | 2024-07-30 | Mentor Acquisition One, Llc | User interface systems for head-worn computers |
US11586048B2 (en) | 2016-06-01 | 2023-02-21 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US11977238B2 (en) | 2016-06-01 | 2024-05-07 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US11754845B2 (en) | 2016-06-01 | 2023-09-12 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US12174393B2 (en) | 2016-06-01 | 2024-12-24 | Mentor Acquisition One, Llc | Modular systems for head-worn computers |
US11825257B2 (en) | 2016-08-22 | 2023-11-21 | Mentor Acquisition One, Llc | Speaker systems for head-worn computer systems |
US12120477B2 (en) | 2016-08-22 | 2024-10-15 | Mentor Acquisition One, Llc | Speaker systems for head-worn computer systems |
US12099280B2 (en) | 2016-09-08 | 2024-09-24 | Mentor Acquisition One, Llc | Electrochromic systems for head-worn computer systems |
US12111473B2 (en) | 2016-09-08 | 2024-10-08 | Mentor Acquisition One, Llc | Optical systems for head-worn computers |
US11768417B2 (en) | 2016-09-08 | 2023-09-26 | Mentor Acquisition One, Llc | Electrochromic systems for head-worn computer systems |
US11771915B2 (en) | 2016-12-30 | 2023-10-03 | Mentor Acquisition One, Llc | Head-worn therapy device |
US12337189B2 (en) | 2016-12-30 | 2025-06-24 | Mentor Acquisition One, Llc | Head-worn therapy device |
US11699262B2 (en) | 2017-03-30 | 2023-07-11 | Magic Leap, Inc. | Centralized rendering |
US11722812B2 (en) | 2017-03-30 | 2023-08-08 | Magic Leap, Inc. | Non-blocking dual driver earphones |
US12211145B2 (en) | 2017-03-30 | 2025-01-28 | Magic Leap, Inc. | Centralized rendering |
US12133043B2 (en) | 2017-03-30 | 2024-10-29 | Magic Leap, Inc. | Non-blocking dual driver earphones |
US11561615B2 (en) | 2017-04-14 | 2023-01-24 | Magic Leap, Inc. | Multimodal eye tracking |
US11703755B2 (en) | 2017-05-31 | 2023-07-18 | Magic Leap, Inc. | Fiducial design |
US12222513B2 (en) | 2017-07-24 | 2025-02-11 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
US11971554B2 (en) | 2017-07-24 | 2024-04-30 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
US12393030B2 (en) | 2017-07-24 | 2025-08-19 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
US11550157B2 (en) | 2017-07-24 | 2023-01-10 | Mentor Acquisition One, Llc | See-through computer display systems |
US11960095B2 (en) | 2017-07-24 | 2024-04-16 | Mentor Acquisition One, Llc | See-through computer display systems |
US11789269B2 (en) | 2017-07-24 | 2023-10-17 | Mentor Acquisition One, Llc | See-through computer display systems |
US11668939B2 (en) | 2017-07-24 | 2023-06-06 | Mentor Acquisition One, Llc | See-through computer display systems with stray light management |
US11567328B2 (en) | 2017-07-24 | 2023-01-31 | Mentor Acquisition One, Llc | See-through computer display systems with adjustable zoom cameras |
US11947120B2 (en) | 2017-08-04 | 2024-04-02 | Mentor Acquisition One, Llc | Image expansion optic for head-worn computer |
US11895483B2 (en) | 2017-10-17 | 2024-02-06 | Magic Leap, Inc. | Mixed reality spatial audio |
US12317064B2 (en) | 2017-10-17 | 2025-05-27 | Magic Leap, Inc. | Mixed reality spatial audio |
US11978171B2 (en) | 2018-02-15 | 2024-05-07 | Magic Leap, Inc. | Mixed reality musical instrument |
US11657585B2 (en) | 2018-02-15 | 2023-05-23 | Magic Leap, Inc. | Mixed reality musical instrument |
US12143660B2 (en) | 2018-02-15 | 2024-11-12 | Magic Leap, Inc. | Mixed reality virtual reverberation |
US11736888B2 (en) | 2018-02-15 | 2023-08-22 | Magic Leap, Inc. | Dual listener positions for mixed reality |
US11956620B2 (en) | 2018-02-15 | 2024-04-09 | Magic Leap, Inc. | Dual listener positions for mixed reality |
US12317062B2 (en) | 2018-02-15 | 2025-05-27 | Magic Leap, Inc. | Dual listener positions for mixed reality |
US12254582B2 (en) | 2018-02-15 | 2025-03-18 | Magic Leap, Inc. | Mixed reality musical instrument |
US11800174B2 (en) | 2018-02-15 | 2023-10-24 | Magic Leap, Inc. | Mixed reality virtual reverberation |
US12405470B2 (en) | 2018-04-24 | 2025-09-02 | Mentor Acquisition One, Llc | See-through computer display systems with vision correction and increased content density |
US11988837B2 (en) | 2018-04-24 | 2024-05-21 | Mentor Acquisition One, Llc | See-through computer display systems with vision correction and increased content density |
US12267654B2 (en) | 2018-05-30 | 2025-04-01 | Magic Leap, Inc. | Index scheming for filter parameters |
US11843931B2 (en) | 2018-06-12 | 2023-12-12 | Magic Leap, Inc. | Efficient rendering of virtual soundfields |
US12120499B2 (en) | 2018-06-12 | 2024-10-15 | Magic Leap, Inc. | Efficient rendering of virtual soundfields |
US11778400B2 (en) | 2018-06-14 | 2023-10-03 | Magic Leap, Inc. | Methods and systems for audio signal filtering |
US12212948B2 (en) | 2018-06-14 | 2025-01-28 | Magic Leap, Inc. | Methods and systems for audio signal filtering |
US12308011B2 (en) | 2018-06-14 | 2025-05-20 | Magic Leap, Inc. | Reverberation gain normalization |
US12008982B2 (en) | 2018-06-14 | 2024-06-11 | Magic Leap, Inc. | Reverberation gain normalization |
US11651762B2 (en) | 2018-06-14 | 2023-05-16 | Magic Leap, Inc. | Reverberation gain normalization |
US12156016B2 (en) | 2018-06-18 | 2024-11-26 | Magic Leap, Inc. | Spatial audio for interactive audio environments |
US11770671B2 (en) | 2018-06-18 | 2023-09-26 | Magic Leap, Inc. | Spatial audio for interactive audio environments |
US11792598B2 (en) | 2018-06-18 | 2023-10-17 | Magic Leap, Inc. | Spatial audio for interactive audio environments |
US12294852B2 (en) | 2018-06-18 | 2025-05-06 | Magic Leap, Inc. | Spatial audio for interactive audio environments |
US12347448B2 (en) | 2018-06-21 | 2025-07-01 | Magic Leap, Inc. | Wearable system speech processing |
US11854566B2 (en) | 2018-06-21 | 2023-12-26 | Magic Leap, Inc. | Wearable system speech processing |
US11936733B2 (en) | 2018-07-24 | 2024-03-19 | Magic Leap, Inc. | Application sharing |
US11928784B2 (en) | 2018-09-25 | 2024-03-12 | Magic Leap, Inc. | Systems and methods for presenting perspective views of augmented reality virtual object |
US11651565B2 (en) | 2018-09-25 | 2023-05-16 | Magic Leap, Inc. | Systems and methods for presenting perspective views of augmented reality virtual object |
US11778411B2 (en) | 2018-10-05 | 2023-10-03 | Magic Leap, Inc. | Near-field audio rendering |
US11863965B2 (en) | 2018-10-05 | 2024-01-02 | Magic Leap, Inc. | Interaural time difference crossfader for binaural audio rendering |
US12342158B2 (en) | 2018-10-05 | 2025-06-24 | Magic Leap, Inc. | Near-field audio rendering |
US12063497B2 (en) | 2018-10-05 | 2024-08-13 | Magic Leap, Inc. | Near-field audio rendering |
US11696087B2 (en) | 2018-10-05 | 2023-07-04 | Magic Leap, Inc. | Emphasis for audio spatialization |
US11948256B2 (en) | 2018-10-09 | 2024-04-02 | Magic Leap, Inc. | Systems and methods for artificial intelligence-based virtual and augmented reality |
US12236543B2 (en) | 2018-10-09 | 2025-02-25 | Magic Leap, Inc. | Systems and methods for artificial intelligence-based virtual and augmented reality |
US11747856B2 (en) | 2018-10-24 | 2023-09-05 | Magic Leap, Inc. | Asynchronous ASIC |
US11619965B2 (en) | 2018-10-24 | 2023-04-04 | Magic Leap, Inc. | Asynchronous ASIC |
US12135580B2 (en) | 2018-10-24 | 2024-11-05 | Magic Leap, Inc. | Asynchronous ASIC |
US11886631B2 (en) | 2018-12-27 | 2024-01-30 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US12164682B2 (en) | 2018-12-27 | 2024-12-10 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US11854550B2 (en) | 2019-03-01 | 2023-12-26 | Magic Leap, Inc. | Determining input for speech processing engine |
US11587563B2 (en) | 2019-03-01 | 2023-02-21 | Magic Leap, Inc. | Determining input for speech processing engine |
US12243531B2 (en) | 2019-03-01 | 2025-03-04 | Magic Leap, Inc. | Determining input for speech processing engine |
US12245097B2 (en) | 2019-03-25 | 2025-03-04 | Magic Leap, Inc. | Systems and methods for virtual and augmented reality |
US12327573B2 (en) | 2019-04-19 | 2025-06-10 | Magic Leap, Inc. | Identifying input for speech recognition engine |
CN110007472B (en) * | 2019-05-24 | 2023-09-26 | 平顶山学院 | Automatically adjustable laser beam expander/contractor |
CN110007472A (en) * | 2019-05-24 | 2019-07-12 | 平顶山学院 | Auto-adjustable laser beam expander/reducer |
US11544888B2 (en) | 2019-06-06 | 2023-01-03 | Magic Leap, Inc. | Photoreal character configurations for spatial computing |
US11823316B2 (en) | 2019-06-06 | 2023-11-21 | Magic Leap, Inc. | Photoreal character configurations for spatial computing |
US11790935B2 (en) | 2019-08-07 | 2023-10-17 | Magic Leap, Inc. | Voice onset detection |
US11704874B2 (en) | 2019-08-07 | 2023-07-18 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
US12020391B2 (en) | 2019-08-07 | 2024-06-25 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
US12094489B2 (en) | 2019-08-07 | 2024-09-17 | Magic Leap, Inc. | Voice onset detection |
US11935180B2 (en) | 2019-10-18 | 2024-03-19 | Magic Leap, Inc. | Dual IMU SLAM |
US12249024B2 (en) | 2019-10-18 | 2025-03-11 | Magic Leap, Inc. | Dual IMU SLAM |
US12283013B2 (en) | 2019-10-25 | 2025-04-22 | Magic Leap, Inc. | Non-uniform stereo rendering |
US11778398B2 (en) | 2019-10-25 | 2023-10-03 | Magic Leap, Inc. | Reverberation fingerprint estimation |
US12149896B2 (en) | 2019-10-25 | 2024-11-19 | Magic Leap, Inc. | Reverberation fingerprint estimation |
US11961194B2 (en) | 2019-10-25 | 2024-04-16 | Magic Leap, Inc. | Non-uniform stereo rendering |
US11959997B2 (en) | 2019-11-22 | 2024-04-16 | Magic Leap, Inc. | System and method for tracking a wearable device |
US12137306B2 (en) | 2019-12-04 | 2024-11-05 | Magic Leap, Inc. | Variable-pitch color emitting display |
US11778148B2 (en) | 2019-12-04 | 2023-10-03 | Magic Leap, Inc. | Variable-pitch color emitting display |
US12309577B2 (en) | 2019-12-06 | 2025-05-20 | Magic Leap, Inc. | Environment acoustics persistence |
US11627430B2 (en) | 2019-12-06 | 2023-04-11 | Magic Leap, Inc. | Environment acoustics persistence |
US12135420B2 (en) | 2019-12-09 | 2024-11-05 | Magic Leap, Inc. | Systems and methods for operating a head-mounted display system based on user identity |
US11592665B2 (en) | 2019-12-09 | 2023-02-28 | Magic Leap, Inc. | Systems and methods for operating a head-mounted display system based on user identity |
US11789262B2 (en) | 2019-12-09 | 2023-10-17 | Magic Leap, Inc. | Systems and methods for operating a head-mounted display system based on user identity |
US11632646B2 (en) | 2019-12-20 | 2023-04-18 | Magic Leap, Inc. | Physics-based audio and haptic synthesis |
US12389187B2 (en) | 2019-12-20 | 2025-08-12 | Magic Leap, Inc. | Physics-based audio and haptic synthesis |
US12003953B2 (en) | 2019-12-20 | 2024-06-04 | Magic Leap, Inc. | Physics-based audio and haptic synthesis |
US12079938B2 (en) | 2020-02-10 | 2024-09-03 | Magic Leap, Inc. | Dynamic colocation of virtual content |
US11797720B2 (en) | 2020-02-14 | 2023-10-24 | Magic Leap, Inc. | Tool bridge |
US11861803B2 (en) | 2020-02-14 | 2024-01-02 | Magic Leap, Inc. | Session manager |
US12096204B2 (en) | 2020-02-14 | 2024-09-17 | Magic Leap, Inc. | Delayed audio following |
US11910183B2 (en) | 2020-02-14 | 2024-02-20 | Magic Leap, Inc. | Multi-application audio rendering |
US12315094B2 (en) | 2020-02-14 | 2025-05-27 | Magic Leap, Inc. | Session manager |
US11778410B2 (en) | 2020-02-14 | 2023-10-03 | Magic Leap, Inc. | Delayed audio following |
US11763559B2 (en) | 2020-02-14 | 2023-09-19 | Magic Leap, Inc. | 3D object annotation |
US12112098B2 (en) | 2020-02-14 | 2024-10-08 | Magic Leap, Inc. | Tool bridge |
US12100207B2 (en) | 2020-02-14 | 2024-09-24 | Magic Leap, Inc. | 3D object annotation |
US12185083B2 (en) | 2020-03-02 | 2024-12-31 | Magic Leap, Inc. | Immersive audio platform |
US11917384B2 (en) | 2020-03-27 | 2024-02-27 | Magic Leap, Inc. | Method of waking a device using spoken voice commands |
US12238496B2 (en) | 2020-03-27 | 2025-02-25 | Magic Leap, Inc. | Method of waking a device using spoken voice commands |
US12347415B2 (en) | 2020-05-29 | 2025-07-01 | Magic Leap, Inc. | Surface appropriate collisions |
US12333066B2 (en) | 2020-05-29 | 2025-06-17 | Magic Leap, Inc. | Determining angular acceleration |
US11636843B2 (en) | 2020-05-29 | 2023-04-25 | Magic Leap, Inc. | Surface appropriate collisions |
US11561613B2 (en) | 2020-05-29 | 2023-01-24 | Magic Leap, Inc. | Determining angular acceleration |
US11900912B2 (en) | 2020-05-29 | 2024-02-13 | Magic Leap, Inc. | Surface appropriate collisions |
US12056273B2 (en) | 2020-05-29 | 2024-08-06 | Magic Leap, Inc. | Determining angular acceleration |
US11963868B2 (en) | 2020-06-01 | 2024-04-23 | Ast Products, Inc. | Double-sided aspheric diffractive multifocal lens, manufacture, and uses thereof |
US12417766B2 (en) | 2020-09-30 | 2025-09-16 | Magic Leap, Inc. | Voice user interface using non-linguistic input |
US12306413B2 (en) | 2021-03-12 | 2025-05-20 | Magic Leap , Inc. | Athermalization concepts for polymer eyepieces used in augmented reality or mixed reality devices |
CN113109290A (en) * | 2021-04-08 | 2021-07-13 | 晨光生物科技集团股份有限公司 | Method for rapidly predicting attenuation speed of natural pigment |
US12430860B2 (en) | 2024-05-14 | 2025-09-30 | Magic Leap, Inc. | Spatial instructions and guides in mixed reality |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2388766A1 (en) | Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames | |
Mann | Mediated reality with implementations for everyday life | |
EP1625745B1 (en) | Mirror assembly with integrated display device | |
CA2233047C (en) | Wearable camera system with viewfinder means | |
Mann | 'WearCam'(The wearable camera): personal imaging systems for long-term use in wearable tetherless computer-mediated reality and personal photo/videographic memory prosthesis | |
CA2362895A1 (en) | Smart sunglasses or computer information display built into eyewear having ordinary appearance, possibly with sight license | |
AU689127B2 (en) | Image display device | |
EP1064783B1 (en) | Wearable camera system with viewfinder means | |
CA1164693A (en) | Vision enhancing system | |
CA2316473A1 (en) | Covert headworn information display or data display or viewfinder | |
US20090174946A1 (en) | Customizable head mounted display | |
US20020085843A1 (en) | Wearable camera system with viewfinder means | |
US20060007056A1 (en) | Head mounted display system having virtual keyboard and capable of adjusting focus of display screen and device installed the same | |
CA2399698A1 (en) | Optical beam-splitter unit and binocular display device containing such a unit | |
JPWO2008096719A1 (en) | Head mounted display with open peripheral vision | |
US12038577B2 (en) | Variable focus and reflectivity mixed augmented reality system | |
Mizrahi | Seeing through photographs: Photography as a transparent visual medium | |
CA2297344A1 (en) | Look direction microphone system with visual aiming aid | |
CA2248473C (en) | Eyetap camera or partial reality mediator having appearance of ordinary eyeglasses | |
CA2247649C (en) | Covert camera viewfinder or display having appearance of ordinary eyeglasses | |
Refai et al. | Diffraction-based glasses for visually impaired people | |
CA2256920A1 (en) | Lenstop camera viewfinder or computer data display having appearance of ordinary reading glasses or half glasses | |
EP1068553B1 (en) | Optical apparatus for image overlaying onto a real world scene | |
CA2249976C (en) | Wearable camera system with viewfinder means | |
JPH0426287A (en) | Spectacle type video display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FZDE | Dead |