[go: up one dir, main page]

NL2006762C2 - Apparatus and method for displaying an image of an object on a visual display unit. - Google Patents

Apparatus and method for displaying an image of an object on a visual display unit. Download PDF

Info

Publication number
NL2006762C2
NL2006762C2 NL2006762A NL2006762A NL2006762C2 NL 2006762 C2 NL2006762 C2 NL 2006762C2 NL 2006762 A NL2006762 A NL 2006762A NL 2006762 A NL2006762 A NL 2006762A NL 2006762 C2 NL2006762 C2 NL 2006762C2
Authority
NL
Netherlands
Prior art keywords
display unit
visual display
image
orientation
eyes
Prior art date
Application number
NL2006762A
Other languages
Dutch (nl)
Inventor
Daniel Fontijne-Dijkman
Original Assignee
Euclid Vision Technologies B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Euclid Vision Technologies B V filed Critical Euclid Vision Technologies B V
Priority to NL2006762A priority Critical patent/NL2006762C2/en
Priority to US13/467,644 priority patent/US20160232704A9/en
Application granted granted Critical
Publication of NL2006762C2 publication Critical patent/NL2006762C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Description

Apparatus and method for displaying an image of an object on a visual display unit
The invention relates to an apparatus and method for 5 displaying an image of an object on a visual display unit.
Such an apparatus and method is commonly known from the prior art and employed in the form of television sets, computer screens and similar devices. A problem with these known apparatuses and methods is that the impression rendered by the image 10 or images of the object that is/are shown on the visual display unit barely ever provides a real-life sensation of the experience that looking at the true object provides. This particularly applies when showing images of non-Lambertian surface materials of an object, which may give an impression depending on the an-15 gle at which light impacts it and also depending on the angle at which one looks at the surface.
It is therefore an object of the invention to provide a method and apparatus in which the image of the object that is shown on the visual display unit provides an accurate match with 20 looking at the true life object directly.
To promote the object of the invention a method and apparatus are proposed in accordance with one or more of the appended claims.
In a first aspect of the invention the image of an ob-25 ject that is shown on the visual display unit depends on a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head or eyes in relation to the visual display unit, and a position of a light source or light sources at 30 a location where the visual display unit is located, or a combination thereof. Surprisingly it has proven to be possible to provide accurate images already by taking account of the 3-D orientation of the visual display unit. Improved results are attainable when also account is taken of a position of a viewer's 35 head or eyes in relation to the visual display unit, and best results are achievable when still further account is taken of a position of a light source or light sources at a location where the visual display unit is located.
2 0 0 6 7 62T
2
Whenever in this description mention is made of a light source or light sources this includes image based lighting, in which the entire environment is deemed to constitute a light source. Also reflections from the environment form a part there-5 of.
There are several viable ways in which the method of the invention can be implemented. One preferred embodiment has the feature that the image of the object that is shown on the visual display unit is calculated from a representation of the 10 object, the calculation taking into account a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head or eyes in relation to the visual display unit, and a position of a light source or light sources at a location where the 15 visual display unit is located, or a combination thereof.
In yet another embodiment the image of the object that is shown on the visual display unit is selected from a database comprising a series of images of the object, wherein the selected image provides a best fit with seeing the object in real 20 life, taking into account a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head or eyes in relation to the visual display unit, and a position of a light source or light sources at a location where the visual display 25 unit is located, or a combination thereof.
If in this embodiment one desires to limit the number of stored images, it is preferable that the image of the object that is shown on the visual display unit is calculated as an interpolation of images from the object that come closest to see-30 ing the object in real life, taking into account a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head or eyes in relation to the visual display unit, and a position of a light source or light sources at a location where 35 the visual display unit is located, or a combination thereof.
As mentioned above the invention is embodied in a method and in an apparatus that operates in accordance with said method. Such an apparatus for displaying an image of an object, is known to comprise a handheld computer with an integrated vis-40 ual display unit. It is also known from the prior art that such 3 a computer may be provided with (first) means to detect its 3-D orientation.
In accordance with the invention such an.apparatus is embodied in a way that the computer is loaded with software that 5 cooperates with said (first) means for detecting the 3-D orientation of the visual display unit to arrange that the image of the object that is shown on the visual display unit depends on the 3-D .orientation of the visual display unit.
Preferably the apparatus is provided with second means 10 to establish a position of a viewer's head or eyes in relation to the visual display unit, and that the software cooperates with said second means to arrange that the image of the object that is shown on the visual display unit depends on the established position of a viewer's head or eyes in relation to the 15 visual display unit.
Still further preferably the apparatus is provided with third means to estimate a position of a light source or light sources at a location where the visual display unit is located, and that the software cooperates with said third means to ar-20 range that the image of the object that is shown on the visual display unit depends on the estimated position of the light source or light sources.
It has shown possible to already provide smooth images of an object with a true live experience when the software oper-25 ates in a continuous loop at a frequency of approximately 30 Hz. Preferably the operating frequency is 60 Hz.
The invention will hereinafter be further elucidated with reference to the drawing of an exemplary embodiment of the invention which is not limiting the appended claims.
30 In the drawing: -figure 1 shows a viewer looking at a tablet computer embodied with software in accordance with the invention; and -figure 2 shows a flow diagram embodying the method of the invention that may be implemented in the software for the 35 handheld .computer.
With reference first to figure 1, the apparatus of the invention for displaying an image of an object is shown arid indicated with reference 1. This apparatus is preferably embodied as a handheld computer 1 with an integrated visual display unit 40 at which a viewer 3 may be looking in a manner that is known per 4 se. The computer 1 is preferably provided with first means to detect its 3-D orientation, which means are symbolized with the part that is carrying reference 2. The handheld computer 1 is further loaded with software that cooperates with said first me-5 ans 2 for detecting the 3-D orientation of the visual display unit that forms part of the computer 1, in order to arrange that the image of the object that is shown on the visual display unit will depend on the 3-D orientation of the visual display unit of the computer 1.
10 Preferably the computer 1 is provided with second means 4 to establish a position of a viewer's 3 head or eyes in relation to the visual display unit of the computer 1, and that the software cooperates with said second means 4 to arrange that the image of the object that is shown on the visual display unit de-15 pends on the established position of a viewer's 3 head or eyes in relation to the visual display unit/the computer 1.
Still further preferably the computer 1 is provided with third means 5 to estimate a position of a light source 6 or light sources at a location where the computer's visual display 20 unit is located, and that the software cooperates with said third means 5 to arrange that the image of the object that is shown on the visual display unit depends on the estimated position of the light source 6 or light sources. This provides the possibility to improve the lighting and shading effects in the 25 image shown.
Making reference now to figure 2, the method of the invention according to which the software preferably operates will now be elucidated.
In this method, the image that is shown on the visual 30 display unit depends on a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head or eyes in relation to the visual display unit, and a position of a light source or light sources at a location where the visual display 35 unit is located, or a combination thereof.
As a first step square 7 relates to the determination of the 3-D orientation of the computer 1 and its visual display unit making use of the first means 2 as elucidated with reference to figure 1. Optionally then in diamond 8 it is established 40 whether it is also possible to keep track of the viewer's 3 head 5 or eyes making use of the second detecting means 4 shown in figure 1. In the affirmative case the position of the viewer's head or eyes can be taken into account in square 9 when determining the relative position of the visual display unit in relation to 5 the viewer 3. In the negative case a fixed head position is assumed.
As a further option diamond 10 concerns the question whether the third means 5 shown in figure 1 are enabled for establishing or estimating the position of a light source 6. If 10 the third means 5 are not enabled then the software operates as if a predetermined fixed position of a virtual light source applies as indicated in square 11. If however the third means 5 are enabled, square 12 indicates that account is being taken of the position of this light source 6 in the displaying of the im-15 age of the object on the visual display unit of the computer 1.
Diamond 13 deals with the selection of the operational method in which the image to be displayed on the visual display unit is determined.
Square 14 relates to the embodiment in which the image 20 that is shown on the visual display unit is calculated from a representation of the object, the calculation taking into account a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head or eyes in relation to the visual 25 display unit, and a position of a light source or light sources at a location where the visual display unit is located, or a combination thereof.
Square 15 relates to the embodiment in which the image that is shown on the visual display unit is selected from a da-30 tabase comprising a series of images of the object, wherein the selected image provides a best fit with seeing the object in real life, taking into account again a parameter or parameters that is/are selected from the group comprising the 3-D orientation of the visual display unit, a position of a viewer's head 35 or eyes in relation to the visual display unit, and a position of a light source or light sources at a location where the visual display unit is located, or a combination thereof.
Preferably in this embodiment the image of the object that is shown on the visual display unit is calculated as an in-40 terpolation of images from the object that come closest to see- 6 ing the object in real life.
Finally it is remarked that the loop is closed with line 16 which reflects that that the software embodying the method of the invention operates preferably in a continuous loop at a 5 frequency of approximately 30 Hz, and more preferably 60 Hz.
2 00 6 7 62

Claims (8)

1. Werkwijze voor het weergeven van een beeld van' een object op een visuele weergave-eenheid, met het kenmerk, dat het beeld van het object dat op de visuele weergave-eenheid getoond wordt afhankelijk is van een parameter of parameters die gese- 5 lecteerd is of zijn uit de groep omvattende de 3-D oriëntatie van de visuele weergave-eenheid, een positie van het hoofd of de ogen van een toeschouwer in relatie tot de visuele weergave-eenheid, en een positie van een lichtbron of lichtbronnen op een locatie waar zich de visuele weergave-eenheid bevindt, of een 10 combinatie daarvan.Method for displaying an image of an object on a visual display unit, characterized in that the image of the object displayed on the visual display unit is dependent on a parameter or parameters that have been selected. is or are selected from the group comprising the 3-D orientation of the visual display unit, a position of the viewer's head or eyes in relation to the visual display unit, and a position of a light source or light sources on a location where the visual display unit is located, or a combination thereof. 2. Werkwijze volgens conclusie 1, met het kenmerk, dat het beeld van het object dat getoond wordt op de visuele weergave-eenheid berekend wordt uit een representatie van het object, waarbij de calculatie rekening houdt met een parameter of para- 15 meters die is of zijn geselecteerd uit de groep omvattende de 3-D oriëntatie van de visuele weergave-eenheid, een positie van een hoofd of ogen van een toeschouwer in relatie tot de visuele weergave-eenheid, en een positie van een lichtbron of lichtbronnen op een locatie waar de visuele weergave-eenheid zich be-20 vindt, of een combinatie daarvan.2. Method according to claim 1, characterized in that the image of the object displayed on the visual display unit is calculated from a representation of the object, the calculation taking into account a parameter or parameters that is or are selected from the group comprising the 3-D orientation of the visual display unit, a position of a spectator's head or eyes in relation to the visual display unit, and a position of a light source or sources at a location where the visual display unit is, or a combination thereof. 3. Werkwijze volgens conclusie 1, met het kenmerk, dat het beeld van het object dat getoond wordt op de visuele weergave-eenheid geselecteerd wordt uit een database omvattende een serie beelden van het object, waarin het geselecteerde beeld van 25 het object een beste match verschaft met het in werkelijkheid zien van het object, waarbij een parameter of parameters worden betrokken die is of zijn geselecteerd uit de groep omvattende de 3-D oriëntatie van de visuele weergave-eenheid, een positie van een hoofd of ogen van een toeschouwer in verhouding tot de visu-30 ele weergave-eenheid, en een positie van een lichtbron of lichtbronnen op een locatie waar de visuele weergave-eenheid zich bevindt, of een combinatie daarvan.3. Method according to claim 1, characterized in that the image of the object displayed on the visual display unit is selected from a database comprising a series of images of the object, in which the selected image of the object matches best provided with actually seeing the object, involving a parameter or parameters selected from the group comprising the 3-D orientation of the visual display unit, a position of a spectator's head or eyes in proportion to the visual display unit, and a position of a light source or light sources at a location where the visual display unit is located, or a combination thereof. 4. Werkwijze volgens conclusie 3, met het kenmerk, dat het beeld van het object dat getoond wordt op de visuele weerga- 35 ve-eenheid berekend wordt als een interpolatie van beelden van het object die het dichtst komen bij het in werkelijkheid zien van het object, waarbij rekening gehouden wordt met een parame- ter of parameters die is of zijn geselecteerd uit de groep omvattende de 3-D oriëntatie van de visuele weergave-eenheid, een positie van een hoofd of ogen van een toeschouwer in verhouding tot de visuele weergave-eenheid, en een positie van een licht-5 bron of lichtbronnen op een locatie waar de visuele weergave-eenheid zich bevindt, of een combinatie daarvan.4. Method as claimed in claim 3, characterized in that the image of the object shown on the visual display unit is calculated as an interpolation of images of the object closest to the actual viewing of the object. object, taking into account a parameter or parameters selected from the group comprising the 3-D orientation of the visual display unit, a position of a head or eyes of a spectator in relation to the visual display unit, and a position of a light source or light sources at a location where the visual display unit is located, or a combination thereof. 5. Inrichting voor het weergeven van een beeld van een object omvattende een handcomputer met een geïntegreerde visuele weergave-eenheid, waarin genoemde computer voorzien is van eer- 10 ste middelen voor het detecteren van de 3-D oriëntatie daarvan, met het kenmerk, dat de computer geladen is met software die samenwerkt met genoemde eerste middelen voor het detecteren van de 3-D oriëntatie van de visuele weergave-eenheid teneinde te verzorgen dat het beeld van het object dat op de visuele weergave-15 eenheid getoond wordt afhankelijk is van de 3-D oriëntatie van de visuele weergave-eenheid.5. Device for displaying an image of an object comprising a handheld computer with an integrated visual display unit, wherein said computer is provided with first means for detecting its 3-D orientation, characterized in that the computer is loaded with software cooperating with said first means for detecting the 3-D orientation of the visual display unit to cause the image of the object displayed on the visual display unit to depend on the 3-D orientation of the visual display unit. 6. Inrichting volgens conclusie 5, met het kenmerk, dat deze voorzien is van tweede middelen voor het vaststellen van een positie van een hoofd of ogen van een toeschouwer in relatie 20 tot de visuele weergave-eenheid, en dat de software samenwerkt met genoemde tweede middelen teneinde te verzorgen dat het beeld van het object dat op de visuele weergave-eenheid getoond wordt afhankelijk is van de vastgestelde positie van een hoofd of ogen van een toeschouwer in relatie tot de visuele weergave eenheid.Device as claimed in claim 5, characterized in that it is provided with second means for determining a position of a viewer's head or eyes in relation to the visual display unit, and that the software cooperates with said second means for ensuring that the image of the object displayed on the visual display unit is dependent on the determined position of a viewer's head or eyes in relation to the visual display unit. 7. Inrichting volgens conclusie 5 of 6, met het ken merk, dat deze voorzien is van derde middelen voor het schatten van een positie van een lichtbron of lichtbronnen op een locatie waar zich de visuele weergave-eenheid bevindt, en dat de software met genoemde derde middelen samenwerkt teneinde te verzorgen 30 dat het beeld van het object dat op de visuele weergave-eenheid getoond wordt afhankelijk is van een geschatte positie van de lichtbron of lichtbronnen.7. Device as claimed in claim 5 or 6, characterized in that it is provided with third means for estimating a position of a light source or light sources at a location where the visual display unit is located, and that the software with said third means cooperates to ensure that the image of the object displayed on the visual display unit is dependent on an estimated position of the light source or sources. 8. Inrichting volgens één der conclusies 5-7, met het kenmerk, dat de software in een continue lus werkzaam is op een 35 frequentie van ongeveer 30Hz.8. Device as claimed in any of the claims 5-7, characterized in that the software operates in a continuous loop at a frequency of approximately 30 Hz.
NL2006762A 2011-05-11 2011-05-11 Apparatus and method for displaying an image of an object on a visual display unit. NL2006762C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NL2006762A NL2006762C2 (en) 2011-05-11 2011-05-11 Apparatus and method for displaying an image of an object on a visual display unit.
US13/467,644 US20160232704A9 (en) 2011-05-11 2012-05-09 Apparatus and method for displaying an image of an object on a visual display unit

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NL2006762 2011-05-11
NL2006762A NL2006762C2 (en) 2011-05-11 2011-05-11 Apparatus and method for displaying an image of an object on a visual display unit.

Publications (1)

Publication Number Publication Date
NL2006762C2 true NL2006762C2 (en) 2012-11-13

Family

ID=44640733

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2006762A NL2006762C2 (en) 2011-05-11 2011-05-11 Apparatus and method for displaying an image of an object on a visual display unit.

Country Status (1)

Country Link
NL (1) NL2006762C2 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680019A2 (en) * 1994-04-19 1995-11-02 Canon Kabushiki Kaisha Image processing method and apparatus
EP1950708A1 (en) * 2005-09-15 2008-07-30 Oleg Stanilasvovich Rurin Method and system for visualising virtual three-dimensional objects
US20090313584A1 (en) * 2008-06-17 2009-12-17 Apple Inc. Systems and methods for adjusting a display based on the user's position
US20100064259A1 (en) * 2008-09-11 2010-03-11 Lg Electronics Inc. Controlling method of three-dimensional user interface switchover and mobile terminal using the same
US20100103172A1 (en) * 2008-10-28 2010-04-29 Apple Inc. System and method for rendering ambient light affected appearing imagery based on sensed ambient lighting

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0680019A2 (en) * 1994-04-19 1995-11-02 Canon Kabushiki Kaisha Image processing method and apparatus
EP1950708A1 (en) * 2005-09-15 2008-07-30 Oleg Stanilasvovich Rurin Method and system for visualising virtual three-dimensional objects
US20090313584A1 (en) * 2008-06-17 2009-12-17 Apple Inc. Systems and methods for adjusting a display based on the user's position
US20100064259A1 (en) * 2008-09-11 2010-03-11 Lg Electronics Inc. Controlling method of three-dimensional user interface switchover and mobile terminal using the same
US20100103172A1 (en) * 2008-10-28 2010-04-29 Apple Inc. System and method for rendering ambient light affected appearing imagery based on sensed ambient lighting

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Affichage 3D sur iPhone/iPad: IIHM fait un buzz", 12 April 2011 (2011-04-12), XP055013604, Retrieved from the Internet <URL:http://www.liglab.fr/spip.php?article980> [retrieved on 20111130] *

Similar Documents

Publication Publication Date Title
US10255713B2 (en) System and method for dynamically adjusting rendering parameters based on user movements
EP3586165B1 (en) Single-frequency time-of-flight depth computation using stereoscopic disambiguation
US9595083B1 (en) Method and apparatus for image producing with predictions of future positions
JP2018125007A5 (en)
IL275447B1 (en) Methods and system for generating and displaying 3d videos in a virtual, augmented, or mixed reality environment
JP5762600B1 (en) Information processing apparatus and information processing method
CA2630238A1 (en) Method and apparatus for maintaining a visual appearance of at least one window when a resolution of the screen changes
CN108427504A (en) Display system and method
EP2816545A3 (en) Method and apparatus for protecting eyesight
JP2008040832A5 (en)
KR20140060365A (en) Method for creating a cover for an electronic device and electronic device
IN2014CN02389A (en)
KR20150041482A (en) Display apparatus and display method using the same
EP2284697A3 (en) Stereoscopic display device and display method
US20160227868A1 (en) Removable face shield for augmented reality device
JP2018526726A5 (en)
CN102693067A (en) System and method for adjusting font size
CN107018392A (en) Projecting apparatus optimization method and system
US20120105589A1 (en) Real time three-dimensional menu/icon shading
EP2672364A3 (en) Method and apparatus for providing graphical user interface
JP2014513317A (en) Method and device for displaying images
EP1871120A3 (en) Apparatus and method for projecting spatial image
NL2006762C2 (en) Apparatus and method for displaying an image of an object on a visual display unit.
CN108604132B (en) Method for operating virtual reality system and virtual reality system
US9704278B2 (en) Visualization device for displaying a rendered virtual object

Legal Events

Date Code Title Description
PD Change of ownership

Owner name: QUALCOMM TECHNOLOGIES, INC.; US

Free format text: DETAILS ASSIGNMENT: VERANDERING VAN EIGENAAR(S), OVERDRACHT; FORMER OWNER NAME: EUCLID VISION TECHNOLOGIES B.V.

Effective date: 20160531

MM Lapsed because of non-payment of the annual fee

Effective date: 20170601