US20190007672A1 - Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections - Google Patents
Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections Download PDFInfo
- Publication number
- US20190007672A1 US20190007672A1 US15/693,076 US201715693076A US2019007672A1 US 20190007672 A1 US20190007672 A1 US 20190007672A1 US 201715693076 A US201715693076 A US 201715693076A US 2019007672 A1 US2019007672 A1 US 2019007672A1
- Authority
- US
- United States
- Prior art keywords
- environment
- image
- time
- real
- projection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N13/0488—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Definitions
- the present invention relates to the operation of image processing systems. More specifically, the present invention relates to the processing images derived from a surrounding environment.
- Software user interfaces have evolved significantly over the past forty years. They have progressed through command-driven terminals, mouse-driven 2D graphical user interfaces, and touch-driven 2D graphical user interfaces.
- the computer has displayed information via forms and effects crafted by a UI designer or developer. Regions of color, brightness, and contrast are crafted in such a way as to imply, for example, depth and lighting. Such effects help the user visually understand and organize large quantities of information. To date, such effects have been implemented statically in two dimensions. However, graphical user interfaces do not, for example, react to the lighting of the environment in which the computer is used.
- methods and apparatus are provided for generating dynamic real-time 3D environment projections that provide a way to utilize characteristics of the surrounding environment in visual displays to support a variety of display applications.
- the projections establish congruity between a graphical user interface and the physical environment in which both a user and a device reside. Such congruity would reduce cognitive load, improve immersiveness, and generally diminish the boundary between man and machine, thereby making computing devices generally easier to use.
- one or more (or a stream of) real-time 2D images are acquired.
- the images are acquired from one or more image capture devices and represent a wide view of a region surrounding the image capture device(s).
- the images are applied to a 3D shape that results in a dynamic real-time 3D projection that depicts the environment surrounding the image capture device.
- the dynamic real-time 3D environment projection can be used to support a variety of device applications and user interface controls, such as rendering transparent elements through which the environment behind the device may be seen, reflections, and/or image based illumination.
- a method for generating dynamic real-time 3D environment projections.
- the method includes acquiring a stream of real-time 2D images, and projecting each real-time 2D image onto a 3D shape to generate a dynamic real-time 3D environment projection.
- a method for generating dynamic real-time 3D environment projections.
- the method includes acquiring a stream of real-time 2D images and projecting the 2D images into an image or set of images representing one or more of six faces of a cube map to generate a dynamic real-time 3D environment projection.
- a method for generating dynamic real-time 3D environment projections.
- the method includes acquiring a stream of real-time 2D images and using the 2D images as a lookup table to generate a dynamic real-time 3D environment projection.
- an apparatus that generates dynamic real-time 3D environment projections.
- the apparatus includes an image receiver that acquires a stream of real-time 2D images, and a projector that projects each real-time 2D image onto a 3D shape to generate a dynamic real-time 3D environment projection.
- FIG. 1 shows devices comprising exemplary embodiments of an environment projection system
- FIG. 2 shows a device that includes an exemplary embodiment of the environment projection system
- FIG. 3 shows a detailed exemplary embodiment of the environment projection system shown in FIG. 2 ;
- FIG. 4 shows a detailed exemplary embodiment of the image sensors and the image receiver shown in FIG. 3 ;
- FIG. 5 shows a detailed exemplary embodiment of the projector shown in FIG. 3 ;
- FIG. 6 shows a detailed exemplary embodiment of an image sensor
- FIG. 7A shows exemplary embodiments of 3D shapes for use with the embodiments of the environment projection system
- FIG. 7B shows an exemplary embodiment of a 3D shape represented as a 3D mesh for use with the embodiments of the environment projection system
- FIG. 8 shows a diagram illustrating exemplary operation of an embodiment of the environment projection system
- FIG. 9 shows a diagram illustrating exemplary operation of an embodiment of the environment projection system.
- FIG. 10 shows an exemplary embodiment of a method for generating real-time 3D environment projections.
- FIG. 1 shows devices 100 comprising exemplary embodiments of an environment projection system (EPS).
- the EPS operates to generate real-time 3D environment projections based on 2D images taken of the surrounding environment.
- the devices shown include tablet computer 102 , notebook computer 104 , cell phone 106 , and smart phone 108 .
- embodiments of the EPS are suitable for use with virtually any type of device to generate real-time 3D environment projections and are not limited to the devices shown.
- the EPS also is suitable for use in automobile dashboard systems, billboards, and stadium big screens.
- FIG. 2 shows a device 200 that includes an exemplary embodiment of an environment projection system 204 .
- the EPS 202 includes a 3D projection (3DP) unit 204 and image sensors 206 .
- the image sensors 206 operate to acquire real-time 2D images of the environment surrounding the device 200 .
- the 3D projection unit 204 operates to receive the real-time 2D images and generate dynamic real-time 3D environment projections that can be utilized, stored, and displayed by the device 200 .
- the dynamic real-time 3D environment projections provide a visualization of the changing environment around the device 200 . Since the environment projections are derived from the real-time images acquired form the image sensor 206 , changes in the orientation and/or position of the image sensor 206 (e.g., when the device 200 is moved) result in corresponding changes to the environment projections.
- FIG. 3 shows a detailed exemplary embodiment of an environment projection system 300 .
- the EPS 300 is suitable for use as the EPS 200 shown in FIG. 2 .
- the EPS 300 comprises one or more image sensors 302 and a 3D projection unit 304 .
- the image sensors 302 comprise one or more high-resolution image sensors that output real-time 2D images.
- each image sensor can output a stream of real-time 2D image frames at 30 frames per second (fps) (or other suitable frame rate).
- the stream of 2D images output from the images sensors 302 is shown at 312 .
- the 3D projection unit 304 includes an image receiver 306 , projector 308 and memory 310 .
- the image receiver 306 receives one or more real-time images 312 from the image sensors 302 and processes these images into a real-time 2D image stream 314 that is passed to the projector 308 .
- the image stream 312 comprises images from multiple image sensors
- the image receiver 306 operates to combine these images into the real-time 2D image stream 314 .
- the image receiver 306 may stitch together multiple images to generate the real-time 2D image stream 314 that provides a 360-degree field of view around the image sensors 302 .
- the projector 308 obtains a selected 3D shape 318 from a plurality of 3D shapes 316 stored in the memory 310 .
- the selected 3D shape is in the form of a 3D mesh.
- the projector 308 projects the received real-time 2D images 314 onto the selected 3D mesh to generate real-time 3D environment projections 320 that are stored in the memory 310 as indicated at 322 .
- the projector 308 operates in real-time (e.g., at least equal to the frame rate of the 2D image stream) so that changes in images of the real-time 2D image stream 314 are immediately reflected in the generated real-time 3D environment projections 320 .
- the projector 308 also outputs the real-time 3D environment projections 324 for display or other purposes.
- FIG. 4 shows detailed exemplary embodiments of the image sensors 302 and the image receiver 306 shown in FIG. 3 .
- the image sensors 302 comprise one or more image sensors that capture images of the environment (or region) surrounding the device to which the image sensors 302 are mounted.
- the image sensors 302 comprise one or more camera sensors that are arranged in such a way as to maximally cover the field of view (up to and even beyond 360°).
- the image sensors 302 comprise two opposing camera sensors, each with 180° field of view, that cover a full sphere encompassing the device to which the images sensors 302 are mounted.
- the implementation of two camera sensors, each with a 180° field of view enables a bona fide 360° field of view to be obtained.
- the image sensors may include but are not limited to high resolution (HD) cameras, video cameras (e.g., outputting 30-60 fps), color or black and white cameras, and/or cameras having special lenses (e.g., wide angle or fish eye). If two cameras each having a 180° field of view are used, they may be placed in opposition to each other to obtain a 360° field of view. Other configurations include four cameras each with 90° field of view to obtain a 360° field of view, or multiple cameras with asymmetrical fields of view that are combined to obtain a 360° field of view.
- HD high resolution
- video cameras e.g., outputting 30-60 fps
- color or black and white cameras e.g., color or black and white cameras
- special lenses e.g., wide angle or fish eye
- the image receiver 306 comprises an image sensor interface (I/F) 402 , image controller 404 , and image output I/F 406 .
- the image sensor I/F 402 comprises, logic, registers, storage elements, and/or discrete components that operate to received image data from the image sensors 302 and to pass this image data to the image controller 404 .
- the image controller 404 comprises a processor, CPU, gate array, programmable logic, registers, logic, and/or discrete components that operate to receive real-time images from the image sensors 302 provided by the image sensor I/F 402 .
- the image controller 404 operates to process those images into a real-time 2D image stream that is output to the image output interface 406 .
- the image sensors 302 may include multiple image sensors that each output real-time 2D images.
- the image controller 404 operates to combine these multiple real-time images into a real-time 2D image stream where each image provides a wide field of view around the image sensors 302 . For example, each image may provide a 360° field of view around the image sensors 302 .
- the image controller 404 operates to stitch together multiple images received from the image sensors 302 to form the real-time 2D output image stream 410 .
- the image controller 404 includes a memory 408 to facilitate combining images from multiple image sensors.
- the image controller 404 outputs the real-time 2D image stream 410 to the image output I/F 406 , which generates the real-time 2D image stream 314 output.
- the real-time 2D image stream 314 is output from the image receiver 306 to the projector 308 .
- FIG. 5 shows a detailed exemplary embodiment of the projector 308 and memory 310 shown in FIG. 3 .
- the projector 308 comprises an image input I/F 502 , projection processor 504 , and a projection output I/F 506 .
- the image input I/F 502 comprises at least one of programmable logic, registers, memory, and/or discrete components that operate to receive real-time 2D images 314 from the image receiver 306 and passes these images to the projection processor 504 .
- the received images may be stored or buffered by the image input I/F 502 .
- the projection processor 504 comprises at least one of a processor, CPU, gate array, programmable logic, memory, registers, logic, and/or discrete components that operate to receive a stream of real-time 2D images from the image input I/F 502 .
- the projection processor 504 projects those images onto real-time 3D environment projections that are output to the projection output I/F 506 .
- the projection processor 504 retrieves a selected 3D shape 318 from the plurality of shapes 316 stored in the memory 310 .
- the projection processor 504 operates to project the received 2D images onto the 3D shape to generate the real-time 3D environment projections.
- the projection processor 504 performs a process known as UV mapping that utilizes the 2D images as a texture.
- the process of UV mapping is known in the art and will not be describe in detail here.
- the projection processor 504 projects selected portions of the texture into polygons that form the 3D shape to generate the real-time 3D environment projections. This process is further described with respect to FIG. 7B .
- the projection processor 504 generates the real-time 3D environment projections by generating one or more of the six faces of a cube map. This is achieved through a process of sampling the 2D environment image or images and re-expressing them using, for example, a 2D projection scheme such as spherical, cylindrical, conic, or azimuthal, as is appropriate for the particular lens configuration in use. For example, two image sensors placed both in opposition to one another and axially aligned may use a spherical projection in order to convert the captured images into components of a cube map. Once the cube map is generated, it may be provided to rendering applications to be used as a reflection source, sky box, or similar.
- a 2D projection scheme such as spherical, cylindrical, conic, or azimuthal
- the projection processor 504 generates the real-time 3D projections by using the acquired real-time 2D image or images of the environment as a lookup table whereby data from the environment images is used in subsequent rendering steps. For example, referencing data derived from the environment image(s) in the scene camera's rendering procedure could result in a sky being rendered that depicts the environment, but without first creating an intermediate texture, cube map, sky box, or other. Similar operations may be used to generate other effects, like reflections. Mips, convolutions, or other intermediate transformations of the environment image(s) may be used to create real-time 3D environment projections as well.
- the projection processor 504 generates the real-time 3D environment projections in the form of reflections of the environment on elements in a user interface.
- user interface elements may be expressed with particular degrees of “shininess” where the notion of “shine” is achieved by rendering the environment upon the surface of the user interface element.
- Designers may control the degree of shininess, anywhere from a mirror-like finish to a matte finish, by, for example, convolving the environment image before it is rendered upon the surface of the user interface element.
- the projection processor 504 generates the real-time 3D environment projections in the form of light mapped, image-based illumination of a user interface.
- image based lighting is a scheme whereby light may be rendered as coming from many directions, and each direction may take on the characteristic color of the light in the environment, the user interface is lit by a high-fidelity representation of the lighting of the environment.
- the projection output I/F 506 also outputs the real-time 3D environment projections 324 to other systems or device applications.
- FIG. 6 shows a detailed exemplary embodiment of an image sensor 600 for use with the environment projection system.
- the image sensor 600 is suitable for use as part of the image sensors 302 shown in FIG. 3 .
- the image sensor 600 comprises a sensor body 602 that houses an image sensor 606 that is covered by a lens 604 .
- the lens 604 may be a hemispherical dome lens.
- other factors such as cost and form factor may affect the choice of lens design for a given implementation.
- the lens 604 operates to provide a wide field of view of the surrounding environment that is captured by the image sensor 606 .
- different sensor/lens combinations are used to acquire a wide field of view of the surrounding environment. Evaluation of a particular sensor/lens configuration should consider the accuracy of the system's ability to project the image onto the surface of the image sensor 606 with minimal aberration. For example, faceted lenses may not be suitable as they may introduce aberration inherent to the way they capture light.
- FIG. 7A shows exemplary 3D shapes 700 for use with the embodiments of the environment projection system.
- the EPS is operable to utilize 3D shapes such as a hemispherical dome 702 , dodecahedron 704 , tetrahedron 706 , and cube 708 .
- the EPS is suitable for use with other types of 3D shapes and is not limited to utilizing only the shapes shown in FIG. 7A .
- the 3D shapes 700 may comprise a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule.
- FIG. 7B shows an exemplary embodiment of the 3D shape 702 represented as a 3D mesh 710 for use with the embodiments of the environment projection system.
- the 3D mesh 710 may be stored in the memory 310 for use by the projection processor 504 to generate the real-time 3D environment projections 320 .
- the 3D mesh 710 comprises polygon shapes (e.g., polygon 712 ) arranged and configured to form a hemispherical dome onto which real-time 2D images will be projected by the projection processor 504 .
- the projection processor 504 performs a UV mapping process that utilizes the 2D images as a texture. Using the UV mapping process, the projection processor 504 projects a selected portion of the texture into each of the polygons of the 3D mesh 710 to generate the real-time 3D environment projections.
- FIG. 8 shows a diagram 800 illustrating exemplary operation of an embodiment of the environment projection system. For example, the operations illustrated in the diagram 800 are discussed with reference to FIG. 5 .
- the projection processor 504 selects a 3D shape from the 3D shapes 316 stored in the memory 310 .
- a hemispherical shape 802 is selected.
- the shape 802 is stored as a 3D mesh in the memory 310 .
- the projection processor 504 obtains a 3D shape from the memory 310 and converts this shape into a mesh representation.
- a real-time 2D image is acquired.
- the image sensors 302 capture the image shown at 804 .
- the projection processor 504 performs a projection to project the 2D image onto the selected 3D shape as shown at 806 .
- the projection processor 504 performs a UV mapping operation to generate a real-time 3D environment projection 808 , which is stored in a memory (e.g., memory 310 ) in a fourth operation.
- the projection may also be output to other programs or devices using the projection output I/F 506 . Since this is a real-time process, the next image in the stream of 2D images is acquired in a fifth operation and the process is repeated.
- real-time 3D environment projections are generated as a continuous stream corresponding to the captured real-time 2D images.
- FIG. 9 shows a diagram 900 illustrating exemplary operation of an embodiment of the environment projection system.
- the diagram 900 shows real-time 2D images 902 and corresponding real-time 3D environment projections 904 .
- the real-time images 902 are acquired by image sensors 302 , sent to the image receiver 306 and processed by the projector 308 .
- a first real-time 2D image 906 is acquired that captures an environment surrounding the image sensors 302 .
- the 2D image 906 includes trees, a building, clouds, and the sun 912 .
- the projector 308 projects this first image onto the 3D mesh 710 to generate a first 3D environment projection 906 A also shown at time T 0 .
- the initial location of the sun is indicated at 912 .
- another real-time 2D image 908 of the surroundings is acquired that includes trees, a building, clouds, and the sun 912 .
- the projector 308 projects this image onto the 3D mesh 710 to generate a 3D environment projection 908 A shown at time Tx.
- the location of the sun is indicated at 912 .
- another real-time 2D image 910 is acquired that includes trees, a building, clouds, and the sun 912 .
- the projector 308 projects this image onto the 3D mesh 710 to generate a 3D environment projection 910 A shown at time Tx.
- the location of the sun is indicated at 912 .
- the images 900 illustrates how movement of objects (e.g., the sun 912 ) in the real-time 2D image stream is reflected in movement of those same objects in the corresponding 3D environment projection.
- objects e.g., the sun 912
- FIG. 10 shows an exemplary embodiment of a method 1000 for generating real-time 3D environment projections in accordance with exemplary embodiments of the present invention.
- the method 1000 is suitable for use with the EPS 300 shown in FIG. 3 .
- a 3D shape is selected.
- the projector 308 selects a 3D shape from the 3D shapes 316 stored in the memory 310 .
- the 3D shapes include spherical, tetrahedral, box and/or any other suitable 3D shape.
- the shape selection is performed during an initialization phase or configuration phase of the operation of the EPS 300 .
- the selection of the 3D shape is performed during runtime.
- the 3D shapes 316 are stored in the memory 310 as 3D mesh shapes.
- a real-time 2D image is acquired.
- the 2D image is acquired from one or more image sensors 302 .
- the image sensors can be part of a camera system attached to a hand-held device.
- the acquired image provides a 360° field of view of the region surrounding the location of the image sensors.
- the image sensors 302 output images at a frame rate of 30 fps.
- an optional operation is performed to combine images from multiple sensors into the acquired real-time 2D image. For example, if several images are acquired by multiple image sensors, these images are combined into one image by connecting the images together or otherwise stitching the images to form one real-time 2D image. In an exemplary embodiment, the image controller 404 performs this operation.
- the real-time 2D image is projected onto the selected 3D shape to generate a real-time 3D environment projection.
- the projection processor 504 performs a UV mapping operation to project the 2D images onto the 3D shape.
- the projection processor 504 performs one or more of Cube Mapping, Lookup Table processing, Environment Reflections processing or Light Mapped Illumination processing as described above, to generate the real-time 3D environment projection.
- the real-time 3D environment projection is stored in a memory.
- the projector 308 stores the real-time 3D environment projection into the memory 310 .
- the environment projection is output.
- the real-time 3D environment projection is output for display or is passed to another device or application by the projection output I/F 506 .
- the method then proceeds to block 1004 to acquire the next real-time 2D image for processing.
- the method 1000 operates to generate a real-time 3D environment projection in accordance with exemplary embodiments of the present invention. It should be noted that although the method 1000 describes specific operation, these operations may be changed, modified, rearranged, added to, and subtracted from within the scope of the embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Studio Devices (AREA)
- Controls And Circuits For Display Device (AREA)
- Image Analysis (AREA)
- Processing Or Creating Images (AREA)
Abstract
Methods and apparatus for generating dynamic real-time environment projections. In an exemplary embodiment, a method for generating a dynamic real-time 3D environment projection includes acquiring a real-time 2D image of an environment, and projecting the real-time 2D image of the environment onto a 3D shape to generate a 3D environment projection. In an exemplary embodiment, an apparatus that generates a dynamic real-time 3D environment projection includes an image receiver that acquires a real-time 2D image of an environment, and a projector that projects the real-time 2D image of the environment onto a 3D shape to generate a 3D environment projection.
Description
- This application claims the benefit of priority based upon U.S. Provisional Patent Application having Application No. 62/527,778, filed on Jun. 30, 2017, and entitled “GENERATION AND USE OF DYNAMIC REAL-TIME ENVIRONMENT MAPS,” which is hereby incorporated herein by reference in its entirety.
- The present invention relates to the operation of image processing systems. More specifically, the present invention relates to the processing images derived from a surrounding environment.
- Software user interfaces have evolved significantly over the past forty years. They have progressed through command-driven terminals, mouse-driven 2D graphical user interfaces, and touch-driven 2D graphical user interfaces. In each generation of software user interface, the computer has displayed information via forms and effects crafted by a UI designer or developer. Regions of color, brightness, and contrast are crafted in such a way as to imply, for example, depth and lighting. Such effects help the user visually understand and organize large quantities of information. To date, such effects have been implemented statically in two dimensions. However, graphical user interfaces do not, for example, react to the lighting of the environment in which the computer is used.
- Therefore, it would be desirable to have a mechanism that allows characteristics of the surrounding environment to be utilized in visual displays to support a variety of display applications.
- In various exemplary embodiments, methods and apparatus are provided for generating dynamic real-
time 3D environment projections that provide a way to utilize characteristics of the surrounding environment in visual displays to support a variety of display applications. For example, the projections establish congruity between a graphical user interface and the physical environment in which both a user and a device reside. Such congruity would reduce cognitive load, improve immersiveness, and generally diminish the boundary between man and machine, thereby making computing devices generally easier to use. - In an exemplary embodiment, one or more (or a stream of) real-
time 2D images are acquired. The images are acquired from one or more image capture devices and represent a wide view of a region surrounding the image capture device(s). The images are applied to a 3D shape that results in a dynamic real-time 3D projection that depicts the environment surrounding the image capture device. The dynamic real-time 3D environment projection can be used to support a variety of device applications and user interface controls, such as rendering transparent elements through which the environment behind the device may be seen, reflections, and/or image based illumination. - In an exemplary embodiment, a method is provided for generating dynamic real-
time 3D environment projections. The method includes acquiring a stream of real-time 2D images, and projecting each real-time 2D image onto a 3D shape to generate a dynamic real-time 3D environment projection. - In an exemplary embodiment, a method is provided for generating dynamic real-
time 3D environment projections. The method includes acquiring a stream of real-time 2D images and projecting the 2D images into an image or set of images representing one or more of six faces of a cube map to generate a dynamic real-time 3D environment projection. - In an exemplary embodiment, a method is provided for generating dynamic real-
time 3D environment projections. The method includes acquiring a stream of real-time 2D images and using the 2D images as a lookup table to generate a dynamic real-time 3D environment projection. - In an exemplary embodiment, an apparatus is provided that generates dynamic real-
time 3D environment projections. The apparatus includes an image receiver that acquires a stream of real-time 2D images, and a projector that projects each real-time 2D image onto a 3D shape to generate a dynamic real-time 3D environment projection. - Additional features and benefits of the exemplary embodiments of the present invention will become apparent from the detailed description, figures and claims set forth below.
- The exemplary embodiments of the present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
-
FIG. 1 shows devices comprising exemplary embodiments of an environment projection system; -
FIG. 2 shows a device that includes an exemplary embodiment of the environment projection system; -
FIG. 3 shows a detailed exemplary embodiment of the environment projection system shown inFIG. 2 ; -
FIG. 4 shows a detailed exemplary embodiment of the image sensors and the image receiver shown inFIG. 3 ; -
FIG. 5 shows a detailed exemplary embodiment of the projector shown inFIG. 3 ; -
FIG. 6 shows a detailed exemplary embodiment of an image sensor; -
FIG. 7A shows exemplary embodiments of 3D shapes for use with the embodiments of the environment projection system; -
FIG. 7B shows an exemplary embodiment of a 3D shape represented as a 3D mesh for use with the embodiments of the environment projection system; -
FIG. 8 shows a diagram illustrating exemplary operation of an embodiment of the environment projection system; -
FIG. 9 shows a diagram illustrating exemplary operation of an embodiment of the environment projection system; and -
FIG. 10 shows an exemplary embodiment of a method for generating real-time 3D environment projections. - The purpose of the following detailed description is to provide an understanding of one or more embodiments of the present invention. Those of ordinary skill in the art will realize that the following detailed description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure and/or description.
- In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be understood that in the development of any such actual implementation, numerous implementation-specific decisions may be made in order to achieve the developer's specific goals, such as compliance with application and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be understood that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of the embodiments of this disclosure.
- Various exemplary embodiments illustrated in the drawings may not be drawn to scale. Rather, the dimensions of the various features may be expanded or reduced for clarity. In addition, some of the drawings may be simplified for clarity. Thus, the drawings may not depict all of the components of a given apparatus (e.g., device) or method. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.
-
FIG. 1 showsdevices 100 comprising exemplary embodiments of an environment projection system (EPS). For example, the EPS operates to generate real-time 3D environment projections based on 2D images taken of the surrounding environment. For example, the devices shown includetablet computer 102,notebook computer 104,cell phone 106, andsmart phone 108. It should be noted that embodiments of the EPS are suitable for use with virtually any type of device to generate real-time 3D environment projections and are not limited to the devices shown. For example, the EPS also is suitable for use in automobile dashboard systems, billboards, and stadium big screens. -
FIG. 2 shows adevice 200 that includes an exemplary embodiment of anenvironment projection system 204. For example, theEPS 202 includes a 3D projection (3DP)unit 204 andimage sensors 206. Theimage sensors 206 operate to acquire real-time 2D images of the environment surrounding thedevice 200. The3D projection unit 204 operates to receive the real-time 2D images and generate dynamic real-time 3D environment projections that can be utilized, stored, and displayed by thedevice 200. The dynamic real-time 3D environment projections provide a visualization of the changing environment around thedevice 200. Since the environment projections are derived from the real-time images acquired form theimage sensor 206, changes in the orientation and/or position of the image sensor 206 (e.g., when thedevice 200 is moved) result in corresponding changes to the environment projections. -
FIG. 3 shows a detailed exemplary embodiment of anenvironment projection system 300. For example, theEPS 300 is suitable for use as theEPS 200 shown inFIG. 2 . TheEPS 300 comprises one ormore image sensors 302 and a3D projection unit 304. Theimage sensors 302 comprise one or more high-resolution image sensors that output real-time 2D images. For example, each image sensor can output a stream of real-time 2D image frames at 30 frames per second (fps) (or other suitable frame rate). The stream of 2D images output from theimages sensors 302 is shown at 312. - In an exemplary embodiment, the
3D projection unit 304 includes animage receiver 306,projector 308 andmemory 310. Theimage receiver 306 receives one or more real-time images 312 from theimage sensors 302 and processes these images into a real-time 2D image stream 314 that is passed to theprojector 308. For example, if theimage stream 312 comprises images from multiple image sensors, theimage receiver 306 operates to combine these images into the real-time 2D image stream 314. For example, theimage receiver 306 may stitch together multiple images to generate the real-time 2D image stream 314 that provides a 360-degree field of view around theimage sensors 302. - The
projector 308 obtains a selected3D shape 318 from a plurality of 3D shapes 316 stored in thememory 310. In an exemplary embodiment, the selected 3D shape is in the form of a 3D mesh. Theprojector 308, projects the received real-time 2D imagestime 3D environment projections 320 that are stored in thememory 310 as indicated at 322. Theprojector 308 operates in real-time (e.g., at least equal to the frame rate of the 2D image stream) so that changes in images of the real-time 2D image stream 314 are immediately reflected in the generated real-time 3D environment projections 320. In an exemplary embodiment, theprojector 308 also outputs the real-time 3D environment projections 324 for display or other purposes. -
FIG. 4 shows detailed exemplary embodiments of theimage sensors 302 and theimage receiver 306 shown inFIG. 3 . In an exemplary embodiment, theimage sensors 302 comprise one or more image sensors that capture images of the environment (or region) surrounding the device to which theimage sensors 302 are mounted. In an exemplary embodiment, theimage sensors 302 comprise one or more camera sensors that are arranged in such a way as to maximally cover the field of view (up to and even beyond 360°). For example, in one embodiment, theimage sensors 302 comprise two opposing camera sensors, each with 180° field of view, that cover a full sphere encompassing the device to which theimages sensors 302 are mounted. In an exemplary embodiment, the implementation of two camera sensors, each with a 180° field of view enables a bona fide 360° field of view to be obtained. - In various exemplary embodiments, the image sensors may include but are not limited to high resolution (HD) cameras, video cameras (e.g., outputting 30-60 fps), color or black and white cameras, and/or cameras having special lenses (e.g., wide angle or fish eye). If two cameras each having a 180° field of view are used, they may be placed in opposition to each other to obtain a 360° field of view. Other configurations include four cameras each with 90° field of view to obtain a 360° field of view, or multiple cameras with asymmetrical fields of view that are combined to obtain a 360° field of view.
- In an exemplary embodiment, the
image receiver 306 comprises an image sensor interface (I/F) 402,image controller 404, and image output I/F 406. The image sensor I/F 402 comprises, logic, registers, storage elements, and/or discrete components that operate to received image data from theimage sensors 302 and to pass this image data to theimage controller 404. - In an exemplary embodiment, the
image controller 404 comprises a processor, CPU, gate array, programmable logic, registers, logic, and/or discrete components that operate to receive real-time images from theimage sensors 302 provided by the image sensor I/F 402. Theimage controller 404 operates to process those images into a real-time 2D image stream that is output to theimage output interface 406. For example, theimage sensors 302 may include multiple image sensors that each output real-time 2D images. Theimage controller 404 operates to combine these multiple real-time images into a real-time 2D image stream where each image provides a wide field of view around theimage sensors 302. For example, each image may provide a 360° field of view around theimage sensors 302. In an embodiment, theimage controller 404 operates to stitch together multiple images received from theimage sensors 302 to form the real-time 2Doutput image stream 410. In one embodiment, theimage controller 404 includes amemory 408 to facilitate combining images from multiple image sensors. - Once acquisition and processing of the image sensor data is complete, the
image controller 404 outputs the real-time 2D image stream 410 to the image output I/F 406, which generates the real-time 2D image stream 314 output. For example, as shown inFIG. 3 , the real-time 2D image stream 314 is output from theimage receiver 306 to theprojector 308. -
FIG. 5 shows a detailed exemplary embodiment of theprojector 308 andmemory 310 shown inFIG. 3 . In an exemplary embodiment, theprojector 308 comprises an image input I/F 502,projection processor 504, and a projection output I/F 506. - In an exemplary embodiment, the image input I/
F 502 comprises at least one of programmable logic, registers, memory, and/or discrete components that operate to receive real-time 2D imagesimage receiver 306 and passes these images to theprojection processor 504. In an exemplary embodiment, the received images may be stored or buffered by the image input I/F 502. - In an exemplary embodiment, the
projection processor 504 comprises at least one of a processor, CPU, gate array, programmable logic, memory, registers, logic, and/or discrete components that operate to receive a stream of real-time 2D images from the image input I/F 502. Theprojection processor 504 projects those images onto real-time 3D environment projections that are output to the projection output I/F 506. - In an exemplary embodiment, the
projection processor 504 retrieves a selected3D shape 318 from the plurality ofshapes 316 stored in thememory 310. Theprojection processor 504 operates to project the received 2D images onto the 3D shape to generate the real-time 3D environment projections. - In an exemplary embodiment, the
projection processor 504 performs a process known as UV mapping that utilizes the 2D images as a texture. The process of UV mapping is known in the art and will not be describe in detail here. However, using the UV mapping process, theprojection processor 504 projects selected portions of the texture into polygons that form the 3D shape to generate the real-time 3D environment projections. This process is further described with respect toFIG. 7B . - In an exemplary embodiment, the
projection processor 504 generates the real-time 3D environment projections by generating one or more of the six faces of a cube map. This is achieved through a process of sampling the 2D environment image or images and re-expressing them using, for example, a 2D projection scheme such as spherical, cylindrical, conic, or azimuthal, as is appropriate for the particular lens configuration in use. For example, two image sensors placed both in opposition to one another and axially aligned may use a spherical projection in order to convert the captured images into components of a cube map. Once the cube map is generated, it may be provided to rendering applications to be used as a reflection source, sky box, or similar. - In an exemplary embodiment, the
projection processor 504 generates the real-time 3D projections by using the acquired real-time 2D image or images of the environment as a lookup table whereby data from the environment images is used in subsequent rendering steps. For example, referencing data derived from the environment image(s) in the scene camera's rendering procedure could result in a sky being rendered that depicts the environment, but without first creating an intermediate texture, cube map, sky box, or other. Similar operations may be used to generate other effects, like reflections. Mips, convolutions, or other intermediate transformations of the environment image(s) may be used to create real-time 3D environment projections as well. - In an exemplary embodiment, the
projection processor 504 generates the real-time 3D environment projections in the form of reflections of the environment on elements in a user interface. In this way, user interface elements may be expressed with particular degrees of “shininess” where the notion of “shine” is achieved by rendering the environment upon the surface of the user interface element. Designers may control the degree of shininess, anywhere from a mirror-like finish to a matte finish, by, for example, convolving the environment image before it is rendered upon the surface of the user interface element. - In an exemplary embodiment, the
projection processor 504 generates the real-time 3D environment projections in the form of light mapped, image-based illumination of a user interface. As image based lighting is a scheme whereby light may be rendered as coming from many directions, and each direction may take on the characteristic color of the light in the environment, the user interface is lit by a high-fidelity representation of the lighting of the environment. - As the real-
time 3D environment projections are generated, they are stored in thememory 310 as indicated at 322. The projection output I/F 506 also outputs the real-time 3D environment projections 324 to other systems or device applications. -
FIG. 6 shows a detailed exemplary embodiment of animage sensor 600 for use with the environment projection system. For example, theimage sensor 600 is suitable for use as part of theimage sensors 302 shown inFIG. 3 . Theimage sensor 600 comprises asensor body 602 that houses animage sensor 606 that is covered by alens 604. For example, thelens 604 may be a hemispherical dome lens. However, other factors such as cost and form factor may affect the choice of lens design for a given implementation. - In this particular embodiment, the
lens 604 operates to provide a wide field of view of the surrounding environment that is captured by theimage sensor 606. In other embodiments, different sensor/lens combinations are used to acquire a wide field of view of the surrounding environment. Evaluation of a particular sensor/lens configuration should consider the accuracy of the system's ability to project the image onto the surface of theimage sensor 606 with minimal aberration. For example, faceted lenses may not be suitable as they may introduce aberration inherent to the way they capture light. -
FIG. 7A shows exemplary 3D shapes 700 for use with the embodiments of the environment projection system. For example, the EPS is operable to utilize 3D shapes such as ahemispherical dome 702,dodecahedron 704,tetrahedron 706, andcube 708. It should be noted that the EPS is suitable for use with other types of 3D shapes and is not limited to utilizing only the shapes shown inFIG. 7A . For example, in an embodiment, the 3D shapes 700 may comprise a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule. -
FIG. 7B shows an exemplary embodiment of the3D shape 702 represented as a3D mesh 710 for use with the embodiments of the environment projection system. For example, the3D mesh 710 may be stored in thememory 310 for use by theprojection processor 504 to generate the real-time 3D environment projections 320. In an exemplary embodiment, the3D mesh 710 comprises polygon shapes (e.g., polygon 712) arranged and configured to form a hemispherical dome onto which real-time 2D images will be projected by theprojection processor 504. For example, in an exemplary embodiment, theprojection processor 504 performs a UV mapping process that utilizes the 2D images as a texture. Using the UV mapping process, theprojection processor 504 projects a selected portion of the texture into each of the polygons of the3D mesh 710 to generate the real-time 3D environment projections. -
FIG. 8 shows a diagram 800 illustrating exemplary operation of an embodiment of the environment projection system. For example, the operations illustrated in the diagram 800 are discussed with reference toFIG. 5 . - In a first operation, the
projection processor 504 selects a 3D shape from the 3D shapes 316 stored in thememory 310. For example, ahemispherical shape 802 is selected. In an exemplary embodiment, theshape 802 is stored as a 3D mesh in thememory 310. In another embodiment, theprojection processor 504 obtains a 3D shape from thememory 310 and converts this shape into a mesh representation. In a second operation, a real-time 2D image is acquired. For example, theimage sensors 302 capture the image shown at 804. In a third operation, theprojection processor 504 performs a projection to project the 2D image onto the selected 3D shape as shown at 806. In an exemplary embodiment, theprojection processor 504 performs a UV mapping operation to generate a real-time 3D environment projection 808, which is stored in a memory (e.g., memory 310) in a fourth operation. The projection may also be output to other programs or devices using the projection output I/F 506. Since this is a real-time process, the next image in the stream of 2D images is acquired in a fifth operation and the process is repeated. Thus, real-time 3D environment projections are generated as a continuous stream corresponding to the captured real-time 2D images. -
FIG. 9 shows a diagram 900 illustrating exemplary operation of an embodiment of the environment projection system. The diagram 900 shows real-time 2D imagestime 3D environment projections 904. In an exemplary embodiment, the real-time images 902 are acquired byimage sensors 302, sent to theimage receiver 306 and processed by theprojector 308. - At time T0, a first real-
time 2D imageimage sensors 302. The2D image 906 includes trees, a building, clouds, and thesun 912. Theprojector 308 projects this first image onto the3D mesh 710 to generate a first3D environment projection 906A also shown at time T0. As illustrated in theenvironment projection 906A, the initial location of the sun is indicated at 912. - At time Tx, another real-
time 2D imagesun 912. Theprojector 308 projects this image onto the3D mesh 710 to generate a3D environment projection 908A shown at time Tx. As illustrated in the2D image 908 andenvironment projection 908A, the location of the sun is indicated at 912. - At time Tn, another real-
time 2D imagesun 912. Theprojector 308 projects this image onto the3D mesh 710 to generate a3D environment projection 910A shown at time Tx. As illustrated in the2D image 910 andenvironment projection 910A, the location of the sun is indicated at 912. - Thus, the
images 900 illustrates how movement of objects (e.g., the sun 912) in the real-time 2D image stream is reflected in movement of those same objects in the corresponding 3D environment projection. -
FIG. 10 shows an exemplary embodiment of amethod 1000 for generating real-time 3D environment projections in accordance with exemplary embodiments of the present invention. For example, themethod 1000 is suitable for use with theEPS 300 shown inFIG. 3 . - At
block 1002, a 3D shape is selected. For example, theprojector 308 selects a 3D shape from the 3D shapes 316 stored in thememory 310. In an exemplary embodiment, the 3D shapes include spherical, tetrahedral, box and/or any other suitable 3D shape. In an exemplary embodiment, the shape selection is performed during an initialization phase or configuration phase of the operation of theEPS 300. In another exemplary embodiment, the selection of the 3D shape is performed during runtime. In an exemplary embodiment, the 3D shapes 316 are stored in thememory 310 as 3D mesh shapes. - At
block 1004, a real-time 2D image is acquired. For example, in an exemplary embodiment, the 2D image is acquired from one ormore image sensors 302. For example, the image sensors can be part of a camera system attached to a hand-held device. In one embodiment, the acquired image provides a 360° field of view of the region surrounding the location of the image sensors. In an exemplary embodiment, theimage sensors 302 output images at a frame rate of 30 fps. - At
block 1006, an optional operation is performed to combine images from multiple sensors into the acquired real-time 2D image. For example, if several images are acquired by multiple image sensors, these images are combined into one image by connecting the images together or otherwise stitching the images to form one real-time 2D image. In an exemplary embodiment, theimage controller 404 performs this operation. - At
block 1008, the real-time 2D image is projected onto the selected 3D shape to generate a real-time 3D environment projection. In an exemplary embodiment, theprojection processor 504 performs a UV mapping operation to project the 2D images onto the 3D shape. In other embodiments, theprojection processor 504 performs one or more of Cube Mapping, Lookup Table processing, Environment Reflections processing or Light Mapped Illumination processing as described above, to generate the real-time 3D environment projection. - At
block 1010, the real-time 3D environment projection is stored in a memory. For example, theprojector 308 stores the real-time 3D environment projection into thememory 310. - At
block 1012, the environment projection is output. For example, the real-time 3D environment projection is output for display or is passed to another device or application by the projection output I/F 506. The method then proceeds to block 1004 to acquire the next real-time 2D image for processing. - Thus, the
method 1000 operates to generate a real-time 3D environment projection in accordance with exemplary embodiments of the present invention. It should be noted that although themethod 1000 describes specific operation, these operations may be changed, modified, rearranged, added to, and subtracted from within the scope of the embodiments. - While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from these exemplary embodiments of the present invention and its broader aspects. Therefore, the appended claims are intended to encompass within their scope all such changes and modifications as are within the true spirit and scope of this exemplary embodiments of the present invention.
Claims (24)
1. A method, comprising:
acquiring a real-time two-dimensional (2D) image of an environment; and
projecting the real-time 2D image of the environment onto a 3D shape to generate a real-time three-dimensional (3D) environment projection.
2. The method of claim 1 , wherein the operation of projecting comprises UV-mapping the real-time 2D image of the environment onto the 3D shape.
3. The method of claim 1 , wherein the operation of projecting comprises processing the real-time 2D image of the environment to generate one or more of six faces of a cube map.
4. The method of claim 1 , wherein the operation of projecting comprises using the real-time 2D image of the environment as a lookup table to render the environment in a 3D scene.
5. The method of claim 1 , wherein the operation of acquiring comprises acquiring the real-time 2D image of the environment from one or more image sensors.
6. The method of claim 5 , wherein the operation of acquiring comprises acquiring the real-time 2D image of the environment to form a 360° field of view.
7. The method of claim 1 , further comprising storing the real-time 2D image of the environment in a memory.
8. The method of claim 1 , further comprising storing the 3D environment projection in a memory.
9. The method of claim 1 , further comprising selecting the 3D shape from a plurality of 3D shapes.
10. The method of claim 9 , wherein the 3D shape is selected from a set of 3D shapes comprising a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule.
11. The method of claim 1 , further comprising performing the method on at least one of a handheld device, desktop computer, and laptop computer.
12. The method of claim 11 , further comprising displaying the environment projection on a device display.
13. An apparatus, comprising:
an image receiver that acquires a real-time two-dimensional (2D) image of an environment; and
a projector that projects the real-time 2D image of the environment onto a three-dimensional (3D) shape to generate a real-time 3D environment projection.
14. The apparatus of claim 13 , wherein the projector performs UV-mapping to project the real-time 2D image of the environment onto the 3D shape to generate the 3D environment projection.
15. The apparatus of claim 13 , wherein the image receiver acquires the real-time 2D image of the environment from one or more image sensors.
16. The apparatus of claim 15 , wherein the one or more image sensors capture the real-time 2D image of the environment in a 360° field of view.
17. The apparatus of claim 13 , further comprising a memory that stores the real-time 2D image of the environment.
18. The apparatus of claim 17 , wherein the memory stores the 3D environment projection.
19. The apparatus of claim 13 , wherein the projector selects the 3D shape from a plurality of 3D shapes.
20. The apparatus of claim 13 , wherein the 3D shape is selected from a set of 3D shapes comprising a sphere, a box, a tetrahedron, a pyramid, a cone, and a capsule.
21. The apparatus of claim 13 , wherein the apparatus is located in at least one of a handheld device, desktop computer, and laptop computer.
22. The apparatus of claim 21 , wherein the projector outputs the 3D environment projection for display on a device display.
23. The apparatus of claim 13 , wherein the projector processes the real-time 2D image of the environment to generate one or more of six faces of a cube map.
24. The apparatus of claim 13 , wherein the projector uses the real-time 2D image of the environment as a lookup table to render the environment in a 3D scene.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/693,076 US20190007672A1 (en) | 2017-06-30 | 2017-08-31 | Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762527778P | 2017-06-30 | 2017-06-30 | |
US15/693,076 US20190007672A1 (en) | 2017-06-30 | 2017-08-31 | Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190007672A1 true US20190007672A1 (en) | 2019-01-03 |
Family
ID=64738780
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/693,076 Abandoned US20190007672A1 (en) | 2017-06-30 | 2017-08-31 | Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections |
US15/801,095 Expired - Fee Related US10540809B2 (en) | 2017-06-30 | 2017-11-01 | Methods and apparatus for tracking a light source in an environment surrounding a device |
US16/163,305 Abandoned US20190066366A1 (en) | 2017-06-30 | 2018-10-17 | Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/801,095 Expired - Fee Related US10540809B2 (en) | 2017-06-30 | 2017-11-01 | Methods and apparatus for tracking a light source in an environment surrounding a device |
US16/163,305 Abandoned US20190066366A1 (en) | 2017-06-30 | 2018-10-17 | Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting |
Country Status (1)
Country | Link |
---|---|
US (3) | US20190007672A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11127212B1 (en) * | 2017-08-24 | 2021-09-21 | Sean Asher Wilens | Method of projecting virtual reality imagery for augmenting real world objects and surfaces |
US20210321058A1 (en) * | 2018-06-22 | 2021-10-14 | Lg Electronics Inc. | Method for transmitting 360-degree video, method for providing a user interface for 360-degree video, apparatus for transmitting 360-degree video, and apparatus for providing a user interface for 360-degree video |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10916062B1 (en) * | 2019-07-15 | 2021-02-09 | Google Llc | 6-DoF tracking using visual cues |
US20240404098A1 (en) * | 2023-06-02 | 2024-12-05 | Qualcomm Incorporated | Using light and shadow in vision-aided precise positioning |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3571954A (en) * | 1966-05-13 | 1971-03-23 | Planetaria Inc | Space transit simulator planetarium |
US4297723A (en) * | 1980-01-28 | 1981-10-27 | The Singer Company | Wide angle laser display system |
US4634384A (en) * | 1984-02-02 | 1987-01-06 | General Electric Company | Head and/or eye tracked optically blended display system |
US5642293A (en) * | 1996-06-03 | 1997-06-24 | Camsys, Inc. | Method and apparatus for determining surface profile and/or surface strain |
US20010056574A1 (en) * | 2000-06-26 | 2001-12-27 | Richards Angus Duncan | VTV system |
US20080024594A1 (en) * | 2004-05-19 | 2008-01-31 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
US20090021576A1 (en) * | 2007-07-18 | 2009-01-22 | Samsung Electronics Co., Ltd. | Panoramic image production |
US20090115916A1 (en) * | 2007-11-06 | 2009-05-07 | Satoshi Kondo | Projector and projection method |
US20110285810A1 (en) * | 2010-05-21 | 2011-11-24 | Qualcomm Incorporated | Visual Tracking Using Panoramas on Mobile Devices |
US20120262540A1 (en) * | 2011-04-18 | 2012-10-18 | Eyesee360, Inc. | Apparatus and Method for Panoramic Video Imaging with Mobile Computing Devices |
US20140267593A1 (en) * | 2013-03-14 | 2014-09-18 | Snu R&Db Foundation | Method for processing image and electronic device thereof |
US20160328824A1 (en) * | 2013-12-09 | 2016-11-10 | Cj Cgv Co., Ltd. | Method and system for generating multi-projection images |
US20170026659A1 (en) * | 2015-10-13 | 2017-01-26 | Mediatek Inc. | Partial Decoding For Arbitrary View Angle And Line Buffer Reduction For Virtual Reality Video |
US9582731B1 (en) * | 2014-04-15 | 2017-02-28 | Google Inc. | Detecting spherical images |
US9582741B2 (en) * | 2011-12-01 | 2017-02-28 | Xerox Corporation | System diagnostic tools for printmaking devices |
Family Cites Families (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB8827952D0 (en) | 1988-11-30 | 1989-01-05 | Screen Form Inc | Display device |
US5638116A (en) | 1993-09-08 | 1997-06-10 | Sumitomo Electric Industries, Ltd. | Object recognition apparatus and method |
US6369830B1 (en) | 1999-05-10 | 2002-04-09 | Apple Computer, Inc. | Rendering translucent layers in a display system |
US6559853B1 (en) | 2000-02-16 | 2003-05-06 | Enroute, Inc. | Environment map creation using texture projections with polygonal curved surfaces |
US7742073B1 (en) | 2000-11-01 | 2010-06-22 | Koninklijke Philips Electronics N.V. | Method and apparatus for tracking an object of interest using a camera associated with a hand-held processing device |
US7009663B2 (en) | 2003-12-17 | 2006-03-07 | Planar Systems, Inc. | Integrated optical light sensitive active matrix liquid crystal display |
US7883415B2 (en) | 2003-09-15 | 2011-02-08 | Sony Computer Entertainment Inc. | Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion |
JP4532856B2 (en) * | 2003-07-08 | 2010-08-25 | キヤノン株式会社 | Position and orientation measurement method and apparatus |
US7694233B1 (en) | 2004-04-30 | 2010-04-06 | Apple Inc. | User interface presentation of information in reconfigured or overlapping containers |
KR101112735B1 (en) | 2005-04-08 | 2012-03-13 | 삼성전자주식회사 | 3D display apparatus using hybrid tracking system |
US7657849B2 (en) | 2005-12-23 | 2010-02-02 | Apple Inc. | Unlocking a device by performing gestures on an unlock image |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US8180112B2 (en) | 2008-01-21 | 2012-05-15 | Eastman Kodak Company | Enabling persistent recognition of individuals in images |
US8291341B2 (en) | 2008-05-28 | 2012-10-16 | Google Inc. | Accelerated panning user interface interactions |
US8098894B2 (en) | 2008-06-20 | 2012-01-17 | Yahoo! Inc. | Mobile imaging device as navigator |
CN111522493A (en) | 2008-08-22 | 2020-08-11 | 谷歌有限责任公司 | Navigation in three-dimensional environment on mobile device |
US8788977B2 (en) | 2008-11-20 | 2014-07-22 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
JP5087532B2 (en) | 2008-12-05 | 2012-12-05 | ソニーモバイルコミュニケーションズ株式会社 | Terminal device, display control method, and display control program |
US20100275122A1 (en) | 2009-04-27 | 2010-10-28 | Microsoft Corporation | Click-through controller for mobile interaction |
JP5393318B2 (en) | 2009-07-28 | 2014-01-22 | キヤノン株式会社 | Position and orientation measurement method and apparatus |
US8762846B2 (en) | 2009-11-16 | 2014-06-24 | Broadcom Corporation | Method and system for adaptive viewport for a mobile device based on viewing angle |
US8913004B1 (en) | 2010-03-05 | 2014-12-16 | Amazon Technologies, Inc. | Action based device control |
US8922480B1 (en) | 2010-03-05 | 2014-12-30 | Amazon Technologies, Inc. | Viewer-based device control |
US9218119B2 (en) * | 2010-03-25 | 2015-12-22 | Blackberry Limited | System and method for gesture detection and feedback |
US8581905B2 (en) | 2010-04-08 | 2013-11-12 | Disney Enterprises, Inc. | Interactive three dimensional displays on handheld devices |
US9411413B2 (en) * | 2010-08-04 | 2016-08-09 | Apple Inc. | Three dimensional user interface effects on a display |
JP5960796B2 (en) * | 2011-03-29 | 2016-08-02 | クアルコム,インコーポレイテッド | Modular mobile connected pico projector for local multi-user collaboration |
US20120314899A1 (en) | 2011-06-13 | 2012-12-13 | Microsoft Corporation | Natural user interfaces for mobile image viewing |
US9880640B2 (en) | 2011-10-06 | 2018-01-30 | Amazon Technologies, Inc. | Multi-dimensional interface |
US9324183B2 (en) | 2011-11-29 | 2016-04-26 | Apple Inc. | Dynamic graphical interface shadows |
US20160292924A1 (en) * | 2012-10-31 | 2016-10-06 | Sulon Technologies Inc. | System and method for augmented reality and virtual reality applications |
US20150370322A1 (en) * | 2014-06-18 | 2015-12-24 | Advanced Micro Devices, Inc. | Method and apparatus for bezel mitigation with head tracking |
US10311644B2 (en) * | 2016-12-14 | 2019-06-04 | II Jonathan M. Rodriguez | Systems and methods for creating and sharing a 3-dimensional augmented reality space |
US20170169617A1 (en) * | 2015-12-14 | 2017-06-15 | II Jonathan M. Rodriguez | Systems and Methods for Creating and Sharing a 3-Dimensional Augmented Reality Space |
US10095928B2 (en) * | 2015-12-22 | 2018-10-09 | WorldViz, Inc. | Methods and systems for marker identification |
US10065049B2 (en) * | 2016-01-25 | 2018-09-04 | Accuray Incorporated | Presenting a sequence of images associated with a motion model |
-
2017
- 2017-08-31 US US15/693,076 patent/US20190007672A1/en not_active Abandoned
- 2017-11-01 US US15/801,095 patent/US10540809B2/en not_active Expired - Fee Related
-
2018
- 2018-10-17 US US16/163,305 patent/US20190066366A1/en not_active Abandoned
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3571954A (en) * | 1966-05-13 | 1971-03-23 | Planetaria Inc | Space transit simulator planetarium |
US4297723A (en) * | 1980-01-28 | 1981-10-27 | The Singer Company | Wide angle laser display system |
US4634384A (en) * | 1984-02-02 | 1987-01-06 | General Electric Company | Head and/or eye tracked optically blended display system |
US5642293A (en) * | 1996-06-03 | 1997-06-24 | Camsys, Inc. | Method and apparatus for determining surface profile and/or surface strain |
US20010056574A1 (en) * | 2000-06-26 | 2001-12-27 | Richards Angus Duncan | VTV system |
US20080024594A1 (en) * | 2004-05-19 | 2008-01-31 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
US20090021576A1 (en) * | 2007-07-18 | 2009-01-22 | Samsung Electronics Co., Ltd. | Panoramic image production |
US20090115916A1 (en) * | 2007-11-06 | 2009-05-07 | Satoshi Kondo | Projector and projection method |
US20110285810A1 (en) * | 2010-05-21 | 2011-11-24 | Qualcomm Incorporated | Visual Tracking Using Panoramas on Mobile Devices |
US20120262540A1 (en) * | 2011-04-18 | 2012-10-18 | Eyesee360, Inc. | Apparatus and Method for Panoramic Video Imaging with Mobile Computing Devices |
US9582741B2 (en) * | 2011-12-01 | 2017-02-28 | Xerox Corporation | System diagnostic tools for printmaking devices |
US20140267593A1 (en) * | 2013-03-14 | 2014-09-18 | Snu R&Db Foundation | Method for processing image and electronic device thereof |
US20160328824A1 (en) * | 2013-12-09 | 2016-11-10 | Cj Cgv Co., Ltd. | Method and system for generating multi-projection images |
US9582731B1 (en) * | 2014-04-15 | 2017-02-28 | Google Inc. | Detecting spherical images |
US20170026659A1 (en) * | 2015-10-13 | 2017-01-26 | Mediatek Inc. | Partial Decoding For Arbitrary View Angle And Line Buffer Reduction For Virtual Reality Video |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11127212B1 (en) * | 2017-08-24 | 2021-09-21 | Sean Asher Wilens | Method of projecting virtual reality imagery for augmenting real world objects and surfaces |
US20210321058A1 (en) * | 2018-06-22 | 2021-10-14 | Lg Electronics Inc. | Method for transmitting 360-degree video, method for providing a user interface for 360-degree video, apparatus for transmitting 360-degree video, and apparatus for providing a user interface for 360-degree video |
US11831855B2 (en) * | 2018-06-22 | 2023-11-28 | Lg Electronics Inc. | Method for transmitting 360-degree video, method for providing a user interface for 360-degree video, apparatus for transmitting 360-degree video, and apparatus for providing a user interface for 360-degree video |
Also Published As
Publication number | Publication date |
---|---|
US10540809B2 (en) | 2020-01-21 |
US20190066366A1 (en) | 2019-02-28 |
US20190005675A1 (en) | 2019-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110517355B (en) | Ambient composition for illuminating mixed reality objects | |
JP5676092B2 (en) | Panorama image generation method and panorama image generation program | |
US20200372604A1 (en) | Equatorial stitching of hemispherical images in a spherical image capture system | |
US10650592B2 (en) | Methods and apparatus for providing rotated spherical viewpoints | |
US20190007672A1 (en) | Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections | |
US10325391B2 (en) | Oriented image stitching for spherical image content | |
CN106558017B (en) | Spherical display image processing method and system | |
US20140085295A1 (en) | Direct environmental mapping method and system | |
WO2017029885A1 (en) | Image generating device and image display control device | |
US11727654B2 (en) | Ambient light based mixed reality object rendering | |
US10664947B2 (en) | Image processing apparatus and image processing method to represent part of spherical image in planar image using equidistant cylindrical projection | |
JPH11175762A (en) | Light environment measuring instrument and device and method for shading virtual image using same | |
US20230147474A1 (en) | Information processing apparatus to display image of illuminated target object, method of information processing, and storage medium | |
JP2019164782A (en) | Image processing apparatus, image capturing system, image processing method, and program | |
US11164366B2 (en) | Mixed reality object rendering based on environment lighting | |
JP2020173529A (en) | Information processing device, information processing method, and program | |
CN110192221B (en) | Image generating apparatus and image display control apparatus | |
JP7150460B2 (en) | Image processing device and image processing method | |
Schwandt et al. | Glossy reflections for mixed reality environments on mobile devices | |
EP4609360A1 (en) | Appearance capture | |
EP3322186A1 (en) | Method and device for transmitting data representative of an image | |
US11388336B2 (en) | Horizontal calibration method and system for panoramic image or video, and portable terminal | |
EP3330839A1 (en) | Method and device for adapting an immersive content to the field of view of a user | |
US20210192679A1 (en) | Electronic device and omni-directional image display method of electronic device | |
CN116012511A (en) | Texture display method and device and electronic terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |