US20150193982A1 - Augmented reality overlays using position and orientation to facilitate interactions between electronic devices - Google Patents
Augmented reality overlays using position and orientation to facilitate interactions between electronic devices Download PDFInfo
- Publication number
- US20150193982A1 US20150193982A1 US14/588,515 US201514588515A US2015193982A1 US 20150193982 A1 US20150193982 A1 US 20150193982A1 US 201514588515 A US201514588515 A US 201514588515A US 2015193982 A1 US2015193982 A1 US 2015193982A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- virtual object
- orientation
- augmented reality
- reality overlay
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
-
- H04L67/38—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/026—Services making use of location information using location based information parameters using orientation information, e.g. compass
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/029—Location-based management or tracking services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/33—Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/04—Architectural design, interior design
Definitions
- the present application generally relates to the use of augmented reality overlays in electronic devices and, more particularly, to the use of three-dimensional mapping to facilitate location-based augmented reality overlays in electronic devices.
- Computing-enabled cellular phones, tablet computers, and other portable electronic devices increasing incorporate location-based services to provide enhanced user-machine interactions.
- these location-based services rely on a static and unrealistic representation of the local environment of the portable electronic device, and thus often lead to user dissatisfaction.
- FIG. 1 is a diagram illustrating an electronic device utilizing position and orientation information to facilitate location-based services in accordance with at least one embodiment of the present disclosure.
- FIG. 2 is a diagram illustrating an example system implementation of the electronic device of FIG. 1 in accordance with at least one embodiment of the present disclosure.
- FIG. 3 is a diagram illustrating provision of a density map of electronic devices in a facility or other location in accordance with at least one embodiment of the present disclosure.
- FIG. 4 is a diagram illustrating an example of an augmented reality interaction between two devices based on their physical orientation to each other in accordance with at least one embodiment of the present disclosure.
- FIG. 5 is a diagram illustrating another example of an augmented reality interaction between two devices based on their physical orientation to each other in accordance with at least one embodiment of the present disclosure.
- FIG. 6 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which depictions of physical objects are replaced by depictions of virtual objects having matching geometry in accordance with at least one embodiment of the present disclosure.
- FIG. 7 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which navigation aids are depicted in accordance with at least one embodiment of the present disclosure.
- FIG. 8A is a diagram illustrating an example of measuring a dimension of an object using a change in position of an electronic device in accordance with at least one embodiment of the present disclosure.
- FIG. 8B is a diagram illustrating an example of measuring a dimension of an object using an electronic device based on 3D mapping of the object in accordance with at least one embodiment of the present disclosure.
- FIG. 9 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which a virtual pet is depicted in accordance with at least one embodiment of the present disclosure.
- FIG. 10 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which instructional assistance is depicted in accordance with at least one embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which prior user path information or other visual pedometry information is depicted in accordance with at least one embodiment of the present disclosure.
- FIG. 12 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which a user incentivized mapping game is depicted in accordance with at least one embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a virtual reality display at an electronic device in which a remote environment or virtual environment is depicted based on movement of the electronic device in a local environment in accordance with at least one embodiment of the present disclosure.
- FIGS. 1-13 illustrate various techniques for leveraging the position and orientation tracking capabilities of an electronic device, along with depth information and other 3D mapping information available to the electronic device, to provide enhanced location-based services.
- These location-based services may take the form of enhanced augmented reality (AR) functionality that formats or otherwise controls graphical information presented in an AR overlay so as to be visually compatible with the geometries of the objects within a local environment, or to effectively convey related position/orientation information.
- AR enhanced augmented reality
- These location-based services further may take the form of virtual reality (VR) functionality that maps the position and orientation of an electronic device in the local environment to corresponding perspective views of a remote environment or virtual environment.
- VR virtual reality
- FIG. 1 illustrates an electronic device 100 configured to support location-based functionality, such as simultaneous location and mapping (SLAM), visual odometry, augmented reality (AR), and virtual reality (VR) techniques, using image and non-image sensor data in accordance with at least one embodiment of the present disclosure.
- the electronic device 100 can include a portable user device, such as a tablet computer, computing-enabled cellular phone (e.g., a “smartphone”), a notebook computer, a personal digital assistant (PDA), a gaming system remote, a television remote, and the like.
- the electronic device 100 can include a fixture device, such as medical imaging equipment, a security imaging camera system, an industrial robot control system, a drone control system, and the like.
- the electronic device 100 is generally described herein in the example context of a portable user device, such as a tablet computer or a smartphone; however, the electronic device 100 is not limited to these example implementations.
- the electronic device 100 includes a housing 102 containing or otherwise incorporating a display 104 and a plurality of sensors to obtain information regarding a local environment of the electronic device 100 .
- These sensors can include image-based sensors, such as imaging cameras 106 , 108 to obtain visual information (imagery) for the local environment in the form of pictures or video.
- These sensors also may include non-image-based sensors, such as Global Positioning System (GPS) receivers, wireless interfaces (e.g., IEEE 802.11 transceivers or BluetoothTM receivers), accelerometers, gyroscopes, magnometers, ambient light sensors, and the like.
- GPS Global Positioning System
- wireless interfaces e.g., IEEE 802.11 transceivers or BluetoothTM receivers
- accelerometers e.g., gyroscopes, magnometers, ambient light sensors, and the like.
- the electronic device 100 uses the information generated by some or all of these sensors to determine one or both of a position and orientation of the electronic device 100 in support of various location-based services available via the electronic device 100 .
- This position and orientation (referred to herein in the alternative or collective as “position/orientation” or “pose”) may be an absolute position/orientation (e.g., a GPS coordinate and compass orientation) or a relative position/orientation (that is, relative to the local environment), or a combination thereof.
- a magnometer, gyroscope, and accelerometer may be used to determine a relative position (or change in position) and orientation (relative to gravity), and a GPS sensor may be used to provide an absolute position.
- SLAM or other visual odometry methods may be used to provide a position and orientation relative to a specified location or within a specified facility or map.
- the electronic device 100 uses imagery from the local environment to support object recognition functionality.
- the electronic device 100 incorporates depth sensing via a depth sensor in order to determine its distance from various objects in the local environment, whereby this information may be used for visual telemetry purposes or for identifying the objects.
- this depth sensor may be implemented by positioning and orienting the imaging cameras 106 , 108 such that their fields of view overlap starting at a specified distance from the electronic device 100 , thereby enabling depth sensing of objects in the local environment that are positioned in the region of overlapping fields of view via multiview image analysis.
- the electronic device 100 implements the depth sensor in the form of a structured light projector (not shown in FIG.
- the depth sensor then may calculate the depths of the objects, that is, the distances of the objects from the electronic device 100 , based on the analysis of the depth imagery.
- the resulting depth information obtained from the depth sensor may be used to calibrate or otherwise augment depth information obtained from multiview analysis (e.g., stereoscopic analysis) of the image data captured by the imaging cameras 106 , 108 .
- the depth information from the depth sensor may be used in place of depth information obtained from multiview analysis.
- multiview analysis typically is better suited for bright lighting conditions and when the objects are relatively distant, whereas modulated light-based depth sensing is better suited for lower light conditions or when the observed objects are relatively close (e.g., within 4-5 meters).
- the electronic device 100 may elect to use multiview analysis to determine object depths.
- the electronic device 100 may switch to using modulated light-based depth sensing via the depth sensor.
- Position/orientation information 122 and other sensor information 124 obtained by the electronic device 100 can be used to support any of a variety of location-based functionality, such as visual odometry or other SLAM functionality, augmented reality (AR) functionality, virtual reality (VR) functionality, and the like.
- the electronic device 100 can 3D map the local environment and then use this mapping to facilitate the user's navigation through the local environment, such as by displaying to the user a floor plan generated from the mapping information and an indicator of the user's current location relative to the floor plan as determined from the current relative position of the electronic device 100 .
- the relative position/orientation information 122 obtained by the electronic device 100 can be combined with supplemental information 126 to present an AR view of the local environment to a user via the display 104 of the electronic device 100 .
- This supplemental information 126 can include one or more AR databases locally stored at the electronic device 100 or remotely accessible by the electronic device 100 from a cloud service or other remote processing system via a wired or wireless network.
- the local environment is a museum of art
- the electronic device 100 is in communication with a remote server or other cloud service that maintains a database of information regarding the artwork currently on display in the museum (one example of supplemental information 126 ).
- This database identifies the corresponding artwork based on relative locations within the museum.
- the electronic device 100 can determine a relative orientation/position of the electronic device 100 as described above and herein, and from this relative orientation/position determine the identity of artwork 130 captured by the imaging cameras 106 , 108 based on the point of view of the imaging cameras 106 , 108 while in this orientation/position.
- the electronic device 100 can query the database to determine information regarding the artwork 130 , and provide a graphical representation of this information as a graphical overlay 132 associated with the depiction of the artwork 130 on the display 104 of the electronic device 100 .
- the position/orientation of the electronic device 100 may be used to control the user's interactions with systems in the local environment.
- the user may speak the voice command “lower temperature” into the electronic device 100 while standing in the user's living room.
- the electronic device 100 sends its current position and the command “lower temperature” to a home automation system in communication with the electronic device 100 , and the home automations system, using the position information, lowers the temperature in the living room in response to the command.
- the user may tap the electronic device 100 against a lamp in the living room, and the home automation system may detect this motion, identify the lamp as being the closest object to the electronic device 100 , and thus turn on the lamp in response.
- location-based services utilizing one or more of the position/orientation information 122 and other sensor information obtained by the various sensors of the electronic device 100 , in combination with supplemental information 126 , are described in detail below with reference to FIGS. 3-13 .
- FIG. 2 illustrates an example implementation of the electronic device 100 in accordance with at least one embodiment of the present disclosure.
- the electronic device 100 includes the imaging cameras 106 , 108 , at least one processor 202 (e.g., a central processing device (CPU)), one or more non-transitory computer readable storage media, such as system memory 204 , flash memory 206 , or another storage device (e.g., optical or magnetic disc drive, solid state hard drive, etc.), a wireless network interface 208 (e.g., a wireless local area network (WAN) interface, a cellular data network interface, etc.), a user interface (UI) 210 , a set 212 of non-image sensors, and a structured light projector 214 interconnected via one or more busses 216 or other interconnects.
- processor 202 e.g., a central processing device (CPU)
- non-transitory computer readable storage media such as system memory 204 , flash memory 206 , or another storage device (e.g
- the UI 210 includes, for example, the display 104 , a speaker or microphone 218 , a keypad or touchscreen 220 , as well as other input/output components to receive input from, or provide information to, a user.
- the set 212 of non-image sensors can include any of a variety and combination of non-image sensors used to provide non-image context or state of the electronic device 100 , such as a GPS receiver 222 , a gyroscope 24 , an accelerometer 226 , a magnometer 228 , and an ambient light sensor 229 .
- the imaging camera 106 comprises a wide-angle imaging camera and the imaging camera 108 comprises a narrow-angle imaging camera.
- the imaging cameras 106 , 108 may be used to capture user-viewable imagery of the local environment, as well as to provide depth information using multiview analysis in certain conditions (e.g., high ambient light conditions). In other conditions (e.g., low ambient light conditions), depth information may be captured by a depth sensor 230 formed from a combination of the modulated light projector 214 and one or both of the imaging cameras 106 , 108 .
- the modulated light projector 214 projects modulated light patterns into the local environment, and uses one or both of imaging cameras 106 , 108 to capture reflections of the modulated light patterns as they reflect back from objects in the local environment.
- modulated light patterns can be either spatially-modulated light patterns or temporally-modulated light patterns.
- the projection of a modulated light pattern is referred to herein as a “modulated light flash” and the captured reflections of a modulated light flash are referred to herein as “depth images” or “depth imagery.”
- the depth sensor 230 then may calculate the depths of the objects, that is, the distances of the objects from the electronic device 100 , based on the analysis of the depth imagery.
- the one or more processors 202 execute one or more software applications 232 representing a set of executable instructions stored at a computer readable storage medium, such as the system memory 204 or flash memory 206 .
- the executable instructions of the software application 232 when executed, manipulate the one or more processors 202 to perform various software-based functionality to implement at least a portion of the techniques described herein, provide visual information via the display 104 , respond to user input via the UI 210 and the like.
- the software application 232 implements two modules: a position/orientation tracking module 234 ; and a location-based services module 236 .
- the position/orientation tracking module 234 operates to track the position/orientation of the electronic device 100 via SLAM or visual telemetry techniques using the depth sensor 230 and the imaging cameras 106 , 108 , via the set 212 of non-image sensors (e.g., GPS location tracking via the GPS receiver 222 and orientation tracking via the gyroscope 224 , accelerometer 226 , and magnometer 228 ), or a combination thereof.
- non-image sensors e.g., GPS location tracking via the GPS receiver 222 and orientation tracking via the gyroscope 224 , accelerometer 226 , and magnometer 228 .
- the location-based services module 236 uses the position/orientation tracking information to provide one or more location-based services to the user via the electronic device 100 , examples of which are described in detail below. In providing these location-based services, the location-based services module 236 may make use of supplemental information 126 ( FIG. 1 ) obtained from one or more local or remote databases. To illustrate, the location-based services module 236 may maintain a local datastore of previous waypoints of the electronic device 100 and imagery captured at each waypoint in order to provide an AR overlay in the imagery (video) displayed at the display 104 .
- the electronic device 100 may access a cloud service or other remote processing system 240 via one or more networks 242 (e.g., a local area network, wide area network, or the internet) to obtain supplemental information for a provided service, such as AR overlay information, VR imagery, location information for other devices, and the like.
- networks 242 e.g., a local area network, wide area network, or the internet
- FIG. 3 illustrates an example implementation of a location-based service to provide density mapping of electronic devices at a facility or other location in accordance with at least one embodiment.
- the locations of portable electronic devices such as smart phones, tablets, and smart watches, at a facility or other area are monitored by the devices themselves or by a remote system, and a density map is generated from the current locations of the portable electronic devices.
- the density map may take the form of individualized density map 302 , whereby the location of each monitored portable electronic device within a facility is separately indicated by a corresponding icon 304 within a floorplan 306 , map, or other spatial representation of the facility or area.
- the density map may take the form of non-individualized density map 308 , whereby an indication of a number of devices within a given area, or density of devices, is represented via shading 310 , coloring, or other visual indicia within the floorplan 306 or other spatial representation of the facility or area, and such that the location of any particular electronic device is not readily discerned from the non-individualized density map 308 .
- a filter may be implemented so that device densities below a certain threshold are omitted from inclusion in the density map.
- the individualized density map 302 may be utilized when the location of the portable electronic device is sufficiently granular.
- each participating portable electronic device within the facility may use a software application or operating system (OS) feature that tracks and reports the position/orientation of the electronic device within ten centimeters using SLAM, visual odometry, or readings from the GPS sensor 222 ( FIG. 2 ) and the gyroscope 224 ( FIG. 2 ) and periodically reports the current position/orientation of the electronic device to a remote processing system via, e.g., a wireless connection. From this, the remote processing system may then collate the current positions/orientations of the reporting electronic devices and generate the density map 302 for the corresponding point in time.
- OS operating system
- the non-individualized density map 308 may be utilized when there is insufficient location accuracy to accurately report the locations of the participating electronic devices, or in the event that privacy or security concerns preclude the identification of the specific location of any particular electronic device.
- the facility may provide wireless network access to the electronic devices via a set of access points distributed through the facility in known locations and with known or estimated coverage areas, and a cloud service may make use of the number or frequency (or “density”) of queries to an access point by the electronic devices as an indicator of the number, or density, of electronic devices within the coverage area of that access point. From this information, the cloud service can then construct the non-individualized density map 308 with each coverage area colored or shaded according to its current device query “density.”
- the density map illustrates the number, or density, of electronic devices within areas of a facility or other location
- the density map implies the popularity of each particular area among the device users, and thus can serve as a powerful analytical tool.
- the density map can be used to support retail analytics, such as determining where to post sales associates on the floor of a retail shop at any given time based on the density and distribution of customers depicted in the density map.
- Security operations likewise can identify where to focus security personnel within a facility based on the density map.
- a building manager can identify where to most optimally activate heating-ventilation-air-conditioning (HVAC) equipment within a building based on a current density map.
- HVAC heating-ventilation-air-conditioning
- changes in the density map over time such as density flow within a given day, also may be analyzed to more fully optimize systems and services.
- a museum may observe the changes in the density map over the day to identify the displayed objects that are of highest interest to the museum patrons and the order in which the museum patrons visit the objects, and then more effectively position the objects within the museum facility based on this analysis.
- the density map is generated by a remote processing system and maintained for use by entities affiliated with the operator of the remote processing system, such as in the HVAC management example above.
- the density map may be transmitted to the electronic devices in the facility for display to the user.
- the density map may be used to highlight the current concentration of visitors to a music festival, and this density map may be provided to the electronic devices of the visitors so that they can use the density map to identify the most popular venues or other points of interest and plan their visit accordingly.
- the locations of the individual electronic devices within the facility can also be linked to other sensor available by the electronic device so as to obtain more information about some aspect of the facility.
- the tight timing synchronization afforded by GPS signaling can be leveraged by a cloud service so as to direct a number of electronic devices within a specified area of the facility to concurrently or sequentially capture imagery of the specified area from their different perspectives, and then this imagery can be stitched together by the cloud service so as to provide, for example, a 360-degree 3D view of an object in the specified area, or to provide a “bullet time” rendition of an object or event by stitching together a sequence of images captured of an area or object from the different perspectives of the participating electronic devices.
- FIGS. 4 and 5 illustrate example implementations of a location-based service that controls interactions between two or more electronic devices in an augmented reality setting based on the physical orientation/position of the two or more electronic devices relative to each other in accordance with at least one embodiment.
- the electronic device 100 can provide AR functionality whereby video or other imagery captured by the electronic device 100 is displayed at the display 104 in combination with an AR overlay that includes additional information or graphical objects. Often this AR functionality extends over multiple electronic devices (e.g., through the use of a cloud service as an intermediary) such that interactions between the electronic devices are reflected in some manner in the AR overlays of the electronic devices.
- the electronic devices can use their current position/orientation information to determine their physical position/orientation relative to each other, and this physical position/orientation can control or impact how the AR interactions are depicted within the AR overlays.
- two electronic devices 402 , 404 are cooperating to provide a shared AR functionality whereby the AR overlay of each electronic device coordinates with respect to the local environment shared by both electronic device 402 , 404 .
- This AR overlay includes a virtual object 406 , such as a graphical representation of a disc.
- the electronic devices 402 , 404 communicate their respective positions/orientations, and from this the AR functionality determines that the electronic devices 402 , 404 are physically oriented to be side-by-side with the electronic device 402 to the left of the electronic device 404 .
- the virtual object 406 is located in the AR overlay of the electronic device 402 , and the user uses a touchscreen to effect a left-to-right swiping motion 407 to initiate movement of the virtual object 406 to the right.
- a user of the electronic device 402 may be standing in a room with a television to the front of the electronic device 402 and a stereo system to the right of the electronic device 402 .
- the user may be listening to music using an audio application on the electronic device 402 , and desire to transfer the music playback to one of the television or stereo system.
- the user may make a forward swiping motion, in response to which the electronic device 402 determines that the television is oriented and positioned in front, and thus the electronic device 402 communicates with the television to arrange for a hand-off of the music playback to the television.
- the electronic device 402 determines that the stereo system is oriented and positioned to the right, and thus electronic device 402 communicates with the stereo system to arrange for a hand-off of the music playback to the stereo system.
- FIG. 5 illustrates another example whereby the physical position/orientation between two electronic devices 502 , 504 is used to control how each electronic device views a virtual object 506 inserted into a local environment 508 observed by the electronic devices 502 , 504 .
- the user of the electronic device 502 interacts with the electronic device 502 to place the virtual object 506 as an AR overlay on the imagery of the local environment 508 captured by the imaging cameras of the electronic device 502 in its particular position/orientation.
- the AR software of the two electronic devices are in communication, so in response to the generation and placement of the virtual object 506 in the virtualized representation of the local environment 508 , the AR software of the electronic device 504 determines how its physical position/orientation relative to the virtual object 506 differs compared to the electronic device 502 , and thus prepares a representation of the virtual object 506 in the AR overlay of the electronic device 504 to reflect this difference so as to accurately depict how the virtual object 506 would appear given the particular position/orientation of the electronic device 504 .
- the adaptation of the AR display of the virtual object 506 in another electronic device based on physical position/orientation can be used for either concurrent viewing of the virtual object, or for time-shifted viewing.
- both the electronic device 502 and the electronic device 504 could be concurrently viewing the local environment 508 , and thus the generation, modification, or other manipulation of the virtual object 506 by one of the electronic devices is soon reflected in the AR overlay of the other.
- Either user may then interact with the virtual object 506 via the corresponding electronic device, and the manipulation of the virtual object 506 would then be reflected in the depiction of the virtual object 506 in the AR overlay of the other electronic device.
- the user of the electronic device 502 could place the virtual object into its indicated position at one time, and then when the user of the electronic device 504 comes into the local environment 508 at another time, the virtual object 506 is then made viewable in the AR overlay of the electronic device 504 in a manner appropriate to the point of view of the object considering the current position/orientation of the electronic device 504 .
- the persistence of the virtual object 506 may be maintained by prior storage of data representing the virtual object 506 and its current position/orientation at the electronic devices 502 , 504 , by using a cloud service to coordinate the presentation of the virtual object 506 with the proper perspective view at different times and different locations/orientations and for different devices, or a combination thereof.
- FIG. 6 illustrates an example implementation of a location-based service that facilitates the transformation of a local environment as viewed through the imaging cameras and display of an electronic device in accordance with at least one embodiment.
- An electronic device 600 can utilize its imaging cameras to capture imagery of the local environment 602 as the user moves through the local environment 602 , and provide an AR overlay 604 over this imagery so as to transform local environment as depicted in the display 104 of the electronic device 600 into an alternate reality whereby depictions of physical objects in the local environment 602 within the captured and displayed imagery are “replaced” with corresponding objects in accordance with the theme of the transformation.
- the AR overlay 604 may implement a “jungle” theme whereby physical objects in the captured imagery are replaced with jungle-themed objects in the display of the imagery at the electronic device 600 .
- a couch 612 and a lamp 614 captured in the imagery of the local environment 602 could be replaced with imagery of a boulder 622 and a palm tree 624 , respectively, in the AR overlay 604 .
- walls in the local environment can be replaced with depictions of rows of trees or rows of bamboo, patches of grass can be replaced with depictions of quicksand, etc.
- this AR overlay 604 is intended as a 3D AR overlay such that as the user moves the electronic device 600 around the local environment 602 , the view of the physical objects changes in the captured imagery, and thus the electronic device 600 changes the view of the replacement virtual objects to match.
- the virtual objects may be implemented as 3D graphical representations such that as the electronic device 600 moves relative to the physical object represented by a virtual object in the AR overlay 604 , the view of the virtual object presented in the AR overlay 604 changes in response to the change in perspective of the electronic device 600 .
- the electronic device 600 may determine its current position/orientation, and from this determine how the virtual objects would appear in the current scene depicted by the electronic device 600 , and format the AR overlay 604 accordingly.
- the electronic device 600 first identifies physical objects in the captured imagery and which are candidates for virtual object replacement.
- this process can include using object detection algorithms to detect objects within the images, as well as estimation of the dimensions of the objects using, for example, depth information from a depth sensor, or based on comparison of the size of the object in the imagery to some known scale or calibration tool.
- the canonical orientation or features of the object also may be identified to ensure more precise geometric matching between the appearance of the physical object and its replacement virtual object, such that the proxy virtual object substantially matches the geometric constraints of the physical object.
- the electronic device 600 may determine that the boulder 622 is a suitable proxy, and the depiction of the boulder 622 may be formatted such that the canonical orientation of the boulder 622 follows the corresponding canonical orientation of the couch 612 .
- the electronic device 600 may chose the palm tree 624 as a suitable proxy, with the depiction of the palm tree 624 scaled, oriented, and otherwise formatted accordingly.
- Other examples can include, for example, replacing walls with rows of bamboo or other trees, chairs with ruminants, curtains with vines, and the like. In this manner, the local environment 602 can be transformed into a jungle-themed alternate reality while maintaining the geometries of the local environment in the alternate reality, albeit in a different visual form.
- FIG. 7 illustrates an example implementation of a location-based service that facilitates user navigation through a local environment through an AR overlay on captured imagery of the local environment in accordance with at least one embodiment.
- a user of a smartphone, tablet computer, or other electronic device 700 may specify a sought-after object or intended destination.
- the electronic device 700 may query a cloud service or a local mapping database to determine the position or location of the object or destination. From this, the electronic device 700 may determine a route to the object or destination, and then provide navigational assistance to the user in the form of graphical navigational aids displayed as an AR overlay on the displayed captured imagery of the local environment.
- a user of the electronic device 700 may program in a destination and start out walking along a path 704 .
- the user may hold up the electronic device 700 such that the fork 706 is displayed at the display 104 of the electronic device 700 .
- the electronic device 700 may determine its current position and the path between its current position and the intended destination, and then provide navigational assistance to the user by providing an AR overlay 708 that displays a right-bearing arrow 710 identifying the right branch of the fork 704 as the correct choice to reach the destination.
- the AR overlay 708 further can include additional information about the path or the trip thus far, such as the user's current speed, the distance to the destination along the current path, and the like.
- this navigational assistance feature can be used to facilitate user navigation in a retail setting.
- a user may enter a store intending to purchase an item, but the user is unaware of where the item is stocked within the aisles and shelves of the store.
- the store thus may maintain a database storing information regarding the layout of the store and the location of each item stocked by the store within this layout.
- the electronic device 700 may access this database from a cloud service to pinpoint the location of the item within the store, and given the current position/orientation of the electronic device 700 , determine a path to the location of the item and provide navigational assistance through navigational cues displayed in an AR overlay at the electronic device 700 .
- Such navigational assistance can include, for example, displaying directional cues (e.g., which direction to head, arrows pinpointing the position of the item on a shelf), aisle/shelf numbers, one or more images of the item or its packaging to facilitate identification of the item on the shelf, and the like.
- directional cues e.g., which direction to head, arrows pinpointing the position of the item on a shelf
- aisle/shelf numbers e.g., which direction to head, arrows pinpointing the position of the item on a shelf
- images of the item or its packaging e.g., one or more images of the item or its packaging to facilitate identification of the item on the shelf, and the like.
- the AR overlay can provide navigation assistance in a form akin to “X-ray vision” whereby the AR overlay can depict the item in its location through one or more aisles or other barriers between the item in its location and the electronic device 700 .
- the database of the store supplies a three-dimensional mapping of the store interior so that the electronic device 700 can accurately render the intervening shelves and other barriers in transparent or semi-transparent form consistent with the view perspective of the shelves and other barriers to the electronic device 700 in its current position/orientation.
- the store navigational feature can be used in conjunction with retail analytics or advertisement functionality to provide additional information to the user via the AR overlay.
- a user may enter a store to obtain mint-flavored mouthwash, and a cloud service may use this information to determine whether any corn starch is in the current inventory of the store, and if not, determine that the citrus-flavored mouthwash that is in stock is a viable alternative and thus provide navigational assistance to the location of the citrus-flavored mouthwash instead.
- the AR overlay may be used to display any coupons, discounts, or deals for the item sought, to display targeted advertising based on the sought-after item, or to provide reviews from one or more sources to help the user choose between one or more brands of the item in stock at the store.
- FIG. 8A illustrates an example implementation of a location-based service that facilitates object measurement in accordance with at least one embodiment.
- the ability of an electronic device 800 to accurately pinpoint its relative position using one or a combination of position-determining techniques such as visual odometry, GPS, inertial navigation, and the like, is leveraged to enable accurate measurement of one or more dimensions of a physical object. From these dimensions, other physical measurements of the physical object may be determined, such as an area of a feature of the object, a volume of the object, an angle of a plane or edge of the object relative to gravity or another edge or plane, and the like.
- the one or more dimensions of a physical object are determined by determining the distance between the position of the electronic device 800 at one end of the physical object and the position of the electronic device 800 at the other end of the physical object along a measurement axis.
- the electronic device 800 can be used to measure the diagonal dimension 802 of the top of a table 804 .
- the user positions the electronic device 800 at corner 806 and then signals the electronic device 800 to mark its current position 807 by, for example, pressing a button on the electronic device 800 or by gently tapping a corner of the electronic device 800 against the corner 806 .
- the user then positions the electronic device at the opposite corner 808 and again signals the electronic device 800 to mark its current position 809 .
- the electronic device 800 then may calculate the dimension 802 (that is, the distance between the two corners 806 , 808 ) as the distance 810 between the positions 807 and 809 of the electronic device 800 . This process may be repeated for each additional dimension of interest.
- the electronic device 800 may need to compensate for any deviations caused by inconsistent placement or orientation of the electronic device 800 by the user.
- the user may have held the electronic device 800 in a horizontal position when tapping the corner 806 while holding the electronic device 800 in a vertical position when tapping the corner 808 .
- the electronic device 800 also may note its orientation at each of positions 807 , 809 and compensate for any inconsistent orientations between the positions by adjusting distance 810 by an offset or scaling factor based on the orientation deviations.
- This dimension-measuring technique may be combined with object-recognition functionality so as to particularly identify the physical object, and then query a database to obtain additional information of the identified object, including, for example, additional dimensional data for the physical object.
- the user may enter a large furniture store and spot a couch. Interested in the couch but concerned that the couch may not fit into the user's living room, the user can use the dimension-measurement technique described above to determine a major dimension of the couch.
- the user also then uses the electronic device 800 to capture one or more images of the couch, in addition to capturing depth information for the images using a depth sensor as described above.
- the dimension measurement, images, and depth information are then transmitted to a cloud service that analyzes the images to identify the particular model of the couch.
- the measured major dimension of the couch is used by the cloud service to limit its search to only those models of couches that fit that measured dimension.
- the depth information is substantially independent of lighting and background, the geometry information revealed by the depth information can be used to further constrain the scope of the search, thereby facilitating a more efficient object recognition search.
- the cloud service can access its databases to determine additional information on the model of couch and forward this information to the electronic device 800 for display to the user.
- This additional information can include, for example, additional dimensional information, warranty information, product description, user reviews, prices offered by various retailers, advertisements for associated goods and services, and the like.
- one or more dimensions of the physical object are determined based on 3D mapping of the physical object through depth information from a depth sensor or other information, and the user's interaction with imagery of the physical object on the display of the electronic device 800 is then used to provide dimensional information.
- a user captures imagery of the table 804 , which is displayed at the display of the electronic device 800 .
- the electronic device 800 determines a 3D mapping of the salient edges of the imagery containing the table 804 , such as by using depth information obtained from a depth sensor of electronic device 800 or by obtaining a CAD model of the table 804 from a previously-defined 3D mapping of the table 804 (and identified using, for example, the position and orientation of the electronic device 800 at the time of capture of the imagery).
- the user may then interact with this imagery to define two or more points of interest on the depicted imagery of the table 804 , such as by tapping the touchsceen of the electronic device 800 at points 820 and 822 , which coincide with the two top corners of the front face of the table 804 , or by moving a mouse or other cursor to these two points and marking them as points of interest.
- the electronic device 800 or cloud service may utilize a 3D model of the table 804 to render a 2D view of the table 804 from some perspective (e.g., a top-down view or a front view) of the table 804 , and the user may select the points of interest from this 2D view.
- the electronic device 800 or associated cloud service may then calculate one or more dimensional measurements from the points of interest using a 3D model of the table 804 determine from depth information or a priori modeling. For example, the with the points 820 and 822 selected, the electronic device 800 may measure the distance 824 between these two points 820 and 822 based on a CAD model or other 3D mapping of the table 804 , and thus determine the width of the table 804 because the two points 820 and 822 pinpoint the opposite corners of the table 804 . In the event that three points are selected, the electronic device 800 or cloud service can calculate the distances between the three points, the angles between the lines between the three points, the area defined by the three points.
- the electronic device 800 also can determine the volume of the space defined by the four or more points.
- These dimensional evaluation techniques thus can be used to analyze the relationships between objects (e.g., the angle of intersection of two walls, whether a picture or table top is level) or envision an imaged object in another space. For example, a user could capture an image of a doorway leading into the user's bedroom to determine its dimensions, and then when at a furniture store, the user can take a picture of a dresser cabinet to determine the cabinet's dimensions, and from this determine whether the cabinet would fit through the doorway if the user chose to purchase the cabinet.
- the image-based dimensional measurement technique also can be automated, rather than responding to user input.
- the electronic device 800 may performing 3D mapping analysis to identify the salient dimensions of the physical objects captured in the imagery. This identification may be performed using 3D modeling using depth information, by identifying the position and orientation of the electronic device 800 and then accessing a 3D model database to identify the objects present in the device's view based on the position/orientation information.
- the electronic device 800 may provide information on one or more of the dimensions, volume, area, pertinent angles, and the like, using, for example, an AR overlay.
- FIGS. 9-11 illustrates example implementations of location-based services that facilitate display of an AR overlay on captured imagery of a local environment such that graphical content of the AR overlay is integrated with the physical objects of the local environment in the captured imagery based on the pose (position/orientation) of the electronic device relative to the objects in accordance with at least one embodiment.
- the electronic device With the pose of the electronic device in the 3D space representing the local environment, the electronic device can present or format 3D graphical content or metadata in the same 3D space presented by the displayed imagery so that the graphical content is geometrically consistent or compliant with the geometries of the physical objects in the local environment.
- the location-based services may provide a virtual pinball game whereby a pinball AR overlay is displayed over video imagery of the side of a building captured by the electronic device.
- the portable electronic device can determine the dimensions and geometry of the side of the building, and from this information and from its current pose, the electronic device can format an AR overlay such that the pinball game elements are viewed from a perspective that matches the dimensions and geometry of the side of the building.
- the pinball AR overlay is thus formatted to follow the contours of the building using the perspective represented by the pose of the portable electronic device, and elements of the building facade, such as windows or cornices, may be incorporated as barriers or gaming features in the pinball game.
- depth information available from the depth sensor implemented at the electronic device can be used to incorporate user actions into the material presented by the AR overlay.
- a user may use the electronic device to capture imagery of a neighborhood park, and with this imagery and a mapping of the area obtained either from the depth sensor or separate 3D mapping information for the park (e.g., as provided by a cloud service), the electronic device can implement an AR overlay on the live feed of video imagery in a manner that depicts a virtual ball present in the local environment, and whereby the depth sensor can detect the user kicking at this virtual ball (by detecting the user's foot as it enters the field of view of the depth sensor), and from the velocity and direction of the user's foot, simulate the virtual ball traveling through the imagery of the local environment as though it was in fact kicked.
- the electronic device is aware of its position and orientation with respect to the local environment, changes in either of the position and orientation can be accurately reflected by changing the perspective of the AR overlay to match the changed position/orientation.
- the virtual ball in the example above is travelling across a grass lawn, the user may run behind the virtual ball, and the AR overlay updates the view of the virtual ball to reflect the changing perspective.
- the electronic device can use this information to format the AR overlay such that virtual objects realistically interact with physical objects.
- the AR overlay can determine the trajectory of the virtual ball within the local environment, detect the presence of a wall along this trajectory, and thus depict the virtual ball as bouncing off of the physical wall in a manner that emulates the behavior of an actual physical ball traveling along the same trajectory.
- the electronic device may detect two different physical objects at different distances from the electronic device and when the trajectory of the virtual ball is plotted to take the virtual ball between these two physical objects, the AR overlay can be formatted to depict the virtual ball as disappearing behind the closer physical object while remaining in front of the farther physical object.
- FIG. 9 illustrates an example of this AR overlay technique in the context of a virtual pet game provided by an electronic device 900 .
- the electronic device 900 provides an AR overlay that depicts a pseudo-autonomous virtual pet 902 in a local environment 904 (e.g., the pet “owner's” house).
- a local environment 904 e.g., the pet “owner's” house.
- the AR overlay can be formatted to provide the appearance of the virtual pet 902 interacting with the local environment 904 and the user of the electronic device 900 .
- the virtual pet 902 can be depicted as scampering across the floor, hiding behind objects, traveling in and out of rooms, and interacting with the user.
- the virtual pet 902 may be depicted as skittish, such that as if the user gets “too close” to the virtual pet 902 , the virtual pet 902 is depicted as heading for a closet to hide.
- this virtual pet AR overlay can utilize the geometries of physical objects within the local environment 904 to render the physical behavior of the virtual pet 902 more realistic, such as by realistically depicting a trajectory of the virtual pet 902 so that it disappears behind physical objects that are closer to the electronic device.
- external data may be utilized to modify or control the depicted behavior of the virtual pet 902 .
- the electronic device 900 can determine its current longitude and latitude and the current time of day, and from this information determine the position of the sun in the sky by accessing the appropriate database from a cloud service. With the position of the sun so determined, and with knowledge of the placement and orientation of windows in the local environment, the electronic device can determine the angle and presence of a sunbeam anticipated to be streaming in through a window in the local environment 904 and depict a virtual cat (an example of the virtual pet 902 ) as moving to and then napping in the location of the sunbeam.
- a virtual cat an example of the virtual pet 902
- the virtual pet AR overlay utilizes the current pose of the electronic device 900 to inform the content displayed in the AR overlay.
- the electronic device 900 may simulate the actions of the virtual pet 902 such that it appears to hide behind a physical object 906 in the local environment 904 after being “scared” by the user. Accordingly, when the user holds the electronic device 900 in position A depicted in FIG. 9 , the AR overlay omits presentation of the virtual pet 902 so as to give the appearance of the virtual pet 902 hiding behind the physical object 906 . However, when the user moves the electronic device 900 to position B depicted in FIG. 9 , the new pose of the electronic device 900 permits viewing behind the physical object 906 , and thus the AR overlay depicts the virtual pet 902 behind the physical object 906 .
- FIG. 10 illustrates an example of the AR overlay technique in the context of instructional content provided by an electronic device 1000 .
- the AR overlay can provide instructional information intended to guide a user's interaction with a physical object in the local environment.
- the electronic device 1000 captures and displays imagery of the physical object.
- the electronic device 1000 determines its position and orientation relative to the physical object (as currently depicted in the imagery), and from this formats an AR overlay so that the instructional information presented in the AR overlay is consistent with the geometry of the physical object as presented in the electronic device's view of the physical object.
- a circuit has blown in the user's house, and the user thus needs to throw the appropriate circuit breaker to reset the circuit.
- an AR overlay can provide the “X-ray” feature described above so as to depict the circuit breaker panel 1002 through one or more walls or other obstacles in the house.
- the user travels to the circuit breaker panel 1002 and positions the electronic device 1000 to face the circuit breaker panel 1002 having a plurality of circuit breakers 1004 for various circuits within the user's abode.
- the electronic device 1000 can scan a QR code on the circuit breaker panel 1002 to determine the model of the circuit breaker panel, and with the model so identified, access a database compiled by the builder of the home that identifies the pairings between circuit breaker and circuit for the given model.
- the electronic device 1000 then can provide an AR overlay 1006 that provides graphical icons identifying the circuit for each circuit breaker 1004 present in the displayed imagery of the circuit breaker panel 1002 .
- the electronic device 800 can access the wiring information from the CAD model of the house to determine the position of the wiring of various circuits, and then provide an AR overlay depicting where the wiring is located in relation to the walls, ceiling, floor, and other spaces depicted in the captured imagery displayed by the electronic device 1000 .
- the user can use the imaging camera of the electronic device 1000 to capture video of the user throwing a baseball.
- the electronic device 1000 may detect the geometry of the user as the user performs the throwing the baseball using the depth sensor, and compare this to an idealized throwing model that is scaled to the user's particular physiology.
- the electronic device 1000 then may overlay this idealized throwing model on top of the playback of the video of the user throwing the baseball, thereby allowing the user to identify any deviations between the user's throwing mechanics and the ideal throwing mechanics.
- a user may need to undo a paper jam in a photocopier.
- the user can focus the imaging camera of the electronic device 1000 on the photocopier, in response to which the electronic device 1000 uses a QR code, bar code, or other visual indicia to identify the model of the photocopier.
- the electronic device 1000 then can access from a database the visual instructions for solving a paper jam for that particular model, and then overlay graphical representations of these instruction over the captured imagery of the photocopier displayed at the electronic device 1000 .
- the electronic device 1000 can determine its perspective view of the photocopier based on the current pose of the electronic device 1000 or based on depth information collected from the depth sensor, and then format the graphical representations so as to match the perspective view of the photocopier as displayed in the captured imagery.
- the AR overlay can locate a graphical icon identifying a toner release lever such that it is positioned over the location of the toner release lever in the displayed imagery of the photocopier.
- FIG. 11 illustrates an example of the AR overlay technique in the context of virtual pedometry provided by an electronic device 1100 .
- the electronic device 1100 can monitor and record a path of the user as the user traverses through a local environment 1102 .
- the user's prior path thus may be reflected in an AR overlay 1104 of the electronic device 1100 whereby the user's prior path is represented in displayed imagery of the local environment 1102 using some form of a graphical representation, such as barefoot icons 1106 that are oriented in the direction of travel at each point.
- the user may identify where the user has and has not been within the local environment 1102 by viewing the local environment 1102 through the display 104 ( FIG. 1 ) of the electronic device 1100 with the AR overlay 1104 present.
- the paths of the users may be made available to each of the multiple electronic devices via a cloud service, and thus the AR overlay 1104 at one of the electronic devices 1100 may depict the prior paths or locations of some or all of the other users.
- This pedometry AR overlay further may incorporate sound effects to enhance the experience.
- the electronic device 1100 may use an accelerometer to detect the footfall pattern of the user, and as the user's foot strikes the ground (or is anticipated to strike the ground), the electronic device 100 may play a footfall sound synchronized to this foot strike, and also may display the next footfall icon in the AR overlay 1104 to coincide with this foot strike.
- This functionality may be utilized as a game, such as an AR version of the light-cycle game played in the Tron movie.
- the user also may use this functionality to identify areas of the local environment 1102 that the user has already visited, or has not yet visited.
- the tracking of the path of the user can be used by the electronic device 1100 to measure non-linear distances, such as the length of a path walked by the user or a distance a user walks between two buildings.
- FIG. 12 illustrates an example implementation of a location-based service that facilitates user-initiated mapping of previously unmapped spaces in accordance with at least one embodiment.
- the 3D mapping database for a facility or location may be incomplete, in that one or more areas of the facility or location have not yet been mapped and incorporated into the 3D database of the facility or location.
- the entity managing the 3D mapping database typically addresses this lack of information by either accepting that the area will remain unmapped, or if it is sufficiently important to map the unmapped areas, commit the time and expense to send a mapping crew to the facility or location specifically to complete the mapping of the facility or location.
- the AR functionality and the position/orientation detection functionality of the electronic device can be leveraged to induce users of the electronic device to enter the unmapped areas and capture imagery, 3D mapping information, or both, via their electronic devices while in the unmapped areas. This captured information then may be supplied back to the entity for incorporation into its 3D mapping database.
- the user is induced to provide this service through the form of a game presented to the user via an AR overlay, in which the game displays some form of graphical information that induces the user to move into the unmapped area and manipulate the electronic device into different positions and orientations while in the unmapped area so that various sensor information of the unmapped area, such as imagery, depth information, ambient light, temperature, and the like, can be obtained for the unmapped area from different perspectives and locations.
- the game displays some form of graphical information that induces the user to move into the unmapped area and manipulate the electronic device into different positions and orientations while in the unmapped area so that various sensor information of the unmapped area, such as imagery, depth information, ambient light, temperature, and the like, can be obtained for the unmapped area from different perspectives and locations.
- FIG. 12 illustrates an example whereby a cloud service maintains a map 1202 of a facility, with the map 1202 indicating which areas of the facility have been mapped and which have not yet been mapped (with unmapped areas indicated by the hash-lined shading in the map 1202 ).
- the cloud service detects a position 1201 of an electronic device 1200 as being proximity to an unmapped area (e.g., through GPS coordinates provided by the electronic device 1200 ), and thus the cloud service invites the user of the electronic device 1200 to participate in a game.
- the user is challenged to track a virtual gecko 1204 that is depicted in an AR overlay for the imagery of the local environment captured and displayed by the electronic device 1200 , whereby the virtual gecko 1204 is formatted so as to appear to travel over the walls, floors, and ceiling of the facility as viewed through the electronic device 1200 .
- the cloud service determines the present orientation and position of the electronic device 1200 and determines the direction the user needs to travel to enter an unmapped area, and then controls the behavior of the virtual gecko 1204 so that the virtual gecko 1204 is depicted as travelling in the direction of the unmapped area and such that the user is induced to “follow” the virtual gecko 1204 toward the unmapped area.
- the virtual gecko 1204 may be controlled to travel to yet-unmapped locations, with the user and electronic device 1200 following, until the cloud service has obtained sufficient imagery and other sensor data from the electronic device 1200 to 3D map the unmapped area.
- any of a variety of techniques may be used to incentivize the user's participation in this game.
- the challenge of appearing on a scoring leaderboard may sufficiently incentivize the user, with scoring handled by, for example, the percentage of time a user is able to maintain the virtual gecko 1204 within the view of the electronic device.
- financial incentives may be employed, such as by proportionally reducing the user's data plan bill based on the score attained by the user or the number of minutes the user plays the game, by offering coupons or other payment, and the like.
- the entity maintaining the 3D database for the facility can efficiently and inexpensively complete the 3D mapping of the facility through the cooperation of users of electronic devices.
- the electronic device can provide location-based services that utilize virtual reality techniques. For these techniques, the position/orientation of the electronic device within a local environment may be equated to a corresponding position/orientation within a virtual environment or a remote environment, and thus the view of this other environment displayed at the electronic device may be controlled by the position/orientation of the electronic device within the local environment. Thus, the manner in which the user travels through a local environment and handles the electronic device in the local environment is reflected in the views of the remote/virtual environment displayed by the electronic device 100 to the viewer.
- FIG. 13 depicts an example VR-based service provided by an electronic device 1300 whereby a user can “walk” through a virtual/remote environment 1302 (as viewed through the electronic device 1300 ) by manipulating the position and orientation of the electronic device 1300 in a local environment 1304 , whereby the imagery of the virtual/remote environment 1302 changes to reflect a changing viewpoint caused by changes in position/orientation of the electronic device 1300 in the local environment 1304 .
- the virtual environment 1302 includes a box 1305 upon a table 1306 . The user may view the box 1305 and table 1306 from various angles in the virtual environment 1302 by moving around the local environment 1304 in a corresponding manner.
- the electronic device 1300 determines the view of the box 1305 and table 1306 that would be presented to the user in the corresponding position/orientation 1311 in the virtual/remote environment 1302 , and displays this view of the box 1305 and table 1306 in the manner depicted as “Viewpoint A” in FIG. 13 .
- the electronic device 1300 determines the view of the box 1305 and table 1306 that would be presented to the user in the corresponding position/orientation 1313 in the virtual/remote environment 1302 , and displays this view in the manner depicted as “Viewpoint B” in FIG. 13 .
- this technique can be used to, for example, permit a user to walk through a movie scene recorded in 3D from multiple viewpoints, a sporting event likewise recorded in 3D from multiple viewpoints, walk through a simulation of a building or other facility, and the like.
- a restaurant can be 3D mapped and the resulting 3D map stored with a cloud service, and a patron of the restaurant wishing to make a reservation can access the 3D mapping and “walk around” the restaurant using this technique to ascertain the view of the restaurant from each table of the restaurant. The patron can then select the table that best suits the patron's taste, and contact the restaurant to reserve that table.
- relational terms such as first and second, and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
- the terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- An element preceded by “comprises . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
- program is defined as a sequence of instructions designed for execution on a computer system.
- a “program”, or “computer program”, may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
A system includes a first electronic device and a second electronic device. The first electronic device is to display a first augmented reality overlay on imagery of a local environment captured by the first electronic device, the first augmented reality overlay including a depiction of a virtual object from a first perspective that is based on a position and orientation of the first electronic device using a three-dimensional mapping of the local environment. The second electronic device is to display a second augmented reality overlay on imagery of the local environment captured by the second electronic device, the second augmented reality overlay including a depiction of the virtual object from a perspective based on a position and orientation of the second electronic device using the three-dimensional mapping.
Description
- The present application claims priority to U.S. Patent Application Ser. No. 61/923,629 (Attorney Docket No. 1200-CS42345), entitled “Electronic Device Using Position and Orientation to Facilitate Location-Based Services” and filed on 3 Jan. 2014, the entirety of which is incorporated by reference herein.
- The present application generally relates to the use of augmented reality overlays in electronic devices and, more particularly, to the use of three-dimensional mapping to facilitate location-based augmented reality overlays in electronic devices.
- Computing-enabled cellular phones, tablet computers, and other portable electronic devices increasing incorporate location-based services to provide enhanced user-machine interactions. Conventionally, these location-based services rely on a static and unrealistic representation of the local environment of the portable electronic device, and thus often lead to user dissatisfaction.
- The present disclosure may be better understood by, and its numerous features and advantages made apparent to, those skilled in the art by referencing the accompanying drawings. The use of the same reference symbols in different drawings indicates similar or identical items.
-
FIG. 1 is a diagram illustrating an electronic device utilizing position and orientation information to facilitate location-based services in accordance with at least one embodiment of the present disclosure. -
FIG. 2 is a diagram illustrating an example system implementation of the electronic device ofFIG. 1 in accordance with at least one embodiment of the present disclosure. -
FIG. 3 is a diagram illustrating provision of a density map of electronic devices in a facility or other location in accordance with at least one embodiment of the present disclosure. -
FIG. 4 is a diagram illustrating an example of an augmented reality interaction between two devices based on their physical orientation to each other in accordance with at least one embodiment of the present disclosure. -
FIG. 5 is a diagram illustrating another example of an augmented reality interaction between two devices based on their physical orientation to each other in accordance with at least one embodiment of the present disclosure. -
FIG. 6 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which depictions of physical objects are replaced by depictions of virtual objects having matching geometry in accordance with at least one embodiment of the present disclosure. -
FIG. 7 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which navigation aids are depicted in accordance with at least one embodiment of the present disclosure. -
FIG. 8A is a diagram illustrating an example of measuring a dimension of an object using a change in position of an electronic device in accordance with at least one embodiment of the present disclosure. -
FIG. 8B is a diagram illustrating an example of measuring a dimension of an object using an electronic device based on 3D mapping of the object in accordance with at least one embodiment of the present disclosure. -
FIG. 9 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which a virtual pet is depicted in accordance with at least one embodiment of the present disclosure. -
FIG. 10 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which instructional assistance is depicted in accordance with at least one embodiment of the present disclosure. -
FIG. 11 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which prior user path information or other visual pedometry information is depicted in accordance with at least one embodiment of the present disclosure. -
FIG. 12 is a diagram illustrating an example of an augmented reality overlay at an electronic device in which a user incentivized mapping game is depicted in accordance with at least one embodiment of the present disclosure. -
FIG. 13 is a diagram illustrating an example of a virtual reality display at an electronic device in which a remote environment or virtual environment is depicted based on movement of the electronic device in a local environment in accordance with at least one embodiment of the present disclosure. -
FIGS. 1-13 illustrate various techniques for leveraging the position and orientation tracking capabilities of an electronic device, along with depth information and other 3D mapping information available to the electronic device, to provide enhanced location-based services. These location-based services may take the form of enhanced augmented reality (AR) functionality that formats or otherwise controls graphical information presented in an AR overlay so as to be visually compatible with the geometries of the objects within a local environment, or to effectively convey related position/orientation information. These location-based services further may take the form of virtual reality (VR) functionality that maps the position and orientation of an electronic device in the local environment to corresponding perspective views of a remote environment or virtual environment. The capture of imagery of the local environment by the electronic device in support of this functionality may utilize any of a variety of techniques, examples of which are disclosed in U.S. patent application Ser. No. 14/040,918 (Client Ref. No. CS41630), filed on Sep. 30, 2013 and entitled “3D Feature Descriptors with Camera Pose Information”, and U.S. patent application Ser. No. 14/086,427 (Client Ref. No. CS41623), filed on Nov. 21, 2013 and entitled “Electronic Device with Modulated Light Flash Operation for Rolling Shutter Image Sensor,” the entireties of which are incorporated by reference herein. -
FIG. 1 illustrates anelectronic device 100 configured to support location-based functionality, such as simultaneous location and mapping (SLAM), visual odometry, augmented reality (AR), and virtual reality (VR) techniques, using image and non-image sensor data in accordance with at least one embodiment of the present disclosure. Theelectronic device 100 can include a portable user device, such as a tablet computer, computing-enabled cellular phone (e.g., a “smartphone”), a notebook computer, a personal digital assistant (PDA), a gaming system remote, a television remote, and the like. In other embodiments, theelectronic device 100 can include a fixture device, such as medical imaging equipment, a security imaging camera system, an industrial robot control system, a drone control system, and the like. For ease of illustration, theelectronic device 100 is generally described herein in the example context of a portable user device, such as a tablet computer or a smartphone; however, theelectronic device 100 is not limited to these example implementations. - The
electronic device 100 includes ahousing 102 containing or otherwise incorporating adisplay 104 and a plurality of sensors to obtain information regarding a local environment of theelectronic device 100. These sensors can include image-based sensors, such asimaging cameras - In at least one embodiment, the
electronic device 100 uses the information generated by some or all of these sensors to determine one or both of a position and orientation of theelectronic device 100 in support of various location-based services available via theelectronic device 100. This position and orientation (referred to herein in the alternative or collective as “position/orientation” or “pose”) may be an absolute position/orientation (e.g., a GPS coordinate and compass orientation) or a relative position/orientation (that is, relative to the local environment), or a combination thereof. To illustrate, a magnometer, gyroscope, and accelerometer may be used to determine a relative position (or change in position) and orientation (relative to gravity), and a GPS sensor may be used to provide an absolute position. Similarly, SLAM or other visual odometry methods may be used to provide a position and orientation relative to a specified location or within a specified facility or map. Moreover, in at least one embodiment, theelectronic device 100 uses imagery from the local environment to support object recognition functionality. - To illustrate, in some embodiments, the
electronic device 100 incorporates depth sensing via a depth sensor in order to determine its distance from various objects in the local environment, whereby this information may be used for visual telemetry purposes or for identifying the objects. In some instances, this depth sensor may be implemented by positioning and orienting theimaging cameras electronic device 100, thereby enabling depth sensing of objects in the local environment that are positioned in the region of overlapping fields of view via multiview image analysis. In other implementations, theelectronic device 100 implements the depth sensor in the form of a structured light projector (not shown inFIG. 1 ) to project modulated light patterns into the local environment, and one or both ofimaging cameras electronic device 100, based on the analysis of the depth imagery. The resulting depth information obtained from the depth sensor may be used to calibrate or otherwise augment depth information obtained from multiview analysis (e.g., stereoscopic analysis) of the image data captured by theimaging cameras - Alternatively, the depth information from the depth sensor may be used in place of depth information obtained from multiview analysis. To illustrate, multiview analysis typically is better suited for bright lighting conditions and when the objects are relatively distant, whereas modulated light-based depth sensing is better suited for lower light conditions or when the observed objects are relatively close (e.g., within 4-5 meters). Thus, when the
electronic device 100 senses that it is outdoors or otherwise in relatively good lighting conditions, theelectronic device 100 may elect to use multiview analysis to determine object depths. Conversely, when theelectronic device 100 senses that it is indoors or otherwise in relatively poor lighting conditions, theelectronic device 100 may switch to using modulated light-based depth sensing via the depth sensor. - Position/
orientation information 122 andother sensor information 124 obtained by theelectronic device 100 can be used to support any of a variety of location-based functionality, such as visual odometry or other SLAM functionality, augmented reality (AR) functionality, virtual reality (VR) functionality, and the like. As an example, theelectronic device 100 can 3D map the local environment and then use this mapping to facilitate the user's navigation through the local environment, such as by displaying to the user a floor plan generated from the mapping information and an indicator of the user's current location relative to the floor plan as determined from the current relative position of theelectronic device 100. Moreover, the relative position/orientation information 122 obtained by theelectronic device 100 can be combined withsupplemental information 126 to present an AR view of the local environment to a user via thedisplay 104 of theelectronic device 100. Thissupplemental information 126 can include one or more AR databases locally stored at theelectronic device 100 or remotely accessible by theelectronic device 100 from a cloud service or other remote processing system via a wired or wireless network. - To illustrate, in the depicted example of
FIG. 1 , the local environment is a museum of art, and theelectronic device 100 is in communication with a remote server or other cloud service that maintains a database of information regarding the artwork currently on display in the museum (one example of supplemental information 126). This database identifies the corresponding artwork based on relative locations within the museum. Accordingly, theelectronic device 100 can determine a relative orientation/position of theelectronic device 100 as described above and herein, and from this relative orientation/position determine the identity ofartwork 130 captured by theimaging cameras imaging cameras artwork 130 so identified, theelectronic device 100 can query the database to determine information regarding theartwork 130, and provide a graphical representation of this information as agraphical overlay 132 associated with the depiction of theartwork 130 on thedisplay 104 of theelectronic device 100. - As another example, the position/orientation of the
electronic device 100 may be used to control the user's interactions with systems in the local environment. To illustrate, the user may speak the voice command “lower temperature” into theelectronic device 100 while standing in the user's living room. In response, theelectronic device 100 sends its current position and the command “lower temperature” to a home automation system in communication with theelectronic device 100, and the home automations system, using the position information, lowers the temperature in the living room in response to the command. In a similar vein, the user may tap theelectronic device 100 against a lamp in the living room, and the home automation system may detect this motion, identify the lamp as being the closest object to theelectronic device 100, and thus turn on the lamp in response. - Additional examples of location-based services utilizing one or more of the position/
orientation information 122 and other sensor information obtained by the various sensors of theelectronic device 100, in combination withsupplemental information 126, are described in detail below with reference toFIGS. 3-13 . -
FIG. 2 illustrates an example implementation of theelectronic device 100 in accordance with at least one embodiment of the present disclosure. In the depicted example, theelectronic device 100 includes theimaging cameras system memory 204,flash memory 206, or another storage device (e.g., optical or magnetic disc drive, solid state hard drive, etc.), a wireless network interface 208 (e.g., a wireless local area network (WAN) interface, a cellular data network interface, etc.), a user interface (UI) 210, a set 212 of non-image sensors, and a structuredlight projector 214 interconnected via one ormore busses 216 or other interconnects. TheUI 210 includes, for example, thedisplay 104, a speaker ormicrophone 218, a keypad ortouchscreen 220, as well as other input/output components to receive input from, or provide information to, a user. The set 212 of non-image sensors can include any of a variety and combination of non-image sensors used to provide non-image context or state of theelectronic device 100, such as aGPS receiver 222, a gyroscope 24, anaccelerometer 226, amagnometer 228, and an ambientlight sensor 229. - In the depicted example, the
imaging camera 106 comprises a wide-angle imaging camera and theimaging camera 108 comprises a narrow-angle imaging camera. Thus theimaging cameras depth sensor 230 formed from a combination of the modulatedlight projector 214 and one or both of theimaging cameras light projector 214 projects modulated light patterns into the local environment, and uses one or both ofimaging cameras depth sensor 230 then may calculate the depths of the objects, that is, the distances of the objects from theelectronic device 100, based on the analysis of the depth imagery. - In operation, the one or
more processors 202 execute one ormore software applications 232 representing a set of executable instructions stored at a computer readable storage medium, such as thesystem memory 204 orflash memory 206. The executable instructions of thesoftware application 232, when executed, manipulate the one ormore processors 202 to perform various software-based functionality to implement at least a portion of the techniques described herein, provide visual information via thedisplay 104, respond to user input via theUI 210 and the like. - In the illustrated example, the
software application 232 implements two modules: a position/orientation tracking module 234; and a location-basedservices module 236. The position/orientation tracking module 234 operates to track the position/orientation of theelectronic device 100 via SLAM or visual telemetry techniques using thedepth sensor 230 and theimaging cameras GPS receiver 222 and orientation tracking via thegyroscope 224,accelerometer 226, and magnometer 228), or a combination thereof. The location-basedservices module 236 uses the position/orientation tracking information to provide one or more location-based services to the user via theelectronic device 100, examples of which are described in detail below. In providing these location-based services, the location-basedservices module 236 may make use of supplemental information 126 (FIG. 1 ) obtained from one or more local or remote databases. To illustrate, the location-basedservices module 236 may maintain a local datastore of previous waypoints of theelectronic device 100 and imagery captured at each waypoint in order to provide an AR overlay in the imagery (video) displayed at thedisplay 104. Further, theelectronic device 100 may access a cloud service or otherremote processing system 240 via one or more networks 242 (e.g., a local area network, wide area network, or the internet) to obtain supplemental information for a provided service, such as AR overlay information, VR imagery, location information for other devices, and the like. -
FIG. 3 illustrates an example implementation of a location-based service to provide density mapping of electronic devices at a facility or other location in accordance with at least one embodiment. In this implementation, the locations of portable electronic devices, such as smart phones, tablets, and smart watches, at a facility or other area are monitored by the devices themselves or by a remote system, and a density map is generated from the current locations of the portable electronic devices. In some embodiments, the density map may take the form ofindividualized density map 302, whereby the location of each monitored portable electronic device within a facility is separately indicated by acorresponding icon 304 within afloorplan 306, map, or other spatial representation of the facility or area. In other embodiments the density map may take the form ofnon-individualized density map 308, whereby an indication of a number of devices within a given area, or density of devices, is represented via shading 310, coloring, or other visual indicia within thefloorplan 306 or other spatial representation of the facility or area, and such that the location of any particular electronic device is not readily discerned from thenon-individualized density map 308. As depicted by thenon-individualized density map 308, a filter may be implemented so that device densities below a certain threshold are omitted from inclusion in the density map. - The
individualized density map 302 may be utilized when the location of the portable electronic device is sufficiently granular. To illustrate, each participating portable electronic device within the facility may use a software application or operating system (OS) feature that tracks and reports the position/orientation of the electronic device within ten centimeters using SLAM, visual odometry, or readings from the GPS sensor 222 (FIG. 2 ) and the gyroscope 224 (FIG. 2 ) and periodically reports the current position/orientation of the electronic device to a remote processing system via, e.g., a wireless connection. From this, the remote processing system may then collate the current positions/orientations of the reporting electronic devices and generate thedensity map 302 for the corresponding point in time. - The
non-individualized density map 308 may be utilized when there is insufficient location accuracy to accurately report the locations of the participating electronic devices, or in the event that privacy or security concerns preclude the identification of the specific location of any particular electronic device. To illustrate, the facility may provide wireless network access to the electronic devices via a set of access points distributed through the facility in known locations and with known or estimated coverage areas, and a cloud service may make use of the number or frequency (or “density”) of queries to an access point by the electronic devices as an indicator of the number, or density, of electronic devices within the coverage area of that access point. From this information, the cloud service can then construct thenon-individualized density map 308 with each coverage area colored or shaded according to its current device query “density.” - As the density map illustrates the number, or density, of electronic devices within areas of a facility or other location, the density map implies the popularity of each particular area among the device users, and thus can serve as a powerful analytical tool. To illustrate, the density map can be used to support retail analytics, such as determining where to post sales associates on the floor of a retail shop at any given time based on the density and distribution of customers depicted in the density map. Security operations likewise can identify where to focus security personnel within a facility based on the density map. Similarly, a building manager can identify where to most optimally activate heating-ventilation-air-conditioning (HVAC) equipment within a building based on a current density map. Further, changes in the density map over time, such as density flow within a given day, also may be analyzed to more fully optimize systems and services. To illustrate, a museum may observe the changes in the density map over the day to identify the displayed objects that are of highest interest to the museum patrons and the order in which the museum patrons visit the objects, and then more effectively position the objects within the museum facility based on this analysis.
- In some implementations, the density map is generated by a remote processing system and maintained for use by entities affiliated with the operator of the remote processing system, such as in the HVAC management example above. In other implementations, the density map may be transmitted to the electronic devices in the facility for display to the user. To illustrate, the density map may be used to highlight the current concentration of visitors to a music festival, and this density map may be provided to the electronic devices of the visitors so that they can use the density map to identify the most popular venues or other points of interest and plan their visit accordingly.
- With the
individualized density map 302, the locations of the individual electronic devices within the facility can also be linked to other sensor available by the electronic device so as to obtain more information about some aspect of the facility. To illustrate, the tight timing synchronization afforded by GPS signaling can be leveraged by a cloud service so as to direct a number of electronic devices within a specified area of the facility to concurrently or sequentially capture imagery of the specified area from their different perspectives, and then this imagery can be stitched together by the cloud service so as to provide, for example, a 360-degree 3D view of an object in the specified area, or to provide a “bullet time” rendition of an object or event by stitching together a sequence of images captured of an area or object from the different perspectives of the participating electronic devices. -
FIGS. 4 and 5 illustrate example implementations of a location-based service that controls interactions between two or more electronic devices in an augmented reality setting based on the physical orientation/position of the two or more electronic devices relative to each other in accordance with at least one embodiment. As noted above, theelectronic device 100 can provide AR functionality whereby video or other imagery captured by theelectronic device 100 is displayed at thedisplay 104 in combination with an AR overlay that includes additional information or graphical objects. Often this AR functionality extends over multiple electronic devices (e.g., through the use of a cloud service as an intermediary) such that interactions between the electronic devices are reflected in some manner in the AR overlays of the electronic devices. In at least one embodiment, the electronic devices can use their current position/orientation information to determine their physical position/orientation relative to each other, and this physical position/orientation can control or impact how the AR interactions are depicted within the AR overlays. - To illustrate, in
FIG. 4 twoelectronic devices electronic device virtual object 406, such as a graphical representation of a disc. Theelectronic devices electronic devices electronic device 402 to the left of theelectronic device 404. At time T=T0, thevirtual object 406 is located in the AR overlay of theelectronic device 402, and the user uses a touchscreen to effect a left-to-right swiping motion 407 to initiate movement of thevirtual object 406 to the right. In response to this input and in response to the physical position/orientation of theelectronic devices electronic devices virtual object 406 from the AR overlay of theelectronic device 402 to the AR overlay of theelectronic device 404, such that at time T=T1 thevirtual object 406 is depicted as half in the AR overlay of theelectronic device 402 and half in the AR overlay of theelectronic device 404, and at time T=T2 thevirtual object 406 is depicted as fully transferred to the AR overlay of theelectronic device 404. In similar manner, if there were another electronic device physically oriented to the left of theelectronic device 402, a right-to-left swiping motion on the touchscreen of theelectronic device 402 would result in a transfer of thevirtual object 406 to the AR overlay of this other electronic device. - As another example of this technique, a user of the
electronic device 402 may be standing in a room with a television to the front of theelectronic device 402 and a stereo system to the right of theelectronic device 402. The user may be listening to music using an audio application on theelectronic device 402, and desire to transfer the music playback to one of the television or stereo system. To transfer the music playback to the television, the user may make a forward swiping motion, in response to which theelectronic device 402 determines that the television is oriented and positioned in front, and thus theelectronic device 402 communicates with the television to arrange for a hand-off of the music playback to the television. Alternatively, if the user makes a left-to-right swiping motion, theelectronic device 402 determines that the stereo system is oriented and positioned to the right, and thuselectronic device 402 communicates with the stereo system to arrange for a hand-off of the music playback to the stereo system. -
FIG. 5 illustrates another example whereby the physical position/orientation between twoelectronic devices virtual object 506 inserted into alocal environment 508 observed by theelectronic devices electronic device 502 interacts with theelectronic device 502 to place thevirtual object 506 as an AR overlay on the imagery of thelocal environment 508 captured by the imaging cameras of theelectronic device 502 in its particular position/orientation. The AR software of the two electronic devices are in communication, so in response to the generation and placement of thevirtual object 506 in the virtualized representation of thelocal environment 508, the AR software of theelectronic device 504 determines how its physical position/orientation relative to thevirtual object 506 differs compared to theelectronic device 502, and thus prepares a representation of thevirtual object 506 in the AR overlay of theelectronic device 504 to reflect this difference so as to accurately depict how thevirtual object 506 would appear given the particular position/orientation of theelectronic device 504. - The adaptation of the AR display of the
virtual object 506 in another electronic device based on physical position/orientation can be used for either concurrent viewing of the virtual object, or for time-shifted viewing. To illustrate, both theelectronic device 502 and theelectronic device 504 could be concurrently viewing thelocal environment 508, and thus the generation, modification, or other manipulation of thevirtual object 506 by one of the electronic devices is soon reflected in the AR overlay of the other. Either user may then interact with thevirtual object 506 via the corresponding electronic device, and the manipulation of thevirtual object 506 would then be reflected in the depiction of thevirtual object 506 in the AR overlay of the other electronic device. - Alternatively, the user of the
electronic device 502 could place the virtual object into its indicated position at one time, and then when the user of theelectronic device 504 comes into thelocal environment 508 at another time, thevirtual object 506 is then made viewable in the AR overlay of theelectronic device 504 in a manner appropriate to the point of view of the object considering the current position/orientation of theelectronic device 504. Under this approach, the persistence of thevirtual object 506 may be maintained by prior storage of data representing thevirtual object 506 and its current position/orientation at theelectronic devices virtual object 506 with the proper perspective view at different times and different locations/orientations and for different devices, or a combination thereof. -
FIG. 6 illustrates an example implementation of a location-based service that facilitates the transformation of a local environment as viewed through the imaging cameras and display of an electronic device in accordance with at least one embodiment. Anelectronic device 600 can utilize its imaging cameras to capture imagery of thelocal environment 602 as the user moves through thelocal environment 602, and provide anAR overlay 604 over this imagery so as to transform local environment as depicted in thedisplay 104 of theelectronic device 600 into an alternate reality whereby depictions of physical objects in thelocal environment 602 within the captured and displayed imagery are “replaced” with corresponding objects in accordance with the theme of the transformation. To illustrate, theAR overlay 604 may implement a “jungle” theme whereby physical objects in the captured imagery are replaced with jungle-themed objects in the display of the imagery at theelectronic device 600. Thus, acouch 612 and alamp 614 captured in the imagery of thelocal environment 602 could be replaced with imagery of aboulder 622 and apalm tree 624, respectively, in theAR overlay 604. Likewise, walls in the local environment can be replaced with depictions of rows of trees or rows of bamboo, patches of grass can be replaced with depictions of quicksand, etc. - In some embodiments, this
AR overlay 604 is intended as a 3D AR overlay such that as the user moves theelectronic device 600 around thelocal environment 602, the view of the physical objects changes in the captured imagery, and thus theelectronic device 600 changes the view of the replacement virtual objects to match. That is, the virtual objects may be implemented as 3D graphical representations such that as theelectronic device 600 moves relative to the physical object represented by a virtual object in theAR overlay 604, the view of the virtual object presented in theAR overlay 604 changes in response to the change in perspective of theelectronic device 600. To this end, theelectronic device 600 may determine its current position/orientation, and from this determine how the virtual objects would appear in the current scene depicted by theelectronic device 600, and format theAR overlay 604 accordingly. - To perform this replacement, the
electronic device 600 first identifies physical objects in the captured imagery and which are candidates for virtual object replacement. In some embodiments, this process can include using object detection algorithms to detect objects within the images, as well as estimation of the dimensions of the objects using, for example, depth information from a depth sensor, or based on comparison of the size of the object in the imagery to some known scale or calibration tool. Moreover, the canonical orientation or features of the object also may be identified to ensure more precise geometric matching between the appearance of the physical object and its replacement virtual object, such that the proxy virtual object substantially matches the geometric constraints of the physical object. To illustrate, based on the size of thecouch 612 and based on thecouch 612 having a primarily horizontal canonical orientation, theelectronic device 600 may determine that theboulder 622 is a suitable proxy, and the depiction of theboulder 622 may be formatted such that the canonical orientation of theboulder 622 follows the corresponding canonical orientation of thecouch 612. Conversely, based on the size of thelamp 614 and based on thelamp 614 being primarily vertical in appearance, theelectronic device 600 may chose thepalm tree 624 as a suitable proxy, with the depiction of thepalm tree 624 scaled, oriented, and otherwise formatted accordingly. Other examples can include, for example, replacing walls with rows of bamboo or other trees, chairs with ruminants, curtains with vines, and the like. In this manner, thelocal environment 602 can be transformed into a jungle-themed alternate reality while maintaining the geometries of the local environment in the alternate reality, albeit in a different visual form. -
FIG. 7 illustrates an example implementation of a location-based service that facilitates user navigation through a local environment through an AR overlay on captured imagery of the local environment in accordance with at least one embodiment. In this implementation, a user of a smartphone, tablet computer, or otherelectronic device 700 may specify a sought-after object or intended destination. In response, theelectronic device 700 may query a cloud service or a local mapping database to determine the position or location of the object or destination. From this, theelectronic device 700 may determine a route to the object or destination, and then provide navigational assistance to the user in the form of graphical navigational aids displayed as an AR overlay on the displayed captured imagery of the local environment. - To illustrate using the example depicted in
FIG. 7 , a user of theelectronic device 700 may program in a destination and start out walking along apath 704. As the user approaches afork 706 in thepath 704, the user may hold up theelectronic device 700 such that thefork 706 is displayed at thedisplay 104 of theelectronic device 700. In response, theelectronic device 700 may determine its current position and the path between its current position and the intended destination, and then provide navigational assistance to the user by providing anAR overlay 708 that displays a right-bearingarrow 710 identifying the right branch of thefork 704 as the correct choice to reach the destination. TheAR overlay 708 further can include additional information about the path or the trip thus far, such as the user's current speed, the distance to the destination along the current path, and the like. - As another example, this navigational assistance feature can be used to facilitate user navigation in a retail setting. To illustrate, a user may enter a store intending to purchase an item, but the user is unaware of where the item is stocked within the aisles and shelves of the store. The store thus may maintain a database storing information regarding the layout of the store and the location of each item stocked by the store within this layout. The
electronic device 700 may access this database from a cloud service to pinpoint the location of the item within the store, and given the current position/orientation of theelectronic device 700, determine a path to the location of the item and provide navigational assistance through navigational cues displayed in an AR overlay at theelectronic device 700. Such navigational assistance can include, for example, displaying directional cues (e.g., which direction to head, arrows pinpointing the position of the item on a shelf), aisle/shelf numbers, one or more images of the item or its packaging to facilitate identification of the item on the shelf, and the like. - Further, in some embodiments, the AR overlay can provide navigation assistance in a form akin to “X-ray vision” whereby the AR overlay can depict the item in its location through one or more aisles or other barriers between the item in its location and the
electronic device 700. In support of this feature, the database of the store supplies a three-dimensional mapping of the store interior so that theelectronic device 700 can accurately render the intervening shelves and other barriers in transparent or semi-transparent form consistent with the view perspective of the shelves and other barriers to theelectronic device 700 in its current position/orientation. - The store navigational feature can be used in conjunction with retail analytics or advertisement functionality to provide additional information to the user via the AR overlay. To illustrate, a user may enter a store to obtain mint-flavored mouthwash, and a cloud service may use this information to determine whether any corn starch is in the current inventory of the store, and if not, determine that the citrus-flavored mouthwash that is in stock is a viable alternative and thus provide navigational assistance to the location of the citrus-flavored mouthwash instead. Further, the AR overlay may be used to display any coupons, discounts, or deals for the item sought, to display targeted advertising based on the sought-after item, or to provide reviews from one or more sources to help the user choose between one or more brands of the item in stock at the store.
-
FIG. 8A illustrates an example implementation of a location-based service that facilitates object measurement in accordance with at least one embodiment. In this implementation, the ability of anelectronic device 800 to accurately pinpoint its relative position using one or a combination of position-determining techniques such as visual odometry, GPS, inertial navigation, and the like, is leveraged to enable accurate measurement of one or more dimensions of a physical object. From these dimensions, other physical measurements of the physical object may be determined, such as an area of a feature of the object, a volume of the object, an angle of a plane or edge of the object relative to gravity or another edge or plane, and the like. - In one embodiment, the one or more dimensions of a physical object are determined by determining the distance between the position of the
electronic device 800 at one end of the physical object and the position of theelectronic device 800 at the other end of the physical object along a measurement axis. To illustrate, in the example ofFIG. 8A , theelectronic device 800 can be used to measure thediagonal dimension 802 of the top of a table 804. To measure thediagonal dimension 802, the user positions theelectronic device 800 atcorner 806 and then signals theelectronic device 800 to mark itscurrent position 807 by, for example, pressing a button on theelectronic device 800 or by gently tapping a corner of theelectronic device 800 against thecorner 806. The user then positions the electronic device at theopposite corner 808 and again signals theelectronic device 800 to mark itscurrent position 809. Theelectronic device 800 then may calculate the dimension 802 (that is, the distance between the twocorners 806, 808) as thedistance 810 between thepositions electronic device 800. This process may be repeated for each additional dimension of interest. - As part of this measurement process, the
electronic device 800 may need to compensate for any deviations caused by inconsistent placement or orientation of theelectronic device 800 by the user. To illustrate, the user may have held theelectronic device 800 in a horizontal position when tapping thecorner 806 while holding theelectronic device 800 in a vertical position when tapping thecorner 808. Thus, theelectronic device 800 also may note its orientation at each ofpositions distance 810 by an offset or scaling factor based on the orientation deviations. - This dimension-measuring technique may be combined with object-recognition functionality so as to particularly identify the physical object, and then query a database to obtain additional information of the identified object, including, for example, additional dimensional data for the physical object. To illustrate, the user may enter a large furniture store and spot a couch. Interested in the couch but concerned that the couch may not fit into the user's living room, the user can use the dimension-measurement technique described above to determine a major dimension of the couch. The user also then uses the
electronic device 800 to capture one or more images of the couch, in addition to capturing depth information for the images using a depth sensor as described above. The dimension measurement, images, and depth information are then transmitted to a cloud service that analyzes the images to identify the particular model of the couch. In this example, the measured major dimension of the couch is used by the cloud service to limit its search to only those models of couches that fit that measured dimension. Likewise, because the depth information is substantially independent of lighting and background, the geometry information revealed by the depth information can be used to further constrain the scope of the search, thereby facilitating a more efficient object recognition search. When the model of the couch is so identified, the cloud service can access its databases to determine additional information on the model of couch and forward this information to theelectronic device 800 for display to the user. This additional information can include, for example, additional dimensional information, warranty information, product description, user reviews, prices offered by various retailers, advertisements for associated goods and services, and the like. - In some embodiments, one or more dimensions of the physical object are determined based on 3D mapping of the physical object through depth information from a depth sensor or other information, and the user's interaction with imagery of the physical object on the display of the
electronic device 800 is then used to provide dimensional information. To illustrate, in the example ofFIG. 8B , a user captures imagery of the table 804, which is displayed at the display of theelectronic device 800. Theelectronic device 800, or a cloud service in communication with theelectronic device 800, determines a 3D mapping of the salient edges of the imagery containing the table 804, such as by using depth information obtained from a depth sensor ofelectronic device 800 or by obtaining a CAD model of the table 804 from a previously-defined 3D mapping of the table 804 (and identified using, for example, the position and orientation of theelectronic device 800 at the time of capture of the imagery). The user may then interact with this imagery to define two or more points of interest on the depicted imagery of the table 804, such as by tapping the touchsceen of theelectronic device 800 atpoints electronic device 800 or cloud service may utilize a 3D model of the table 804 to render a 2D view of the table 804 from some perspective (e.g., a top-down view or a front view) of the table 804, and the user may select the points of interest from this 2D view. - The
electronic device 800 or associated cloud service may then calculate one or more dimensional measurements from the points of interest using a 3D model of the table 804 determine from depth information or a priori modeling. For example, the with thepoints electronic device 800 may measure thedistance 824 between these twopoints points electronic device 800 or cloud service can calculate the distances between the three points, the angles between the lines between the three points, the area defined by the three points. In the event that more than three points are selected, theelectronic device 800 also can determine the volume of the space defined by the four or more points. These dimensional evaluation techniques thus can be used to analyze the relationships between objects (e.g., the angle of intersection of two walls, whether a picture or table top is level) or envision an imaged object in another space. For example, a user could capture an image of a doorway leading into the user's bedroom to determine its dimensions, and then when at a furniture store, the user can take a picture of a dresser cabinet to determine the cabinet's dimensions, and from this determine whether the cabinet would fit through the doorway if the user chose to purchase the cabinet. - The image-based dimensional measurement technique also can be automated, rather than responding to user input. To illustrate, while the user is operating the imaging cameras of the electronic device to capture still picture or video imagery, in the background the
electronic device 800 may performing 3D mapping analysis to identify the salient dimensions of the physical objects captured in the imagery. This identification may be performed using 3D modeling using depth information, by identifying the position and orientation of theelectronic device 800 and then accessing a 3D model database to identify the objects present in the device's view based on the position/orientation information. Thus, when a user indicated interest in one of the physical objects, theelectronic device 800 may provide information on one or more of the dimensions, volume, area, pertinent angles, and the like, using, for example, an AR overlay. -
FIGS. 9-11 illustrates example implementations of location-based services that facilitate display of an AR overlay on captured imagery of a local environment such that graphical content of the AR overlay is integrated with the physical objects of the local environment in the captured imagery based on the pose (position/orientation) of the electronic device relative to the objects in accordance with at least one embodiment. With the pose of the electronic device in the 3D space representing the local environment, the electronic device can present or format 3D graphical content or metadata in the same 3D space presented by the displayed imagery so that the graphical content is geometrically consistent or compliant with the geometries of the physical objects in the local environment. To illustrate, the location-based services may provide a virtual pinball game whereby a pinball AR overlay is displayed over video imagery of the side of a building captured by the electronic device. Using 3D mapping techniques, the portable electronic device can determine the dimensions and geometry of the side of the building, and from this information and from its current pose, the electronic device can format an AR overlay such that the pinball game elements are viewed from a perspective that matches the dimensions and geometry of the side of the building. The pinball AR overlay is thus formatted to follow the contours of the building using the perspective represented by the pose of the portable electronic device, and elements of the building facade, such as windows or cornices, may be incorporated as barriers or gaming features in the pinball game. - Further, depth information available from the depth sensor implemented at the electronic device can be used to incorporate user actions into the material presented by the AR overlay. To illustrate, a user may use the electronic device to capture imagery of a neighborhood park, and with this imagery and a mapping of the area obtained either from the depth sensor or separate 3D mapping information for the park (e.g., as provided by a cloud service), the electronic device can implement an AR overlay on the live feed of video imagery in a manner that depicts a virtual ball present in the local environment, and whereby the depth sensor can detect the user kicking at this virtual ball (by detecting the user's foot as it enters the field of view of the depth sensor), and from the velocity and direction of the user's foot, simulate the virtual ball traveling through the imagery of the local environment as though it was in fact kicked. Because the electronic device is aware of its position and orientation with respect to the local environment, changes in either of the position and orientation can be accurately reflected by changing the perspective of the AR overlay to match the changed position/orientation. To illustrate, as the virtual ball in the example above is travelling across a grass lawn, the user may run behind the virtual ball, and the AR overlay updates the view of the virtual ball to reflect the changing perspective.
- Additionally, because the depth sensor or a priori 3D mapping information informs the electronic device of the geometries of the physical objects and their distances from the electronic device, the electronic device can use this information to format the AR overlay such that virtual objects realistically interact with physical objects. For example, the AR overlay can determine the trajectory of the virtual ball within the local environment, detect the presence of a wall along this trajectory, and thus depict the virtual ball as bouncing off of the physical wall in a manner that emulates the behavior of an actual physical ball traveling along the same trajectory. As another example, the electronic device may detect two different physical objects at different distances from the electronic device and when the trajectory of the virtual ball is plotted to take the virtual ball between these two physical objects, the AR overlay can be formatted to depict the virtual ball as disappearing behind the closer physical object while remaining in front of the farther physical object.
-
FIG. 9 illustrates an example of this AR overlay technique in the context of a virtual pet game provided by anelectronic device 900. In this virtual pet game, theelectronic device 900 provides an AR overlay that depicts a pseudo-autonomousvirtual pet 902 in a local environment 904 (e.g., the pet “owner's” house). With knowledge of the geometry of thelocal environment 904 determined using one or more 3D mapping techniques, the AR overlay can be formatted to provide the appearance of thevirtual pet 902 interacting with thelocal environment 904 and the user of theelectronic device 900. Thevirtual pet 902 can be depicted as scampering across the floor, hiding behind objects, traveling in and out of rooms, and interacting with the user. To illustrate, thevirtual pet 902 may be depicted as skittish, such that as if the user gets “too close” to thevirtual pet 902, thevirtual pet 902 is depicted as heading for a closet to hide. As with the examples above, this virtual pet AR overlay can utilize the geometries of physical objects within thelocal environment 904 to render the physical behavior of thevirtual pet 902 more realistic, such as by realistically depicting a trajectory of thevirtual pet 902 so that it disappears behind physical objects that are closer to the electronic device. Moreover, external data may be utilized to modify or control the depicted behavior of thevirtual pet 902. To illustrate, knowing cat's fondness for napping in sunbeams, theelectronic device 900 can determine its current longitude and latitude and the current time of day, and from this information determine the position of the sun in the sky by accessing the appropriate database from a cloud service. With the position of the sun so determined, and with knowledge of the placement and orientation of windows in the local environment, the electronic device can determine the angle and presence of a sunbeam anticipated to be streaming in through a window in thelocal environment 904 and depict a virtual cat (an example of the virtual pet 902) as moving to and then napping in the location of the sunbeam. - As with the other examples of AR overlays described above, the virtual pet AR overlay utilizes the current pose of the
electronic device 900 to inform the content displayed in the AR overlay. To illustrate, theelectronic device 900 may simulate the actions of thevirtual pet 902 such that it appears to hide behind aphysical object 906 in thelocal environment 904 after being “scared” by the user. Accordingly, when the user holds theelectronic device 900 in position A depicted inFIG. 9 , the AR overlay omits presentation of thevirtual pet 902 so as to give the appearance of thevirtual pet 902 hiding behind thephysical object 906. However, when the user moves theelectronic device 900 to position B depicted inFIG. 9 , the new pose of theelectronic device 900 permits viewing behind thephysical object 906, and thus the AR overlay depicts thevirtual pet 902 behind thephysical object 906. -
FIG. 10 illustrates an example of the AR overlay technique in the context of instructional content provided by anelectronic device 1000. In this approach, the AR overlay can provide instructional information intended to guide a user's interaction with a physical object in the local environment. To this end, theelectronic device 1000 captures and displays imagery of the physical object. Theelectronic device 1000 determines its position and orientation relative to the physical object (as currently depicted in the imagery), and from this formats an AR overlay so that the instructional information presented in the AR overlay is consistent with the geometry of the physical object as presented in the electronic device's view of the physical object. In the example ofFIG. 10 , a circuit has blown in the user's house, and the user thus needs to throw the appropriate circuit breaker to reset the circuit. The user is unaware of the location of thecircuit breaker panel 1002 for the house, and thus the user interacts with anelectronic device 1000 to bring up a CAD model of the house, which pinpoints the location of thecircuit breaker panel 1002. Further, an AR overlay can provide the “X-ray” feature described above so as to depict thecircuit breaker panel 1002 through one or more walls or other obstacles in the house. - With this information, the user travels to the
circuit breaker panel 1002 and positions theelectronic device 1000 to face thecircuit breaker panel 1002 having a plurality ofcircuit breakers 1004 for various circuits within the user's abode. To enable the user to identify whichcircuit breaker 1004 belongs to which circuit, theelectronic device 1000 can scan a QR code on thecircuit breaker panel 1002 to determine the model of the circuit breaker panel, and with the model so identified, access a database compiled by the builder of the home that identifies the pairings between circuit breaker and circuit for the given model. Theelectronic device 1000 then can provide anAR overlay 1006 that provides graphical icons identifying the circuit for eachcircuit breaker 1004 present in the displayed imagery of thecircuit breaker panel 1002. Furthermore, theelectronic device 800 can access the wiring information from the CAD model of the house to determine the position of the wiring of various circuits, and then provide an AR overlay depicting where the wiring is located in relation to the walls, ceiling, floor, and other spaces depicted in the captured imagery displayed by theelectronic device 1000. - As another example of an instructional AR overlay, the user can use the imaging camera of the
electronic device 1000 to capture video of the user throwing a baseball. Theelectronic device 1000 then may detect the geometry of the user as the user performs the throwing the baseball using the depth sensor, and compare this to an idealized throwing model that is scaled to the user's particular physiology. Theelectronic device 1000 then may overlay this idealized throwing model on top of the playback of the video of the user throwing the baseball, thereby allowing the user to identify any deviations between the user's throwing mechanics and the ideal throwing mechanics. - As yet another example of an instructional AR overlay, a user may need to undo a paper jam in a photocopier. Thus, the user can focus the imaging camera of the
electronic device 1000 on the photocopier, in response to which theelectronic device 1000 uses a QR code, bar code, or other visual indicia to identify the model of the photocopier. Theelectronic device 1000 then can access from a database the visual instructions for solving a paper jam for that particular model, and then overlay graphical representations of these instruction over the captured imagery of the photocopier displayed at theelectronic device 1000. To this end, theelectronic device 1000 can determine its perspective view of the photocopier based on the current pose of theelectronic device 1000 or based on depth information collected from the depth sensor, and then format the graphical representations so as to match the perspective view of the photocopier as displayed in the captured imagery. To illustrate, based on the identified perspective view, the AR overlay can locate a graphical icon identifying a toner release lever such that it is positioned over the location of the toner release lever in the displayed imagery of the photocopier. -
FIG. 11 illustrates an example of the AR overlay technique in the context of virtual pedometry provided by anelectronic device 1100. In this approach, theelectronic device 1100 can monitor and record a path of the user as the user traverses through alocal environment 1102. The user's prior path thus may be reflected in anAR overlay 1104 of theelectronic device 1100 whereby the user's prior path is represented in displayed imagery of thelocal environment 1102 using some form of a graphical representation, such asbarefoot icons 1106 that are oriented in the direction of travel at each point. Accordingly, the user may identify where the user has and has not been within thelocal environment 1102 by viewing thelocal environment 1102 through the display 104 (FIG. 1 ) of theelectronic device 1100 with theAR overlay 1104 present. Moreover, if there are multipleelectronic devices 1100 cooperating in this regard, the paths of the users may be made available to each of the multiple electronic devices via a cloud service, and thus theAR overlay 1104 at one of theelectronic devices 1100 may depict the prior paths or locations of some or all of the other users. This pedometry AR overlay further may incorporate sound effects to enhance the experience. To illustrate, theelectronic device 1100 may use an accelerometer to detect the footfall pattern of the user, and as the user's foot strikes the ground (or is anticipated to strike the ground), theelectronic device 100 may play a footfall sound synchronized to this foot strike, and also may display the next footfall icon in theAR overlay 1104 to coincide with this foot strike. This functionality may be utilized as a game, such as an AR version of the light-cycle game played in the Tron movie. The user also may use this functionality to identify areas of thelocal environment 1102 that the user has already visited, or has not yet visited. Moreover, the tracking of the path of the user can be used by theelectronic device 1100 to measure non-linear distances, such as the length of a path walked by the user or a distance a user walks between two buildings. -
FIG. 12 illustrates an example implementation of a location-based service that facilitates user-initiated mapping of previously unmapped spaces in accordance with at least one embodiment. Oftentimes, the 3D mapping database for a facility or location may be incomplete, in that one or more areas of the facility or location have not yet been mapped and incorporated into the 3D database of the facility or location. The entity managing the 3D mapping database typically addresses this lack of information by either accepting that the area will remain unmapped, or if it is sufficiently important to map the unmapped areas, commit the time and expense to send a mapping crew to the facility or location specifically to complete the mapping of the facility or location. However, rather than rely on a specialized mapping crew, the AR functionality and the position/orientation detection functionality of the electronic device can be leveraged to induce users of the electronic device to enter the unmapped areas and capture imagery, 3D mapping information, or both, via their electronic devices while in the unmapped areas. This captured information then may be supplied back to the entity for incorporation into its 3D mapping database. - In at least one embodiment, the user is induced to provide this service through the form of a game presented to the user via an AR overlay, in which the game displays some form of graphical information that induces the user to move into the unmapped area and manipulate the electronic device into different positions and orientations while in the unmapped area so that various sensor information of the unmapped area, such as imagery, depth information, ambient light, temperature, and the like, can be obtained for the unmapped area from different perspectives and locations.
- To illustrate,
FIG. 12 illustrates an example whereby a cloud service maintains amap 1202 of a facility, with themap 1202 indicating which areas of the facility have been mapped and which have not yet been mapped (with unmapped areas indicated by the hash-lined shading in the map 1202). The cloud service detects aposition 1201 of anelectronic device 1200 as being proximity to an unmapped area (e.g., through GPS coordinates provided by the electronic device 1200), and thus the cloud service invites the user of theelectronic device 1200 to participate in a game. In this game, the user is challenged to track avirtual gecko 1204 that is depicted in an AR overlay for the imagery of the local environment captured and displayed by theelectronic device 1200, whereby thevirtual gecko 1204 is formatted so as to appear to travel over the walls, floors, and ceiling of the facility as viewed through theelectronic device 1200. The cloud service determines the present orientation and position of theelectronic device 1200 and determines the direction the user needs to travel to enter an unmapped area, and then controls the behavior of thevirtual gecko 1204 so that thevirtual gecko 1204 is depicted as travelling in the direction of the unmapped area and such that the user is induced to “follow” thevirtual gecko 1204 toward the unmapped area. Once at the unmapped area, thevirtual gecko 1204 may be controlled to travel to yet-unmapped locations, with the user andelectronic device 1200 following, until the cloud service has obtained sufficient imagery and other sensor data from theelectronic device 1200 to 3D map the unmapped area. - Any of a variety of techniques may be used to incentivize the user's participation in this game. For example, the challenge of appearing on a scoring leaderboard may sufficiently incentivize the user, with scoring handled by, for example, the percentage of time a user is able to maintain the
virtual gecko 1204 within the view of the electronic device. Alternatively, financial incentives may be employed, such as by proportionally reducing the user's data plan bill based on the score attained by the user or the number of minutes the user plays the game, by offering coupons or other payment, and the like. In this manner, the entity maintaining the 3D database for the facility can efficiently and inexpensively complete the 3D mapping of the facility through the cooperation of users of electronic devices. - In addition to providing AR-based services that rely on the precise position/orientation information maintained by the electronic device using visual odometry, SLAM, and other techniques, the electronic device can provide location-based services that utilize virtual reality techniques. For these techniques, the position/orientation of the electronic device within a local environment may be equated to a corresponding position/orientation within a virtual environment or a remote environment, and thus the view of this other environment displayed at the electronic device may be controlled by the position/orientation of the electronic device within the local environment. Thus, the manner in which the user travels through a local environment and handles the electronic device in the local environment is reflected in the views of the remote/virtual environment displayed by the
electronic device 100 to the viewer. - To illustrate,
FIG. 13 depicts an example VR-based service provided by anelectronic device 1300 whereby a user can “walk” through a virtual/remote environment 1302 (as viewed through the electronic device 1300) by manipulating the position and orientation of theelectronic device 1300 in a local environment 1304, whereby the imagery of the virtual/remote environment 1302 changes to reflect a changing viewpoint caused by changes in position/orientation of theelectronic device 1300 in the local environment 1304. In this example, the virtual environment 1302 includes abox 1305 upon a table 1306. The user may view thebox 1305 and table 1306 from various angles in the virtual environment 1302 by moving around the local environment 1304 in a corresponding manner. To illustrate, when theelectronic device 1300 is at position/orientation 1310 in the local environment 1304, theelectronic device 1300 determines the view of thebox 1305 and table 1306 that would be presented to the user in the corresponding position/orientation 1311 in the virtual/remote environment 1302, and displays this view of thebox 1305 and table 1306 in the manner depicted as “Viewpoint A” inFIG. 13 . When the user moves so that theelectronic device 1300 is at position/orientation 1312 in the local environment 1304, theelectronic device 1300 determines the view of thebox 1305 and table 1306 that would be presented to the user in the corresponding position/orientation 1313 in the virtual/remote environment 1302, and displays this view in the manner depicted as “Viewpoint B” inFIG. 13 . - Using this technique, a user can traverse through a virtual/remote environment by performing corresponding actions in the local environment. Thus, this technique can be used to, for example, permit a user to walk through a movie scene recorded in 3D from multiple viewpoints, a sporting event likewise recorded in 3D from multiple viewpoints, walk through a simulation of a building or other facility, and the like. As another example, a restaurant can be 3D mapped and the resulting 3D map stored with a cloud service, and a patron of the restaurant wishing to make a reservation can access the 3D mapping and “walk around” the restaurant using this technique to ascertain the view of the restaurant from each table of the restaurant. The patron can then select the table that best suits the patron's taste, and contact the restaurant to reserve that table.
- Much of the inventive functionality and many of the inventive principles described above are well suited for implementation with or in software programs or instructions and integrated circuits (ICs) such as application specific ICs (ASICs). It is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation. Therefore, in the interest of brevity and minimization of any risk of obscuring the principles and concepts according to the present disclosure, further discussion of such software and ICs, if any, will be limited to the essentials with respect to the principles and concepts within the preferred embodiments.
- In this document, relational terms such as first and second, and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “comprises . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element. The term “another”, as used herein, is defined as at least a second or more. The terms “including” and/or “having”, as used herein, are defined as comprising. The term “coupled”, as used herein with reference to electro-optical technology, is defined as connected, although not necessarily directly, and not necessarily mechanically. The term “program”, as used herein, is defined as a sequence of instructions designed for execution on a computer system. A “program”, or “computer program”, may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
- The specification and drawings should be considered as examples only, and the scope of the disclosure is accordingly intended to be limited only by the following claims and equivalents thereof. Note that not all of the activities or elements described above in the general description are required, that a portion of a specific activity or device may not be required, and that one or more further activities may be performed, or elements included, in addition to those described. Still further, the order in which activities are listed are not necessarily the order in which they are performed. The steps of the flowcharts depicted above can be in any order unless specified otherwise, and steps may be eliminated, repeated, and/or added, depending on the implementation. Also, the concepts have been described with reference to specific embodiments. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of the present disclosure.
- Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any feature(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature of any or all the claims.
Claims (30)
1. A method comprising:
determining a position and orientation of a first electronic device relative to a second electronic device based on a three-dimensional mapping of an environment of the first and second electronic devices; and
controlling an interaction between the first and second electronic devices based on the position and orientation.
2. The method of claim 1 , wherein controlling an interaction between the first and second electronic devices comprises:
displaying a first augmented reality overlay at the first electronic device, the first augmented reality overlay displaying a first perspective of a virtual object in the environment based on a position and orientation of the first electronic device; and
displaying a second augmented reality overlay at the second electronic device, the second augmented reality overlay displaying a second perspective of the virtual object based on a position and orientation of the second electronic device.
3. The method of claim 2 , wherein controlling an interaction between the first and second electronic devices further comprises:
receiving, at the first electronic device, user input representing a manipulation of the virtual object; and
modifying a depiction of the virtual object in the second augmented reality overlay at the second electronic device in response to the manipulation of the virtual object at the first electronic device.
4. The method of claim 3 , wherein:
the manipulation of the virtual object comprises an insertion of the virtual object into the first augmented reality overlay at the first electronic device; and
modifying the depiction of the virtual object in the second augmented reality overlay comprises initiating display of the virtual object in the second augmented reality overlay.
5. The method of claim 3 , further comprising:
communicating first data representing the manipulation of the virtual object from the first electronic device to a remote processing system; and
communicating a representation of the first data from the remote processing system to the second electronic device;
wherein modifying the depiction of the virtual object in the second augmented reality overlay comprises modifying the depiction of the virtual object based on the representation of the first data.
6. The method of claim 1 , wherein controlling an interaction between the first and second electronic devices comprises:
receiving user input at the first electronic device, the user input representing initiation of motion of a virtual object displayed in a first augmented reality overlay at the first electronic device; and
in response to the user input:
identifying the second electronic device based on the motion of the virtual object and the position and orientation of the first electronic device relative to the second electronic device; and
controlling the first augmented reality overlay at the first electronic device and a second augmented reality overlay at the second electronic device to depict a transfer of the virtual object from the first electronic device to the second electronic device.
7. The method of claim 1 , wherein controlling an interaction between the first and second electronic devices comprises:
receiving user input at the first electronic device, the user input representing a manipulation of a virtual object displayed in an augmented reality overlay at the first electronic device; and
communicating data representative the manipulation of the virtual object to a remote processing device for distribution to the second electronic device.
8. The method of claim 1 , wherein controlling an interaction between the first and second electronic devices comprises:
receiving, at the first electronic device, data representing a manipulation of a virtual object displayed in an augmented reality overlay at the second electronic device; and
displaying, at the first electronic device, a representation of the virtual object in an augmented reality overlay based on the manipulation of the virtual object and based on the position and orientation of the first electronic device relative to the second electronic device.
9. The method of claim 1 , wherein determining the position and orientation of the first electronic device relative to the second electronic device comprises:
determining at least one of an orientation and a position of the first electronic device in the environment based on one of: depth information from a depth sensor; and multiview analysis of imagery from a plurality of imaging sensors.
10. The method of claim 9 , wherein determining the position and orientation of the first electronic device relative to the second electronic device further comprises:
determining at least one of an orientation or a position of the second electronic device in the environment based on one of: depth information from a depth sensor; and multiview analysis of imagery from a plurality of imaging sensors.
11. A method comprising:
displaying, at a first time at a first electronic device, a first augmented reality overlay on imagery of a local environment captured by the first electronic device, the first augmented reality overlay including a depiction of a virtual object from a first perspective that is based on a position and orientation of the first electronic device using a three-dimensional mapping of the local environment; and
displaying, at a second time at a second electronic device, a second augmented reality overlay on imagery of the local environment captured by the second electronic device, the second augmented reality overlay including a depiction of the virtual object from a perspective based on a position and orientation of the second electronic device using the three-dimensional mapping.
12. The method of claim 11 , wherein the first time and the second time are concurrent.
13. The method of claim 12 , further comprising:
transmitting first data from the first electronic device to a remote processing system, the first data representing a user manipulation of the virtual object at the first electronic device; and
receiving second data from the remote processing system at the second electronic device, the second data representative of the user manipulation of the virtual object; and
wherein the second augmented reality overlay comprises a depiction of the user manipulation of the virtual object from the perspective based on the position and orientation of the second electronic device.
14. The method of claim 11 , wherein the second time is offset from the first time.
15. The method of claim 14 , further comprising:
transmitting first data from the first electronic device to a remote processing system at the first time, the first data representing a user manipulation of the virtual object at the first electronic device;
storing the first data at the remote processing system;
subsequently accessing, at the second electronic device, a representation of the first data from the remote processing system at the second time; and
wherein the second augmented reality overlay comprises a depiction of the user manipulation of the virtual object from the perspective based on the position and orientation of the second electronic device.
16. The method of claim 11 , further comprising:
separately determining the three-dimensional mapping of the local environment at each of the first electronic device and the second electronic device using visual telemetry.
17. The method of claim 11 , further comprising:
receiving, at the first electronic device, data representative of the three-dimensional mapping of the local environment; and
determining at least one of the position and the orientation of the first electronic device in the local environment based on the three-dimensional mapping and based on visual telemetry at the first electronic device.
18. The method of claim 17 , wherein the visual telemetry uses at least one of: depth sensing using a depth sensor; and multiview analysis using at least two imaging sensors.
19. An electronic device comprising:
a position/orientation tracking module to determine a position and orientation of the electronic device within a local environment based on a three-dimensional mapping of the local environment; and
a location-based services module coupled to the position/orientation tracking module, the location-based services module to provide an augmented reality overlay on imagery of the local environment captured by the electronic device, the augmented reality overlay including a depiction of a virtual object from a perspective that is based on the position and orientation of the electronic device.
20. The electronic device of claim 19 , wherein the position/orientation tracking module is to determine at least one of the position and the orientation of the electronic device based on visual telemetry information.
21. The electronic device of claim 20 , wherein the electronic device comprises:
a depth sensor coupled to the position/orientation tracking module, wherein the position/orientation tracking module is to determine the visual telemetry information using depth information from the depth sensor.
22. The electronic device of claim 20 , wherein the electronic device comprises:
a plurality of imaging sensors coupled to the position/orientation tracking module, wherein the position/orientation tracking module is to determine the visual telemetry information based on multiview analysis of image information from the plurality of imaging sensors.
23. The electronic device of claim 19 , further comprising:
an interface connected to a remote processing system; and
wherein the location-based services module is to receive data representing a manipulation of the virtual object by a user of another electronic device and to modify the depiction of the virtual object by the augmented reality overlay based on the manipulation of the virtual object.
24. The electronic device of claim 19 , further comprising:
a first interface to couple to a remote processing system;
an interface to receive input representative of a manipulation of the virtual object by a user of the electronic device; and
wherein the location-based services module is to provide to the first interface data representative of the manipulation of the virtual object for transmission to the remote processing system for distribution to at least one other electronic device.
25. A system comprising:
a first electronic device to display a first augmented reality overlay on imagery of a local environment captured by the first electronic device, the first augmented reality overlay including a depiction of a virtual object from a first perspective that is based on a position and orientation of the first electronic device using a three-dimensional mapping of the local environment; and
a second electronic device to display a second augmented reality overlay on imagery of the local environment captured by the second electronic device, the second augmented reality overlay including a depiction of the virtual object from a perspective based on a position and orientation of the second electronic device using the three-dimensional mapping.
26. The system of claim 25 , further comprising:
a remote processing system coupled to the first and second electronic devices, the remote processing system to communicate data representative of a manipulation of the virtual object by a user of the first electronic device from the first electronic device to the second electronic device; and
wherein the second electronic device is to modify the depiction of the virtual object in the second augmented reality overlay based on the data representative of the manipulation of the virtual object.
27. The system of claim 25 , wherein at least one of the first electronic device and the second electronic device is to determine at least one of a position and an orientation using visual telemetry.
28. The system of claim 27 , wherein the visual telemetry uses at least one of: depth sensing using a depth sensor; and multiview analysis using at least two imaging sensors.
29. The system of claim 25 , wherein the first and second augmented reality overlays are displayed concurrently.
30. The system of claim 25 , wherein the first and second augmented reality overlays are displayed at separate times.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/588,515 US20150193982A1 (en) | 2014-01-03 | 2015-01-02 | Augmented reality overlays using position and orientation to facilitate interactions between electronic devices |
US15/664,754 US10275945B2 (en) | 2014-01-03 | 2017-07-31 | Measuring dimension of object through visual odometry |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461923629P | 2014-01-03 | 2014-01-03 | |
US14/588,515 US20150193982A1 (en) | 2014-01-03 | 2015-01-02 | Augmented reality overlays using position and orientation to facilitate interactions between electronic devices |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/664,754 Continuation US10275945B2 (en) | 2014-01-03 | 2017-07-31 | Measuring dimension of object through visual odometry |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150193982A1 true US20150193982A1 (en) | 2015-07-09 |
Family
ID=53495613
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/588,515 Abandoned US20150193982A1 (en) | 2014-01-03 | 2015-01-02 | Augmented reality overlays using position and orientation to facilitate interactions between electronic devices |
US15/664,754 Active US10275945B2 (en) | 2014-01-03 | 2017-07-31 | Measuring dimension of object through visual odometry |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/664,754 Active US10275945B2 (en) | 2014-01-03 | 2017-07-31 | Measuring dimension of object through visual odometry |
Country Status (1)
Country | Link |
---|---|
US (2) | US20150193982A1 (en) |
Cited By (120)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150356786A1 (en) * | 2014-06-09 | 2015-12-10 | Huntington Ingalls, Inc. | System and Method for Augmented Reality Display of Electrical System Information |
US20160063764A1 (en) * | 2014-08-27 | 2016-03-03 | Ricoh Company, Ltd. | Image processing apparatus, image processing method, and computer program product |
US20160148434A1 (en) * | 2014-11-20 | 2016-05-26 | Thomson Licensing | Device and method for processing visual data, and related computer program product |
US20160188763A1 (en) * | 2014-12-30 | 2016-06-30 | Energybox Ltd. | Visualization of Electrical Loads |
US20160284128A1 (en) * | 2015-03-27 | 2016-09-29 | Rockwell Automation Technologies, Inc. | Systems and methods for presenting an augmented reality |
CN106078747A (en) * | 2016-08-11 | 2016-11-09 | 贵州翰凯斯智能技术有限公司 | A kind of time delay industrial operation control system based on virtual reality |
WO2017031033A1 (en) * | 2015-08-19 | 2017-02-23 | Honeywell International Inc. | Augmented reality-based wiring, commissioning and monitoring of controllers |
US20170301140A1 (en) * | 2016-04-18 | 2017-10-19 | Disney Enterprises, Inc. | System and method for linking and interacting between augmented reality and virtual reality environments |
US20170307747A1 (en) * | 2016-04-22 | 2017-10-26 | ZhongGuang PAN | Position acquistion method and apparatus |
US20170315629A1 (en) * | 2016-04-29 | 2017-11-02 | International Business Machines Corporation | Laser pointer emulation via a mobile device |
US20170318318A1 (en) * | 2014-11-12 | 2017-11-02 | Sony Corporation | Method and system for providing coupon |
US20170359570A1 (en) * | 2015-07-15 | 2017-12-14 | Fyusion, Inc. | Multi-View Interactive Digital Media Representation Lock Screen |
US20180012330A1 (en) * | 2015-07-15 | 2018-01-11 | Fyusion, Inc | Dynamic Multi-View Interactive Digital Media Representation Lock Screen |
US20180039076A1 (en) * | 2016-08-04 | 2018-02-08 | International Business Machines Corporation | Facilitation of communication using shared visual cue |
WO2018031621A1 (en) * | 2016-08-11 | 2018-02-15 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
US20180063205A1 (en) * | 2016-08-30 | 2018-03-01 | Augre Mixed Reality Technologies, Llc | Mixed reality collaboration |
EP3301544A1 (en) * | 2016-09-30 | 2018-04-04 | Alcatel Lucent | System and method for controlling an altered-reality application |
US9947138B2 (en) | 2014-04-15 | 2018-04-17 | Huntington Ingalls Incorporated | System and method for augmented reality display of dynamic environment information |
US20180114065A1 (en) * | 2016-10-26 | 2018-04-26 | Alibaba Group Holding Limited | User location determination based on augmented reality |
US9995815B2 (en) | 2014-12-30 | 2018-06-12 | Energybox Ltd. | Energy metering system and method for its calibration |
US9996551B2 (en) | 2013-03-15 | 2018-06-12 | Huntington Ingalls, Incorporated | System and method for determining and maintaining object location and status |
WO2018128526A1 (en) * | 2017-01-09 | 2018-07-12 | Samsung Electronics Co., Ltd. | System and method for augmented reality control |
WO2018136946A1 (en) * | 2017-01-23 | 2018-07-26 | Magic Leap, Inc. | Localization determination for mixed reality systems |
US10048753B1 (en) * | 2017-04-20 | 2018-08-14 | Robert C. Brooks | Perspective or gaze based visual identification and location system |
EP3358444A4 (en) * | 2015-09-30 | 2018-08-22 | Shenzhen Dlodlo Technologies Co., Ltd. | Method and device for determining position of virtual object in virtual space |
WO2018165154A1 (en) * | 2017-03-06 | 2018-09-13 | Snap Inc. | Virtual vision system |
US20180288295A1 (en) * | 2015-02-02 | 2018-10-04 | Apple Inc. | Focusing lighting module |
US10147210B1 (en) * | 2015-03-13 | 2018-12-04 | Amazon Technologies, Inc. | Data visualization system |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US10168857B2 (en) | 2016-10-26 | 2019-01-01 | International Business Machines Corporation | Virtual reality for cognitive messaging |
WO2019006026A1 (en) * | 2017-06-28 | 2019-01-03 | Commscope Technologies Llc | Systems and methods for managed connectivity wall outlets using low energy wireless communication |
CN109254666A (en) * | 2018-09-21 | 2019-01-22 | 上海曼恒数字技术股份有限公司 | Virtual reality device positioning synchronous method, apparatus, equipment and medium |
US10198863B2 (en) | 2017-02-23 | 2019-02-05 | OPTO Interactive, LLC | Method of managing proxy objects |
WO2019027515A1 (en) * | 2017-07-31 | 2019-02-07 | Google Llc | Virtual reality environment boundaries using depth sensors |
CN109587188A (en) * | 2017-09-28 | 2019-04-05 | 阿里巴巴集团控股有限公司 | Determine the method, apparatus and electronic equipment of relative positional relationship between terminal device |
US10271013B2 (en) * | 2015-09-08 | 2019-04-23 | Tencent Technology (Shenzhen) Company Limited | Display control method and apparatus |
US20190128676A1 (en) * | 2017-11-02 | 2019-05-02 | Sony Corporation | Augmented reality based electronic device to provide location tagging assistance in an indoor or outdoor area |
US10290149B2 (en) * | 2016-04-08 | 2019-05-14 | Maxx Media Group, LLC | System, method and software for interacting with virtual three dimensional images that appear to project forward of or above an electronic display |
US10319149B1 (en) * | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US10324584B2 (en) * | 2015-12-10 | 2019-06-18 | Whirlpool Corporation | Touch screen display having an external physical element for association with screen icons |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US10408624B2 (en) | 2017-04-18 | 2019-09-10 | Microsoft Technology Licensing, Llc | Providing familiarizing directional information |
US10430997B2 (en) | 2017-02-23 | 2019-10-01 | OPTO Interactive, LLC | Method of managing proxy objects |
CN110310175A (en) * | 2018-06-27 | 2019-10-08 | 北京京东尚科信息技术有限公司 | System and method for mobile augmented reality |
US10504294B2 (en) | 2014-06-09 | 2019-12-10 | Huntington Ingalls Incorporated | System and method for augmented reality discrepancy determination and reporting |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US10511881B1 (en) * | 2018-05-31 | 2019-12-17 | Titan Health & Security Technologies, Inc. | Communication exchange system for remotely communicating instructions |
US20200005540A1 (en) * | 2018-06-29 | 2020-01-02 | The Travelers Indemnity Company | Systems, methods, and apparatus for managing augmented reality environments |
WO2020001039A1 (en) * | 2018-06-27 | 2020-01-02 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for multi-user augmented reality shopping |
US20200026806A1 (en) * | 2017-02-23 | 2020-01-23 | OPTO Interactive, LLC | Method of managing proxy objects |
US10571145B2 (en) * | 2015-04-07 | 2020-02-25 | Mitsubishi Electric Corporation | Maintenance support system for air conditioners |
US10636221B2 (en) * | 2016-12-21 | 2020-04-28 | Tencent Technology (Shenzhen) Company Limited | Interaction method between user terminals, terminal, server, system, and storage medium |
US20200177928A1 (en) * | 2018-11-30 | 2020-06-04 | Kt Corporation | Providing time slice video |
US10681183B2 (en) | 2014-05-28 | 2020-06-09 | Alexander Hertel | Platform for constructing and consuming realm and object featured clouds |
CN111295234A (en) * | 2017-09-08 | 2020-06-16 | 奈安蒂克公司 | Method and system for generating detailed data sets of an environment via game play |
US10733661B1 (en) * | 2015-05-22 | 2020-08-04 | Walgreen Co. | Automatic mapping of store layout using soft object recognition |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
US10748302B1 (en) | 2019-05-02 | 2020-08-18 | Apple Inc. | Multiple user simultaneous localization and mapping (SLAM) |
EP3696841A1 (en) * | 2019-02-14 | 2020-08-19 | ABB S.p.A. | Method for guiding installation of internal accessory devices in low voltage switches |
JP2020125951A (en) * | 2019-02-04 | 2020-08-20 | 旭化成ホームズ株式会社 | Position specifying system, portable terminal, dwelling house, method for utilizing position specifying system, and position specifying method and program |
WO2020190387A1 (en) * | 2019-03-19 | 2020-09-24 | Microsoft Technology Licensing, Llc | Relative spatial localization of mobile devices |
US10803663B2 (en) | 2017-08-02 | 2020-10-13 | Google Llc | Depth sensor aided estimation of virtual reality environment boundaries |
WO2020247399A1 (en) * | 2019-06-04 | 2020-12-10 | Metcalfarchaeological Consultants, Inc. | Spherical image based registration and self-localization for onsite and offsite viewing |
WO2020263838A1 (en) * | 2019-06-24 | 2020-12-30 | Magic Leap, Inc. | Virtual location selection for virtual content |
US10915754B2 (en) | 2014-06-09 | 2021-02-09 | Huntington Ingalls Incorporated | System and method for use of augmented reality in outfitting a dynamic structural space |
US10915781B2 (en) * | 2018-03-01 | 2021-02-09 | Htc Corporation | Scene reconstructing system, scene reconstructing method and non-transitory computer-readable medium |
US10940387B2 (en) * | 2019-03-15 | 2021-03-09 | Disney Enterprises, Inc. | Synchronized augmented reality gameplay across multiple gaming environments |
WO2021081068A1 (en) * | 2019-10-21 | 2021-04-29 | Wormhole Labs, Inc. | Multi-instance multi-user augmented reality environment |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
US11036048B2 (en) * | 2018-10-03 | 2021-06-15 | Project Whitecard Digital Inc. | Virtual reality system and method for displaying on a real-world display a viewable portion of a source file projected on an inverse spherical virtual screen |
WO2021129514A1 (en) * | 2019-12-24 | 2021-07-01 | Oppo广东移动通信有限公司 | Augmented reality processing method, apparatus and system, and storage medium, and electronic device |
US20210208551A1 (en) * | 2018-07-04 | 2021-07-08 | Carrier Corporation | Building management system and positioning method in building |
WO2021168338A1 (en) * | 2020-02-20 | 2021-08-26 | Magic Leap, Inc. | Cross reality system with wifi/gps based map merge |
US11120632B2 (en) * | 2018-10-16 | 2021-09-14 | Sony Interactive Entertainment Inc. | Image generating apparatus, image generating system, image generating method, and program |
US11164379B2 (en) * | 2017-06-08 | 2021-11-02 | Baidu Online Network Technology (Beijing) Co., Ltd. | Augmented reality positioning method and apparatus for location-based service LBS |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US20210377365A1 (en) * | 2020-05-26 | 2021-12-02 | Inter Ikea Systems B.V. | System, method, device and computer program product for connecting users to a persistent ar environment |
US20220076286A1 (en) * | 2020-09-04 | 2022-03-10 | International Business Machines Corporation | Context aware gamification in retail environments |
US20220157032A1 (en) * | 2020-02-24 | 2022-05-19 | SpotMap, Inc. | Multi-modality localization of users |
US11373340B2 (en) * | 2018-11-23 | 2022-06-28 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method thereof |
US20220261723A1 (en) * | 2021-02-13 | 2022-08-18 | Applied Software Technology, Inc. | Labor Tracking Beacon for Visualizing Project Status in Computer-aided Design |
US11442473B2 (en) * | 2014-10-31 | 2022-09-13 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with a visual marker |
US20220292633A1 (en) * | 2021-03-15 | 2022-09-15 | International Business Machines Corporation | Image stitching for high-resolution scans |
US11461820B2 (en) * | 2016-08-16 | 2022-10-04 | Adobe Inc. | Navigation and rewards involving physical goods and services |
US20230004900A1 (en) * | 2021-03-31 | 2023-01-05 | F3Systems Limited | System and method for 3 dimensional visualization and interaction with project management tickets |
US11602841B2 (en) * | 2016-11-28 | 2023-03-14 | Brain Corporation | Systems and methods for remote operating and/or monitoring of a robot |
US20230087202A1 (en) * | 2021-09-17 | 2023-03-23 | Ford Global Technologies, Llc | Augmented Reality And Touch-Based User Engagement Parking Assist |
US11614621B2 (en) * | 2017-12-19 | 2023-03-28 | Datalogic IP Tech, S.r.l. | User-wearable systems and methods to collect data and provide information |
US11654552B2 (en) * | 2019-07-29 | 2023-05-23 | TruPhysics GmbH | Backup control based continuous training of robots |
US20230215104A1 (en) * | 2021-12-30 | 2023-07-06 | Snap Inc. | Ar position and orientation along a plane |
US11775134B2 (en) * | 2017-11-13 | 2023-10-03 | Snap Inc. | Interface to display animated icon |
US11830209B2 (en) | 2017-05-26 | 2023-11-28 | Snap Inc. | Neural network-based image stream modification |
US11887260B2 (en) | 2021-12-30 | 2024-01-30 | Snap Inc. | AR position indicator |
US20240062478A1 (en) * | 2022-08-15 | 2024-02-22 | Middle Chart, LLC | Spatial navigation to digital content |
US11922588B2 (en) | 2017-09-29 | 2024-03-05 | Apple Inc. | Cooperative augmented reality map interface |
US11954762B2 (en) | 2022-01-19 | 2024-04-09 | Snap Inc. | Object replacement system |
US11972529B2 (en) | 2019-02-01 | 2024-04-30 | Snap Inc. | Augmented reality system |
US20240144617A1 (en) * | 2020-05-28 | 2024-05-02 | Comcast Cable Communications, Llc | Methods and systems for anchoring objects in augmented or virtual reality |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
WO2024147194A1 (en) * | 2023-01-06 | 2024-07-11 | マクセル株式会社 | Information processing device and information processing method |
US12086507B2 (en) | 2017-02-22 | 2024-09-10 | Middle Chart, LLC | Method and apparatus for construction and operation of connected infrastructure |
US20240323340A1 (en) * | 2023-03-20 | 2024-09-26 | Apple Inc. | Systems and methods for specifying configurations of an electronic device |
WO2024196938A1 (en) * | 2023-03-20 | 2024-09-26 | Adeia Guides Inc. | Systems and methods for enabling an enhanced extended reality experience |
US20240346785A1 (en) * | 2022-01-13 | 2024-10-17 | Naver Labs Corporation | Method and device for providing augmented content through augmented reality view on basis of preset unit space |
US20240361833A1 (en) * | 2023-04-25 | 2024-10-31 | Apple Inc. | System and method of representations of user interfaces of an electronic device |
US12154232B2 (en) | 2022-09-30 | 2024-11-26 | Snap Inc. | 9-DoF object tracking |
US12182325B2 (en) | 2023-04-25 | 2024-12-31 | Apple Inc. | System and method of representations of user interfaces of an electronic device |
US12182209B1 (en) * | 2019-07-31 | 2024-12-31 | Cisco Technology, Inc. | Techniques for placing content in and applying layers in an extended reality environment |
US12223234B2 (en) | 2017-02-22 | 2025-02-11 | Middle Chart, LLC | Apparatus for provision of digital content associated with a radio target area |
US12248737B2 (en) | 2017-02-22 | 2025-03-11 | Middle Chart, LLC | Agent supportable device indicating an item of interest in a wireless communication area |
US12299340B2 (en) | 2020-04-17 | 2025-05-13 | Apple Inc. | Multi-device continuity for use with extended reality systems |
US12307614B2 (en) | 2021-12-23 | 2025-05-20 | Apple Inc. | Methods for sharing content and interacting with physical devices in a three-dimensional environment |
US12314638B2 (en) | 2017-02-22 | 2025-05-27 | Middle Chart, LLC | Methods and apparatus for secure persistent location based digital content associated with a three-dimensional reference |
US12327277B2 (en) | 2021-04-12 | 2025-06-10 | Snap Inc. | Home based augmented reality shopping |
US12400048B2 (en) | 2020-01-28 | 2025-08-26 | Middle Chart, LLC | Methods and apparatus for two dimensional location based digital content |
US12406454B2 (en) | 2016-03-31 | 2025-09-02 | Magic Leap, Inc. | Interactions with 3D virtual objects using poses and multiple-dof controllers |
US12406451B2 (en) | 2019-09-27 | 2025-09-02 | Apple Inc. | Systems, methods, and graphical user interfaces for modeling, measuring, and drawing using augmented reality |
US12412205B2 (en) | 2021-12-30 | 2025-09-09 | Snap Inc. | Method, system, and medium for augmented reality product recommendations |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10107767B1 (en) * | 2017-06-14 | 2018-10-23 | The Boeing Company | Aircraft inspection system with visualization and recording |
US10951434B2 (en) * | 2018-03-23 | 2021-03-16 | Apple Inc. | Modular wall unit system |
DK201870351A1 (en) | 2018-05-07 | 2020-01-13 | Apple Inc. | Devices and Methods for Measuring Using Augmented Reality |
CN110599432B (en) * | 2018-06-12 | 2023-02-24 | 光宝电子(广州)有限公司 | Image processing system and image processing method |
US10593119B2 (en) | 2018-06-25 | 2020-03-17 | Intel Corporation | Projected augmented reality to obscure physical objects |
US11722985B2 (en) | 2018-08-09 | 2023-08-08 | Apple Inc. | Object tracking and authentication using modular wall units |
US10785413B2 (en) | 2018-09-29 | 2020-09-22 | Apple Inc. | Devices, methods, and graphical user interfaces for depth-based annotation |
CN110515461A (en) * | 2019-08-23 | 2019-11-29 | Oppo广东移动通信有限公司 | Interaction method, head mounted device, interaction system and storage medium |
WO2021058191A1 (en) | 2019-09-25 | 2021-04-01 | Osram Gmbh | Methods of illuminating an artwork |
US11480425B2 (en) * | 2019-10-22 | 2022-10-25 | Zebra Techologies Corporation | Method, system and apparatus for mobile dimensioning |
US11138771B2 (en) | 2020-02-03 | 2021-10-05 | Apple Inc. | Systems, methods, and graphical user interfaces for annotating, measuring, and modeling environments |
US12307066B2 (en) | 2020-03-16 | 2025-05-20 | Apple Inc. | Devices, methods, and graphical user interfaces for providing computer-generated experiences |
US11727650B2 (en) | 2020-03-17 | 2023-08-15 | Apple Inc. | Systems, methods, and graphical user interfaces for displaying and manipulating virtual objects in augmented reality environments |
US11615595B2 (en) | 2020-09-24 | 2023-03-28 | Apple Inc. | Systems, methods, and graphical user interfaces for sharing augmented reality environments |
US12003806B2 (en) * | 2021-03-11 | 2024-06-04 | Quintar, Inc. | Augmented reality system for viewing an event with multiple coordinate systems and automatically generated model |
US11941764B2 (en) | 2021-04-18 | 2024-03-26 | Apple Inc. | Systems, methods, and graphical user interfaces for adding effects in augmented reality environments |
US12361602B2 (en) | 2023-09-28 | 2025-07-15 | International Business Machines Corporation | Augmented reality overlay of feature and versioning improvement |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090244097A1 (en) * | 2008-03-25 | 2009-10-01 | Leonardo William Estevez | System and Method for Providing Augmented Reality |
US20130307874A1 (en) * | 2011-04-13 | 2013-11-21 | Longsand Limited | Methods and systems for generating and joining shared experience |
US20140240469A1 (en) * | 2013-02-28 | 2014-08-28 | Motorola Mobility Llc | Electronic Device with Multiview Image Capture and Depth Sensing |
US9170318B1 (en) * | 2010-09-29 | 2015-10-27 | Amazon Technologies, Inc. | Inter-device location determinations |
US20150356788A1 (en) * | 2013-02-01 | 2015-12-10 | Sony Corporation | Information processing device, client device, information processing method, and program |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5477622A (en) * | 1994-08-30 | 1995-12-26 | Skalnik; Dennis A. | Electronic hand-held measuring device for obtaining the dimensional weight of a shipment of freight |
US6519550B1 (en) * | 2000-09-11 | 2003-02-11 | Intel Corporation ( A Delaware Corporation) | Object scanner |
US20140023996A1 (en) * | 2012-07-18 | 2014-01-23 | F3 & Associates, Inc. | Three Dimensional Model Objects |
US9129404B1 (en) * | 2012-09-13 | 2015-09-08 | Amazon Technologies, Inc. | Measuring physical objects and presenting virtual articles |
US9939259B2 (en) * | 2012-10-04 | 2018-04-10 | Hand Held Products, Inc. | Measuring object dimensions using mobile computer |
-
2015
- 2015-01-02 US US14/588,515 patent/US20150193982A1/en not_active Abandoned
-
2017
- 2017-07-31 US US15/664,754 patent/US10275945B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090244097A1 (en) * | 2008-03-25 | 2009-10-01 | Leonardo William Estevez | System and Method for Providing Augmented Reality |
US9170318B1 (en) * | 2010-09-29 | 2015-10-27 | Amazon Technologies, Inc. | Inter-device location determinations |
US20130307874A1 (en) * | 2011-04-13 | 2013-11-21 | Longsand Limited | Methods and systems for generating and joining shared experience |
US20150356788A1 (en) * | 2013-02-01 | 2015-12-10 | Sony Corporation | Information processing device, client device, information processing method, and program |
US20140240469A1 (en) * | 2013-02-28 | 2014-08-28 | Motorola Mobility Llc | Electronic Device with Multiview Image Capture and Depth Sensing |
Cited By (226)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9996551B2 (en) | 2013-03-15 | 2018-06-12 | Huntington Ingalls, Incorporated | System and method for determining and maintaining object location and status |
US9947138B2 (en) | 2014-04-15 | 2018-04-17 | Huntington Ingalls Incorporated | System and method for augmented reality display of dynamic environment information |
US11729245B2 (en) | 2014-05-28 | 2023-08-15 | Alexander Hertel | Platform for constructing and consuming realm and object feature clouds |
US11368557B2 (en) | 2014-05-28 | 2022-06-21 | Alexander Hertel | Platform for constructing and consuming realm and object feature clouds |
US12101371B2 (en) | 2014-05-28 | 2024-09-24 | Alexander Hertel | Platform for constructing and consuming realm and object feature clouds |
US10681183B2 (en) | 2014-05-28 | 2020-06-09 | Alexander Hertel | Platform for constructing and consuming realm and object featured clouds |
US10504294B2 (en) | 2014-06-09 | 2019-12-10 | Huntington Ingalls Incorporated | System and method for augmented reality discrepancy determination and reporting |
US10147234B2 (en) * | 2014-06-09 | 2018-12-04 | Huntington Ingalls Incorporated | System and method for augmented reality display of electrical system information |
US10915754B2 (en) | 2014-06-09 | 2021-02-09 | Huntington Ingalls Incorporated | System and method for use of augmented reality in outfitting a dynamic structural space |
US20150356786A1 (en) * | 2014-06-09 | 2015-12-10 | Huntington Ingalls, Inc. | System and Method for Augmented Reality Display of Electrical System Information |
US20160063764A1 (en) * | 2014-08-27 | 2016-03-03 | Ricoh Company, Ltd. | Image processing apparatus, image processing method, and computer program product |
US11442473B2 (en) * | 2014-10-31 | 2022-09-13 | SZ DJI Technology Co., Ltd. | Systems and methods for surveillance with a visual marker |
US10798428B2 (en) * | 2014-11-12 | 2020-10-06 | Sony Corporation | Method and system for providing coupon |
US20170318318A1 (en) * | 2014-11-12 | 2017-11-02 | Sony Corporation | Method and system for providing coupon |
US20160148434A1 (en) * | 2014-11-20 | 2016-05-26 | Thomson Licensing | Device and method for processing visual data, and related computer program product |
US9995815B2 (en) | 2014-12-30 | 2018-06-12 | Energybox Ltd. | Energy metering system and method for its calibration |
US20160188763A1 (en) * | 2014-12-30 | 2016-06-30 | Energybox Ltd. | Visualization of Electrical Loads |
US10467354B2 (en) * | 2014-12-30 | 2019-11-05 | Energybox Ltd. | Visualization of electrical loads |
US12056182B2 (en) | 2015-01-09 | 2024-08-06 | Snap Inc. | Object recognition based image overlays |
US10380720B1 (en) * | 2015-01-09 | 2019-08-13 | Snap Inc. | Location-based image filters |
US11301960B2 (en) | 2015-01-09 | 2022-04-12 | Snap Inc. | Object recognition based image filters |
US11734342B2 (en) | 2015-01-09 | 2023-08-22 | Snap Inc. | Object recognition based image overlays |
US10157449B1 (en) | 2015-01-09 | 2018-12-18 | Snap Inc. | Geo-location-based image filters |
US11588961B2 (en) | 2015-02-02 | 2023-02-21 | Apple Inc. | Focusing lighting module |
US11122193B2 (en) * | 2015-02-02 | 2021-09-14 | Apple Inc. | Focusing lighting module |
US20180288295A1 (en) * | 2015-02-02 | 2018-10-04 | Apple Inc. | Focusing lighting module |
US10147210B1 (en) * | 2015-03-13 | 2018-12-04 | Amazon Technologies, Inc. | Data visualization system |
US11263795B1 (en) | 2015-03-13 | 2022-03-01 | Amazon Technologies, Inc. | Visualization system for sensor data and facility data |
US20160284128A1 (en) * | 2015-03-27 | 2016-09-29 | Rockwell Automation Technologies, Inc. | Systems and methods for presenting an augmented reality |
US10950051B2 (en) * | 2015-03-27 | 2021-03-16 | Rockwell Automation Technologies, Inc. | Systems and methods for presenting an augmented reality |
US10571145B2 (en) * | 2015-04-07 | 2020-02-25 | Mitsubishi Electric Corporation | Maintenance support system for air conditioners |
US10733661B1 (en) * | 2015-05-22 | 2020-08-04 | Walgreen Co. | Automatic mapping of store layout using soft object recognition |
US10991036B1 (en) | 2015-05-22 | 2021-04-27 | Walgreen Co. | Automatic mapping of store layout using soft object recognition |
US10748313B2 (en) * | 2015-07-15 | 2020-08-18 | Fyusion, Inc. | Dynamic multi-view interactive digital media representation lock screen |
US10750161B2 (en) * | 2015-07-15 | 2020-08-18 | Fyusion, Inc. | Multi-view interactive digital media representation lock screen |
US20170359570A1 (en) * | 2015-07-15 | 2017-12-14 | Fyusion, Inc. | Multi-View Interactive Digital Media Representation Lock Screen |
US20180012330A1 (en) * | 2015-07-15 | 2018-01-11 | Fyusion, Inc | Dynamic Multi-View Interactive Digital Media Representation Lock Screen |
WO2017031033A1 (en) * | 2015-08-19 | 2017-02-23 | Honeywell International Inc. | Augmented reality-based wiring, commissioning and monitoring of controllers |
US11064009B2 (en) * | 2015-08-19 | 2021-07-13 | Honeywell International Inc. | Augmented reality-based wiring, commissioning and monitoring of controllers |
US20170053441A1 (en) * | 2015-08-19 | 2017-02-23 | Honeywell International Inc. | Augmented reality-based wiring, commissioning and monitoring of controllers |
US10271013B2 (en) * | 2015-09-08 | 2019-04-23 | Tencent Technology (Shenzhen) Company Limited | Display control method and apparatus |
US10957065B2 (en) | 2015-09-30 | 2021-03-23 | Shenzhen Dlodlo Technologies Co., Ltd. | Method and device for determining position of virtual object in virtual space |
EP3358444A4 (en) * | 2015-09-30 | 2018-08-22 | Shenzhen Dlodlo Technologies Co., Ltd. | Method and device for determining position of virtual object in virtual space |
US10366543B1 (en) | 2015-10-30 | 2019-07-30 | Snap Inc. | Image based tracking in augmented reality systems |
US11769307B2 (en) | 2015-10-30 | 2023-09-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10733802B2 (en) | 2015-10-30 | 2020-08-04 | Snap Inc. | Image based tracking in augmented reality systems |
US11315331B2 (en) | 2015-10-30 | 2022-04-26 | Snap Inc. | Image based tracking in augmented reality systems |
US10997783B2 (en) | 2015-11-30 | 2021-05-04 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US12079931B2 (en) | 2015-11-30 | 2024-09-03 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US11380051B2 (en) | 2015-11-30 | 2022-07-05 | Snap Inc. | Image and point cloud based tracking and in augmented reality systems |
US10324584B2 (en) * | 2015-12-10 | 2019-06-18 | Whirlpool Corporation | Touch screen display having an external physical element for association with screen icons |
US12406454B2 (en) | 2016-03-31 | 2025-09-02 | Magic Leap, Inc. | Interactions with 3D virtual objects using poses and multiple-dof controllers |
US10290149B2 (en) * | 2016-04-08 | 2019-05-14 | Maxx Media Group, LLC | System, method and software for interacting with virtual three dimensional images that appear to project forward of or above an electronic display |
US20170301140A1 (en) * | 2016-04-18 | 2017-10-19 | Disney Enterprises, Inc. | System and method for linking and interacting between augmented reality and virtual reality environments |
US10380800B2 (en) * | 2016-04-18 | 2019-08-13 | Disney Enterprises, Inc. | System and method for linking and interacting between augmented reality and virtual reality environments |
US20170307747A1 (en) * | 2016-04-22 | 2017-10-26 | ZhongGuang PAN | Position acquistion method and apparatus |
US10466348B2 (en) * | 2016-04-22 | 2019-11-05 | Shang Hai Pan Shi Tou Zi Guan Li You Xian Gong Si | Position acquisition method and apparatus |
US10216289B2 (en) * | 2016-04-29 | 2019-02-26 | International Business Machines Corporation | Laser pointer emulation via a mobile device |
US20170315629A1 (en) * | 2016-04-29 | 2017-11-02 | International Business Machines Corporation | Laser pointer emulation via a mobile device |
US10466474B2 (en) * | 2016-08-04 | 2019-11-05 | International Business Machines Corporation | Facilitation of communication using shared visual cue |
US20180039076A1 (en) * | 2016-08-04 | 2018-02-08 | International Business Machines Corporation | Facilitation of communication using shared visual cue |
US11808944B2 (en) | 2016-08-11 | 2023-11-07 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
US10921599B2 (en) | 2016-08-11 | 2021-02-16 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
US11287659B2 (en) | 2016-08-11 | 2022-03-29 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
CN106078747A (en) * | 2016-08-11 | 2016-11-09 | 贵州翰凯斯智能技术有限公司 | A kind of time delay industrial operation control system based on virtual reality |
US10627625B2 (en) | 2016-08-11 | 2020-04-21 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
WO2018031621A1 (en) * | 2016-08-11 | 2018-02-15 | Magic Leap, Inc. | Automatic placement of a virtual object in a three-dimensional space |
US12354149B2 (en) | 2016-08-16 | 2025-07-08 | Adobe Inc. | Navigation and rewards involving physical goods and services |
US11461820B2 (en) * | 2016-08-16 | 2022-10-04 | Adobe Inc. | Navigation and rewards involving physical goods and services |
US20180063205A1 (en) * | 2016-08-30 | 2018-03-01 | Augre Mixed Reality Technologies, Llc | Mixed reality collaboration |
EP3301544A1 (en) * | 2016-09-30 | 2018-04-04 | Alcatel Lucent | System and method for controlling an altered-reality application |
US20180114065A1 (en) * | 2016-10-26 | 2018-04-26 | Alibaba Group Holding Limited | User location determination based on augmented reality |
US10168857B2 (en) | 2016-10-26 | 2019-01-01 | International Business Machines Corporation | Virtual reality for cognitive messaging |
US10235569B2 (en) * | 2016-10-26 | 2019-03-19 | Alibaba Group Holding Limited | User location determination based on augmented reality |
US10552681B2 (en) | 2016-10-26 | 2020-02-04 | Alibaba Group Holding Limited | User location determination based on augmented reality |
TWI675351B (en) * | 2016-10-26 | 2019-10-21 | 香港商阿里巴巴集團服務有限公司 | User location location method and device based on augmented reality |
US11602841B2 (en) * | 2016-11-28 | 2023-03-14 | Brain Corporation | Systems and methods for remote operating and/or monitoring of a robot |
US10636221B2 (en) * | 2016-12-21 | 2020-04-28 | Tencent Technology (Shenzhen) Company Limited | Interaction method between user terminals, terminal, server, system, and storage medium |
US10499997B2 (en) | 2017-01-03 | 2019-12-10 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US12383347B2 (en) | 2017-01-03 | 2025-08-12 | Mako Surgical Corp. | Systems and methods for surgical navigation |
US11707330B2 (en) | 2017-01-03 | 2023-07-25 | Mako Surgical Corp. | Systems and methods for surgical navigation |
WO2018128526A1 (en) * | 2017-01-09 | 2018-07-12 | Samsung Electronics Co., Ltd. | System and method for augmented reality control |
US10410422B2 (en) | 2017-01-09 | 2019-09-10 | Samsung Electronics Co., Ltd. | System and method for augmented reality control |
WO2018136946A1 (en) * | 2017-01-23 | 2018-07-26 | Magic Leap, Inc. | Localization determination for mixed reality systems |
US11861795B1 (en) | 2017-02-17 | 2024-01-02 | Snap Inc. | Augmented reality anamorphosis system |
US12340475B2 (en) | 2017-02-17 | 2025-06-24 | Snap Inc. | Augmented reality anamorphosis system |
US10319149B1 (en) * | 2017-02-17 | 2019-06-11 | Snap Inc. | Augmented reality anamorphosis system |
US11189299B1 (en) | 2017-02-20 | 2021-11-30 | Snap Inc. | Augmented reality speech balloon system |
US12197884B2 (en) | 2017-02-20 | 2025-01-14 | Snap Inc. | Augmented reality speech balloon system |
US11748579B2 (en) | 2017-02-20 | 2023-09-05 | Snap Inc. | Augmented reality speech balloon system |
US12248737B2 (en) | 2017-02-22 | 2025-03-11 | Middle Chart, LLC | Agent supportable device indicating an item of interest in a wireless communication area |
US12086507B2 (en) | 2017-02-22 | 2024-09-10 | Middle Chart, LLC | Method and apparatus for construction and operation of connected infrastructure |
US12314638B2 (en) | 2017-02-22 | 2025-05-27 | Middle Chart, LLC | Methods and apparatus for secure persistent location based digital content associated with a three-dimensional reference |
US12223234B2 (en) | 2017-02-22 | 2025-02-11 | Middle Chart, LLC | Apparatus for provision of digital content associated with a radio target area |
US11314903B2 (en) | 2017-02-23 | 2022-04-26 | Mitek Holdings, Inc. | Method of managing proxy objects |
US10878138B2 (en) * | 2017-02-23 | 2020-12-29 | Mitek Holdings, Inc. | Method of managing proxy objects |
US11687684B2 (en) | 2017-02-23 | 2023-06-27 | Mitek Holdings, Inc. | Method of managing proxy objects |
US12079545B2 (en) | 2017-02-23 | 2024-09-03 | Mitek Holdings, Inc. | Method of managing proxy objects |
US10635841B2 (en) * | 2017-02-23 | 2020-04-28 | OPTO Interactive, LLC | Method of managing proxy objects |
US20200210630A1 (en) * | 2017-02-23 | 2020-07-02 | OPTO Interactive, LLC | Method of managing proxy objects |
US10430997B2 (en) | 2017-02-23 | 2019-10-01 | OPTO Interactive, LLC | Method of managing proxy objects |
US20200026806A1 (en) * | 2017-02-23 | 2020-01-23 | OPTO Interactive, LLC | Method of managing proxy objects |
US10198863B2 (en) | 2017-02-23 | 2019-02-05 | OPTO Interactive, LLC | Method of managing proxy objects |
US11961196B2 (en) | 2017-03-06 | 2024-04-16 | Snap Inc. | Virtual vision system |
US11037372B2 (en) | 2017-03-06 | 2021-06-15 | Snap Inc. | Virtual vision system |
CN110383344A (en) * | 2017-03-06 | 2019-10-25 | 斯纳普公司 | Virtual vision system |
EP4443848A3 (en) * | 2017-03-06 | 2025-01-01 | Snap Inc. | Virtual vision system |
US11670057B2 (en) | 2017-03-06 | 2023-06-06 | Snap Inc. | Virtual vision system |
WO2018165154A1 (en) * | 2017-03-06 | 2018-09-13 | Snap Inc. | Virtual vision system |
US12333666B2 (en) | 2017-03-06 | 2025-06-17 | Snap Inc. | Virtual vision system |
US10565795B2 (en) | 2017-03-06 | 2020-02-18 | Snap Inc. | Virtual vision system |
US10408624B2 (en) | 2017-04-18 | 2019-09-10 | Microsoft Technology Licensing, Llc | Providing familiarizing directional information |
US11294456B2 (en) * | 2017-04-20 | 2022-04-05 | Robert C. Brooks | Perspective or gaze based visual identification and location system |
US11195018B1 (en) | 2017-04-20 | 2021-12-07 | Snap Inc. | Augmented reality typography personalization system |
US10048753B1 (en) * | 2017-04-20 | 2018-08-14 | Robert C. Brooks | Perspective or gaze based visual identification and location system |
US12033253B2 (en) | 2017-04-20 | 2024-07-09 | Snap Inc. | Augmented reality typography personalization system |
US12394127B2 (en) | 2017-04-20 | 2025-08-19 | Snap Inc. | Augmented reality typography personalization system |
US10387730B1 (en) | 2017-04-20 | 2019-08-20 | Snap Inc. | Augmented reality typography personalization system |
US11830209B2 (en) | 2017-05-26 | 2023-11-28 | Snap Inc. | Neural network-based image stream modification |
US11164379B2 (en) * | 2017-06-08 | 2021-11-02 | Baidu Online Network Technology (Beijing) Co., Ltd. | Augmented reality positioning method and apparatus for location-based service LBS |
CN110800260A (en) * | 2017-06-28 | 2020-02-14 | 康普技术有限责任公司 | System and method for managed connectivity wall outlet with low energy wireless communication |
US20220272155A1 (en) * | 2017-06-28 | 2022-08-25 | Commscope Technologies Llc | Systems and methods for managed connectivity wall outlets using low energy wireless communication |
US11388240B2 (en) * | 2017-06-28 | 2022-07-12 | Commscope Technologies Llc | Systems and methods for managed connectivity wall outlets using low energy wireless communication |
US11641402B2 (en) * | 2017-06-28 | 2023-05-02 | Commscope Technologies Llc | Systems and methods for managed connectivity wall outlets using low energy wireless communication |
WO2019006026A1 (en) * | 2017-06-28 | 2019-01-03 | Commscope Technologies Llc | Systems and methods for managed connectivity wall outlets using low energy wireless communication |
WO2019027515A1 (en) * | 2017-07-31 | 2019-02-07 | Google Llc | Virtual reality environment boundaries using depth sensors |
US10803663B2 (en) | 2017-08-02 | 2020-10-13 | Google Llc | Depth sensor aided estimation of virtual reality environment boundaries |
KR102357265B1 (en) | 2017-09-08 | 2022-02-08 | 나이앤틱, 인크. | Method and systems for generating detailed datasets of an environment via gameplay |
US11110343B2 (en) | 2017-09-08 | 2021-09-07 | Niantic, Inc. | Methods and systems for generating detailed datasets of an environment via gameplay |
CN111295234A (en) * | 2017-09-08 | 2020-06-16 | 奈安蒂克公司 | Method and system for generating detailed data sets of an environment via game play |
EP3678750A4 (en) * | 2017-09-08 | 2020-12-02 | Niantic, Inc. | METHODS AND SYSTEMS FOR GENERATING DETAILED DATA RECORDS OF AN ENVIRONMENT VIA GAMEPLAY |
KR20200117035A (en) * | 2017-09-08 | 2020-10-13 | 나이앤틱, 인크. | Method and systems for generating detailed datasets of an environment via gameplay |
US12266062B2 (en) | 2017-09-15 | 2025-04-01 | Snap Inc. | Augmented reality system |
US11335067B2 (en) | 2017-09-15 | 2022-05-17 | Snap Inc. | Augmented reality system |
US11721080B2 (en) | 2017-09-15 | 2023-08-08 | Snap Inc. | Augmented reality system |
US10740974B1 (en) | 2017-09-15 | 2020-08-11 | Snap Inc. | Augmented reality system |
CN109587188A (en) * | 2017-09-28 | 2019-04-05 | 阿里巴巴集团控股有限公司 | Determine the method, apparatus and electronic equipment of relative positional relationship between terminal device |
US11922588B2 (en) | 2017-09-29 | 2024-03-05 | Apple Inc. | Cooperative augmented reality map interface |
US10921127B2 (en) * | 2017-11-02 | 2021-02-16 | Sony Corporation | Augmented reality based electronic device to provide location tagging assistance in an indoor or outdoor area |
US20190128676A1 (en) * | 2017-11-02 | 2019-05-02 | Sony Corporation | Augmented reality based electronic device to provide location tagging assistance in an indoor or outdoor area |
US11775134B2 (en) * | 2017-11-13 | 2023-10-03 | Snap Inc. | Interface to display animated icon |
US12260065B2 (en) | 2017-11-13 | 2025-03-25 | Snap Inc. | Interface to display animated icon |
US11614621B2 (en) * | 2017-12-19 | 2023-03-28 | Datalogic IP Tech, S.r.l. | User-wearable systems and methods to collect data and provide information |
US10915781B2 (en) * | 2018-03-01 | 2021-02-09 | Htc Corporation | Scene reconstructing system, scene reconstructing method and non-transitory computer-readable medium |
US10511881B1 (en) * | 2018-05-31 | 2019-12-17 | Titan Health & Security Technologies, Inc. | Communication exchange system for remotely communicating instructions |
CN110310175A (en) * | 2018-06-27 | 2019-10-08 | 北京京东尚科信息技术有限公司 | System and method for mobile augmented reality |
EP3815051A4 (en) * | 2018-06-27 | 2022-03-23 | Beijing Jingdong Shangke Information Technology Co., Ltd. | SYSTEM AND PROCEDURES FOR MULTI-USER SHOPPING WITH AUGMENTED REALITY |
WO2020001039A1 (en) * | 2018-06-27 | 2020-01-02 | Beijing Jingdong Shangke Information Technology Co., Ltd. | System and method for multi-user augmented reality shopping |
CN112154486A (en) * | 2018-06-27 | 2020-12-29 | 北京京东尚科信息技术有限公司 | System and method for multi-user augmented reality shopping |
US10937241B2 (en) * | 2018-06-29 | 2021-03-02 | The Travelers Indemnity Company | Systems, methods, and apparatus for identifying an augmented reality service |
US20200005540A1 (en) * | 2018-06-29 | 2020-01-02 | The Travelers Indemnity Company | Systems, methods, and apparatus for managing augmented reality environments |
US20210208551A1 (en) * | 2018-07-04 | 2021-07-08 | Carrier Corporation | Building management system and positioning method in building |
US11450050B2 (en) | 2018-08-31 | 2022-09-20 | Snap Inc. | Augmented reality anthropomorphization system |
US11676319B2 (en) | 2018-08-31 | 2023-06-13 | Snap Inc. | Augmented reality anthropomorphtzation system |
US10997760B2 (en) | 2018-08-31 | 2021-05-04 | Snap Inc. | Augmented reality anthropomorphization system |
CN109254666A (en) * | 2018-09-21 | 2019-01-22 | 上海曼恒数字技术股份有限公司 | Virtual reality device positioning synchronous method, apparatus, equipment and medium |
US11036048B2 (en) * | 2018-10-03 | 2021-06-15 | Project Whitecard Digital Inc. | Virtual reality system and method for displaying on a real-world display a viewable portion of a source file projected on an inverse spherical virtual screen |
US11120632B2 (en) * | 2018-10-16 | 2021-09-14 | Sony Interactive Entertainment Inc. | Image generating apparatus, image generating system, image generating method, and program |
US11373340B2 (en) * | 2018-11-23 | 2022-06-28 | Samsung Electronics Co., Ltd. | Display apparatus and controlling method thereof |
US20200177928A1 (en) * | 2018-11-30 | 2020-06-04 | Kt Corporation | Providing time slice video |
US11228790B2 (en) * | 2018-11-30 | 2022-01-18 | Kt Corporation | Providing time slice video |
US11972529B2 (en) | 2019-02-01 | 2024-04-30 | Snap Inc. | Augmented reality system |
JP2020125951A (en) * | 2019-02-04 | 2020-08-20 | 旭化成ホームズ株式会社 | Position specifying system, portable terminal, dwelling house, method for utilizing position specifying system, and position specifying method and program |
JP7289664B2 (en) | 2019-02-04 | 2023-06-12 | 旭化成ホームズ株式会社 | Location identification system, mobile terminal, usage of location identification system, location identification method and program |
JP7518948B2 (en) | 2019-02-04 | 2024-07-18 | 旭化成ホームズ株式会社 | Location identification system, mobile terminal, method for using location identification system, location identification method, and program |
US11727822B2 (en) | 2019-02-14 | 2023-08-15 | Abb S.P.A. | Method for guiding installation of internal accessory devices in low voltage switches |
EP3696841A1 (en) * | 2019-02-14 | 2020-08-19 | ABB S.p.A. | Method for guiding installation of internal accessory devices in low voltage switches |
US10940387B2 (en) * | 2019-03-15 | 2021-03-09 | Disney Enterprises, Inc. | Synchronized augmented reality gameplay across multiple gaming environments |
WO2020190387A1 (en) * | 2019-03-19 | 2020-09-24 | Microsoft Technology Licensing, Llc | Relative spatial localization of mobile devices |
AU2020241282B2 (en) * | 2019-03-19 | 2025-06-12 | Microsoft Technology Licensing, Llc | Relative spatial localization of mobile devices |
KR20210141510A (en) * | 2019-03-19 | 2021-11-23 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Identify the relative spatial location of mobile devices |
KR102795487B1 (en) | 2019-03-19 | 2025-04-11 | 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 | Determining the relative spatial position of mobile devices |
US11190904B2 (en) | 2019-03-19 | 2021-11-30 | Microsoft Technology Licensing, Llc | Relative spatial localization of mobile devices |
CN111880644A (en) * | 2019-05-02 | 2020-11-03 | 苹果公司 | Multi-user instant location and map construction (SLAM) |
US11127161B2 (en) | 2019-05-02 | 2021-09-21 | Apple Inc. | Multiple user simultaneous localization and mapping (SLAM) |
US10748302B1 (en) | 2019-05-02 | 2020-08-18 | Apple Inc. | Multiple user simultaneous localization and mapping (SLAM) |
WO2020247399A1 (en) * | 2019-06-04 | 2020-12-10 | Metcalfarchaeological Consultants, Inc. | Spherical image based registration and self-localization for onsite and offsite viewing |
US11418716B2 (en) | 2019-06-04 | 2022-08-16 | Nathaniel Boyless | Spherical image based registration and self-localization for onsite and offsite viewing |
WO2020263838A1 (en) * | 2019-06-24 | 2020-12-30 | Magic Leap, Inc. | Virtual location selection for virtual content |
US11094133B2 (en) * | 2019-06-24 | 2021-08-17 | Magic Leap, Inc. | Virtual location selection for virtual content |
CN114072752A (en) * | 2019-06-24 | 2022-02-18 | 奇跃公司 | Virtual location selection for virtual content |
US11861796B2 (en) | 2019-06-24 | 2024-01-02 | Magic Leap, Inc. | Virtual location selection for virtual content |
US12118687B2 (en) | 2019-06-24 | 2024-10-15 | Magic Leap, Inc. | Virtual location selection for virtual content |
US11654552B2 (en) * | 2019-07-29 | 2023-05-23 | TruPhysics GmbH | Backup control based continuous training of robots |
US12182209B1 (en) * | 2019-07-31 | 2024-12-31 | Cisco Technology, Inc. | Techniques for placing content in and applying layers in an extended reality environment |
US12406451B2 (en) | 2019-09-27 | 2025-09-02 | Apple Inc. | Systems, methods, and graphical user interfaces for modeling, measuring, and drawing using augmented reality |
US11475637B2 (en) | 2019-10-21 | 2022-10-18 | Wormhole Labs, Inc. | Multi-instance multi-user augmented reality environment |
WO2021081068A1 (en) * | 2019-10-21 | 2021-04-29 | Wormhole Labs, Inc. | Multi-instance multi-user augmented reality environment |
US12020385B2 (en) | 2019-12-24 | 2024-06-25 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Augmented reality processing method, storage medium, and electronic device |
WO2021129514A1 (en) * | 2019-12-24 | 2021-07-01 | Oppo广东移动通信有限公司 | Augmented reality processing method, apparatus and system, and storage medium, and electronic device |
US12400048B2 (en) | 2020-01-28 | 2025-08-26 | Middle Chart, LLC | Methods and apparatus for two dimensional location based digital content |
WO2021168338A1 (en) * | 2020-02-20 | 2021-08-26 | Magic Leap, Inc. | Cross reality system with wifi/gps based map merge |
US11532124B2 (en) | 2020-02-20 | 2022-12-20 | Magic Leap, Inc. | Cross reality system with WIFI/GPS based map merge |
US12322041B2 (en) | 2020-02-20 | 2025-06-03 | Magic Leap, Inc. | Cross reality system with WiFi/GPS based map merge |
US20220157032A1 (en) * | 2020-02-24 | 2022-05-19 | SpotMap, Inc. | Multi-modality localization of users |
US12299340B2 (en) | 2020-04-17 | 2025-05-13 | Apple Inc. | Multi-device continuity for use with extended reality systems |
US20210377365A1 (en) * | 2020-05-26 | 2021-12-02 | Inter Ikea Systems B.V. | System, method, device and computer program product for connecting users to a persistent ar environment |
US20240144617A1 (en) * | 2020-05-28 | 2024-05-02 | Comcast Cable Communications, Llc | Methods and systems for anchoring objects in augmented or virtual reality |
US11494796B2 (en) * | 2020-09-04 | 2022-11-08 | International Business Machines Corporation | Context aware gamification in retail environments |
US20220076286A1 (en) * | 2020-09-04 | 2022-03-10 | International Business Machines Corporation | Context aware gamification in retail environments |
US20220261723A1 (en) * | 2021-02-13 | 2022-08-18 | Applied Software Technology, Inc. | Labor Tracking Beacon for Visualizing Project Status in Computer-aided Design |
US11615356B2 (en) * | 2021-02-13 | 2023-03-28 | Evolve Mep, Llc | Labor tracking beacon for visualizing project status in computer-aided design |
US20220292633A1 (en) * | 2021-03-15 | 2022-09-15 | International Business Machines Corporation | Image stitching for high-resolution scans |
US12100117B2 (en) * | 2021-03-15 | 2024-09-24 | International Business Machines Corporation | Image stitching for high-resolution scans |
US20230004900A1 (en) * | 2021-03-31 | 2023-01-05 | F3Systems Limited | System and method for 3 dimensional visualization and interaction with project management tickets |
US12327277B2 (en) | 2021-04-12 | 2025-06-10 | Snap Inc. | Home based augmented reality shopping |
US20230087202A1 (en) * | 2021-09-17 | 2023-03-23 | Ford Global Technologies, Llc | Augmented Reality And Touch-Based User Engagement Parking Assist |
US12307614B2 (en) | 2021-12-23 | 2025-05-20 | Apple Inc. | Methods for sharing content and interacting with physical devices in a three-dimensional environment |
US12299832B2 (en) | 2021-12-30 | 2025-05-13 | Snap Inc. | AR position and orientation along a plane |
US11928783B2 (en) * | 2021-12-30 | 2024-03-12 | Snap Inc. | AR position and orientation along a plane |
US11887260B2 (en) | 2021-12-30 | 2024-01-30 | Snap Inc. | AR position indicator |
US12412205B2 (en) | 2021-12-30 | 2025-09-09 | Snap Inc. | Method, system, and medium for augmented reality product recommendations |
US20230215104A1 (en) * | 2021-12-30 | 2023-07-06 | Snap Inc. | Ar position and orientation along a plane |
US12260505B2 (en) * | 2022-01-13 | 2025-03-25 | Naver Corporation | Method and device for providing augmented content through augmented reality view on basis of preset unit space |
US20240346785A1 (en) * | 2022-01-13 | 2024-10-17 | Naver Labs Corporation | Method and device for providing augmented content through augmented reality view on basis of preset unit space |
US11954762B2 (en) | 2022-01-19 | 2024-04-09 | Snap Inc. | Object replacement system |
US12020386B2 (en) | 2022-06-23 | 2024-06-25 | Snap Inc. | Applying pregenerated virtual experiences in new location |
US12086943B2 (en) * | 2022-08-15 | 2024-09-10 | Middle Chart, LLC | Spatial navigation to digital content |
US20240062478A1 (en) * | 2022-08-15 | 2024-02-22 | Middle Chart, LLC | Spatial navigation to digital content |
US12154232B2 (en) | 2022-09-30 | 2024-11-26 | Snap Inc. | 9-DoF object tracking |
WO2024147194A1 (en) * | 2023-01-06 | 2024-07-11 | マクセル株式会社 | Information processing device and information processing method |
WO2024196938A1 (en) * | 2023-03-20 | 2024-09-26 | Adeia Guides Inc. | Systems and methods for enabling an enhanced extended reality experience |
US20240323340A1 (en) * | 2023-03-20 | 2024-09-26 | Apple Inc. | Systems and methods for specifying configurations of an electronic device |
US12321515B2 (en) * | 2023-04-25 | 2025-06-03 | Apple Inc. | System and method of representations of user interfaces of an electronic device |
US12182325B2 (en) | 2023-04-25 | 2024-12-31 | Apple Inc. | System and method of representations of user interfaces of an electronic device |
US20240361833A1 (en) * | 2023-04-25 | 2024-10-31 | Apple Inc. | System and method of representations of user interfaces of an electronic device |
Also Published As
Publication number | Publication date |
---|---|
US10275945B2 (en) | 2019-04-30 |
US20170358142A1 (en) | 2017-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10275945B2 (en) | Measuring dimension of object through visual odometry | |
US11645781B2 (en) | Automated determination of acquisition locations of acquired building images based on determined surrounding room data | |
US11252329B1 (en) | Automated determination of image acquisition locations in building interiors using multiple data capture devices | |
US11632602B2 (en) | Automated determination of image acquisition locations in building interiors using multiple data capture devices | |
CA3097164C (en) | Generating floor maps for buildings from automated analysis of visual data from the buildings' interiors | |
CA3058602C (en) | Automated mapping information generation from inter-connected images | |
CN105408938B (en) | System for the processing of 2D/3D space characteristics | |
CN102681661B (en) | Using a three-dimensional environment model in gameplay | |
US10636185B2 (en) | Information processing apparatus and information processing method for guiding a user to a vicinity of a viewpoint | |
US10462406B2 (en) | Information processing apparatus and information processing method | |
TWI567659B (en) | Theme-based augmentation of photorepresentative view | |
US10767975B2 (en) | Data capture system for texture and geometry acquisition | |
JPWO2018131238A1 (en) | Information processing apparatus, information processing method, and program | |
WO2018113759A1 (en) | Detection system and detection method based on positioning system and ar/mr | |
WO2019016820A1 (en) | A METHOD FOR PLACING, TRACKING AND PRESENTING IMMERSIVE REALITY-VIRTUALITY CONTINUUM-BASED ENVIRONMENT WITH IoT AND/OR OTHER SENSORS INSTEAD OF CAMERA OR VISUAL PROCCESING AND METHODS THEREOF | |
EP4432228A1 (en) | Automated generation of building floor plans having associated absolute locations using multiple data capture devices | |
WO2022259253A1 (en) | System and method for providing interactive multi-user parallel real and virtual 3d environments | |
EP4358026A1 (en) | Automated determination of acquisition locations of acquired building images based on identified surrounding objects | |
Piérard et al. | I-see-3D! An interactive and immersive system that dynamically adapts 2D projections to the location of a user's eyes | |
WO2022129646A1 (en) | Virtual reality environment | |
Renius | A Technical Evaluation of the WebXR Device API for Developing Augmented Reality Web Applications | |
US20250104126A1 (en) | Intelligent Spatial Mapping and Navigation System for Item Location | |
Wither | Annotation at a distance in augmented reality | |
Khare et al. | Amalgam Version of Itinerant Augmented Reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE TECHNOLOGY HOLDINGS LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MIHELICH, PATRICK;LEE, JOHNNY;SIGNING DATES FROM 20150224 TO 20150302;REEL/FRAME:035069/0593 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |