[go: up one dir, main page]

US20140128161A1 - Cross-platform augmented reality experience - Google Patents

Cross-platform augmented reality experience Download PDF

Info

Publication number
US20140128161A1
US20140128161A1 US13/670,258 US201213670258A US2014128161A1 US 20140128161 A1 US20140128161 A1 US 20140128161A1 US 201213670258 A US201213670258 A US 201213670258A US 2014128161 A1 US2014128161 A1 US 2014128161A1
Authority
US
United States
Prior art keywords
display
see
augmented reality
user
computing system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/670,258
Inventor
Stephen Latta
Daniel McCulloch
Jason Scott
Kevin Geisner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/670,258 priority Critical patent/US20140128161A1/en
Priority to PCT/US2013/068351 priority patent/WO2014074465A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GEISNER, KEVIN, SCOTT, JASON, LATTA, STEPHEN, MCCULLOCH, Daniel
Publication of US20140128161A1 publication Critical patent/US20140128161A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNOR'S INTEREST Assignors: MICROSOFT CORPORATION
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F9/00Games not otherwise provided for
    • A63F9/24Electric games; Games using electronic circuits not otherwise provided for
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • Massively multiplayer online games are frequently configured to operate on a single platform. Users typically participate in a massively multiplayer online game by selecting a server and viewing a virtual representation of the game on a stationary display, such as a high definition television.
  • Embodiments are disclosed herein for providing a cross-platform, augmented reality online experience in a computing system.
  • a server system may host a plurality of multiplayer gaming sessions and join computing devices to the multiplayer gaming sessions.
  • the server system may provide augmentation information to see-through display devices and/or experience information to computing systems operating on different platforms from the see-through display devices.
  • the see-through display devices may provide an augmented reality experience with the augmentation information.
  • the computing systems operating on different platforms may provide a cross-platform representation of the augmented reality experience with the experience information.
  • FIG. 1 schematically shows an example communication diagram for an augmented reality massively multiplayer online gaming system including in accordance with an embodiment of the present disclosure.
  • FIG. 2 shows a method of hosting a plurality of multiplayer game sessions in accordance with an embodiment of the present disclosure.
  • FIG. 3 illustrates an unaltered view of a physical environment including participants of a multiplayer game session in accordance with an embodiment of the present disclosure.
  • FIG. 4 illustrates an example first-person view of an augmented reality experience for the physical environment depicted in FIG. 3 .
  • FIG. 5 illustrates an example cross-platform representation of the augmented reality experience depicted in FIG. 4 .
  • FIG. 6 shows a method of hosting a plurality of multiplayer game sessions in accordance with an embodiment of the present disclosure.
  • FIG. 7 schematically shows an example head-mounted display in accordance with an embodiment of the present disclosure.
  • FIG. 8 is an example computing system in accordance with an embodiment of the present disclosure.
  • Massively multiplayer online games are often implemented as two- or three-dimensional virtual environments on a single platform, such as a gaming console, a personal computer, or a mobile computing device.
  • Previous cross-platform massively multiplayer online games only utilize platforms with the same type of game display (e.g., two- or three-dimensional graphics and/or animations rendered and displayed on a conventional display, such as a television, computer monitor, and/or mobile phone screen. Accordingly, players are only able to participate in games with other players having platforms that utilize the same types of displays, thereby limiting the amount and variety of players encountered in such multiplayer online games.
  • the disclosed embodiments are directed to a cross-platform, massively multiplayer online experience that allows users with augmented reality, see-through displays to participate in an augmented reality game.
  • users of a see-through display device may participate in an augmented reality experience.
  • Users of a computing device operating on a different platform may participate in a cross-platform representation of the augmented reality experience.
  • the cross-platform representation may thereby bring the appearance of the augmented reality experience to a user of a computing device that is typically incapable of providing such an augmented reality experience.
  • FIG. 1 schematically shows an example communication diagram of a cross-platform, augmented reality massively multiplayer online gaming system 100 including a plurality of multiplayer gaming sessions 101 hosted by a game server 102 .
  • Cross-platform, augmented reality massively multiplayer online gaming system 100 may accommodate a variety of cross-platform computing devices, such as see-through display devices, gaming systems connected to a stationary display device, personal computing devices connected to an external display device, mobile computing devices such as laptops, smart phones, tablets, etc.
  • augmented reality massively multiplayer online gaming system 100 may include see-through display devices 104 , 106 , and 108 and home entertainment system 110 including gaming system 112 connected to stationary display device 114 .
  • Each computing device may participate in a massively multiplayer online game by connecting to game server 102 through network 116 (e.g., the Internet).
  • network 116 e.g., the Internet
  • game server 102 of FIG. 1 may perform the steps of method 200 to host a plurality of cross-platform multiplayer gaming sessions.
  • game server 102 joins a first computing device to a first multiplayer gaming session.
  • the first computing device may include a see-through display.
  • the first computing device may correspond to see-through display device 104 of FIG. 1 .
  • the game server sends augmentation information to the first computing device.
  • the augmentation information may correspond to a virtual environment of the multiplayer gaming session. Augmentation information may include image data corresponding to the virtual environment or game space of the multiplayer gaming session and location information for placement of the image data.
  • the location information may be provided relative to other image data. For example, an augmentation for a tree may be described in relation to an augmentation for a building. In some embodiments, the location information may be provided relative to physical locations. For example, an augmentation for a tree may be described in relation to a location of the tree in the physical world.
  • the see-through display device may display this augmentation at a particular location of the see-through display based on a known or detected location of the tree in the physical world.
  • the see-through display of the first computing device may utilize the augmentation information to present an augmented reality experience by representing the multiplayer gaming session as an augmentation of the physical environment viewable through the see-through display.
  • An illustration of an example augmented reality experience is shown in FIG. 3 and FIG. 4 .
  • physical environment 300 may include a plurality of real-world objects 302 .
  • Real-world objects may include virtually any object that exists in the real world, including, but not limited to, trees, buildings, landmarks, people, vehicles, etc.
  • Physical environment 300 may also include a plurality of computing devices participating in a multiplayer gaming session, such as see-through display devices 304 and 306 controlled by a first user 308 and a second user 310 , respectively.
  • see-through display devices 304 and 306 may correspond to see-through display devices 104 and 106 of FIG. 1 and physical environment 300 may correspond to Location 1 of FIG. 1 .
  • FIG. 4 illustrates an example augmented reality experience for a multiplayer gaming session.
  • Augmented view 400 of physical environment 300 is depicted as seen through see-through display device 304 .
  • Augmented view 400 may provide an augmented reality experience to user 308 by presenting an altered appearance of one or more objects in the real world when the objects are viewed through a see-through display.
  • see-through display device 304 may display virtual objects 402 as an augmentation of real-world objects 302 in order to provide an augmented reality environment corresponding to a multiplayer gaming session.
  • An appearance of user 310 may also be augmented to provide an augmented reality version of a character controlled by user 310 .
  • see-through display device 304 may display virtual character-based objects 404 , such as armor and weaponry, in a location corresponding to relevant portions of user 310 .
  • a virtual sword 404 a may be displayed in a location corresponding to a right hand of user 310 .
  • Character-based items may include any suitable virtual item associated with a character and positioned on a see-through display to augment an appearance of a user.
  • character information 406 may also be displayed near a location of a character presented in the augmented view.
  • Character information may include virtually any information about a character and/or a user, such as character name, user name, stats, gear, contact information, etc.
  • Character information may be displayed in any suitable format, such as text, icons, images, etc.
  • virtual items such as pets or companions belonging to a character of a user may be displayed near a location of the user.
  • character information 406 a includes a user name and a character name corresponding to user 310 displayed as a popup above user 310 .
  • Augmentations of real-world objects may be configured to hide associated real-world objects. In some embodiments, this may be achieved by obscuring an object with one or more images that depict a background. For example, a tree that appears in front of a portion of sky when viewed through a see-through display may be augmented by displaying an image of the sky in a location of the see-through display corresponding to the location of the tree. In this example, the augmented tree appears as sky when viewed through the see-through display, giving the illusion that the tree has disappeared.
  • An augmented view of a physical environment may also include virtual objects that do not directly correspond to a real-world object in the physical environment.
  • virtual character 408 may be displayed to represent a user of a remotely located computing device participating in the multiplayer gaming session.
  • Character information 406 may also be displayed for virtual characters of remote users.
  • character information 406 b is shown as a popup including a user name and a character name corresponding to a user represented by character 408 .
  • a second computing device is joined to the first multiplayer gaming session.
  • the second computing device may be remotely located and/or may operate on a different platform than the first computing device.
  • the game server sends experience information to the second computing device.
  • the experience information may provide a cross-platform representation of the augmented reality experience provided to the first computing device.
  • FIG. 5 illustrates an example cross-platform representation of the augmented reality experience depicted in FIG. 4 .
  • Gaming system 502 may be joined to a multiplayer gaming session based on virtually any suitable criteria.
  • gaming system 502 may be joined to a multiplayer gaming session in response to a user 506 opting into and/or selecting the multiplayer gaming session.
  • gaming system 502 may be joined to a multiplayer gaming session automatically based on criteria such as location, session population, technical specifications of the gaming system, preference settings, and/or any other suitable criteria.
  • the multiplayer gaming session may be presented to user 506 by displaying experience 508 on display device 504 as a cross-platform representation of an augmented reality experience.
  • user 506 may see a virtual representation of the game space in which one or more other players with see-through display devices are participating in the multiplayer gaming session.
  • FIG. 5 depicts a first person mode, in which display device 504 displays a virtual representation of users 308 and 310 from a first-person perspective of virtual character 408 shown in FIG. 4 .
  • a third-person perspective may be used.
  • perspective may be dynamically altered, by selection or automatically, to show experience 508 in a third-person mode and/or from any other suitable perspective.
  • Experience 508 may be a cross-platform representation of the augmented reality experience that is configured for visual presentation via display device 504 .
  • experience 508 may be presented in response to receiving experience information from a game server, such as game server 102 .
  • experience information may include aspects of a physical environment corresponding to the multiplayer gaming session.
  • Experience information may additionally or alternatively include augmentation information or a representation of augmentation information corresponding to a virtual environment of the multiplayer gaming session.
  • aspects of the physical environment may be provided in any suitable manner and may include any information about the physical environment, such as topographical features, depth information of objects in the physical environment, landmark locations, transportation routes, time of day, weather conditions, etc.
  • the aspects of the physical environment may be used to configure, alter, and/or enhance gameplay by incorporating the aspects and/or elements related to the aspects into the multiplayer gaming session. For example, gameplay may be different at night versus during the day.
  • the game server may store aspects of the physical environment, receive information about the physical environment from a third-party database, and/or receive information from participants of a multiplayer gaming session.
  • a see-through display device may detect aspects of the physical environment via one or more sensors of the see-through display device and send the detected aspects to the server and/or other computing devices participating in a multiplayer gaming session.
  • the game server may identify all information relating to a game space for a multiplayer game session and configure the information for a particular platform.
  • Game space information may describe a multiplayer gaming session by including information such as physical environment information and corresponding virtual environment information.
  • the game server may send experience information to a computing device, such as gaming system 502 , in the form of a two- or three-dimensional representation of the game space, so that it may be displayed on display device 504 .
  • the cross-platform representation of an augmented reality experience may provide an experience corresponding to a multiplayer gaming session to a user having a different computing device than other users in the multiplayer gaming session.
  • the augmented reality experience and cross-platform representation of the augmented reality experience may allow a plurality of users of a plurality of computing devices operating on different platforms to participate in a multiplayer gaming session of a massively multiplayer online game.
  • a game space for a multiplayer gaming session may be provided to a see-through display device in a different manner than a gaming system.
  • augmentation information may be provided to a see-through display device, such as see-through display device 304 of FIG. 3
  • experience information may be provided to a computing device operating on a different platform, such as gaming system 502 of FIG. 5 .
  • augmentation information may be a different data format, include different data structures, have different data sources, and/or differ in any suitable manner from experience information in order to configure the experience for presentation on a display device of a respective platform.
  • the computing devices may also perform cross-platform communication to provide a social aspect to the game.
  • communication information may be sent between a first and second computing device, such as see-through display device 104 and gaming system 112 of FIG. 1 .
  • Communication information may include text and/or speech input from the users of the computing devices.
  • the communication information may be sent directly between the computing devices, through a game server, and/or through any third-party computing device.
  • the computing devices and/or the game server may include text-to-speech and/or speech-to-text conversion capabilities.
  • speech output may be derived from text input and text output may be derived from speech input.
  • FIG. 3 may provide a speech input to see-through display device 304 to communicate with user 506 of FIG. 5 .
  • the see-through display device 304 , game server 102 , and/or gaming system 502 may convert the speech input to text output to be displayed on display device 504 depicted in FIG. 5 .
  • Communication may be provided between any group of computing devices, including a private conversation between selected computing devices, across an entire game session, across all game sessions, etc.
  • a user may provide user input to control an associated user character and/or to interact with the experience. For example, a user may provide user input in order to create, delete, and/or otherwise alter one or more elements within a multiplayer gaming session.
  • User input may be detected by any suitable user input device or combination of input devices, such as gesture detection devices, microphones, inertial measurement units, etc.
  • user 308 of FIG. 3 may point at a target, such as virtual character 408 depicted in FIG. 4 , and perform a gesture corresponding to an attack in order to control an associated character to attack the target.
  • the gesture may be captured by one or more sensors of see-through display device 304 , such as an image sensor, and matched to a corresponding command in order to cause the character associated with user 308 to perform the requested attack.
  • a user may also combine user input, for example, by pointing at a target and speaking an attack command in order to perform the associated attack on the target.
  • an inertial measurement unit connected to see-through display device 304 of FIG. 3 may detect a gesture from user 308 while a microphone of see-through display device 304 detects a voice command from user 308 .
  • User input may also allow a user to interact with menu items of an experience.
  • user 308 of FIG. 3 may point at a second user, such as user 506 of FIG. 5 , or a virtual representation of a remote user, such as virtual character 408 of FIG. 4 , to display additional information about the corresponding user and/or character.
  • an experience may include a map of a multiplayer gaming session as an overlay and/or augmentation of a map of the physical environment associated with the multiplayer gaming session.
  • the map may provide a location of a representation of a user of a computing system that is dynamically updated to correspond to a current location of the computing system. Put another way, real world travel may be automatically updated in the experience.
  • the map may provide a location of virtually any item of the experience and/or real world, such as locations of other users, quests, virtual objects, real-world objects, etc.
  • the map may be customizable by a user. For example, the map may be filtered to show only friends of the users, enemies, selected quests, etc.
  • the map may be viewable when logged out of a multiplayer gaming session and/or game server. For example, a user may view the map to identify an in-game status of a physical location of the user.
  • a multiplayer gaming session may have features that are location-dependent.
  • a multiplayer gaming session may correspond to a particular region or regions of the real world.
  • quests or other in-game events may be tied to a real-world landmark, such that the in-game event is activated when a user is within a proximity threshold of the real-world landmark.
  • In-game events may also be triggered when a number of users in a particular real-world location and/or virtual location within a multiplayer gaming session exceeds a threshold.
  • in-game events may be created, deleted, and/or otherwise altered by a user. For example, a user may create a virtual object to be placed within a gamespace of the multiplayer gaming session. In response to the creation of the virtual object, another user may receive a quest related to the virtual object. The created virtual object may remain and be viewed as part of the related quest.
  • FIG. 6 shows an example method 600 of joining two computing devices to multiplayer gaming sessions based on the locations of the two computing devices.
  • a game server such as game server 102 of FIG. 1
  • the first computing device may include a first see-through display, such as see-through display device 104 of FIG. 1 .
  • the location of a computing device may be determined in any suitable manner. For example, the location of the first computing device may be determined by detecting aspects of the physical environment via one or more sensors of the first computing device.
  • the first computing device is joined to a first multiplayer gaming session.
  • the first multiplayer gaming session may be selected based on the location of the first computing device. Virtually any suitable criteria related to location may direct the selection of a multiplayer gaming session as described at 608 .
  • a plurality of multiplayer gaming sessions may each correspond to a different real-world region.
  • a computing device may be placed in a multiplayer gaming session corresponding to a real-world region in which the computing device is located.
  • location may be one of a plurality of criteria for selecting a multiplayer gaming session for a computing device. For example, a computing device may be joined to a multiplayer gaming session corresponding to the nearest real-world region that has a population above or below a population threshold. As another example, a computing device may be joined to a multiplayer gaming session corresponding to the nearest real-world region having one or more friends of a user of the computing device.
  • augmentation information for the first multiplayer gaming session may be sent to the first computing device at 610 .
  • Such augmentation information may be used by the see-through display to augment the reality of the physical environment viewed through the see-through display.
  • the game server receives a location of a second computing device.
  • the second computing device may include a second see-through display, such as see-through display device 106 or 108 of FIG. 1 .
  • the game server determines whether the location of the second computing device is within a proximity threshold of the location of the first computing device.
  • the proximity threshold may define a maximum and/or minimum distance between the first and second computing devices.
  • the proximity threshold may define a game boundary (e.g., downtown Seattle).
  • the proximity threshold may be one of a plurality of criteria for joining a second computing device to a multiplayer gaming session.
  • the game server joins the second computing device to the first multiplayer gaming session at 618 .
  • the second computing device may correspond to see-through display device 106 of FIG. 1 .
  • the dashed box representing Location 1 may illustrate that see-through display devices 104 and 106 are within a proximity threshold of one another. Therefore, these devices may be joined to the same multiplayer gaming session.
  • the game server joins the second computing device to a second multiplayer gaming session, different from the first multiplayer gaming session, at 620 .
  • the second computing device may correspond to see-through display device 108 , which is at Location 2 in FIG. 1 .
  • see-through display device 108 may be joined to a different multiplayer gaming session than see-through display device 104 .
  • a first multiplayer gaming session may correspond to a first location (e.g., Seattle) and include quests tied to landmarks of the first location (e.g., taking pictures of specific Seattle landmarks).
  • a second multiplayer gaming session may correspond to a second location (e.g., Portland) and include quests tied to landmarks of the second location (e.g., a scavenger hunt for virtual objects hidden near specific Portland landmarks).
  • the second computing device may be joined to the first multiplayer gaming session if the second computing device moves to a location within the proximity threshold of the location of the first computing device. For example, the second computing device may log out of the game server and/or the second multiplayer gaming session and move to a new physical location within the proximity threshold of the first computing device. Upon logging back into the game server and/or the second multiplayer gaming session, the second computing device may be joined to the first multiplayer gaming session. In another example, the second computing device may move within the proximity threshold of the location of the first computing device without disconnecting from the multiplayer gaming session. In response, the second computing device may be automatically joined to the first multiplayer gaming session.
  • FIG. 7 shows a nonlimiting example of a see-through display device 104 including a see-through display 702 .
  • see-through display device 104 may be a head-mounted see-through display device.
  • a see-through display device such as see-through display device 104 may be integrated as depicted in FIG. 7 .
  • the see-through display device may be a modular computing system.
  • the computing system may include a see-through display and one or more other components communicatively coupled to the see-through display.
  • See-through display 702 is at least partially transparent, thus allowing light to pass through the see-through display to the eyes of a user.
  • the see-through display is configured to visually augment an appearance of a physical space to a user viewing the physical space through the see-through display.
  • the see-through display may display virtual objects that the user can see when the user looks through the see-through display. As such, the user is able to view the virtual objects that do not exist within the physical space at the same time that the user views the physical space. This creates the illusion that the virtual objects are part of the physical space.
  • See-through display device 104 also includes a virtual reality engine 704 .
  • the virtual reality engine 704 may be configured to cause the see-through display to visually present one or more virtual objects as an augmentation of real-world objects.
  • the virtual objects can simulate the appearance of real world objects.
  • the virtual objects appear to be integrated with the physical space and/or with real-world objects.
  • the virtual objects and/or other images displayed via the see through display may be positioned relative to the eyes of a user such that the displayed virtual objects and/or images appear, to the user, to occupy particular locations within the physical space. In this way, the user is able to view objects that are not actually present in the physical space.
  • Virtual reality engine may include software, hardware, firmware, or any combination thereof.
  • See-through display device 104 may include a speaker subsystem 706 and a sensor subsystem 708 .
  • the sensor subsystem may include a variety of different sensors in different embodiments.
  • a sensor subsystem may include a microphone 710 , one or more forward-facing (away from user) infrared and/or visible light cameras 712 , and/or one or more rearward-facing (towards user) infrared and/or visible light cameras 714 .
  • the forward-facing camera(s) may include one or more depth cameras, and/or the rearward-facing cameras may include one or more eye-tracking cameras.
  • an onboard sensor subsystem may communicate with one or more off-board sensors that send observation information to the onboard sensor subsystem.
  • a depth camera used by a gaming console may send depth maps and/or modeled virtual skeletons to the sensor subsystem of the head-mounted display.
  • See-through display device 104 may also include one or more features that allow the see-through display device to be worn on the head of a user.
  • see-through display device 104 takes the form of eye glasses and includes a nose rest 716 and ear rests 718 a and 718 b .
  • a head-mounted display may include a hat or helmet with an in-front-of-the-face see-through visor.
  • the concepts described herein may be applied to see-through displays that are not head mounted (e.g., a windshield) and to displays that are not see-through (e.g., an opaque display that renders real objects observed by a camera with virtual objects not within the camera's field of view).
  • See-through display device 104 may also include a communication subsystem 720 .
  • Communication subsystem 720 may be configured to communicate with one or more off-board computing devices.
  • the communication subsystem may be configured to wirelessly receive a video stream, audio stream, coordinate information, virtual object descriptions, and/or other information to render augmentation information as an augmented reality experience.
  • the methods and processes described above may be tied to a computing system of one or more computing devices.
  • such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
  • API application-programming interface
  • FIG. 8 schematically shows a non-limiting embodiment of a computing system 800 that can enact one or more of the methods and processes described above.
  • Computing system 800 is shown in simplified form. It will be understood that virtually any computer architecture may be used without departing from the scope of this disclosure.
  • computing system 800 may take the form of a head-mounted see-through display device (e.g., see-through display device 104 ), gaming device (e.g., gaming system 502 ), mobile computing device, mobile communication device (e.g., smart phone), desktop computer, laptop computer, tablet computer, home-entertainment computer, network computing device, mainframe computer, server computer, etc.
  • Computing system 800 includes a logic subsystem 802 and a storage subsystem 804 .
  • Computing system 800 may optionally include a display subsystem 806 (e.g., a see-through display), input subsystem 808 , communication subsystem 810 , and/or other components not shown in FIG. 8 .
  • display subsystem 806 e.g., a see-through display
  • Logic subsystem 802 includes one or more physical devices configured to execute instructions.
  • the logic subsystem may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result.
  • the logic subsystem may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions.
  • the processors of the logic subsystem may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing.
  • the logic subsystem may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
  • Storage subsystem 804 includes one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the logic subsystem to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage subsystem 804 may be transformed—e.g., to hold different data.
  • Storage subsystem 804 may include removable media and/or built-in devices.
  • Storage subsystem 804 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.
  • Storage subsystem 804 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
  • storage subsystem 804 includes one or more physical, non-transitory devices.
  • aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
  • a pure signal e.g., an electromagnetic signal, an optical signal, etc.
  • data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
  • aspects of logic subsystem 802 and of storage subsystem 804 may be integrated together into one or more hardware-logic components through which the functionally described herein may be enacted.
  • hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC) systems, and complex programmable logic devices (CPLDs), for example.
  • program and engine may be used to describe an aspect of computing system 800 implemented to perform a particular function.
  • program or engine may be instantiated via logic subsystem 802 executing instructions held by storage subsystem 804 . It will be understood that different programs and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc.
  • program and “engine” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
  • a “service”, as used herein, is an application program executable across multiple user sessions.
  • a service may be available to one or more system components, programs, and/or other services.
  • a service may run on one or more server-computing devices.
  • display subsystem 806 may be used to present a visual representation of data held by storage subsystem 804 .
  • This visual representation may take the form of images that appear to augment a physical space, thus creating the illusion of an augmented reality.
  • the state of display subsystem 806 may likewise be transformed to visually represent changes in the underlying data.
  • Display subsystem 806 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 802 and/or storage subsystem 804 in a shared enclosure (e.g., a head-mounted display), or such display devices may be peripheral display devices.
  • input subsystem 808 may comprise or interface with one or more user-input devices such as a game controller, gesture input detection device, voice recognizer, inertial measurement unit, keyboard, mouse, or touch screen.
  • user-input devices such as a game controller, gesture input detection device, voice recognizer, inertial measurement unit, keyboard, mouse, or touch screen.
  • the input subsystem may comprise or interface with selected natural user input (NUI) componentry.
  • NUI natural user input
  • Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board.
  • Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
  • communication subsystem 810 may be configured to communicatively couple computing system 800 with one or more other computing devices.
  • Communication subsystem 810 may include wired and/or wireless communication devices compatible with one or more different communication protocols.
  • the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network.
  • the communication subsystem may allow computing system 800 to send and/or receive messages to and/or from other devices via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A plurality of game sessions are hosted at a server system. A first computing device of a first user is joined to a first multiplayer gaming session, the first computing device including a see-through display. Augmentation information is sent to the first computing device for the first multiplayer gaming session to provide an augmented reality experience to the first user. A second computing device of a second user is joined to the first multiplayer gaming session. Experience information is sent to the second computing device for the first multiplayer gaming session to provide a cross-platform representation of the augmented reality experience to the second user.

Description

    BACKGROUND
  • Massively multiplayer online games are frequently configured to operate on a single platform. Users typically participate in a massively multiplayer online game by selecting a server and viewing a virtual representation of the game on a stationary display, such as a high definition television.
  • SUMMARY
  • Embodiments are disclosed herein for providing a cross-platform, augmented reality online experience in a computing system. For example, a server system may host a plurality of multiplayer gaming sessions and join computing devices to the multiplayer gaming sessions. The server system may provide augmentation information to see-through display devices and/or experience information to computing systems operating on different platforms from the see-through display devices. The see-through display devices may provide an augmented reality experience with the augmentation information. The computing systems operating on different platforms may provide a cross-platform representation of the augmented reality experience with the experience information.
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 schematically shows an example communication diagram for an augmented reality massively multiplayer online gaming system including in accordance with an embodiment of the present disclosure.
  • FIG. 2 shows a method of hosting a plurality of multiplayer game sessions in accordance with an embodiment of the present disclosure.
  • FIG. 3 illustrates an unaltered view of a physical environment including participants of a multiplayer game session in accordance with an embodiment of the present disclosure.
  • FIG. 4 illustrates an example first-person view of an augmented reality experience for the physical environment depicted in FIG. 3.
  • FIG. 5 illustrates an example cross-platform representation of the augmented reality experience depicted in FIG. 4.
  • FIG. 6 shows a method of hosting a plurality of multiplayer game sessions in accordance with an embodiment of the present disclosure.
  • FIG. 7 schematically shows an example head-mounted display in accordance with an embodiment of the present disclosure.
  • FIG. 8 is an example computing system in accordance with an embodiment of the present disclosure.
  • DETAILED DESCRIPTION
  • Massively multiplayer online games are often implemented as two- or three-dimensional virtual environments on a single platform, such as a gaming console, a personal computer, or a mobile computing device. Previous cross-platform massively multiplayer online games only utilize platforms with the same type of game display (e.g., two- or three-dimensional graphics and/or animations rendered and displayed on a conventional display, such as a television, computer monitor, and/or mobile phone screen. Accordingly, players are only able to participate in games with other players having platforms that utilize the same types of displays, thereby limiting the amount and variety of players encountered in such multiplayer online games. Thus, the disclosed embodiments are directed to a cross-platform, massively multiplayer online experience that allows users with augmented reality, see-through displays to participate in an augmented reality game. For example, as described in more detail below, users of a see-through display device may participate in an augmented reality experience. Users of a computing device operating on a different platform may participate in a cross-platform representation of the augmented reality experience. The cross-platform representation may thereby bring the appearance of the augmented reality experience to a user of a computing device that is typically incapable of providing such an augmented reality experience.
  • FIG. 1 schematically shows an example communication diagram of a cross-platform, augmented reality massively multiplayer online gaming system 100 including a plurality of multiplayer gaming sessions 101 hosted by a game server 102. Cross-platform, augmented reality massively multiplayer online gaming system 100 may accommodate a variety of cross-platform computing devices, such as see-through display devices, gaming systems connected to a stationary display device, personal computing devices connected to an external display device, mobile computing devices such as laptops, smart phones, tablets, etc. As shown in FIG. 1, augmented reality massively multiplayer online gaming system 100 may include see-through display devices 104, 106, and 108 and home entertainment system 110 including gaming system 112 connected to stationary display device 114. Each computing device may participate in a massively multiplayer online game by connecting to game server 102 through network 116 (e.g., the Internet).
  • Turning now to FIG. 2, a method 200 of hosting a plurality of multiplayer gaming sessions is shown. For example, game server 102 of FIG. 1 may perform the steps of method 200 to host a plurality of cross-platform multiplayer gaming sessions. At 202, game server 102 joins a first computing device to a first multiplayer gaming session. In some embodiments, as shown at 204, the first computing device may include a see-through display. For example, the first computing device may correspond to see-through display device 104 of FIG. 1.
  • At 206, the game server sends augmentation information to the first computing device. In some embodiments, as shown at 208, the augmentation information may correspond to a virtual environment of the multiplayer gaming session. Augmentation information may include image data corresponding to the virtual environment or game space of the multiplayer gaming session and location information for placement of the image data. In some embodiments, the location information may be provided relative to other image data. For example, an augmentation for a tree may be described in relation to an augmentation for a building. In some embodiments, the location information may be provided relative to physical locations. For example, an augmentation for a tree may be described in relation to a location of the tree in the physical world. The see-through display device may display this augmentation at a particular location of the see-through display based on a known or detected location of the tree in the physical world. Thus, the see-through display of the first computing device may utilize the augmentation information to present an augmented reality experience by representing the multiplayer gaming session as an augmentation of the physical environment viewable through the see-through display. An illustration of an example augmented reality experience is shown in FIG. 3 and FIG. 4.
  • Turning first to FIG. 3, an example unaltered view of a physical environment 300 is shown. For example, physical environment 300 may include a plurality of real-world objects 302. Real-world objects may include virtually any object that exists in the real world, including, but not limited to, trees, buildings, landmarks, people, vehicles, etc. Physical environment 300 may also include a plurality of computing devices participating in a multiplayer gaming session, such as see-through display devices 304 and 306 controlled by a first user 308 and a second user 310, respectively. For example, see-through display devices 304 and 306 may correspond to see-through display devices 104 and 106 of FIG. 1 and physical environment 300 may correspond to Location 1 of FIG. 1.
  • FIG. 4 illustrates an example augmented reality experience for a multiplayer gaming session. Augmented view 400 of physical environment 300 is depicted as seen through see-through display device 304. Augmented view 400 may provide an augmented reality experience to user 308 by presenting an altered appearance of one or more objects in the real world when the objects are viewed through a see-through display. For example, see-through display device 304 may display virtual objects 402 as an augmentation of real-world objects 302 in order to provide an augmented reality environment corresponding to a multiplayer gaming session.
  • An appearance of user 310 may also be augmented to provide an augmented reality version of a character controlled by user 310. For example, see-through display device 304 may display virtual character-based objects 404, such as armor and weaponry, in a location corresponding to relevant portions of user 310. As depicted, a virtual sword 404 a may be displayed in a location corresponding to a right hand of user 310. Character-based items may include any suitable virtual item associated with a character and positioned on a see-through display to augment an appearance of a user.
  • As shown at 210 of FIG. 2, character information 406 may also be displayed near a location of a character presented in the augmented view. Character information may include virtually any information about a character and/or a user, such as character name, user name, stats, gear, contact information, etc. Character information may be displayed in any suitable format, such as text, icons, images, etc. In some embodiments, virtual items such as pets or companions belonging to a character of a user may be displayed near a location of the user. For example, character information 406 a includes a user name and a character name corresponding to user 310 displayed as a popup above user 310.
  • Augmentations of real-world objects may be configured to hide associated real-world objects. In some embodiments, this may be achieved by obscuring an object with one or more images that depict a background. For example, a tree that appears in front of a portion of sky when viewed through a see-through display may be augmented by displaying an image of the sky in a location of the see-through display corresponding to the location of the tree. In this example, the augmented tree appears as sky when viewed through the see-through display, giving the illusion that the tree has disappeared.
  • An augmented view of a physical environment may also include virtual objects that do not directly correspond to a real-world object in the physical environment. For example, virtual character 408 may be displayed to represent a user of a remotely located computing device participating in the multiplayer gaming session. Character information 406 may also be displayed for virtual characters of remote users. For example, character information 406 b is shown as a popup including a user name and a character name corresponding to a user represented by character 408.
  • Turning back to FIG. 2, at 212 a second computing device is joined to the first multiplayer gaming session. In some embodiments, the second computing device may be remotely located and/or may operate on a different platform than the first computing device. At 214, the game server sends experience information to the second computing device. As shown at 216, the experience information may provide a cross-platform representation of the augmented reality experience provided to the first computing device. FIG. 5 illustrates an example cross-platform representation of the augmented reality experience depicted in FIG. 4.
  • Indoor environment 500 includes a gaming system 502 connected to a stationary external display device 504. For example, gaming system 502 and display device 504 may correspond to gaming system 112 and display device 114 of FIG. 1. Gaming system 502 may be joined to a multiplayer gaming session based on virtually any suitable criteria. In some embodiments, gaming system 502 may be joined to a multiplayer gaming session in response to a user 506 opting into and/or selecting the multiplayer gaming session. In other embodiments, gaming system 502 may be joined to a multiplayer gaming session automatically based on criteria such as location, session population, technical specifications of the gaming system, preference settings, and/or any other suitable criteria.
  • The multiplayer gaming session may be presented to user 506 by displaying experience 508 on display device 504 as a cross-platform representation of an augmented reality experience. For example, user 506 may see a virtual representation of the game space in which one or more other players with see-through display devices are participating in the multiplayer gaming session. FIG. 5 depicts a first person mode, in which display device 504 displays a virtual representation of users 308 and 310 from a first-person perspective of virtual character 408 shown in FIG. 4. In some embodiments, a third-person perspective may be used. For example, perspective may be dynamically altered, by selection or automatically, to show experience 508 in a third-person mode and/or from any other suitable perspective.
  • Experience 508 may be a cross-platform representation of the augmented reality experience that is configured for visual presentation via display device 504. For example, experience 508 may be presented in response to receiving experience information from a game server, such as game server 102. As shown at 218 of FIG. 2, experience information may include aspects of a physical environment corresponding to the multiplayer gaming session. Experience information may additionally or alternatively include augmentation information or a representation of augmentation information corresponding to a virtual environment of the multiplayer gaming session.
  • Aspects of the physical environment may be provided in any suitable manner and may include any information about the physical environment, such as topographical features, depth information of objects in the physical environment, landmark locations, transportation routes, time of day, weather conditions, etc. The aspects of the physical environment may be used to configure, alter, and/or enhance gameplay by incorporating the aspects and/or elements related to the aspects into the multiplayer gaming session. For example, gameplay may be different at night versus during the day. In some embodiments, the game server may store aspects of the physical environment, receive information about the physical environment from a third-party database, and/or receive information from participants of a multiplayer gaming session. For example, a see-through display device may detect aspects of the physical environment via one or more sensors of the see-through display device and send the detected aspects to the server and/or other computing devices participating in a multiplayer gaming session.
  • In some embodiments, the game server may identify all information relating to a game space for a multiplayer game session and configure the information for a particular platform. Game space information may describe a multiplayer gaming session by including information such as physical environment information and corresponding virtual environment information. For example, the game server may send experience information to a computing device, such as gaming system 502, in the form of a two- or three-dimensional representation of the game space, so that it may be displayed on display device 504. Thus, the cross-platform representation of an augmented reality experience may provide an experience corresponding to a multiplayer gaming session to a user having a different computing device than other users in the multiplayer gaming session.
  • The augmented reality experience and cross-platform representation of the augmented reality experience may allow a plurality of users of a plurality of computing devices operating on different platforms to participate in a multiplayer gaming session of a massively multiplayer online game. For example, a game space for a multiplayer gaming session may be provided to a see-through display device in a different manner than a gaming system. As discussed above, augmentation information may be provided to a see-through display device, such as see-through display device 304 of FIG. 3, while experience information may be provided to a computing device operating on a different platform, such as gaming system 502 of FIG. 5. Accordingly, augmentation information may be a different data format, include different data structures, have different data sources, and/or differ in any suitable manner from experience information in order to configure the experience for presentation on a display device of a respective platform.
  • The computing devices may also perform cross-platform communication to provide a social aspect to the game. For example, communication information may be sent between a first and second computing device, such as see-through display device 104 and gaming system 112 of FIG. 1. Communication information may include text and/or speech input from the users of the computing devices. In some embodiments, the communication information may be sent directly between the computing devices, through a game server, and/or through any third-party computing device. The computing devices and/or the game server may include text-to-speech and/or speech-to-text conversion capabilities. Thus, speech output may be derived from text input and text output may be derived from speech input. For example, user 308 of FIG. 3 may provide a speech input to see-through display device 304 to communicate with user 506 of FIG. 5. In response, the see-through display device 304, game server 102, and/or gaming system 502 may convert the speech input to text output to be displayed on display device 504 depicted in FIG. 5. Communication may be provided between any group of computing devices, including a private conversation between selected computing devices, across an entire game session, across all game sessions, etc.
  • A user may provide user input to control an associated user character and/or to interact with the experience. For example, a user may provide user input in order to create, delete, and/or otherwise alter one or more elements within a multiplayer gaming session. User input may be detected by any suitable user input device or combination of input devices, such as gesture detection devices, microphones, inertial measurement units, etc. For example, user 308 of FIG. 3 may point at a target, such as virtual character 408 depicted in FIG. 4, and perform a gesture corresponding to an attack in order to control an associated character to attack the target. The gesture may be captured by one or more sensors of see-through display device 304, such as an image sensor, and matched to a corresponding command in order to cause the character associated with user 308 to perform the requested attack.
  • A user may also combine user input, for example, by pointing at a target and speaking an attack command in order to perform the associated attack on the target. For example, an inertial measurement unit connected to see-through display device 304 of FIG. 3 may detect a gesture from user 308 while a microphone of see-through display device 304 detects a voice command from user 308. User input may also allow a user to interact with menu items of an experience. For example, user 308 of FIG. 3 may point at a second user, such as user 506 of FIG. 5, or a virtual representation of a remote user, such as virtual character 408 of FIG. 4, to display additional information about the corresponding user and/or character.
  • In some embodiments, an experience may include a map of a multiplayer gaming session as an overlay and/or augmentation of a map of the physical environment associated with the multiplayer gaming session. For example, the map may provide a location of a representation of a user of a computing system that is dynamically updated to correspond to a current location of the computing system. Put another way, real world travel may be automatically updated in the experience. The map may provide a location of virtually any item of the experience and/or real world, such as locations of other users, quests, virtual objects, real-world objects, etc. In some embodiments, the map may be customizable by a user. For example, the map may be filtered to show only friends of the users, enemies, selected quests, etc. The map may be viewable when logged out of a multiplayer gaming session and/or game server. For example, a user may view the map to identify an in-game status of a physical location of the user.
  • A multiplayer gaming session may have features that are location-dependent. For example, a multiplayer gaming session may correspond to a particular region or regions of the real world. In some embodiments, quests or other in-game events may be tied to a real-world landmark, such that the in-game event is activated when a user is within a proximity threshold of the real-world landmark. In-game events may also be triggered when a number of users in a particular real-world location and/or virtual location within a multiplayer gaming session exceeds a threshold. In some embodiments, in-game events may be created, deleted, and/or otherwise altered by a user. For example, a user may create a virtual object to be placed within a gamespace of the multiplayer gaming session. In response to the creation of the virtual object, another user may receive a quest related to the virtual object. The created virtual object may remain and be viewed as part of the related quest.
  • FIG. 6 shows an example method 600 of joining two computing devices to multiplayer gaming sessions based on the locations of the two computing devices. At 602, a game server, such as game server 102 of FIG. 1, may receive a location of a first computing device. As shown at 604, the first computing device may include a first see-through display, such as see-through display device 104 of FIG. 1. The location of a computing device may be determined in any suitable manner. For example, the location of the first computing device may be determined by detecting aspects of the physical environment via one or more sensors of the first computing device.
  • At 606, the first computing device is joined to a first multiplayer gaming session. As shown at 608, the first multiplayer gaming session may be selected based on the location of the first computing device. Virtually any suitable criteria related to location may direct the selection of a multiplayer gaming session as described at 608. For example, a plurality of multiplayer gaming sessions may each correspond to a different real-world region.
  • In some embodiments, a computing device may be placed in a multiplayer gaming session corresponding to a real-world region in which the computing device is located. In additional or alternative embodiments, location may be one of a plurality of criteria for selecting a multiplayer gaming session for a computing device. For example, a computing device may be joined to a multiplayer gaming session corresponding to the nearest real-world region that has a population above or below a population threshold. As another example, a computing device may be joined to a multiplayer gaming session corresponding to the nearest real-world region having one or more friends of a user of the computing device.
  • After joining the first multiplayer gaming session, augmentation information for the first multiplayer gaming session may be sent to the first computing device at 610. Such augmentation information may be used by the see-through display to augment the reality of the physical environment viewed through the see-through display.
  • At 612, the game server receives a location of a second computing device. As shown at 614, the second computing device may include a second see-through display, such as see-through display device 106 or 108 of FIG. 1. At 616, the game server determines whether the location of the second computing device is within a proximity threshold of the location of the first computing device. For example, the proximity threshold may define a maximum and/or minimum distance between the first and second computing devices. As another example, the proximity threshold may define a game boundary (e.g., downtown Seattle). In some embodiments, the proximity threshold may be one of a plurality of criteria for joining a second computing device to a multiplayer gaming session.
  • If the location of the second computing device is within a proximity threshold of the location of the first computing device, the game server joins the second computing device to the first multiplayer gaming session at 618. For example, the second computing device may correspond to see-through display device 106 of FIG. 1. The dashed box representing Location 1 may illustrate that see-through display devices 104 and 106 are within a proximity threshold of one another. Therefore, these devices may be joined to the same multiplayer gaming session.
  • Alternatively, if the location of the second computing device is not within a proximity threshold of the location of the first computing device, the game server joins the second computing device to a second multiplayer gaming session, different from the first multiplayer gaming session, at 620. In this example, the second computing device may correspond to see-through display device 108, which is at Location 2 in FIG. 1. As Location 2 may be considered to be outside of a proximity threshold of see-through display device 104, see-through display device 108 may be joined to a different multiplayer gaming session than see-through display device 104. For example, a first multiplayer gaming session may correspond to a first location (e.g., Seattle) and include quests tied to landmarks of the first location (e.g., taking pictures of specific Seattle landmarks). A second multiplayer gaming session may correspond to a second location (e.g., Portland) and include quests tied to landmarks of the second location (e.g., a scavenger hunt for virtual objects hidden near specific Portland landmarks).
  • In some embodiments, the second computing device may be joined to the first multiplayer gaming session if the second computing device moves to a location within the proximity threshold of the location of the first computing device. For example, the second computing device may log out of the game server and/or the second multiplayer gaming session and move to a new physical location within the proximity threshold of the first computing device. Upon logging back into the game server and/or the second multiplayer gaming session, the second computing device may be joined to the first multiplayer gaming session. In another example, the second computing device may move within the proximity threshold of the location of the first computing device without disconnecting from the multiplayer gaming session. In response, the second computing device may be automatically joined to the first multiplayer gaming session.
  • FIG. 7 shows a nonlimiting example of a see-through display device 104 including a see-through display 702. For example, see-through display device 104 may be a head-mounted see-through display device. In some embodiments, a see-through display device such as see-through display device 104 may be integrated as depicted in FIG. 7. In alternative embodiments, the see-through display device may be a modular computing system. The computing system may include a see-through display and one or more other components communicatively coupled to the see-through display.
  • See-through display 702 is at least partially transparent, thus allowing light to pass through the see-through display to the eyes of a user. Furthermore, the see-through display is configured to visually augment an appearance of a physical space to a user viewing the physical space through the see-through display. For example, the see-through display may display virtual objects that the user can see when the user looks through the see-through display. As such, the user is able to view the virtual objects that do not exist within the physical space at the same time that the user views the physical space. This creates the illusion that the virtual objects are part of the physical space.
  • See-through display device 104 also includes a virtual reality engine 704. The virtual reality engine 704 may be configured to cause the see-through display to visually present one or more virtual objects as an augmentation of real-world objects. The virtual objects can simulate the appearance of real world objects. To a user viewing the physical space through the see-through display, the virtual objects appear to be integrated with the physical space and/or with real-world objects. For example, the virtual objects and/or other images displayed via the see through display may be positioned relative to the eyes of a user such that the displayed virtual objects and/or images appear, to the user, to occupy particular locations within the physical space. In this way, the user is able to view objects that are not actually present in the physical space. Virtual reality engine may include software, hardware, firmware, or any combination thereof.
  • See-through display device 104 may include a speaker subsystem 706 and a sensor subsystem 708. The sensor subsystem may include a variety of different sensors in different embodiments. As nonlimiting examples, a sensor subsystem may include a microphone 710, one or more forward-facing (away from user) infrared and/or visible light cameras 712, and/or one or more rearward-facing (towards user) infrared and/or visible light cameras 714. The forward-facing camera(s) may include one or more depth cameras, and/or the rearward-facing cameras may include one or more eye-tracking cameras. In some embodiments, an onboard sensor subsystem may communicate with one or more off-board sensors that send observation information to the onboard sensor subsystem. For example, a depth camera used by a gaming console may send depth maps and/or modeled virtual skeletons to the sensor subsystem of the head-mounted display.
  • See-through display device 104 may also include one or more features that allow the see-through display device to be worn on the head of a user. In the illustrated example, see-through display device 104 takes the form of eye glasses and includes a nose rest 716 and ear rests 718 a and 718 b. In other embodiments, a head-mounted display may include a hat or helmet with an in-front-of-the-face see-through visor. Furthermore, while described in the context of a head-mounted see-through display, the concepts described herein may be applied to see-through displays that are not head mounted (e.g., a windshield) and to displays that are not see-through (e.g., an opaque display that renders real objects observed by a camera with virtual objects not within the camera's field of view).
  • See-through display device 104 may also include a communication subsystem 720. Communication subsystem 720 may be configured to communicate with one or more off-board computing devices. As an example, the communication subsystem may be configured to wirelessly receive a video stream, audio stream, coordinate information, virtual object descriptions, and/or other information to render augmentation information as an augmented reality experience.
  • In some embodiments, the methods and processes described above may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
  • FIG. 8 schematically shows a non-limiting embodiment of a computing system 800 that can enact one or more of the methods and processes described above. Computing system 800 is shown in simplified form. It will be understood that virtually any computer architecture may be used without departing from the scope of this disclosure. In different embodiments, computing system 800 may take the form of a head-mounted see-through display device (e.g., see-through display device 104), gaming device (e.g., gaming system 502), mobile computing device, mobile communication device (e.g., smart phone), desktop computer, laptop computer, tablet computer, home-entertainment computer, network computing device, mainframe computer, server computer, etc.
  • Computing system 800 includes a logic subsystem 802 and a storage subsystem 804. Computing system 800 may optionally include a display subsystem 806 (e.g., a see-through display), input subsystem 808, communication subsystem 810, and/or other components not shown in FIG. 8.
  • Logic subsystem 802 includes one or more physical devices configured to execute instructions. For example, the logic subsystem may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result.
  • The logic subsystem may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors of the logic subsystem may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing. The logic subsystem may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
  • Storage subsystem 804 includes one or more physical, non-transitory, devices configured to hold data and/or instructions executable by the logic subsystem to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage subsystem 804 may be transformed—e.g., to hold different data.
  • Storage subsystem 804 may include removable media and/or built-in devices. Storage subsystem 804 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage subsystem 804 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
  • It will be appreciated that storage subsystem 804 includes one or more physical, non-transitory devices. However, in some embodiments, aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
  • In some embodiments, aspects of logic subsystem 802 and of storage subsystem 804 may be integrated together into one or more hardware-logic components through which the functionally described herein may be enacted. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC) systems, and complex programmable logic devices (CPLDs), for example.
  • The terms “program” and “engine” may be used to describe an aspect of computing system 800 implemented to perform a particular function. In some cases, program or engine may be instantiated via logic subsystem 802 executing instructions held by storage subsystem 804. It will be understood that different programs and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “program” and “engine” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
  • It will be appreciated that a “service”, as used herein, is an application program executable across multiple user sessions. A service may be available to one or more system components, programs, and/or other services. In some implementations, a service may run on one or more server-computing devices.
  • When included, display subsystem 806 may be used to present a visual representation of data held by storage subsystem 804. This visual representation may take the form of images that appear to augment a physical space, thus creating the illusion of an augmented reality. As the herein described methods and processes change the data held by the storage subsystem, and thus transform the state of the storage subsystem, the state of display subsystem 806 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 806 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 802 and/or storage subsystem 804 in a shared enclosure (e.g., a head-mounted display), or such display devices may be peripheral display devices.
  • When included, input subsystem 808 may comprise or interface with one or more user-input devices such as a game controller, gesture input detection device, voice recognizer, inertial measurement unit, keyboard, mouse, or touch screen. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
  • When included, communication subsystem 810 may be configured to communicatively couple computing system 800 with one or more other computing devices. Communication subsystem 810 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system 800 to send and/or receive messages to and/or from other devices via a network such as the Internet.
  • It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
  • The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims (22)

1. A method for hosting a plurality of game sessions at a server system, the method comprising:
joining a first computing device of a first user to a first multiplayer gaming session, the first computing device including a see-through display;
sending augmentation information to the first computing device for the first multiplayer gaming session to provide an augmented reality experience to the first user;
joining a second computing device of a second user to the first multiplayer gaming session; and
sending experience information to the second computing device for the first multiplayer gaming session to provide a cross-platform representation of the augmented reality experience to the second user.
2. The method of claim 1, wherein the cross-platform representation of the augmented reality experience is configured for visual presentation via a display device connected to the second computing device.
3. The method of claim 1, wherein the cross-platform representation of the augmented reality experience is presented to the second user in a first-person mode.
4. The method of claim 1, wherein the cross-platform representation of the augmented reality experience is presented to the second user in a third-person mode.
5. The method of claim 1, wherein the experience information includes aspects of a physical environment detected by the see-through display.
6. The method of claim 1, wherein the augmentation information includes character information that augments an appearance of a third user when the third user is viewed through the see-through display.
7-8. (canceled)
9. A computing system for participating in a multiplayer game, the computing system comprising:
a see-through display;
one or more sensors;
a logic subsystem; and
a storage subsystem configured to store instructions that, when executed, cause the computing system to:
determine a location of the computing system;
send the location of the computing system to a game server;
receive augmentation information from the game server for a first multiplayer gaming session, the first multiplayer gaming session corresponding to the location of the computing system;
present, on the see-through display, an augmented reality experience with the augmentation information, the augmented reality experience representing the first multiplayer gaming session as an augmentation of a physical environment viewable through the see-through display;
detect aspects of the physical environment via the one or more sensors; and
send experience information corresponding to the detected aspects of the physical environment to the game server to provide a cross-platform representation of the augmented reality experience to one or more remote computing devices participating in the first multiplayer gaming session.
10. The computing system of claim 9, wherein the cross-platform representation of the augmented reality experience is configured for visual presentation via a display device connected to the one more remote computing devices.
11. The computing system of claim 9, wherein the location of the computing system is determined by detecting aspects of the physical environment of the computing system via the one or more sensors.
12. The computing system of claim 9, wherein the first multiplayer gaming session is one of a plurality of multiplayer gaming sessions, each of the plurality of multiplayer gaming sessions corresponding to a different real world region.
13. The computing system of claim 9, wherein the instructions, when executed, further cause the system to display, on the see-through display, character information as an augmentation of a user participating in the first multiplayer gaming session when the user is viewed through the see-through display.
14. The computing system of claim 13, wherein the character information includes aspects of a character appearance that are displayed in a location of the see-through display corresponding to a real world location of the user.
15. The computing system of claim 9, wherein the instructions, when executed, further cause the computing system to detect a user input and perform a corresponding command associated with the augmented reality experience.
16. The computing system of claim 15, wherein the user input includes a gesture detected by a gesture detection device.
17. The computing system of claim 15, further comprising a microphone, wherein the user input includes a voice command detected by the microphone.
18. The computing system of claim 9, wherein a location of a representation of a user of the computing system within the augmented reality experience is dynamically updated to correspond to the location of the computing system.
19-20. (canceled)
21. The computing system of claim 9, wherein the one or more remote computing devices participating in the first multiplayer gaming session that are provided with the cross-platform representation of the augmented reality experience include at least one computing device including a see-through display and operating on a first platform, and at least one computing device including a stationary non see-through display and operating on a second platform different from the first platform.
22. The computing system of claim 9, wherein the augmented reality experience includes a first virtual representation of a first player viewable through the see-through display, wherein the first virtual representation is a virtual augmentation of the first player, wherein the augmented reality experience includes a second virtual representation of a second player located remotely so as not to be viewable through the see-through display, and wherein the second virtual representation is a virtual avatar of the second player.
23. The computing system of claim 22, wherein the cross-platform augmented reality experience is provided to a see-through display of the first player and the cross-platform augmented reality experience is provided to a stationary non see-through display of the second player.
24. The computing system of claim 23, wherein the cross-platform augmented reality experience provided to the see-through display of the first player visually differs from the cross-platform augmented reality experience provided to the stationary non see-through display of the second player.
US13/670,258 2012-11-06 2012-11-06 Cross-platform augmented reality experience Abandoned US20140128161A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/670,258 US20140128161A1 (en) 2012-11-06 2012-11-06 Cross-platform augmented reality experience
PCT/US2013/068351 WO2014074465A1 (en) 2012-11-06 2013-11-04 Cross-platform augmented reality experience

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/670,258 US20140128161A1 (en) 2012-11-06 2012-11-06 Cross-platform augmented reality experience

Publications (1)

Publication Number Publication Date
US20140128161A1 true US20140128161A1 (en) 2014-05-08

Family

ID=49640170

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/670,258 Abandoned US20140128161A1 (en) 2012-11-06 2012-11-06 Cross-platform augmented reality experience

Country Status (2)

Country Link
US (1) US20140128161A1 (en)
WO (1) WO2014074465A1 (en)

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140287836A1 (en) * 2013-03-21 2014-09-25 Nextbit Systems Inc. Location based game state synchronization
US20140364209A1 (en) * 2013-06-07 2014-12-11 Sony Corporation Entertainment America LLC Systems and Methods for Using Reduced Hops to Generate an Augmented Virtual Reality Scene Within A Head Mounted System
US20150005052A1 (en) * 2013-06-27 2015-01-01 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US9061210B2 (en) 2013-03-21 2015-06-23 Nextbit Systems Inc. Synchronizing an instance of an application between multiple devices
WO2016137675A1 (en) * 2015-02-27 2016-09-01 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments
US9451051B1 (en) * 2014-02-13 2016-09-20 Sprint Communications Company L.P. Method and procedure to improve delivery and performance of interactive augmented reality applications over a wireless network
US20160371888A1 (en) * 2014-03-10 2016-12-22 Bae Systems Plc Interactive information display
US9533226B2 (en) 2013-04-29 2017-01-03 Kabam, Inc. Dynamic adjustment of difficulty in an online game based on hardware or network configuration
US9555324B1 (en) 2013-07-02 2017-01-31 Kabam, Inc. Dynamic effectiveness for virtual items
US9606782B2 (en) 2013-03-21 2017-03-28 Nextbit Systems Inc. Game state synchronization and restoration across multiple devices
US20170160093A9 (en) * 2013-09-04 2017-06-08 Essilor International (Compagnie Genrale d'Optique Navigation method based on a see-through head-mounted device
US20170270713A1 (en) * 2016-03-21 2017-09-21 Accenture Global Solutions Limited Multiplatform based experience generation
US9836117B2 (en) 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
WO2017212208A1 (en) * 2016-06-08 2017-12-14 Companion Limited System providing a shared environment
US9852546B2 (en) 2015-01-28 2017-12-26 CCP hf. Method and system for receiving gesture input via virtual control objects
US20180028915A1 (en) * 2015-02-27 2018-02-01 Sony Interactive Entertainment Inc. Display control program, dislay control apparatus and display control method
US20180034867A1 (en) * 2016-07-29 2018-02-01 Jessica Ellen Zahn Private communication with gazing
US9898864B2 (en) 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality
US9901818B1 (en) * 2016-02-19 2018-02-27 Aftershock Services, Inc. Systems and methods for regulating access to game content of an online game
US9919218B1 (en) * 2016-02-19 2018-03-20 Aftershock Services, Inc. Systems and methods for providing virtual reality content in an online game
US20180154253A1 (en) * 2016-12-01 2018-06-07 Lenovo (Singapore) Pte. Ltd. Systems and methods for presentation of content at headset based on rating
CN108159688A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 Interface sharing method, mobile terminal and computer readable storage medium
US10035068B1 (en) 2016-02-19 2018-07-31 Electronic Arts Inc. Systems and methods for making progress of a user character obtained in an online game via a non-virtual reality interface available in a virtual reality interface
US20180285050A1 (en) * 2017-03-29 2018-10-04 Disney Enterprises, Inc. Incorporating external guests into a virtual reality environment
US10096204B1 (en) * 2016-02-19 2018-10-09 Electronic Arts Inc. Systems and methods for determining and implementing platform specific online game customizations
US10134227B1 (en) 2016-02-19 2018-11-20 Electronic Arts Inc. Systems and methods for making game content from a single online game accessible to users via multiple platforms
EP3422145A1 (en) * 2017-06-28 2019-01-02 Nokia Technologies Oy Provision of virtual reality content
US20190080519A1 (en) * 2016-09-30 2019-03-14 Sony Interactive Entertainment Inc. Integration of tracked facial features for vr users in virtual reality environments
US10235965B2 (en) * 2017-03-15 2019-03-19 Kuato Games (UK) Limited Virtual reality system using an actor and director model
WO2019221952A1 (en) * 2018-05-18 2019-11-21 Microsoft Technology Licensing, Llc Viewing a virtual reality environment on a user device
US10576379B1 (en) 2016-02-19 2020-03-03 Electronic Arts Inc. Systems and methods for adjusting online game content and access for multiple platforms
US10645126B2 (en) 2018-08-23 2020-05-05 8 Bit Development Inc. System and method for enabling simulated environment collaboration across a plurality of platforms
US10725297B2 (en) 2015-01-28 2020-07-28 CCP hf. Method and system for implementing a virtual representation of a physical environment using a virtual reality environment
US10726625B2 (en) 2015-01-28 2020-07-28 CCP hf. Method and system for improving the transmission and processing of data regarding a multi-user virtual environment
US10905943B2 (en) 2013-06-07 2021-02-02 Sony Interactive Entertainment LLC Systems and methods for reducing hops associated with a head mounted system
WO2021138161A1 (en) * 2019-12-31 2021-07-08 Snap Inc. Augmented reality objects registry
US11090561B2 (en) 2019-02-15 2021-08-17 Microsoft Technology Licensing, Llc Aligning location for a shared augmented reality experience
US11097194B2 (en) * 2019-05-16 2021-08-24 Microsoft Technology Licensing, Llc Shared augmented reality game within a shared coordinate space
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US20230032824A1 (en) * 2018-05-03 2023-02-02 Pcms Holdings, Inc. Systems and methods for physical proximity and/or gesture-based chaining of vr experiences
US11656680B2 (en) 2016-07-21 2023-05-23 Magic Leap, Inc. Technique for controlling virtual image generation system using emotional states of user
US11660536B2 (en) 2015-02-27 2023-05-30 Sony Interactive Entertainment Inc. Display control program, display control apparatus and display control method
US11864054B1 (en) * 2017-06-20 2024-01-02 Roblox Corporation Proximity friending
US12482131B2 (en) 2023-07-10 2025-11-25 Snap Inc. Extended reality tracking using shared pose data

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657589B (en) * 2017-11-16 2021-05-14 上海麦界信息技术有限公司 Mobile phone AR positioning coordinate axis synchronization method based on three-datum-point calibration

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6425825B1 (en) * 1992-05-22 2002-07-30 David H. Sitrick User image integration and tracking for an audiovisual presentation system and methodology
US20070281765A1 (en) * 2003-09-02 2007-12-06 Mullen Jeffrey D Systems and methods for location based games and employment of the same on locaton enabled devices
US20090215536A1 (en) * 2008-02-21 2009-08-27 Palo Alto Research Center Incorporated Location-aware mixed-reality gaming platform
US20100295771A1 (en) * 2009-05-20 2010-11-25 Microsoft Corporation Control of display objects
US20130117377A1 (en) * 2011-10-28 2013-05-09 Samuel A. Miller System and Method for Augmented and Virtual Reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120122570A1 (en) * 2010-11-16 2012-05-17 David Michael Baronoff Augmented reality gaming experience
US20120231887A1 (en) * 2011-03-07 2012-09-13 Fourth Wall Studios, Inc. Augmented Reality Mission Generators

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6425825B1 (en) * 1992-05-22 2002-07-30 David H. Sitrick User image integration and tracking for an audiovisual presentation system and methodology
US20070281765A1 (en) * 2003-09-02 2007-12-06 Mullen Jeffrey D Systems and methods for location based games and employment of the same on locaton enabled devices
US20090215536A1 (en) * 2008-02-21 2009-08-27 Palo Alto Research Center Incorporated Location-aware mixed-reality gaming platform
US20100295771A1 (en) * 2009-05-20 2010-11-25 Microsoft Corporation Control of display objects
US20130117377A1 (en) * 2011-10-28 2013-05-09 Samuel A. Miller System and Method for Augmented and Virtual Reality

Cited By (97)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9606782B2 (en) 2013-03-21 2017-03-28 Nextbit Systems Inc. Game state synchronization and restoration across multiple devices
US9002992B2 (en) * 2013-03-21 2015-04-07 Nexbit Systems, Inc. Location based game state synchronization
US9061210B2 (en) 2013-03-21 2015-06-23 Nextbit Systems Inc. Synchronizing an instance of an application between multiple devices
US20140287836A1 (en) * 2013-03-21 2014-09-25 Nextbit Systems Inc. Location based game state synchronization
US10146790B2 (en) 2013-03-21 2018-12-04 Razer (Asia-Pacific) Pte. Ltd. Game state synchronization and restoration across multiple devices
US9533226B2 (en) 2013-04-29 2017-01-03 Kabam, Inc. Dynamic adjustment of difficulty in an online game based on hardware or network configuration
US9757653B1 (en) 2013-04-29 2017-09-12 Kabam, Inc. Dynamic adjustment of difficulty in an online game based on hardware or network configuration
US20140364209A1 (en) * 2013-06-07 2014-12-11 Sony Corporation Entertainment America LLC Systems and Methods for Using Reduced Hops to Generate an Augmented Virtual Reality Scene Within A Head Mounted System
US10137361B2 (en) * 2013-06-07 2018-11-27 Sony Interactive Entertainment America Llc Systems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
US11697061B2 (en) * 2013-06-07 2023-07-11 Sony Interactive Entertainment LLC Systems and methods for reducing hops associated with a head mounted system
US10905943B2 (en) 2013-06-07 2021-02-02 Sony Interactive Entertainment LLC Systems and methods for reducing hops associated with a head mounted system
US10629029B2 (en) 2013-06-27 2020-04-21 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US9721431B2 (en) 2013-06-27 2017-08-01 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US11308759B2 (en) 2013-06-27 2022-04-19 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US12283159B2 (en) * 2013-06-27 2025-04-22 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US10127769B2 (en) 2013-06-27 2018-11-13 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US9959705B2 (en) 2013-06-27 2018-05-01 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US9403093B2 (en) * 2013-06-27 2016-08-02 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US20150005052A1 (en) * 2013-06-27 2015-01-01 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US11847887B2 (en) 2013-06-27 2023-12-19 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US20240127670A1 (en) * 2013-06-27 2024-04-18 Kabam, Inc. System and method for dynamically adjusting prizes or awards based on a platform
US9555324B1 (en) 2013-07-02 2017-01-31 Kabam, Inc. Dynamic effectiveness for virtual items
US20170160093A9 (en) * 2013-09-04 2017-06-08 Essilor International (Compagnie Genrale d'Optique Navigation method based on a see-through head-mounted device
US9976867B2 (en) * 2013-09-04 2018-05-22 Essilor International Navigation method based on a see-through head-mounted device
US9451051B1 (en) * 2014-02-13 2016-09-20 Sprint Communications Company L.P. Method and procedure to improve delivery and performance of interactive augmented reality applications over a wireless network
US20160371888A1 (en) * 2014-03-10 2016-12-22 Bae Systems Plc Interactive information display
US9852546B2 (en) 2015-01-28 2017-12-26 CCP hf. Method and system for receiving gesture input via virtual control objects
US10725297B2 (en) 2015-01-28 2020-07-28 CCP hf. Method and system for implementing a virtual representation of a physical environment using a virtual reality environment
US10726625B2 (en) 2015-01-28 2020-07-28 CCP hf. Method and system for improving the transmission and processing of data regarding a multi-user virtual environment
US11660536B2 (en) 2015-02-27 2023-05-30 Sony Interactive Entertainment Inc. Display control program, display control apparatus and display control method
US9911232B2 (en) 2015-02-27 2018-03-06 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments
WO2016137675A1 (en) * 2015-02-27 2016-09-01 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments
US10596464B2 (en) * 2015-02-27 2020-03-24 Sony Interactive Entertainment Inc. Display control program, display control apparatus and display control method
US11154778B2 (en) * 2015-02-27 2021-10-26 Sony Interactive Entertainment Inc. Display control program, display control apparatus and display control method
US20180028915A1 (en) * 2015-02-27 2018-02-01 Sony Interactive Entertainment Inc. Display control program, dislay control apparatus and display control method
US12434142B2 (en) 2015-02-27 2025-10-07 Sony Interactive Entertainment Inc. Display control program, display control apparatus and display control method
US9836117B2 (en) 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
US9898864B2 (en) 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality
US10576379B1 (en) 2016-02-19 2020-03-03 Electronic Arts Inc. Systems and methods for adjusting online game content and access for multiple platforms
US10134227B1 (en) 2016-02-19 2018-11-20 Electronic Arts Inc. Systems and methods for making game content from a single online game accessible to users via multiple platforms
US9901818B1 (en) * 2016-02-19 2018-02-27 Aftershock Services, Inc. Systems and methods for regulating access to game content of an online game
US10096204B1 (en) * 2016-02-19 2018-10-09 Electronic Arts Inc. Systems and methods for determining and implementing platform specific online game customizations
US10183223B2 (en) * 2016-02-19 2019-01-22 Electronic Arts Inc. Systems and methods for providing virtual reality content in an online game
US9919218B1 (en) * 2016-02-19 2018-03-20 Aftershock Services, Inc. Systems and methods for providing virtual reality content in an online game
US10232271B2 (en) * 2016-02-19 2019-03-19 Electronic Arts Inc. Systems and methods for regulating access to game content of an online game
US11383169B1 (en) 2016-02-19 2022-07-12 Electronic Arts Inc. Systems and methods for adjusting online game content and access for multiple platforms
US10035068B1 (en) 2016-02-19 2018-07-31 Electronic Arts Inc. Systems and methods for making progress of a user character obtained in an online game via a non-virtual reality interface available in a virtual reality interface
US12318698B1 (en) 2016-02-19 2025-06-03 Electronic Arts Inc. Systems and methods for adjusting online game content and access for multiple platforms
US20180178116A1 (en) * 2016-02-19 2018-06-28 Electronic Arts Inc. Systems and methods for regulating access to game content of an online game
AU2017200358B2 (en) * 2016-03-21 2017-11-23 Accenture Global Solutions Limited Multiplatform based experience generation
US10642567B2 (en) 2016-03-21 2020-05-05 Accenture Global Solutions Limited Multiplatform based experience generation
US10115234B2 (en) * 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
CN107219916A (en) * 2016-03-21 2017-09-29 埃森哲环球解决方案有限公司 Generated based on multi-platform experience
US20170270713A1 (en) * 2016-03-21 2017-09-21 Accenture Global Solutions Limited Multiplatform based experience generation
WO2017212208A1 (en) * 2016-06-08 2017-12-14 Companion Limited System providing a shared environment
US11471778B2 (en) 2016-06-08 2022-10-18 Companion Limited System providing a shared environment
GB2551323A (en) * 2016-06-08 2017-12-20 Companion Ltd System providing a shared environment
US10881969B2 (en) 2016-06-08 2021-01-05 Companion Limited System providing a shared environment
GB2551323B (en) * 2016-06-08 2021-02-10 Companion Ltd System providing a shared environment
US11656680B2 (en) 2016-07-21 2023-05-23 Magic Leap, Inc. Technique for controlling virtual image generation system using emotional states of user
US12158985B2 (en) 2016-07-21 2024-12-03 Magic Leap, Inc. Technique for controlling virtual image generation system using emotional states of user
US10572005B2 (en) * 2016-07-29 2020-02-25 Microsoft Technology Licensing, Llc Private communication with gazing
US20180034867A1 (en) * 2016-07-29 2018-02-01 Jessica Ellen Zahn Private communication with gazing
US20190080519A1 (en) * 2016-09-30 2019-03-14 Sony Interactive Entertainment Inc. Integration of tracked facial features for vr users in virtual reality environments
US10636217B2 (en) * 2016-09-30 2020-04-28 Sony Interactive Entertainment Inc. Integration of tracked facial features for VR users in virtual reality environments
US20180154253A1 (en) * 2016-12-01 2018-06-07 Lenovo (Singapore) Pte. Ltd. Systems and methods for presentation of content at headset based on rating
US10252154B2 (en) * 2016-12-01 2019-04-09 Lenovo (Singapore) Pte. Ltd. Systems and methods for presentation of content at headset based on rating
US10235965B2 (en) * 2017-03-15 2019-03-19 Kuato Games (UK) Limited Virtual reality system using an actor and director model
US11294615B2 (en) * 2017-03-29 2022-04-05 Disney Enterprises, Inc. Incorporating external guests into a virtual reality environment
US20180285050A1 (en) * 2017-03-29 2018-10-04 Disney Enterprises, Inc. Incorporating external guests into a virtual reality environment
US11864054B1 (en) * 2017-06-20 2024-01-02 Roblox Corporation Proximity friending
US10970932B2 (en) 2017-06-28 2021-04-06 Nokia Technologies Oy Provision of virtual reality content
EP3422145A1 (en) * 2017-06-28 2019-01-02 Nokia Technologies Oy Provision of virtual reality content
CN108159688A (en) * 2017-12-28 2018-06-15 努比亚技术有限公司 Interface sharing method, mobile terminal and computer readable storage medium
US12541256B2 (en) 2018-05-03 2026-02-03 Interdigital Vc Holdings, Inc. Systems and methods for physical proximity and/or gesture-based chaining of VR experiences
US20230032824A1 (en) * 2018-05-03 2023-02-02 Pcms Holdings, Inc. Systems and methods for physical proximity and/or gesture-based chaining of vr experiences
US12093466B2 (en) * 2018-05-03 2024-09-17 Interdigital Vc Holdings, Inc. Systems and methods for physical proximity and/or gesture-based chaining of VR experiences
US11165837B2 (en) * 2018-05-18 2021-11-02 Microsoft Technology Licensing, Llc Viewing a virtual reality environment on a user device by joining the user device to an augmented reality session
WO2019221952A1 (en) * 2018-05-18 2019-11-21 Microsoft Technology Licensing, Llc Viewing a virtual reality environment on a user device
US10771512B2 (en) * 2018-05-18 2020-09-08 Microsoft Technology Licensing, Llc Viewing a virtual reality environment on a user device by joining the user device to an augmented reality session
EP3910905A1 (en) * 2018-05-18 2021-11-17 Microsoft Technology Licensing, LLC Viewing a virtual reality environment on a user device
US10645126B2 (en) 2018-08-23 2020-05-05 8 Bit Development Inc. System and method for enabling simulated environment collaboration across a plurality of platforms
US11044281B2 (en) 2018-08-23 2021-06-22 8 Bit Development Inc. Virtual three-dimensional user interface object having a plurality of selection options on its outer surface for interacting with a simulated environment, and system for providing a simulated environment that uses same
US11090561B2 (en) 2019-02-15 2021-08-17 Microsoft Technology Licensing, Llc Aligning location for a shared augmented reality experience
CN113840637A (en) * 2019-05-16 2021-12-24 微软技术许可有限责任公司 Shared augmented reality games within a shared coordinate space
US12172088B2 (en) * 2019-05-16 2024-12-24 Microsoft Technology Licensing, Llc Shared augmented reality game within a shared coordinate space
US20210346810A1 (en) * 2019-05-16 2021-11-11 Microsoft Technology Licensing, Llc Shared Augmented Reality Game Within a Shared Coordinate Space
US11097194B2 (en) * 2019-05-16 2021-08-24 Microsoft Technology Licensing, Llc Shared augmented reality game within a shared coordinate space
US11977553B2 (en) 2019-12-30 2024-05-07 Snap Inc. Surfacing augmented reality objects
US11429618B2 (en) 2019-12-30 2022-08-30 Snap Inc. Surfacing augmented reality objects
US12298987B2 (en) 2019-12-30 2025-05-13 Snap Inc. Surfacing augmented reality objects
US11943303B2 (en) 2019-12-31 2024-03-26 Snap Inc. Augmented reality objects registry
CN114902208A (en) * 2019-12-31 2022-08-12 斯纳普公司 Augmented reality object registry
US11343323B2 (en) 2019-12-31 2022-05-24 Snap Inc. Augmented reality objects registry
EP4614350A3 (en) * 2019-12-31 2025-11-12 Snap Inc. Augmented reality objects registry
WO2021138161A1 (en) * 2019-12-31 2021-07-08 Snap Inc. Augmented reality objects registry
US12482131B2 (en) 2023-07-10 2025-11-25 Snap Inc. Extended reality tracking using shared pose data

Also Published As

Publication number Publication date
WO2014074465A1 (en) 2014-05-15

Similar Documents

Publication Publication Date Title
US20140128161A1 (en) Cross-platform augmented reality experience
EP3491781B1 (en) Private communication by gazing at avatar
US20140125698A1 (en) Mixed-reality arena
US8894484B2 (en) Multiplayer game invitation system
US9361732B2 (en) Transitions between body-locked and world-locked augmented reality
US10373392B2 (en) Transitioning views of a virtual model
US20140240351A1 (en) Mixed reality augmentation
US9430038B2 (en) World-locked display quality feedback
US10768426B2 (en) Head mounted display system receiving three-dimensional push notification
US9956487B2 (en) Variable audio parameter setting
EP2948826A2 (en) Mixed reality filtering
JP2015116336A (en) Mixed reality arena
CN103760972A (en) Cross-platform augmented reality experience
US12354200B2 (en) Computing system and method for rendering avatars
EP2886172A1 (en) Mixed-reality arena
EP2886171A1 (en) Cross-platform augmented reality experience
KR20150071824A (en) Cross-platform augmented reality experience
JP2015116339A (en) Cross-platform augmented reality experience
KR20150071611A (en) Mixed-reality arena
JP2019033894A (en) Information processing method, apparatus, and program for causing computers to execute the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LATTA, STEPHEN;MCCULLOCH, DANIEL;SCOTT, JASON;AND OTHERS;SIGNING DATES FROM 20121025 TO 20121101;REEL/FRAME:031803/0654

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417

Effective date: 20141014

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION