[go: up one dir, main page]

HK1163580A - System and method for simulating events in a real environment - Google Patents

System and method for simulating events in a real environment Download PDF

Info

Publication number
HK1163580A
HK1163580A HK12104555.5A HK12104555A HK1163580A HK 1163580 A HK1163580 A HK 1163580A HK 12104555 A HK12104555 A HK 12104555A HK 1163580 A HK1163580 A HK 1163580A
Authority
HK
Hong Kong
Prior art keywords
real
virtual
location
data
data object
Prior art date
Application number
HK12104555.5A
Other languages
Chinese (zh)
Inventor
Juan Manuel Rejen
Original Assignee
Iopener Media Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iopener Media Gmbh filed Critical Iopener Media Gmbh
Publication of HK1163580A publication Critical patent/HK1163580A/en

Links

Description

System and method for simulating events in a real environment
RELATED APPLICATIONS
This application claims rights and priority to U.S. provisional application 61/099,697 filed on 24.9.2008, all of the teachings of which are incorporated herein by reference.
Technical Field
The present invention relates generally to computer-based methods and apparatus, including computer program products, for simulating events in a real environment.
Background
Today's computer games are increasingly concerned about realism and strive to expand the link between reality and the game world. One way to achieve this includes seamlessly integrating real-world objects into the virtual environment of the game. For example, a player is sitting at home playing a car racing game; however, the opponent (rather than the non-player character) in that race is the avatar of the real car (avatar), thus the real driver who is racing along the real route somewhere in the real world at that moment in time is driving. Real-time participation in real-world competitions is challenging due to the unpredictability of the actions of real-world competitors.
Accordingly, there is a need in the art for techniques that can integrate reality with the game world to achieve an optimal gaming experience for the user.
Disclosure of Invention
One way to simulate events in a real environment is a method. The method includes determining a user location (location) of a user control object in a virtual environment; determining a virtual location of a real data object in the virtual environment relative to the user location based on a real location of the real data object in the real environment; and controlling a current virtual location of the real data object in the virtual environment based on the virtual location and one or more saved real locations associated with the real data object.
Another way to simulate events in a real environment is a method. The method comprises determining projected intersections (projected intersections) in the virtual environment between one or more real world objects and one or more virtual objects; and determining an alternative location for each real-world object to be projected to intersect with at least one virtual object based on projected intersections between the one or more real-world objects and the one or more virtual objects.
Another way to simulate events in a real environment is a method. The method includes identifying a virtual location and a real-world location of a real-world object; identifying a virtual location of a virtual object; determining that the real-world object intersects a projection of the virtual object based on the virtual location of the real-world object, the real-world location of the real-world object, the virtual location of the virtual object, or any combination thereof; and modifying the virtual location of the real-world object based on the projection intersection and one or more stored virtual locations associated with the real-world object.
Another way of simulating events in a real environment is a computer program product. The computer program product being embodied in an information carrier and comprising instructions operable to cause data processing apparatus to: determining a user position of a user control object in a virtual environment; determining a virtual location of a real data object in the virtual environment relative to the user location based on a real location of the real data object in the real environment; and controlling a current virtual location of the real data object in the virtual environment based on the virtual location and one or more saved real locations associated with the real data object.
Another way to simulate events in a real environment is a system. The system includes a virtual data positioning module configured to determine a user positioning of a user-controlled object in a virtual environment; a real-data localization module configured to determine a virtual localization of a real-data object in the virtual environment relative to the user localization based on a real localization of the real-data object in the real environment; and a positioning control module configured to control a current virtual positioning of the real data object in the virtual environment based on the virtual positioning and one or more saved real positioning associated with the real data object.
Another way to simulate events in a real environment is a system. The system includes a real-data location module configured to identify a virtual location and a real-world location of a real-world object; a virtual data location module configured to identify a virtual location of a virtual object; a location projection module configured to determine that the real-world object intersects a projection of the virtual object based on a virtual location of the real-world object, the real-world location, a virtual location of the virtual object, or any combination thereof; and a positioning control module configured to modify a virtual positioning of the real-world object based on the projection intersection and one or more stored virtual positioning associated with the real-world object.
Another way to simulate events in a real environment is a system. The system includes means for determining a user location of a user control object in a virtual environment; means for determining a virtual location of a real data object in the virtual environment relative to the user location based on a real location of the real data object in the real environment; and means for controlling a current virtual location of the real data object in the virtual environment based on the virtual location and one or more saved real locations associated with the real data object.
In other examples, any of the approaches described above may include one or more of the following features.
In some examples, the method further comprises determining whether a next real position fix of the real data object is available; and controlling a current virtual location of the real-data object in the virtual environment based on a predefined path associated with the real environment and a determination of whether a next real location of the real-data object is available.
In other examples, the method further comprises determining whether additional real position fixes of the real data object are available; identifying a next user location of the user control object in the virtual environment; determining one or more future virtual positions of the real-data object in the virtual environment based on the determination of whether additional real positions of the real-data object are available and the next user position, the one or more future virtual positions associated with a path to move the current virtual position to a virtual position associated with the additional real positions; and controlling a current virtual location of the real data object in the virtual environment based on the one or more future virtual locations.
In some examples, the method further comprises identifying a next user position of the user control object in the virtual environment; determining a next virtual location of the real data object in the virtual environment based on a next real location of the real data object in the real environment; and controlling a current virtual location of the real data object based on the next virtual location and an actual distance between the next virtual location and the next user location.
In other examples, the method further comprises determining an additional virtual location of the real data object in the virtual environment based on the one or more saved real locations.
In some examples, the method further comprises identifying additional user positions of the user control object in the virtual environment; determining a virtual location of a next real-data object in the virtual environment based on a real location of the next real-data object in the real environment; and identifying a current virtual location for controlling a next virtual location of the real-data object in the virtual environment based on the virtual location, an actual distance between the virtual location and an additional user location of the user-controlled object, and a chronological order associated with the next virtual location of the real-data object.
In other examples, the method further comprises determining an additional virtual location of the real data object in the virtual environment based on the one or more saved locations, the additional virtual location associated with a next chronological identification; and determining a next virtual location of the next real data object in the virtual environment based on one or more next saved locations and the next chronological identification.
In some examples, the method further comprises determining a next virtual location of the real data object in the virtual environment based on a next real location of the real data object in the real environment, the next virtual location being different from the next real location and in front of the user control object; and controlling a current virtual positioning of the real data object based on a next virtual positioning of the real data object.
In other examples, a virtual location of the real-data object in the virtual environment is different from a real location of the real-data object in the real environment.
In some examples, the method further comprises determining a user position in the virtual environment relative to the user control object based on a real position of a next real data object in the real environment, a virtual position of the next real object in the virtual environment; and controlling a current virtual location of the next real-data object in the virtual environment based on the virtual location and one or more saved real locations associated with the next real-data object.
In other examples, wherein the determining the virtual location occurs in real-time or near real-time with movement of the real-data object in the real environment.
In some examples, the method further comprises positioning (position) each real world object as projected to intersect on a respective alternative location.
In other examples, the method further comprises determining whether the positioning for the one or more real-world objects is lost; and determining a missing location for each real-world object that loses data based on the one or more saved locations associated with the respective real-world object.
In some examples, the system further comprises a real-data location module further configured to determine whether a next real location of the real-data object is available; and a positioning control module further configured to control a current virtual positioning of the real data object in the virtual environment based on a predefined path associated with the real environment and the determination of whether a next real positioning of the real data object is available.
In other examples, the system further comprises a real-data localization module further configured to determine whether additional real localization of the real-data object is available; a virtual data location module further configured to identify a next user location of the user control object in the virtual environment; a location projection module configured to determine one or more future virtual locations of the real-data object in the virtual environment based on the determination of whether additional real locations of the real-data object are available and the next user location, the one or more future virtual locations associated with a path to move the current virtual location to a virtual location associated with the additional real locations; and a positioning control module further configured to control a current virtual positioning of the real-data object in the virtual environment based on the one or more future virtual positioning.
In some examples, the system further comprises a virtual data location module further configured to identify a next user location of the user control object in the virtual environment; a real-data localization module further configured to determine a next virtual localization of the real-data object in the virtual environment based on a next real localization of the real-data object in the real environment; and a positioning control module further configured to control a current virtual positioning of the real-data object based on the next virtual positioning and an actual distance between the next virtual positioning and the next user positioning.
In other examples, the system further comprises a real-data positioning module further configured to determine an additional virtual positioning of the real-data object in the virtual environment based on the one or more saved real positionings.
In some examples, the system further comprises a virtual data location module further configured to identify additional user locations of the user control object in the virtual environment; a real-data localization module further configured to determine a virtual localization of a next real-data object in the virtual environment based on a real localization of the next real-data object in the real environment; and a positioning control module further configured to identify a current virtual positioning of the next real data object in the virtual environment based on the virtual positioning, an actual distance between the virtual positioning and an additional user positioning of the user-controlled object, and a temporal order associated with the next virtual positioning of the real data object.
In other examples, the system further comprises a real data location module further configured to: determining an additional virtual location of the real-data object in the virtual environment based on the one or more saved locations, the additional virtual location associated with a next chronological identification; and determining a next virtual location of the next real data object in the virtual environment based on one or more next saved locations and the next chronological identification.
In some examples, the system further comprises a real-data positioning module further configured to determine a next virtual positioning of the real-data object in the virtual environment based on a next real positioning of the real-data object in the real environment, the next virtual positioning being different from the next real positioning and in front of the user-controlled object; and a positioning control module further configured to control a current virtual positioning of the real data object based on a next virtual positioning of the real data object.
In other examples, the system further comprises a real-data positioning module further configured to determine a user position in the virtual environment relative to the user control object based on a next real position of a next real-data object in the real environment, the virtual position of the next real-data object in the virtual environment; and a positioning control module further configured to control a current virtual positioning of the next real-data object in the virtual environment based on the virtual positioning and one or more saved real positioning associated with the next real-data object.
In some examples, the system further includes a positioning intersection module configured to determine a projected intersection in the virtual environment between the one or more real-world objects and the one or more virtual objects; and a positioning projection module configured to determine an alternative positioning for each real-world object to be projected to intersect at least one virtual object based on projected intersections between the one or more real-world objects and the one or more virtual objects.
In other examples, the system further includes a position control module configured to position each real-world object projected to intersect on a respective alternative position.
In some examples, the system further comprises a real-data location module configured to determine whether location for the one or more real-world objects is lost; and a location projection module further configured to determine a missing location for each real-world object that loses data based on one or more saved locations associated with the respective real-world object.
The techniques described herein for simulating events in a real environment provide one or more of the following advantages. One advantage of the event simulation is the illusion that realism (i.e., credibility) can be preserved by implementing the techniques described herein, thereby improving the quality of the user's gaming experience. Another advantage of the event simulation is that implementations of the techniques described herein can occur in real-time to ensure that the data presented to the user corresponds to real-world data, thereby improving the quality of the user's gaming experience.
Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
Drawings
The above and other objects, features and advantages of the present invention, as well as the invention itself, will be more fully understood from the following description of the various embodiments, when read in conjunction with the accompanying drawings.
FIG. 1 is an illustration of an exemplary gaming system;
FIG. 2 is an illustration of another exemplary gaming system;
FIG. 3 is a block diagram of an exemplary game server;
FIG. 4 is a flow diagram of an exemplary gaming process;
FIG. 5 is another flow diagram of an exemplary gaming process;
FIG. 6 is another flow diagram of an exemplary gaming process for collision avoidance;
FIG. 7 is an illustration of an exemplary object in an exemplary gaming system;
FIG. 8 is another illustration of an exemplary object in an exemplary gaming system;
FIG. 9 is another flow diagram of an exemplary gaming process;
FIG. 10 is another illustration of an exemplary object in an exemplary gaming system;
FIG. 11 is another illustration of an exemplary object in an exemplary gaming system;
FIG. 12 is another flow diagram of an exemplary gaming process;
FIG. 13 is a screen shot of an exemplary object in another exemplary gaming system;
FIG. 14 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 15 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 16 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 17 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 18 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 19 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 20 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 21 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 22 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 23 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 24 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 25 is another screen shot of an exemplary object in another exemplary gaming system;
FIG. 26 is an illustration of another exemplary gaming system;
FIG. 27 is another flow diagram of an exemplary gaming process; and
FIG. 28 is another flow diagram of an exemplary game process.
Detailed Description
In general, today's computer games are increasingly concerned about realism and strive to expand the link between reality and the game world. An example of extending reality is seamlessly integrating real-world objects into the virtual environment of a game. For example, a user is sitting at home playing a car race game; however, the opponent (rather than the non-player character) in that race is an avatar of a real car, and thus is driving along a real driver of the real line race somewhere in the real world at that moment. The system enables real-world competitions, i.e. competitions actually ongoing elsewhere in the world, to be participated in real time. Although real-time tournament games are exemplified herein, other events, motions, and/or games can also utilize the system to integrate real-world objects into virtual environments.
As a further overview of a system for simulating events in a real environment, the system captures information from physical events (e.g., automobile races, sporting events, etc.) in which real-world objects (e.g., cars, people, bulldozers, etc.) interact with the surrounding environment and with each other. The system generates a virtual representation of the physical event, including a virtual representation of the real-world object, and allows the end user to participate in the virtual representation by inserting a virtual object (e.g., a computer simulation, a computer game, etc.). The system may advantageously capture state information from the event to make the virtual representation of the event as realistic as possible. The end user manipulates virtual objects within the virtual representation using a control device (e.g., keyboard, mouse, joystick, steering wheel, etc.).
FIG. 1 is an illustration of an exemplary gaming system 100 for an example automobile race. The system 100 includes an automotive device 112 (e.g., a GPS receiver) that is mounted on a real-world automobile (i.e., a dynamic object). For example, the GPS receiver 112 periodically receives signals from a plurality of GPS satellites 105 and determines the location of the car throughout the race event 110. The automobile may be configured with other equipment 112 as shown, such as an Inertial Measurement Unit (IMU), telemetry, mobile radio, and/or other types of communications (e.g., WiMax, CDMA, etc.). A base station 114 (i.e. a communication scheme) is also provided locally to form a radio (communication) link with the mobile radio of the car. The base station 114 receives the information from the car and relays it to a networked server 116. The server 116 may transmit the information from the automobile to the database 132 via the network 120.
The radio transmitter sends the location information and any other telemetry data that may be collected from the dynamic object to the radio base station 114. Preferably, the location information is updated quickly, such as at a rate of at least 30 Hz. However, the latency (latency) of the system 100 is not a delay in radio communication but a delay between the actual event 110 and the performance in the client device 150.
Other event information 118, such as weather, flags, etc., is transmitted from an event information system (not shown) to the web server 116. The server 116 may transmit the event information to the database 132 via the network 120.
Preferably, the radio messages of each of the different dynamic vehicles are distinguishable from each other and may be separated in time or frequency. The communication between the car and the base station 114 is not limited to radio communication but can also be covered by other types of communication (e.g. Wifi, WiMax, infrared light, laser, etc.).
An event toolset (toolset)134 processes the database 132 to normalize data and/or identify event scenarios. The web service 136 provides a web interface for retrieving and/or analyzing the database 132. One or more media players (media players) 138 process the database 132 to provide real-time or near real-time data streams for real-world events to the game server 142, game engine 148, and/or client devices 150. The game server 142 may process the data stream and provide simulated events to a plurality of users. Client device 150 may process the data stream and provide simulated events to the user.
The game engine 148 receives the data stream from the media player 138 via the input/output module 144 and/or Artificial Intelligence (AI) module 146. Game engine 148 processes the data stream and provides simulated events to the user.
Although FIG. 1 relates to an automobile race, the techniques are applicable to virtually any race event in which a virtual user may participate in a virtual representation of a real-world race event (e.g., sports, games, auto opens, rowing, horse racing, motorcycle racing, bicycle racing, etc.).
Fig. 2 is an illustration of another exemplary gaming system 200. The system 200 includes a media player 210, a database 212 connected to the media player 210, a network 220, a game server 230, and a game engine 240.
The game engine 240 includes an input/output module 241 and an input/output subsystem 243 for sending information to and receiving information from the networked game server 230 via the network 220. The game engine 140 also includes an input subsystem 255 for receiving user input from user controls 270 (e.g., joystick, keyboard, mouse, etc.) and an Artificial Intelligence (AI) subsystem 245 (e.g., determining a path around the projection intersection, determining a path back to the current real-world location, etc.).
Other subsystems or modules of the game engine 240 include a script engine 244 (e.g., executing scripts associated with the virtual environment, etc.), a timer 246, a physics engine 247 (e.g., ensuring that objects in the virtual environment comply with real-world physical constraints, ensuring authenticity by applying rules, etc.), a sound manager 248, a scene manager 249, a spatial partitioning module 250, a conflict (collision) detection module 251 (e.g., detecting potential conflicts, etc.), an animation engine 252, a sound renderer 253, and a graphics renderer 254. The game engine 240 stores game data, receives in-game parameters of real world objects from the networked server 230, and receives in-game data from the AI module 245 and data from other sources, such as user input received through the user controls 270. The game engine 240 also reads locally stored data, communicates with the game server 230, and generates graphics, sounds, and other feedback to indicate a virtual representation of the physical event, including virtual objects. The graphics, sound, and other feedback are rendered by game engine 240 on user display 260.
The system 200 is capable of processing amateur competitor performance information, but does not forward such data directly or indirectly to a networked server 230 or media center. Where the system 200 relies on applications hosted on the web, such applications will be downloaded from the web to the end-user client prior to use in order to generate any rendering of the display images on the end-user's console rather than on the web server.
Fig. 3 is a block diagram of an exemplary game server 330. The game server 330 includes a communication module 331, a real data location module 332, a virtual data location module 333, a location control module 334, a location projection module 335, a location intersection module 336, a location history module 337, a processor 338, and a storage device 339. The game server 330 includes various modules and/or devices for operating the game server 330. The modules and/or means may be hardware and/or software. For example, the modules and/or devices illustrated in game server 330 can utilize and/or include a processor (e.g., an encryption processing unit, a field programmable gate array processing unit, etc.) for executing computer-executable instructions. It will be appreciated that the game server 330 may include other modules, devices, and/or processors and/or variations of the illustrated modules, devices, and/or processors, for example, as is known in the art.
The communication module 331 transmits information and/or data from/to the game server 330. The real-data localization module 332 determines a virtual localization of the real-data object in the virtual environment relative to the user localization based on the real localization of the real-data object in the real environment. The real data location module 332 can determine whether a next real location of the real data object is available (e.g., determine whether data transmission from the real data object has ceased, determine whether there is no incoming data transmission from the real data object, etc.). In some examples, the virtual position fix is associated with a chronological identification (e.g., time 4: 34.23; time 45, etc.). In other examples, the real data location module 332 determines a virtual location of the real data object based on the one or more saved locations and the chronological identification. The real-data location module 332 can determine whether a location for one or more real-world objects is lost.
The virtual data positioning module 333 determines a user positioning of the user control object in the virtual environment. The virtual data location module 333 can identify the next user location of the user control object in the virtual environment.
The position control module 334 controls a current virtual position of the real-data object in the virtual environment based on the virtual position and one or more saved real positions associated with the real-data object. The positioning control module 334 can control the current virtual positioning of the real-data object in the virtual environment based on the predefined path associated with the real environment and the determination of whether the next real positioning of the real-data object is available. The positioning control module 334 can control the current virtual positioning of the real-data object in the virtual environment based on one or more future virtual positioning. The position control module 334 can control the current virtual position of the real data object based on the virtual position and the actual distance between the virtual position and the user position.
The location projection module 335 determines one or more future virtual locations of the real data object in the virtual environment based on the determination of whether additional real locations of the real data object are available and the next user location. One or more future virtual positions may be associated with a path that moves the current virtual position to a virtual position associated with an additional real position.
The location intersection module 336 determines a projected intersection in the virtual environment between the one or more real world objects and the one or more virtual objects. The location history module 337 stores the location of one or more real data objects and/or one or more user control objects. The processor 338 executes an operating system and/or any other computer-executable instructions for the game server 330.
Storage 339 stores the systems described herein and/or any other data associated with game server 330. Storage 339 may include multiple storage devices. Storage 339 may include, for example, long-term storage (e.g., hard disk drives, tape storage, flash memory, etc.), short-term storage (e.g., random access memory, graphics memory, etc.), and/or any other type of computer-readable storage.
Fig. 4 is a flow diagram 400 of an exemplary game process utilizing, for example, game server 330 of fig. 3. The communication module 331 receives (410) data associated with the real data object. The real data location module 332 checks 420 the validity of the data (e.g., correct format, correct parameters, etc.) and processes the data (e.g., convert the data to an internally stored format, convert measurements to standard measurements, etc.). The real data location module 332 determines (430) whether a next real location for the real data object is available (e.g., missing data, required data, etc.). If the next data is not available, the location projection module 335 determines (435) one or more future virtual locations of the real data object (e.g., by interpolation, by extrapolation, by projection, etc.). If the next data is available, the location history module 337 stores 440 the data. The positioning control module 334 processes (450) the data to modify the virtual positioning of the real world object in the virtual environment. The communication module 331 communicates (460) the data (including the modified virtual position fix) to the game engine 240 of fig. 2.
Fig. 5 is another flow diagram 500 of an exemplary game process utilizing, for example, game server 330 of fig. 3. The communication module 331 receives (510) data from one or more network components (e.g., the database 132, one or more media players 138, etc. of fig. 1). Location history module 337 stores 520 the data in storage 339. The real data location module 332 determines 530 a current mode of operation of the simulated event.
If the current operation mode is the real mode, the communication module 331 outputs (540) the current frame to the game engine 148 of fig. 1. The virtual data location module 333 examines 542 the data of the virtual object (e.g., identifies the location of the virtual object, identifies the orientation of the virtual object, etc.). The location intersection module 336 determines 554 whether there is a projected intersection between the virtual object and the real world object. If no projections intersect, processing of the input data continues. If there is a projection intersection, game server 330 changes (546) the mode of operation to AI mode.
If the current mode of operation is AI mode, then the real data location module 332 checks 550 the data of the virtual object (e.g., checks to ensure the data is accurate, checks to ensure the data is complete, etc.). The location intersection module 336 determines 552 whether there is still a projection intersection between the virtual object and the real world object. If there are still projection intersections, the positioning control module 334 controls 553 the real world objects in the virtual environment to take the appropriate evasive action. If no projections intersect, the position projection module 335 determines (554) an actual path that returns the virtual position of the real-world object to its real-world position in the virtual environment. The position control module 334 moves (555) the virtual position of the real-world object based on the path. The position control module 334 determines (556) whether the virtual position is the current real position of the real-world object. If the virtual location does not match the physical location, the location control module 334 continues to move the virtual location of the real-world object based on the path. If the virtual location matches the physical location, game server 330 changes (557) the mode to a real mode.
FIG. 6 is another flow diagram 600 of an exemplary game process for collision avoidance, for example, utilizing the game server 330 of FIG. 3. The real data location module 332 identifies (610) a current location of the real world object and the virtual data location module 333 identifies (610) a current location of the virtual object. The location projection module 335 determines 620 whether a conflict is about to occur based on the current locations of the real world object and the virtual object (e.g., within a set distance, etc.). If a conflict is to occur, the position control module 334 controls 625 the position of the real-world object to prevent the conflict. If no conflict will occur, the real-data location module 332 determines (630) whether the virtual location of the real-world object is delayed from the real location of the real-world object.
If the virtual positioning is not delayed from the real positioning, the positioning control module 334 controls (635) the virtual positioning of the real-world object to allow the virtual object to take over (take over) the virtual positioning of the real-world object. If the virtual positioning is delayed from the real positioning, the virtual data positioning module 333 determines (640) whether a real-world object override (overake) of the virtual object is possible. If the override is more likely, the positioning control module 334 takes over (645) control of the virtual positioning of the real-world object to avoid the conflict. If the override is not possible, the positioning control module 334 controls (635) the virtual positioning of the real-world object to allow the virtual object to take over the virtual positioning of the real-world object.
FIG. 7 is an illustration of exemplary objects 710, 720a, and 730a in an exemplary gaming system and illustrates that user control object 710a overrides real data objects 720a and 730 a. As illustrated, each real data object 720a and 730a includes a history of one or more previous locations 720 (i.e., 720b, 720c, and 720d) and 730 (i.e., 730b, 730c, and 730d), respectively. When user control object 710 passes beyond real data objects 720a and 730a, real data objects 720a and 730a are positioned within their respective histories but beyond actual distance 740. In this example, each real data object 720a and 730a is positioned in a location based on the history and chronological order of the corresponding real data object. For example, if real data object 720a is positioned on position 720d, the time position is 3, and real data object 730a is positioned on position 730d, the time position is 3. In this example, the temporal positions of real data objects 720a and 730a that user control object 710 is overriding are the same.
FIG. 8 is another illustration of exemplary objects 810, 820a, and 830a in an exemplary gaming system and illustrates that user control object 810 overrides real data objects 820a and 830 a. As illustrated, each real data object 820a and 830a includes a history of one or more previous locations 820 (i.e., 820b, 820c, and 820d) and 830 (i.e., 830b, 830c, and 830d), respectively. Real data objects 820a and 830a are overriding user control object 810. However, because the real-data objects 820a and 830a are within the actual distance 840 of the user-controlled object 810, the virtual locations of the real-world objects 820a and 830a are at virtual locations 820b and 830b, respectively. In this example, the virtual positions of the real world objects 820a and 830a correspond in chronological identity, i.e., the temporal position is 1.
Fig. 9 is another flow diagram 900 of an exemplary game process utilizing the game server 330 of fig. 3. Flow diagram 900 illustrates a user controlling an object over a real data object. The location history module 337 stores 910 the location of the real data object in storage 339 and/or any other type of storage (e.g., storage area network, etc.). The positioning control module 334 determines (920) whether there is an override of the real data object by the user control object. If not, the location history module 337 continues to store 910 the location of the real data object. If there is an override, the position control module 334 determines (930) whether there are other real data objects that are overridden.
If there are other transcended real data objects, then real data location module 332 locates (935) the time frame and historical location of the real data object based on the time frame of the transcended real data object. The localization control module 334 controls (937) the localization of the real data object based on the time frame and the historical localization.
If there are no other transcended real data objects, the real data location module 332 locates (940) a current location based on the historical locations of the real data objects. The position control module 334 controls (945) the positioning of the real data object based on the historical positions.
In some examples, the system detects the override by analyzing the forward position of the user control object and/or the forward position of the user control object plus an actual distance (e.g., a percentage of the length of the user control object, a set distance, etc.).
In other examples, object Z (the real data object) becomes object X after the real data object is overridden by the user control object. At this point, object X and object Y begin to use information from the time frame out of the history list rather than the information actually received. Object X backs up in the history list until objects X and Y reach a time frame with an associated position with an actual distance behind the user-controlled object. From this point, object X will continuously use the historical time frame (i.e., one or more saved positions) with the relevant information to position itself at the actual distance behind the user-controlled object. The temporal information includes a temporal frame difference between the actual temporal frame and the valid historical temporal frame. The time frame difference between the actual time frame and the valid historical time frame is referred to as dT (also referred to as time position).
In some examples, to maintain the position and relative positioning of all real data objects (i.e., object Y) behind the user-controlled object, all of the same real objects located behind object X will simultaneously reverse the same amount of data frames (dT) in their respective history lists as object X. In other words, dT of all real-time objects behind object X may be continuously the same. In this way, all real data objects behind the user control object may be in the same historical location in time.
In other examples, the actual distance from the user control object may vary based on positioning on the track of the user control object, maneuvers (maneuver) of the control object, and/or simply randomly. The time information (i.e., dT) may be updated accordingly based on the actual distance.
FIG. 10 is an illustration of exemplary objects 1010, 1020a, and 1030a in an exemplary gaming system and illustrates that the real data objects 1020a and 1030a override the user control object 1010. As illustrated, each real data object 1020a and 1030a includes a history of one or more previous locations 1020 (i.e., 1020b, 1020c, and 1020d) and 1030 (i.e., 1030b, 1030c, and 1030d), respectively. The virtual positioning of the real data objects 1020a and 1030a is at a time position of 3, 1020d and 1030d, respectively, i.e. outside the actual distance 1040 from the user control object 1010.
FIG. 11 is another illustration of exemplary objects 1110, 1120a, and 1130a in an exemplary gaming system and illustrates that a real data object 1120a overrides a user control object 1110. As illustrated, each real data object 1120a and 1130a includes a history of one or more previous locations 1120 (i.e., 1120b, 1120c, and 1120d) and 1130 (i.e., 1130b, 1130c, and 1130d), respectively. When the real world location 1120a of the real world object 1120a passes the user control object 1110, the virtual location of the real world object 1120a is moved back to the real world location 1120 a. After the real-world object 1120a returns the real-world position, control of the real-world object returns to the real-world object 1130c (e.g., control identified in time order, time position 2). In this regard, the virtual location of real-world object 1130a moves to virtual location 1130c because this virtual location is closest to real-world location 1130a, but still exceeds actual distance 1140.
Fig. 12 is another flow diagram 1200 of an exemplary game process utilizing, for example, game server 330 of fig. 3. The real data localization module 332 determines (1210) an actual time frame of each real data object (object X and object Y) behind the user control object using the historical time frame to localize the real data object (dT > 0) while continuously checking whether the localization of the real data object on the actual time frame is in front of the user control object. The real data location module 332 determines (1220) whether the real data object overrides the user control object. If the real data object does not override the user control object, the process continues (1210).
If the real data object does not override the user-controlled object, the positioning control module 334 determines (1230) whether the override may occur in a manner that is consistent with being practical and realizable. If the override cannot occur in a manner that is practical and realizable, the process continues (1210). If the override can occur in a real and realizable manner, the location control module 334 overrides (1240) the real-world object with the user-controlled object and brings the real-world object back to its actual time frame and location in front of the user-controlled object in a manner that is consistent with reality.
The real data location module 332 determines 1250 if the real data object is object X (i.e., the first real data object behind the user control object). If the real data object is object X, then real data location module 332 designates 1260 the next real data object behind the user control object as object X. If the real data object is not object X, the process continues (1210). In some examples, all other real data objects behind the transcendental real data object will advance in the history list (i.e., the relevant time frame and location) at the same time until one of the real data objects first comes behind the user control object and becomes the new object X.
Fig. 13 is a screenshot 1300 of an exemplary object in another exemplary gaming system and illustrates a user control object 1327 in a virtual environment 1320 and a real data object 1325 corresponding to the real data object 1315 in a real environment 1310.
FIG. 14 is another screenshot 1400 of an exemplary object in another exemplary gaming system and illustrates a user control object 1427 and a real data object 1400 in a virtual environment 1420. As illustrated, the two real data objects 1412a and 1412b in the real environment 1410 are within the actual distance 1430 and are not shown behind the user control object 1427 in the virtual environment 1420.
FIG. 15 is another screenshot 1500 of an exemplary object in another exemplary gaming system and illustrates a user control object 1527 and a real data object in a virtual environment 1520. As illustrated, real data objects 1512 in real environment 1510 are within actual distance 1530 and are not shown behind user control objects 1527 in virtual environment 1520.
Fig. 16 is another screenshot 1600 of an exemplary object in another exemplary gaming system and illustrates a user control object 1627 and real data objects 1622a and 1622b in a virtual environment 1620. As illustrated, two real data objects 1612a and 1612b in real environment 1610 are partially within an actual distance. However, in this example, the two real data objects 1622a and 1622b are shown in front of the user control object 1627 in the virtual environment 1620.
FIG. 17 is another screenshot 1700 of an exemplary object in another exemplary gaming system and illustrates a real data object 1728 in a virtual environment 1720 behind a user control object 1727. As illustrated, the real location of the real data object 1712 in the real environment 1710 differs from the virtual location of the real data object 1728, as the virtual location is controlled by a history list of real data object locations.
FIG. 18 is another screenshot 1800 of an exemplary object in another exemplary gaming system and illustrates a real data object 1828 behind a user control object 1827 in a virtual environment 1820. As illustrated, the real location of the real data object 1812b in the real environment 1810 is different from the virtual location of the real data object 1828, as the virtual location is controlled by a historical list of real data object locations. Furthermore, as illustrated, real data object 1812a is not within virtual environment 1820 because the virtual positioning of real data object 1812a is beyond the reach of virtual environment 1820 (i.e., outside the visible range of user control object 1827).
FIG. 19 is another screenshot 1900 of an exemplary object in another exemplary gaming system and illustrates two real data objects 1928a and 1928b behind a user control object 1927 in a virtual environment 1920. Real data objects 1928a and 1928b follow user control object 1927 based on the respective history lists, but the time frame of that location is controlled by the first-bit real data object 1928b (i.e., object X), which controls the timing of which location is utilized. The virtual positioning of real data objects 1928a and 1928b is different from the real positioning of real data objects 1912a and 1912b in real environment 1910 because the real positioning is within a real distance from user control object 1927 in virtual environment 1920.
FIG. 20 is another screenshot 2000 of an exemplary object in another exemplary gaming system and illustrates a real data object 2028 behind a user control object 2027 in a virtual environment 2020. The real data object 2028 follows the user control object 2027 based on the history list of the real data object 2028. The virtual positioning of the real data object 2028 is different from the real positioning of the real data object 2012 in the real environment 2010.
FIG. 21 is another screenshot 2100 of an exemplary object in another exemplary gaming system and illustrates a real data object 2128 in a virtual environment 2120 behind a user control object 2127. Real data object 2128 follows user control object 2127 based on the history list of real data objects 2128. The virtual positioning of real-data objects 2128 is different from the real positioning of real-data objects 2112 in real environment 2110.
FIG. 22 is another screenshot 2200 of an exemplary object in another exemplary gaming system and illustrates an actual distance 2230 around a user controlled object in a virtual environment 2220. The two real data objects 2212a and 2212b are within the actual distance 2230 of the user control object 2227 when the real position in the real environment 2210 is placed within the virtual environment 2220. In other words, if the real positioning of the two real data objects 2212a and 2212b corresponds to a virtual positioning of the real data objects, then the virtual positioning will be within the actual distance 2230 around the user control object 2227. In this example, the two real data objects are placed at locations corresponding to historical time frames of real data objects 2228a and 2228b (e.g., time positions 2 after the current location).
FIG. 23 is another screenshot 2300 of an exemplary object in another exemplary gaming system and illustrates an actual distance 2330 around a user controlled object 2327 in a virtual environment 2320. The three real data objects 2312a, 2312b and 2312c are within the actual distance 2330 of the user control object 2327 when the real location in the real environment 2310 is placed within the virtual environment 2220. Thus, the three real data objects 2312a, 2312b and 2312c virtual environment 2220 are not illustrated in virtual environment 2220 because they are virtually positioned out of the line of sight of the user control object 2327 in virtual environment 2320.
FIG. 24 is another screen shot 2400 of an exemplary object in another exemplary gaming system and illustrates an actual distance 2430 around a user control object 2427 in virtual environment 2420. The real positioning of real data object 2412 in real environment 2410 is outside of the actual distance 2430 of user control object 2427 when placed within virtual environment 2410. As such, the real data object is placed on a virtual position of the real data object 2428 in the virtual environment 2420, which corresponds to a real position of the real data object 2412 in the real environment 2410.
FIG. 25 is another screenshot 2500 of an exemplary object in another exemplary gaming system and illustrates an actual distance 2530 around a user control object 2527 in a virtual environment 2520. As illustrated, the true location of real data object 2512a in real environment 2510 is within actual distance 2530. The virtual location of the real-data object 2528a is placed on the virtual location of the real-data object 2528a in the virtual environment 2520 based on the historical time frame of the real-data object 2528 a. Furthermore, because the real location of real data object 2512b in real environment 2510 is behind the real location of real data object 2512a, the virtual location of real data object 2528b is at a historical time frame of real data object 2528b, which corresponds to the temporal location of the virtual location of real data object 2528a (e.g., both real data objects 2528a and 2528b are at temporal location ═ 2).
Table 1 illustrates an exemplary history list of the locations of real data objects. Although table 1 illustrates seconds and miles by feet (miles by fee), the list of position fixes may utilize any type of time metric (e.g., milliseconds, actual time, etc.) and/or any type of location metric (e.g., GPS coordinates, longitude/latitude, etc.).
TABLE 1 History List of locations
In some examples, depending on the type of game and/or tactics allowed, the system may take over control of the real data object to let it interact with the user control object. The system may use one or more of the following parameters for the interaction:
1. deviation from reality is as small as possible according to needs;
2. other real data objects are not affected;
3. allowing interaction;
4. the interaction is realistic (e.g., within physical limits, etc.);
5. interaction is within the expectations of the user/player; and/or
6. Interactively enhancing user/player gaming experience
After interaction, the system can conform to actually return real data objects to their valid real data locations.
The above described interactions may also occur in a virtual world, where multiple user control objects are present simultaneously. In other words, the control of the real data object by the system may occur simultaneously for multiple user control objects.
The virtual world may be a three-dimensional computer-based environment having objects, logic, rules, states, and/or goals. The virtual world may be represented graphically, be a simulated representation of a real-world environment, and/or be a computer game.
In some examples, information about the location, orientation, and state of the object is needed to represent the object in the virtual world. This information comes from the data source. The data source may be one or more of the following: i) computer input devices such as keyboards, mice, joysticks, steering wheels, game pads, and the like; ii) another computer or computer network; iii) real world objects being monitored; iv) the stored data file; v) data flowing over the network; vi) a set of algorithms to generate performance information; and/or vii) any other type of data source (e.g., database, externally generated data, internally generated data, etc.). However, it should be understood that this list is not all inclusive.
In other examples, the data source may provide information in real-time and/or delayed. If multiple objects in the virtual world become their performance information from different data sources without being aware of each other, their performance in the virtual world may cause the virtual world to be non-compliant with the actual presentation (i.e., the presentation does not match the objects, logic, rules, states, and/or goals of the virtual world).
In some examples, the Real World Object (RWO) is a moving object that (1) exists in the real world, (2) has associated steering intelligence, and/or (3) is represented by an avatar within the virtual environment (world). Based on the landscape, RWO refers to both the objects in the real world and their avatars in the virtual world. For example, in a racing game, this is any real-world race car (including a driver) that is tracked.
In other examples, Virtual Objects (VOs) are moving objects that (1) exist only in the virtual environment without any real-world equivalents, and/or (2) have some associated manipulation intelligence. The virtual object may be user controlled or controlled by artificial intelligence. For example, in a racing game, this is a racing car controlled by a player.
In some examples, an Artificial Intelligence (AI) module is part of the system. The AI module may change information for the object (e.g., information from a data source) in a manner that causes the representation of the object in the virtual world to match the object, logic, rules, status, and/or goals of the virtual world. The AI module may further simulate awareness of the presence of other objects that are also present in the virtual world.
The AI module may advantageously keep the deformation from the "no intervention situation" as small as possible so that the virtual world is as close to the real world as possible. The AI module may advantageously, gradually and consistently, return the real world object to a "no intervention" situation.
FIG. 26 is a diagram of another exemplary gaming system 2600 and illustrates a tournament game (i.e., a virtual world) having two cars (i.e., objects). The system 2600 includes a virtual world 2610, a data source a 2620 corresponding to user control objects, and a data source B2630 corresponding to real world objects. The virtual world 2610 receives data from data sources a 2620 and B2630. The virtual world 2610 communicates with the AI module 2640 to simulate real world events in the virtual world (e.g., determine intersections between objects, determine alternative paths, etc.). The virtual world 2610 includes objects 2612 (e.g., real world objects, user control objects, etc.), logic 2613 (e.g., two objects cannot occupy the same space, etc.), rules 2614 (e.g., speed, physical laws, etc.), states 2615 (e.g., contests, flags, etc.), and goals 2616 (e.g., finish line, exits, etc.). For example, one car is controlled by the user (i.e., data source a) and the other car is controlled by telemetry data received over the internet from the real car (i.e., data source B).
As an additional example, two cars are both represented in the game. The user controls car a few meters in front of telemetry car B. Both cars are subject to the rules of the tournament game and are rendered in compliance with the data received from their corresponding data sources.
As another example, the user applies the brakes and the automobile A begins to decelerate. The AI module 2640 determines that a conflict between car a and car B may occur. In some embodiments, the conflict is not a desired goal for the tournament game based on the logic, rules, and/or goals of the virtual environment. In this way, the AI module 2640 changes the data for the object involved. In this way, the progress and speed of the car B is changed in order to prevent collisions.
As an additional example, when the risk of collision according to actual data is minimal based on logic, rules, and/or objectives, the AI module 2640 gradually changes the route and speed of the automobile B so that the automobile B can quickly and consistently return to its actual location, route, and speed.
AI module 2640 may operate, for example, in virtual environment 2610 for prediction and interpolation management and/or for avoiding overlap. The AI module 2640 advantageously predicts when two moving targets are at risk of imminent conflict. AI module 2640 may continuously monitor virtual environment 2610 and determine where objects may go given the parameters of the current situation. Via such monitoring and determination, the AI module 2640 may determine whether an evasive maneuver (evasivemaneuver) is required.
In some examples, the prediction is important when the data stream received from the real-world object is interrupted. In other words, the avatar still needs to conform to the actual representation and AI module 2640 needs to predict the location of the real-world object based on its current location and previously known locations (i.e., historical information). Table 2 illustrates real world data points and predicted data points.
TABLE 2
Time (in seconds) Real world positioning Predictive positioning
0 1.3 miles -
1 1.5 miles -
2 1.7 miles -
3 2.1 mile
4 Without data 2.5 miles
5 Without data 2.9 miles
6 Without data 3.3 miles
The AI module 2640 may advantageously predict intervening data points between actual data points. In other words, if the AI module 2640 only receives data points from real-world objects every three seconds, the AI module 2640 may insert data points for real-world objects in the middle over time. Table 3 illustrates real world data points and interpolated data points.
TABLE 3
Time (in seconds) Real world positioning Positioning of insertion
0 1.3 miles -
1 1.4 miles -
2 - 1.5 miles
3 1.6 miles -
4 1.7 miles -
5 - 1.8 miles
6 1.9 miles -
AI module 2640 may, for example, operate to avoid overlap between any objects whenever (e.g., objects may contact each other, but never occupy the same space). In the virtual environment 2610, it is assumed that real world objects exist in the real world at the same time, and therefore never occupy the same space. Thus, typically, only the relative position of the virtual object to the real-world object must be tested (unless the position of the real-world object has been changed to avoid overlap).
If the virtual object and the real-world object are in close proximity (e.g., the locations are not true, a conflict is imminent, etc.), the AI module 2640 may, for example, take action to maintain authenticity. For example, if two cars in a race game are close together, a real driver will initiate an avoidance maneuver to prevent himself from colliding with another car.
The AI module 2640 advantageously operates to maintain a goal 2616 for the virtual environment. Goal 2616 may include trustworthiness, authenticity, real-time, and/or stability of the virtual environment.
AI module 2640 may operate to preserve the illusion of trustworthiness for the user. Even if it is not possible to accurately model the actual situation due to the effect of the virtual object on the current situation, the illusion should always be good enough to enable the player to believe that it is perfectly true. For example, if the solution to the overlap problem is achieved by simply leaving behind other cars, then suddenly jumping to that location when the real objects are in a position in front of them, the user will notice and the gaming experience will suffer.
The AI module 2640 may operate to preserve the illusion of reality. The authenticity is usually more stringent than the trustworthiness and less pragmatic. As an example of the difference between authenticity and credibility, when we need a speed only a little higher than the actual highest speed to get back to the right situation: authenticity will not take this into account, but trustworthiness will be given the fact that it is unlikely that any user will notice the difference. In this way, AI module 2640 may prioritize the goals of the virtual environment to ensure an optimally balanced user experience.
AI module 2640 may operate the virtual environment in real time and/or based on stored information. The AI module 2640 may operate in real-time, with short delays, and/or based on stored information. The AI module 2640 may operate based on stored information to provide pay-per-view service after an actual real-world event occurs. In other words, AI module 2640 may replay the game event multiple times based on the stored information. The AI module 2640 may further calculate solutions (e.g., transit methods, override methods, etc.) in real-time (e.g., relative to actual real-world events, relative to a time frame of stored events, etc.) given the only data currently available. The AI module 2640 may calculate the next state before it is actually displayed to the user.
The AI module 2640 may operate a stable virtual environment. A stable virtual environment includes terminating any changes from the data source at a reasonable time and/or limiting overlap between the displaced real world objects. For example, once any real-world object is shifted to prevent overlap with a virtual object from occurring, the real-world object may overlap with another real-world object in the virtual environment. In this way, the displacement of real world objects may become unstable, each displacement may trigger another displacement, and so on. The AI module 2640 operates to ensure that such displacement chains terminate and preferably do not unnecessarily displace many real-world objects. In this way, the AI module 2640 operates to make the virtual environment behave as closely as possible to reality.
Fig. 27 is another flowchart 2700 of an exemplary game process utilizing, for example, AI module 2640 of fig. 26. AI module 2640 receives (2710) data associated with real world objects. The AI module 2640 processes 2720 the received data and associates 2730 the processed data with real-world objects. The AI module 2640 determines (2740) whether data for the real-world object is lost (i.e., unavailable). If data is not available for the real world object, the AI module 2640 determines (2745) missing data (e.g., interpolation). If data is available, the AI module 2640 determines (2750) whether there are any intersections or projected intersections between the real world objects and/or the user control objects. If there is no intersection or projection intersection, processing continues (2710). If there are intersections or projection intersections, the AI module 2640 determines (2755) alternative locations for the real-world object intersections or projection intersections.
Fig. 28 is another flow diagram 2800 of an exemplary game process utilizing, for example, the AI module 2640 of fig. 26. The AI module 2640 identifies (2810) the real-world object, wherein the virtual location in the virtual environment does not correspond to a real-world location of the real-world object. The AI module 2640 determines (2820) whether the identified real-world objects can return their real-world locations. If the identified real world objects cannot return their real world locations, processing continues (2810). If the identified real-world objects can return their real-world locations, the AI module 2640 causes the real-world objects to return 2830 their real-world locations in a manner that is consistent with reality (e.g., speed constraints, location constraints, etc.).
In some examples, the AI module 2640 may operate to predict conflicts, insert data points, and/or avoid overlaps.
In some examples, the system allows a user-controlled object to compete with an object controlled by real-world information in a race and/or any other type of event. The information is presented to the user in such a way that the user feels that he/she is actually participating in that competition. The user control object may be presented in the area of the real data object while maintaining the relative positioning of the real data object in front of and/or behind the user control object, as in the real world.
The interaction between real data objects and user control objects may be managed, for example, by a client utilizing an Artificial Intelligence (AI) engine (also referred to as an AI module). The AI engine includes, for example, a conflict detection module to manage (i.e., prevent) conflicts between virtual racing cars and real-world racing cars (also known as GPS-managed racing cars). Although the interaction between the real data object and the user control object is described as a race event, the interaction can occur in any type of event (e.g., track and field, football, dance, etc.) that can include real world objects and virtual objects.
In some examples, the interaction between real world objects and virtual objects is managed using a polygon channel (polygon) projected from a virtual car according to the speed and/or azimuth of the virtual car. When the end user places the virtual car in close proximity to one of the GPS managed cars, one of the polygonal channels intersects that one of the GPS managed race cars, identifying a potential conflict between the two vehicles.
In other examples, interactions between real-world objects and virtual objects are managed using actual distance regions (e.g., dynamically generated distances, predetermined distances, etc.) and/or histories of real data objects. When the end user places the virtual car in close proximity to one of the GPS-managed cars, that GPS-managed car enters the actual distance zone of the virtual car, identifying potential conflicts between the vehicles.
For example, upon detecting a conflict, the AI engine temporarily takes over control of the GPS-managed automobile, operating it in an autonomous (autonomous) mode. The AI engine may initiate an override sequence, determine whether it is advisable to override the virtual car at a particular point on the track, and whether the override of the virtual car can be accomplished at a sensible speed given the location on the track. If the AI engine decides to override the virtual car by the autonomous car, the AI engine performs an override sequence, overriding the virtual car and recalculating its position on the frame-by-frame sequence. When the autonomous vehicle completes the override process, the vehicle is reset to the actual location of the GPS-managed vehicle. This reset occurs once over a series of frames to provide a smooth and realistic transition. Once the autonomous car reaches the location of the GPS-managed car, the car is again managed by GPS data from the real-world car.
In some examples, the AI engine determines overrides of virtual objects by real-world objects. For example, in the racing game example, an override problem occurs when a real-world automobile is behind a virtual automobile and the real-world automobile is driving faster than the virtual automobile. In this example, the real world car must travel through the virtual car-of course, this is not a practical situation. In this example, control of the real-world automobile is temporarily taken over by the AI engine. The AI engine may now have several interrelated goals: a car should start where it is currently located, should transcend the virtual car in a plausible way, should return to the track after transcending and most specifically, should return to a data point at the exact time that the real world object is at that data point-and all other real world objects and virtual objects must be avoided at the same time. To do so, the system may take the following steps: (i) calculating the current distance between the projection of the virtual automobile on the actual path and the real-world automobile; (ii) an offset centered at 0 is derived as a function of f (distance). The shape of the curve should conform to the application itself-examples of different factors include the relative speed, relative size, and tunability of real world objects and virtual objects. Also, the offset function should return 0 when the starting distance is taken as a parameter (since no offset is used at the beginning of the displacement). As a final requirement, the function should ensure that the objects do not hit each other, even in the case of small corners, (iii) at each time step, the system calculates the distance between the actual positions of the real-world cars along the actual path and uses this distance as an input to the offset function. The result of this offset function is the distance by which the car should be displaced, which is perpendicular to the local tangent of the actual path. This offset should be applied in the most logical direction: if the virtual car of the gear is to the left of the actual path, the offset should move the real world car to the right.
Described herein are examples of interactions between user control objects and real data objects. In these examples, the user control object, the real data object, and the object X are utilized as described below. User control objects are objects in a virtual world where positioning and other properties are controlled by a user (e.g., a player, referee, etc.). Real data objects are objects in the virtual world, where positioning and other properties are obtained from real objects in the real world. For each real data object, at least the positioning information for each time frame is stored in a history list. Also, for this time frame, other information from the real data object may be stored (e.g., speed, heading, orientation, etc.). Object X is the first real data object behind the user control object.
In some examples, the system ensures that real-world objects remain in their actual positions as much as possible, while also taking into account virtual objects. In particular, the system can ensure that the representation of real world objects (also referred to as real data objects) takes into account and reacts appropriately to virtual objects (also referred to as user control objects).
In other examples, non-stationary real world objects are referred to as dynamic objects, while stationary ones are referred to as static objects. The information captured by the system allows the system to determine, for example, where dynamic objects are, what they are doing, and/or what they represent.
In some examples, the system collects and distributes detailed information (e.g., actual location, relative location, etc.) about the location of real-world dynamic objects during the course of an event. The system may also collect status information from events (e.g., flags, symbols, weather, etc.).
In other examples, the system includes a position locating device to continuously determine a real-world position of the dynamic object relative to a static object within the environment during the event. The position-locating device may include, for example, one or more position sensors that provide real-time updated positions of dynamic objects during the course of an event. By way of example, each dynamic object may include a respective location sensor, such as a Global Positioning System (GPS) receiver. The GPS receiver can recalculate its position at a rate of up to 50 Hz. If necessary (e.g., where the end user display refresh rate is different than the location update rate), the system can interpolate between successive inputs.
In some examples, the dynamic object may also include additional sensors that sense other information related to the dynamic object (e.g., RPM, speed, throttle position, gear position, an Inertial Measurement Unit (IMU) that detects changes in the current rate and rotational characteristics of acceleration, including y-axis rotational angle (pitch), x-axis rotational angle (roll), and z-axis rotational angle (yaw), etc.). In other examples, the speed information may be derived from the location rather than being obtained directly from a speed sensor (such as a speedometer on a real-world object).
In some examples, the system includes features for increasing the position resolution obtained by the GPS receiver to about +/-10cm, preferably approximately 1cm horizontally and 2cm in height. Such boost features include, for example, differential gps (dgps), carrier phase boost gps (cpgps), Omnistar correction messages, ground-based reference stations, novatewadypoint software, and/or combinations with IMUs. The system may also include one or more sensors that collect information from static objects and/or event states (e.g., flags, symbols, weather, etc.).
In some examples, some of the event information, such as weather, flags, symbols, etc., may be collected (e.g., manually, automatically with sensors, etc.) and fed into a networked server.
In other examples, a networked server may access a storage device (e.g., a database) and/or include a management terminal. All systems connected to the internet may include firewalls and/or other security measures for protection and privacy.
In some examples, the end-user gaming station receives data from the media player over the internet and/or any other type of communication network. The end-user gaming stations may include personal computers (e.g., mobile phones, other handheld communication devices, transmission devices, etc.) and/or gaming platforms (e.g., XBOX gaming platform, PS3 gaming platform, etc.). Although the GPS positioning scheme may include a GPS time value, the timing within the virtual representation need not be synchronized to any GPS timing information, for example.
Referring again to FIG. 1, the networked server receives all of the original information from the dynamic objects and the local environment. At least some of this information comes to the networked server via a communication scheme that may include a radio base station and/or any other type of transceiver. The networked server stores this data in a database and filters, optimizes, and/or repairs the data as needed. For example, a networked server performs a Cyclic Redundancy Check (CRC) and checks for telecommunication interruptions. A networked server stores the data in an appropriate format for further processing (e.g., by a media player).
In some examples, the media player is a server connected to the internet and configured to retrieve event data from storage and send the data in a continuous stream to an end-user gaming station (often referred to as a game client) under the control of an end-user (i.e., player). The data may include location data, telemetry data (when available), and more generally any data obtained or derived from a physical event.
In other examples, multiple media players may be provided in a geographically dispersed arrangement (e.g., worldwide) to provide optimal connectivity to the game client. The client may retrieve streaming data from a local media center. The data stream to the game client can optionally be protected by encryption.
In some examples, the system may include one or more services, such as a receiving service, a database service, a filtering and optimization service, and/or a game server. The receiving service application runs in the background to receive and store raw data in the database. The database service may be a standard off-the-shelf database application configured for high-volume data traffic. Several databases may be created to store information related to dynamic objects (e.g., cars), environments (e.g., tracks), and other information. The filtering and optimization service is an application that examines data stored in a database, filters out singular values therefrom, calculates, optimizes, and adds missing values in the database (i.e., data interruptions).
The game server is an application that makes it possible for a game client to connect to the media player. The game server sends instructions to the database controller to select which data from the database is to be transferred (real-time or historical racing). The game server also collects selected data from the database and sends them as data packets to the connected game client. Although fig. 1 illustrates the game server separate from other services, the game server may be integrated into these other services, multiple game servers may operate within the system and/or the game server may be integrated into any other portion of the system.
In other examples, the system includes features to handle small data interrupts. For example, the system uses kalman filtering to filter and ultimately predict the small number of data interruptions that may be experienced due to lost or corrupted data packets. The system also counts the number of lost packets and predicts the value of the lost packets. For a large number of data interruptions (e.g., 1-2 seconds or more) for which the kalman filter can no longer reliably predict where a dynamic object may be), a networked server sends a signal to the client. During the interruption period, the client manages the dynamic objects in an autonomous mode, as will be described in more detail below. In some instances, a delay is provided and maintained between the time streaming data is received and the time such data is played or used.
In some examples, the system includes features that allow a user to pause, rewind, and/or fast forward an event. Pause, rewind, and/or fast forward features may be used in recorded event plays and/or in live broadcasts of events. For example, a user may be simulating a race car in a live race and need to rest. In this example, the user may pause the simulation and continue the simulation after a break. The user may continue from the paused position, for example, after a break and then play in the form of a recorded-play simulation and/or the user may fast forward the race to a live simulation (e.g., reset it based on the simulated car's past performance, jump to a stop, etc.).
In other examples, the system includes one or more client applications that attach to a networked server over a network (such as the internet) in a client-server configuration. The input to the client application is a data stream from a networked server. The exact format of the data may be defined, such as: a message ID; a car ID; the total cell state; GPS signals, etc. The client application features graphically display real-time (or near real-time) data that can be parsed and visualized in the virtual world. The application also displays areas where end users (i.e., game participants) can interact with the virtual world.
In other examples, the client includes initialization capabilities. Such capabilities may include initializing dynamic and virtual objects within the virtual representation, initializing a graphics engine, opening a log file, and/or configuring user controls (e.g., mouse, keyboard, joystick, steering wheel, etc.). The user controller allows an end user (i.e., a player) to control a virtual object that is added to a virtual representation of a physical event. The initialization capability also handles configuration settings such as selectable user perspective views of virtual events (e.g., top-down and centered on the active car and behind the car views). The client also reads a set of points that describe static objects in the real-world environment, such as a race track (route).
In some examples, the representation of the local environment of the event includes location information for static objects (i.e., tracks). For example, the location information includes latitude, longitude, and altitude of points along the race track. Such points may be obtained from a terrain map (such as Google Earth) and/or any other terrain map source.
In other examples, for situations in which there is a significant data interruption (i.e., where more data is lost than latency and thus data interpolation is not possible), each affected GPS-managed automobile is temporarily controlled by the AI engine in an autonomous mode. The AI engine transfers the car in a frame-by-frame process from the last known GPS location to the best possible path (e.g., an ideal path determined for a given environment (such as a racing track), a shortest length path, a shortest time path, a path defined by waypoints (waypoints), along a curve, along an interior route of a curve, along an exterior route of a curve, etc.) that was previously determined for a given track, continuing with the last known speed, azimuth, and acceleration. The game engine continues to attempt to receive valid data from the server. Once valid data is obtained, the AI engine moves the autonomously controlled car from the base path to the actual location in a smooth and realistic manner in a frame-by-frame process.
In some examples, the system allows one or more end users to access event data from networked servers and participate in a virtual representation of a dynamic object of a physical event, including the real world, through the insertion of a virtual object. The virtual representation of the end user may be achieved in real-time or at least near real-time with the event using streaming event data from a networked server. The end user may also choose to use previously recorded data obtained from the database through a networked server to participate in the virtual rendering of an earlier event. In either event, the system provides a realistic experience for the end user through the various features described herein, as if the end user were present in the physical event, participating with the real-world object.
The systems and methods described above may be implemented in digital electronic circuitry, in computer hardware, firmware, and/or software. The implementation may be as a computer program product, i.e. a computer program implementing the objects in an information carrier. The implementation may be, for example, in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus. The implementation may be, for example, a programmable processor, a computer, and/or multiple computers.
A computer program can be written in any form of programming language, including compiled and/or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a subroutine, element, and/or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site.
Method steps can be performed by one or more programmable processors executing a computer program to perform functions of the invention by operating on input data and generating output. Method steps also may be performed by, and apparatus may be implemented as, dedicated logic. For example, the circuitry may be an FPGA (field programmable gate array) and/or an ASIC (application specific integrated circuit). Modules, subroutines, and software agents may refer to portions of a computer program, processor, dedicated circuitry, software, and/or hardware that implement that function.
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer can include, or be operatively coupled to receive data from and/or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
Data transmission and instructions may also occur over a communication network. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices. The information carrier may, for example, be an EPROM, an EEPROM, a flash memory device, a magnetic disk, an internal hard disk, a removable disk, a magneto-optical disk, a CD-ROM and/or a DVD-ROM disk. The processor and the memory can be supplemented by, and/or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, the techniques described above may be implemented on a computer having a display device. The display device may be, for example, a Cathode Ray Tube (CRT) and/or a Liquid Crystal Display (LCD) monitor. Interaction with the user may be, for example, displaying information to the user, and the user may provide input (e.g., interacting with user interface elements) to the computer through a keyboard and a pointing device (e.g., a mouse or a trackball). Other types of devices may be used to provide for interaction with the user. The other means may be, for example, feedback provided to the user in any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback). Input from the user may be received in any form, including acoustic, speech, and/or tactile input, for example.
The techniques described above can be implemented in a distributed computing system that includes a back-end component. The back-end component can be, for example, a data server, a middleware component, and/or an application server. The techniques described above can be implemented with a distributed computing system that includes a front-end component. The front-end component can be, for example, a client computer having a graphical user interface, a web browser through which a user can interact with the example implementations, and/or other graphical user interfaces for a delivery device. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a Local Area Network (LAN), a Wide Area Network (WAN), the internet, a wired network, and/or a wireless network.
The system may include a client and a server. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
The packet-based network may include, for example, the Internet, a telecommunication-grade Internet Protocol (IP) network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), a Campus Area Network (CAN), a Metropolitan Area Network (MAN), a Home Area Network (HAN), a private IP network, an IP private switch (IPBX), a wireless network (e.g., a Radio Access Network (RAN), an 802.11 network, an 802.16 network, a General Packet Radio Service (GPRS) network, a HiperLAN), and/or other packet-based networks.
The client devices may include, for example, computers with browser devices, telephones, IP phones, mobile devices (e.g., cellular phones, Personal Digital Assistant (PDA) devices, laptop computers, email devices), and/or other communication devices. The browser means includes, for example, a web browser (e.g., Microsoft (R) Microsoft (R) TMInternetExplorerMozilla, available from Mozilla corporationFirefox) computer (e.g., desktop computer, laptop computer). Mobile computing devices include, for example, Personal Digital Assistants (PDAs).
The terms "comprises," "comprising," and/or their respective plural forms are open-ended and include the recited moieties and may include moieties not recited. "and/or" is open-ended and includes one or more of the recited components as well as combinations of the recited components.
While the present invention has been particularly shown and described with reference to preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the invention encompassed by the appended claims.

Claims (30)

1. A method for simulating events in a real environment, the method comprising:
determining a user position of a user control object in a virtual environment;
determining a virtual location of a real data object in the virtual environment relative to the user location based on a real location of the real data object in the real environment; and
controlling a current virtual location of the real data object in the virtual environment based on the virtual location and one or more saved real locations associated with the real data object.
2. The method of claim 1, further comprising:
determining whether a next real position fix of the real data object is available; and
controlling a current virtual location of the real-data object in the virtual environment based on a predefined path associated with the real environment and a determination of whether a next real location of the real-data object is available.
3. The method of claim 2, further comprising:
determining whether additional real position fixes of the real data object are available;
identifying a next user location of the user control object in the virtual environment;
determining one or more future virtual positions of the real-data object in the virtual environment based on the determination of whether additional real positions of the real-data object are available and the next user position, the one or more future virtual positions associated with a path to move the current virtual position to a virtual position associated with the additional real positions; and
controlling a current virtual location of the real data object in the virtual environment based on the one or more future virtual locations.
4. The method of claim 1, further comprising:
identifying a next user location of the user control object in the virtual environment;
determining a next virtual location of the real data object in the virtual environment based on a next real location of the real data object in the real environment; and
controlling a current virtual location of the real data object based on the next virtual location and an actual distance between the next virtual location and the next user location.
5. The method of claim 4, further comprising determining an additional virtual location of the real data object in the virtual environment based on the one or more saved real locations.
6. The method of claim 4, further comprising:
identifying additional user locations of the user control object in the virtual environment;
determining a virtual location of a next real-data object in the virtual environment based on a real location of the next real-data object in the real environment; and
controlling a current virtual location of a next real data object in the virtual environment based on the virtual location, an actual distance between the virtual location and an additional user location of the user-controlled object, and a chronological identification associated with the next virtual location of the real data object.
7. The method of claim 6, further comprising:
determining an additional virtual location of the real-data object in the virtual environment based on the one or more saved locations, the additional virtual location associated with a next chronological identification; and
determining a next virtual location of the next real data object in the virtual environment based on one or more next saved locations and the next chronological identification.
8. The method of claim 1, further comprising:
determining a next virtual location of the real data object in the virtual environment based on a next real location of the real data object in the real environment, the next virtual location being different from the next real location and in front of the user control object; and
controlling a current virtual location of the real data object based on a next virtual location of the real data object.
9. The method of claim 1, wherein a virtual location of the real data object in the virtual environment is different from a real location of the real data object in the real environment.
10. The method of claim 1, further comprising:
determining a user position in the virtual environment relative to the user control object based on a real position of a next real data object in the real environment, a virtual position of the next real data object in the virtual environment; and
controlling a current virtual location of the next real-data object in the virtual environment based on the virtual location and one or more saved real locations associated with the next real-data object.
11. The method of claim 1, wherein the determining the virtual location occurs in real-time or near real-time with movement of the real data object in the real environment.
12. A method for simulating events in a real environment, the method comprising:
determining projected intersections in the virtual environment between the one or more real-world objects and the one or more virtual objects; and
determining an alternative location for each real-world object to be projected to intersect with at least one virtual object based on projected intersections between the one or more real-world objects and the one or more virtual objects.
13. The method of claim 12, further comprising positioning each real world object to be projected to intersect on a respective alternative location.
14. The method of claim 12, further comprising:
determining whether positioning for the one or more real world objects is lost; and
a missing location is determined for each real-world object that loses data based on one or more saved locations associated with the respective real-world object.
15. A method for simulating events in a real environment, the method comprising:
identifying a virtual location and a real-world location of a real-world object;
identifying a virtual location of a virtual object;
determining that the real-world object intersects a projection of the virtual object based on the virtual location of the real-world object, the real-world location of the real-world object, the virtual location of the virtual object, or any combination thereof; and
modifying a virtual location of the real-world object based on the projection intersection and one or more stored virtual locations associated with the real-world object.
16. A computer program product, tangibly embodied in an information carrier, the computer program product including instructions operable to cause data processing apparatus to:
determining a user position of a user control object in a virtual environment;
determining a virtual location of a real data object in the virtual environment relative to the user location based on a real location of the real data object in the real environment; and
controlling a current virtual location of the real data object in the virtual environment based on the virtual location and one or more saved real locations associated with the real data object.
17. A system for simulating events in a real environment, the system comprising:
a virtual data positioning module configured to determine a user positioning of a user-controlled object in a virtual environment;
a real-data localization module configured to determine a virtual localization of a real-data object in the virtual environment relative to the user localization based on a real localization of the real-data object in the real environment; and
a positioning control module configured to control a current virtual positioning of the real-data object in the virtual environment based on the virtual positioning and one or more saved real positioning associated with the real-data object.
18. The system of claim 17, further comprising:
a real-data location module further configured to determine whether a next real location of the real-data object is available; and
a positioning control module further configured to control a current virtual positioning of the real-data object in the virtual environment based on a predefined path associated with the real environment and a determination of whether a next real positioning of the real-data object is available.
19. The system of claim 18, further comprising:
a real-data localization module further configured to determine whether additional real localization of the real-data object is available;
a virtual data location module further configured to identify a next user location of the user control object in the virtual environment;
a location projection module configured to determine one or more future virtual locations of the real-data object in the virtual environment based on the determination of whether additional real locations of the real-data object are available and the next user location, the one or more future virtual locations associated with a path to move the current virtual location to a virtual location associated with the additional real locations; and
a positioning control module further configured to control a current virtual positioning of the real-data object in the virtual environment based on the one or more future virtual positioning.
20. The system of claim 17, further comprising:
a virtual data location module further configured to identify a next user location of the user control object in the virtual environment;
a real-data localization module further configured to determine a next virtual localization of the real-data object in the virtual environment based on a next real localization of the real-data object in the real environment; and
a positioning control module further configured to control a current virtual positioning of the real-data object based on the next virtual positioning and an actual distance between the next virtual positioning and the next user positioning.
21. The system of claim 20, further comprising a real data positioning module further configured to determine an additional virtual positioning of the real data object in the virtual environment based on the one or more saved real positions.
22. The system of claim 20, further comprising:
a virtual data location module further configured to identify additional user locations of the user control object in the virtual environment;
a real-data localization module further configured to determine a virtual localization of a next real-data object in the virtual environment based on a real localization of the next real-data object in the real environment; and
further configured to identify a location control module that controls a current virtual location of the next real-data object in the virtual environment based on the virtual location, an actual distance between the virtual location and an additional user location of the user-controlled object, and a temporal order associated with the next virtual location of the real-data object.
23. The system of claim 22, further comprising:
a real data location module further configured to:
determining an additional virtual location of the real-data object in the virtual environment based on the one or more saved locations, the additional virtual location associated with a next chronological identification; and
determining a next virtual location of the next real data object in the virtual environment based on one or more next saved locations and the next chronological identification.
24. The system of claim 17, further comprising:
a real-data positioning module further configured to determine a next virtual positioning of the real-data object in the virtual environment based on a next real positioning of the real-data object in the real environment, the next virtual positioning being different from the next real positioning and in front of the user-controlled object; and
a positioning control module further configured to control a current virtual positioning of the real-data object based on a next virtual positioning of the real-data object.
25. The system of claim 17, further comprising:
a real-data positioning module further configured to determine a user position in the virtual environment relative to the user control object based on a next real position of a next real-data object in the real environment, a virtual position of the next real-data object in the virtual environment; and
a positioning control module further configured to control a current virtual positioning of the next real-data object in the virtual environment based on the virtual positioning and one or more saved real positions associated with the next real-data object.
26. A system for simulating events in a real environment, the system comprising:
a positioning intersection module configured to determine a projected intersection in the virtual environment between the one or more real-world objects and the one or more virtual objects; and
configured to determine an alternative location for each real-world object to be projected to intersect with at least one virtual object based on projected intersections between the one or more real-world objects and the one or more virtual objects.
27. The system of claim 26, further comprising a position control module configured to position each real world object to be projected to intersect on a respective alternative position.
28. The system of claim 26, further comprising:
a real-data location module configured to determine whether a location for the one or more real-world objects is lost; and
a location projection module further configured to determine a missing location for each real-world object that loses data based on one or more saved locations associated with the respective real-world object.
29. A system for simulating events in a real environment, the system comprising:
a real-data location module configured to identify a virtual location and a real-world location of a real-world object;
a virtual data location module configured to identify a virtual location of a virtual object;
a location projection module configured to determine that the real-world object intersects a projection of the virtual object based on a virtual location of the real-world object, the real-world location, a virtual location of the virtual object, or any combination thereof; and
a positioning control module configured to modify a virtual positioning of the real-world object based on the projection intersection and one or more stored virtual positioning associated with the real-world object.
30. A system for simulating events in a real environment, the system comprising:
means for determining a user location of a user control object in a virtual environment;
means for determining a virtual location of a real data object in the virtual environment relative to the user location based on a real location of the real data object in the real environment; and
means for controlling a current virtual location of the real-data object in the virtual environment based on the virtual location and one or more saved real locations associated with the real-data object.
HK12104555.5A 2008-09-24 2009-09-24 System and method for simulating events in a real environment HK1163580A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US61/099697 2008-09-24

Publications (1)

Publication Number Publication Date
HK1163580A true HK1163580A (en) 2012-09-14

Family

ID=

Similar Documents

Publication Publication Date Title
KR20110069824A (en) System and method for simulating events in real environment
CN112807681B (en) Game control method, game control device, electronic equipment and storage medium
US8160994B2 (en) System for simulating events in a real environment
JP7629942B2 (en) Methods of haptic response and interaction
CN100363074C (en) Spatial position sharing system, data sharing system, network game system, and client for network game
TWI891997B (en) Method and non-transitory computer-readable storage medium for panoptic segmentation forecasting for augmented reality
US11477610B2 (en) Gaming location pre-emptive loss correction
JP7747996B2 (en) Game program, computer, and game system
JP2022087213A (en) Game program, game device, and game system
KR20240018476A (en) Systems and methods for facilitating virtual participation in racing events
US12465857B2 (en) Systems and methods of processing player interactions in a multiplayer virtual game space
US20250073595A1 (en) Systems and methods of processing player interactions in a multiplayer virtual game space
HK1163580A (en) System and method for simulating events in a real environment
JP2021040871A (en) Game program, game apparatus and game system
JP6974780B2 (en) Game programs, computers, and game systems
JP6836676B2 (en) Game equipment and game programs
JP7082302B2 (en) Game programs, game devices, and game systems
GB2518602A (en) Systems and methods for virtual participation in a real, live event
JP2021053266A (en) Game program, computer and game system
JP2025147467A (en) program
HK1160973A (en) System for simulating events in a real environment