WO2018037269A1 - Atténuation d'interférences permettant un jeu en continu - Google Patents
Atténuation d'interférences permettant un jeu en continu Download PDFInfo
- Publication number
- WO2018037269A1 WO2018037269A1 PCT/IB2016/057845 IB2016057845W WO2018037269A1 WO 2018037269 A1 WO2018037269 A1 WO 2018037269A1 IB 2016057845 W IB2016057845 W IB 2016057845W WO 2018037269 A1 WO2018037269 A1 WO 2018037269A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cameras
- server
- control system
- tracking
- optimum
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- This invention relates to the field of virtual reality and in particular to systems and methods which track object locations using colored tracking marker lights and a plurality of color cameras.
- a tracking system using colored marker lights has advantages in the ability to differentiate players and objects, however accurately tracking markers over a relatively broad space or other virtual reality environment is difficult without using a plurality of cameras.
- Using a plurality of cameras presents a challenge with respect to position calibration, coordination, and synchronization.
- issues related to colored tracking markers conflicting with each other as players roam about a space used for virtual reality environment also present a challenge as markers with the same color may come within a close proximity to one another.
- many different cameras may see the same object but some cameras are frequently known to report the object's position more precisely than other cameras. Therefore, a method is needed to dynamically determine - without interrupting system operation and game play— which cameras track objects more accurately and de-emphasize those cameras that are less accurate.
- a control system includes a memory containing machine readable medium comprising machine executable code having stored thereon instructions for operating the control system comprising at least one processor coupled to the memory, wherein in order to detect and compensate for non-optimum camera operation the control system is configured to execute the machine executable code to cause the control system to: track the movements of the at least one object; determine non-optimum behavior of one or more cameras; and compensate for the non-optimum behavior of the one or more cameras by de-emphasizing cameras with less-optimum behavior in favor of one or more cameras with more optimum behavior.
- a computerized method wherein one or more processors determine a position of an object in a virtual reality environment while de-emphasizing cameras having less than optimum positioning accuracy, the method including locating an object in images received from two or more cameras; determining one or more intersections of vectors, wherein each vector originates at a camera and passes near the object; establishing a system-wide tracking error margin that represents the maximum allowable error for any detected intersection; initializing an object position vector for the object and a total position weight for the object; for a first intersection formed between a pair of vectors, performing the steps of: (a) identifying an intersect error margin for the intersection that comprises the closest distance between the pair of vectors; (b) computing a weight by subtracting the intersect error margin from the tracking error margin and dividing the result by the tracking error margin; (c) determining a position for the intersection based on the pair of intersecting vectors; and (d) multiplying the position by the weight, and adding the product to the object position
- a system for operating a virtual reality environment including mitigation for non-optimum camera operation includes a processor; a memory containing machine readable medium comprising machine executable code having stored thereon instructions for operating the system, wherein the processor is coupled to the memory, wherein in order to detect and compensate for non-optimum camera operation, the system is configured to execute the machine executable code to cause the control system to: track the movements of the at least one object and determine an intersect error margin value for each pair of cameras with respect to each tracked object to produce a plurality of intersect error margin values respective of the object; determine non-optimum behavior of the one or more cameras by comparing the intersect error margin values for all camera pairs viewing each tracked object; and compensate for the non- optimum behavior of the one or more cameras by de-emphasizing cameras with less- optimum behavior in favor of cameras with more optimum behavior.
- FIG. 1 depicts a system comprising a plurality of cameras which track objects such as players and controllers with tracking markers attached thereto, according to an exemplary embodiment
- FIG. 2 depicts a flowchart for initial color assignment before play as well as dynamic reassignment of colors during game play, according to an exemplary embodiment
- FIG. 3 depicts a system comprising a plurality of cameras, players, and controllers connected to a hierarchical server architecture, according to an exemplary embodiment
- FIG. 4 depicts a flowchart for synchronizing a plurality of cameras with consistent and accurate location of game objects, according to an exemplary embodiment
- FIG. 5 depicts a system for position calibration of each of a plurality of cameras using a calibration object, according to an exemplary embodiment.
- FIG. 6 depicts a flowchart for initial position calibration of each of a plurality of cameras using a calibration object, according to an exemplary embodiment.
- FIG. 7 depicts a block diagram of a system of a plurality of cameras communicating with a server, according to another exemplary embodiment.
- FIG. 8 depicts a plurality of cameras observing a trackable object, with focus on a particular pair of cameras where accuracy will be analyzed.
- FIG. 9 depicts the pair of cameras of figure 8 and the trackable object while showing both top and side views of an intersection of vectors emanating from each of the two cameras of the pair.
- FIG. 10 depicts an overview of an exemplary process for determining an Intersect Error Margin for a pair of cameras observing an object.
- FIG. 11 depicts a detailed process for comparing weighted intersect error margins for a plurality of camera pairs viewing an object including determination of de-emphasis to be applied to certain cameras.
- the objects may include players, controllers, and devices related to the game or another virtual reality experience.
- One or more color cameras are used to view one or more spaces, and track positions and orientations of players and other objects according to the attached marker lights.
- a hierarchical system of servers is used to process positions and orientations of objects and provide controls as necessary for the system.
- a method for color assignment is described as well as a calibration process, and a dynamic optimization process.
- a synchronization process is also described that ensures that a plurality of cameras and attached servers are properly coordinated. Head-mounted devices may also be used in conjunction with marker lights to provide information regarding players.
- FIG. 1 depicts a system comprising a plurality of cameras which track objects such as players and controllers with tracking markers attached thereto, according to an exemplary embodiment.
- a plurality of color cameras 102 viewing one or more spaces 104 of a virtual reality.
- a plurality of spaces or other virtual reality environments in the same physical space are supported by a logical or virtual division of the physical space into a plurality of virtual spaces where a single game may be operated in one of the plurality of virtual spaces or other virtual reality environments.
- Cameras 102 or other optical detectors suitable of detecting radiation from tracking markers 108 including infrared detectors, RGB cameras, hyperspectral sensors, and others.
- the space/spaces being viewed by the camera may include any kind of space used by a user/player to participate in the virtual reality experience, the virtual reality experience comprising a virtual reality game or any other form of virtual reality experience.
- At least two cameras 102 are utilized to observe the one or more spaces 104 or other virtual reality environments, however the limit to the number of cameras 102 is not limited thereto and only a single camera or more than two cameras may be utilized to observe the one or more spaces 103.
- Cameras 102 may be connected to a hierarchical server architecture 110 which analyzes images viewed by cameras 102 and communicates with players 106 and other objects such as game controllers, simulated weapons etc., all of which include tracking markers 108 for observation by cameras 102.
- the hierarchical server architecture 110 will be described in more detail below, with reference to FIG. 3 and FIG. 4.
- Connections 112 between cameras 102 and server architecture 110 may be either hardwired such as Ethernet, or alternately wirelessly connected such as, for example, Wi-Fi connectivity.
- connection 112 is not limited thereto and other forms of establishing a network may be used.
- Communication between server architecture 110 and players 106 and other game objects for both control and sensing purposes may be performed through wireless connectivity 114 which may include Wi-Fi connectivity or other forms of wireless connectivity.
- communication between the server architecture 110 and players 106 may be performed through a wired connection.
- players 106 may carry a form of backpack PC 116 which may interface electronically with a form of head-mounted device and/or a controller or simulated weapon device carried by the player.
- backpack PC 116 may communicate wirelessly and directly with the head-mounted device and or the controller or simulated weapon device carried by the player, however this form of communication is not limited thereto and the communication may be performed via a wired connection.
- step S202 An example process for initial color assignment for the tracking marker lights 108 before play, and for dynamic color reassignment for the marker lights 108 during play, is shown in FIG. 2.
- step S202 a first marker light 108 is set to white, then is viewed by one or more cameras 102, and is located in the tracking system of depicted in FIG. 1. The first marker light 108 is then changed to a first color, for example red.
- step S204 a next marker light 108 is set to white and located by the tracking system in the same manner as step S202. Subsequently, this next marker light 108 is changed to a different available color, for example green.
- the tracking marker lights 108 may be other light or radiation sources, including fluorescent light sources, infrared bulbs, or other types of light sources.
- step S206 it is determined, per step S206, if all assignable colors have been assigned to marker lights 108. If not, step S204 is executed again with a next marker light 108 and changed to a next available color which might be, for example, blue, since red and green have been assigned. If all assignable colors have been assigned to marker lights 108 the process proceeds to step S208.
- an exemplary list of assignable colors may comprise White (R,G,B), Red (R), Blue (B), Green (G), Yellow (R,G), Cyan (B,G), Magenta (R,B). This list of assignable colors is merely exemplary and color variations in-between the listed available colors are also possible.
- step S208 the process starts assigning colors to new unassigned marker lights 108 where the color has been previously assigned to at least one other marker light 108. As such, the system considers the distance from the new unassigned marker light 108 to the previously assigned marker lights 108 in making a color choice.
- a next unassigned marker light 108 is set to white and located in the tracking system. Subsequently its color is changed to be the same as whichever tracking marker, previously assigned with a color, is farthest from this next unassigned marker light 108.
- step S210 it is determined if all tracking marker lights 108 have been assigned a color. If not, step S208 is repeated until all marker lights 108 have been assigned a color. Otherwise, the process proceeds to cover dynamic color reassignment during operation of a game.
- step S212 whenever during a game a tracking marker 108 is determined to have moved within a specified minimum distance of another tracking marker 108 having the same light color, the color of one of the two tracking markers is changed to another color such that distances between markers having the same color is maximized.
- the specified distance may vary based on the size of the game arena.
- one of the tasks of the server architecture 110 is to keep track of all distances between tracking markers 108 having the same color, and compare those distances with the specified minimum distance.
- FIG. 3 depicts a system comprising a plurality of cameras, players, and controllers connected to a hierarchical server architecture, according to an exemplary embodiment.
- one bank of color cameras 302 connects with slave tracking server 306, while another bank of color cameras 304 connects with slave tracking server 308.
- Positions and movements of game objects tracked by slave tracking servers 306 and 308 are consolidated in master server 310 which may optionally have one or more local cameras 312 connected to it.
- master server 310 may optionally have one or more local cameras 312 connected to it.
- calibration of tracking marker 108 positions may be performed locally on the server(s) observing that tracking marker.
- the number of slave tracking servers and master server depicted in FIG. 3 is merely exemplary and not limited thereto.
- the functionality of the slave tracking server and the master tracking server may be combined into a single server, according to an exemplary embodiment.
- a slave tracking server such as 306 or 308 receives an image, they immediately process the image to identify any tracking markers in the optical data of the image.
- the slave tracking server 308 immediately sends the processed data to the master server 310 and performs no further processing on that particular image, according to an exemplary embodiment. This may include identifying a pixel row and column location of the tracking marker 108, including with a time stamp camera identification.
- Master server 310 interfaces with game server 314 which communicates wirelessly 316 with players 106 and other devices 318 which may include for example any of controller devices including simulated weapons, according to one exemplary embodiment. The communication may even be conducted via a wired connection, according to another exemplary embodiment.
- the Master server 310 collects all the processed data from both local cameras 312 and slave servers 306 and 308. It continues to store all this information until it has a complete set of data from each camera in the system or until it receives repeated data from the same camera. Once the data set is considered complete, it performs the next stage of processing on each individual camera image to create a list of all the intersections of the data points from the cameras where the tracking marker is a match. Positions of these intersection points are then averaged out to create the final processed position for each tracking marker. Where not enough information is available to create an accurate intersection or the information conflicts within a threshold, the information may be optionally discarded.
- FIG. 4 depicts a flowchart illustrating this process for synchronizing a plurality of cameras with consistent and accurate locations of objects, according to an exemplary embodiment.
- step S402 tracking markers in the space are located using cameras 302 and 304 communicating with slave servers 306, 308.
- step S404 positions of tracking markers are communicated from the various slave servers 306, 308 to master server 310.
- step S406 a process operating on the master server creates a list of all intersection points where a position of a first marker seen by one camera matches a position of a second marker seen by another camera.
- step S408 for each intersection point in the list of intersection points, the positions of the first and second tracking markers are averaged to create a processed position for that intersection point, and represents a position of a composite tracking marker corresponding to both the first and second tracking markers that will be used thenceforth in operation of the game.
- the master server 310 and the slave servers 306,308 are exemplary embodiment forming part of the hierarchy server where the master server 310 may have unidirectional control over the slave servers 306, 308.
- the master and the slave servers may be incorporated into a single server which performs the below defined functions of both the master server 310 and the slave server 306, 308, according to an exemplary embodiment.
- FIG. 5 depicts a system performing an initial calibration using a calibration object 502.
- one exemplary calibration configuration involves laying out a 1 meter grid on the flat ground. This could be achieved using masking tape or other available means. This grid is visible to the cameras. The grid serves as a guide for where the virtual space will be defined, and a center point is chosen in the room to be the center in the virtual space (x:0, y:0, z:0).
- a calibration device 502 is placed on the 1 -meter grid.
- One exemplary configuration for the calibration device is an L-shape (90 degree angle) with arms each measuring 1 meter long (dimension "d" 504), with a colored ball or calibration marker light 506 at each end of the arm and also at the center.
- the length mentioned above is merely exemplary and a different shape and size of calibration device with a different number of marker lights 506 may be used.
- These colored balls or calibration marker lights 506 may be powered, and set to fixed colors.
- An exemplary configuration would include Green in the center, Blue on one arm and Red on the other, however different colors may be used.
- the calibration software can automatically detect the location of the cameras which can see all 3 colored markers on the calibration device.
- the detected orientation of the calibration device is converted into a position and orientation for the camera.
- step S602 describes placing a calibration object in a space so it is visible to a plurality of one or more color cameras, the calibration object comprising at least three colored calibration marker lights 506 mounted in a specified orientation on the calibration object, and wherein the calibration object is placed in the space in a specified orientation relative to the one or more cameras.
- step S604 for each camera, a position of each of the calibration marker lights is determined in a captured image, and each of these positions is converted to vectors relative to a zero origin.
- the vectors are analyzed to determine a best fit for the position of each calibration marker light and the detected orientation of the calibration object is converted into a position and orientation respective of the camera for use thenceforth in operation of the game.
- the detected calibration for the camera can be validated by the operator as the system may also draw a dotted line over the video feed to show where it believes the grid on the floor should be. In the instance where the calibration device is not available, the cameras may be configured manually using the dotted line overlay. All camera calibration data is then stored on the tracking system server that the cameras are connected to (be it a slave server, master server or a combination of the two).
- FIG. 7 depicts a block diagram of a gaming system 700, according to another exemplary embodiment.
- the system 700 includes Cameras 702 and 704 and VR server 724.
- the cameras, 702 and 704 may be capable of accessing the VR server 724 either directly or indirectly over a network 714.
- the cameras, 702 and 704, may access the VR server 724 over the network 714 using wireless or wired connections supporting one or more point-to-point links, shared local area networks (LAN), wide area networks (WAN), or other access technologies.
- LAN local area networks
- WAN wide area networks
- These cameras 702 and 704 may be transmitting video, audio or other kinds of data to the VR server 724.
- the VR system 700 is a type of system that provides tracking of marker lights on players or their controllers or other game objects using cameras 702 and 704 using storage devices 728, 730 and multiple processors 718.
- alternate embodiments of the VR system 700 may use a single processor and storage device and the depicted embodiment is merely exemplary.
- FIG. 7 depicts a single server 724, the VR system may comprise multiple servers splitting up the functionalities which are performed by the depicted server 724, as described in the exemplary embodiment of FIG. 1.
- FIG. 7 depicts a single server 724
- the VR system may comprise multiple servers splitting up the functionalities which are performed by the depicted server 724, as described in the exemplary embodiment of FIG. 1.
- the VR server 724 may receive the location information and other action/state information regarding a user holding a controller, colors assigned to the tracking lights on the controller or other game objects etc. in a space using the cameras 702 and 704.
- the VR server 724 may be realized as a software program stored in a memory and executing on a central processing unit (CPU).
- the VR server 724 may use video images from the tracking cameras 702,704.
- the VR server 724 receives video images over video cables connected to the cameras, however the images may be transferred wirelessly.
- Possible video cable types include analog formats such as composite video, S-Video and VGA; and digital formats such as HDMI and DVI, however these are mere exemplary embodiments and the possibilities are not limited thereto.
- the slave server receives video images over a wireless communication connection.
- the VR server 724 may follow the procedures described in FIG. 2, FIG. 4 and FIG. 6 for assignment and reassignment of colors to the tracking marker lights, and synchronization and calibration of the cameras 702,704.
- the present disclosure emulates a real-world experience for players, and as such the experience players have is quite real, just as pilots in flight simulators experience all the aspects of flying a real airplane.
- the disclosure is deeply intertwined with computer and networking technology without which it would not be possible.
- the functions described herein have an extremely time-sensitive nature to their operation in order to achieve a true virtual reality experience, and without an intimate integration with the described hardware would not function properly, if at all.
- the dynamic reassignment of colors during a game based on changing distances between light markers having the same color is a function grounded in reality.
- the use of a physical calibration device to calibrate distances for each camera as well as the process for synchronizing positions among a plurality of cameras, are all concrete functionalities.
- a further embodiment of the invention deals with the identification and de-emphasis of problematic or not well calibrated cameras that are incapable of optimum identification and position tracking.
- the impact of these identified cameras on a virtual reality game is reduced on the fly without disturbing operation of a game in progress. Most of the time, a plurality of cameras will see an object, and images from those cameras are mixed to identify the object and its location. When such mixing of images happens, identification of non-optimum or not well calibrated cameras is enabled and performed during the game. The impact of such cameras is then dynamically de-emphasized or reduced with respect to their impact in the game play, providing higher emphasis to images from well calibrated cameras. As such, game stoppage for re-calibration of these few cameras is not required for continuous game play.
- Camera calibration includes where the system believes the camera is situated in the physical space and its orientation. This is simply stored as X, Y, Z coordinates and Roll, Pitch and Yaw (Euler angle). A bad calibration is where any of these values are incorrect for the camera which can be caused by anything from a hard physical knock or even tiny vibrations, causing the camera to move slightly.
- a camera cannot be used at all in the system if it is not calibrated. The system typically ignores any camera which is not yet calibrated. A non-calibrated camera will have all its values set to 0, a non-optimum camera will have values but they won't be quite right. In an instance where only poor camera data is available for all cameras that see a tracked object, it means that there is no good tracking data available to improve the result. In this instance tracking quality and user experience will degrade as the tracking won't be optimum.
- FIG. 8 shows a plurality of cameras including camera pair 804 and 806 which observe trackable object 802.
- Vector lines 808 and 810 emanating respectively from each camera intersect in the vicinity of trackable object 802.
- FIG. 9 shows the camera pair 804/806 of figure 8 and trackable object 802.
- FIG. 9 focuses on the intersection of vector lines 808 and 810 as they intersect - or come close to intersecting - in the vicinity of trackable object 802.
- vector lines 808 and 810 appear to be intersecting, while a cross section 904 taken vertically into the page reveals a side-view cross-section diagram 906 where in fact vector lines 808 and 810 do not intersect.
- vector lines 808 and 810 come within a distance of intersecting that is hereinafter called an Intersect Error Margin 908.
- an Intersect Error Margin 908 In determining camera accuracy and eventually which cameras should be de-emphasized because of inferior accuracy, it is this intersect error margin that is determined among pairs of cameras that will be the determining factor.
- the following process describes how data is mixed to get position information.
- the process begins by getting the first/next trackable object in the list.
- the process continues by going through all the camera results to find all result data that matches that trackable object (as identified by the camera's ID).
- all the matching results are looped through, comparing each result as a pair to find where those two results intersect in space. This is done by calculating a line from each camera to the point in space where they detected the trackable marker. As the lines from each will never intersect exactly, the nearest points along both lines are calculated and the process then finds the middle of those points. The error margin at this stage is simply calculated as the distance between those two points.
- the intersect is discarded.
- the bad intersect is also logged to help identify poor calibration.
- Each intersect is then stored in a list along with the detected error margin. The list is then processed to find the final position result. Each result from the list is appended to the final result based on how close it is to perfect accuracy.
- NewPosition Vector3(0,0,0);
- NewPosition / PositionWeight
- the final processed position is then applied to the trackable marker/object.
- the same processed is repeated from the start if the trackable object also has its secondary marker active.
- a secondary marker on an object also allows rotational position sensing (orientation) for the object when the position of the secondary market is compared with the primary marker on the object. Finally, the process repeats from the start for the next trackable object.
- the system has a global variable set which is the 'Tracking Error Margin'. This variable represents the maximum allowable error for the "Intersect Error Margin" for any detected intersection. For example, assume the error margin is set to ⁇ . ⁇ , aka lOcms.
- the system starts with an empty Object Position Vector (representing the 3 axis, X, Y and Z in 3D space) called 'NewPosition', and the 'Total PositionWeight' starts at ' ⁇ '.
- 'NewPosition' an empty Object Position Vector (representing the 3 axis, X, Y and Z in 3D space)
- 'Total PositionWeight' starts at ' ⁇ '.
- the weight for the intersect is calculated as 'TrackingErrorMargin - IntersectErrorMargin (the Intersect's detected error) divided by the TrackingErrorMargin. This results in a value which is much higher, the lower the detected error margin is. Therefore, objects with a very low error margin are given much more weight in the position calculation for the object.
- An object's position is then multiplied by the Weight, and added to the 'New Object Position Vector.” The Weight of this calculation is then added to the 'Total Position Weight' for later use. Once all intersects have been added together in this way, the process divides the final 'Object Position Vector' by the final 'Total Position Weight' which will result in a position accurate to the object's real world position as determined by the weighting given to each position.
- a simple example showing how data is mixed to get position information follows: • Assume a system is tracking 2 objects, defined as object 1 and 2. Cameras A, B and C are tracking the objects.
- Camera Position and Result Direction (result direction is the direction from the camera position to the tracked object). This provides an origin and a vector to help determine the object's position.
- step 1002 a first trackable object is located from a plurality of cameras. Then within that plurality of cameras, pairs of cameras that observe the object are identified per step 1004. In step 1006, for each pair of cameras that observe the object, a line (vector) is traced from each camera to the object. In step 1008 a point is determined along each vector of the two vectors where the points on the two vectors come closest to intersecting. The distance between the two points is then recorded as an Intersect Error Margin for that intersect. Per step 1010, a list of all intersects and associated Intersect Error Margins is created relative to the object.
- FIG. 11 depicts the detailed process for evaluating one or more pairs of cameras that view each object, and as a result determines a final position for the object while de-emphasizing cameras that are poorly calibrated with respect to position.
- the process starts 1102 for each object and then locates the object 1104 with two or more cameras to establish intersections for each pair of vectors that position the object.
- the values for "Object Position Vector” and "Total Position Weight” are set to zero.
- a process is executed for each intersection to identify 1108 the "Intersect Error Margin" for that intersection.
- a Weight 1110 is computed for the intersection according to the following equation:
- Track Error Margin is a system-wide parameter representing the maximum allowable error margin for intersections of vectors tracking any object.
- a position corresponding to the intersecting vectors is multiplied 1112 by the Weight value, and added to the previous value of the Object Position Vector to create a new value for the Object Position Vector.
- the final position is equal to the final Object Position Vector, after all intersects have been processed, divided by the Total Position Weight.
- the Total Position Weight is the result of step 1114 where the Weight value computed in step 1110 for each intersection has been accumulated over the processing of all intersections related to the object, to create a value for the Total Position Weight.
- the embodiments disclosed herein can be implemented as hardware, firmware, software, or any combination thereof.
- the software is preferably implemented as an application program tangibly embodied on a program storage unit or computer readable medium.
- the application program may be uploaded to, and executed by, a machine comprising any suitable architecture.
- the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPUs"), one or more memories, and one or more input/output interfaces.
- the computer platform may also include an operating system and micro-instruction code.
- the various processes and functions described herein may be either part of the micro-instruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such computer or processor is explicitly shown.
- various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit.
- the processes are implemented as a control system which can reside within the hierarchical server architecture shown in Figure 8; however, it will be appreciated that the control system may be located on a server in a remote location, at a location in the Cloud, or distributed among a combination of the above locations, including being distributed among a plurality of servers within the hierarchical server architecture. It will be appreciated that the control system can be located anywhere as long as it has access to the data it needs to perform the processing.
- the control system typically contains a processor or CPU and memory, as discussed above.
- the memory of the control system contains
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Processing Or Creating Images (AREA)
Abstract
Dans un système pour mettre en oeuvre un jeu ou un environnement de réalité virtuelle, l'invention concerne un procédé permettant d'identifier des caméras problématiques ou mal étalonnées qui sont incapables d'identifier et de suivre de manière optimale des objets de jeu. L'impact des caméras identifiées sur le jeu est réduit à la volée. La plupart du temps, quelques caméras voient un objet. Les images sont mélangées pour identifier l'objet et son emplacement selon des vecteurs établis entre des caméras et des objets traçables. Lorsqu'un tel mélange d'images se produit, l'identification de caméras non étalonnées non optimales est activée et mise en oeuvre. L'impact de telles caméras est ensuite réduit sur l'impact global du jeu, ce qui donne davantage d'importance aux images provenant des caméras bien étalonnées. De cette manière, il n'est pas nécessaire d'arrêter le jeu et de réétalonner ces quelques caméras pour permettre un jeu en continu.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201662379049P | 2016-08-24 | 2016-08-24 | |
| US62/379,049 | 2016-08-24 | ||
| US15/362,611 US10486061B2 (en) | 2016-03-25 | 2016-11-28 | Interference damping for continuous game play |
| US15/362,611 | 2016-11-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018037269A1 true WO2018037269A1 (fr) | 2018-03-01 |
Family
ID=61246582
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2016/057845 Ceased WO2018037269A1 (fr) | 2016-08-24 | 2016-12-21 | Atténuation d'interférences permettant un jeu en continu |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018037269A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10071306B2 (en) | 2016-03-25 | 2018-09-11 | Zero Latency PTY LTD | System and method for determining orientation using tracking cameras and inertial measurements |
| US10421012B2 (en) | 2016-03-25 | 2019-09-24 | Zero Latency PTY LTD | System and method for tracking using multiple slave servers and a master server |
| US10430646B2 (en) | 2016-03-25 | 2019-10-01 | Zero Latency PTY LTD | Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects |
| US10486061B2 (en) | 2016-03-25 | 2019-11-26 | Zero Latency Pty Ltd. | Interference damping for continuous game play |
| US10717001B2 (en) | 2016-03-25 | 2020-07-21 | Zero Latency PTY LTD | System and method for saving tracked data in the game server for replay, review and training |
| US10751609B2 (en) | 2016-08-12 | 2020-08-25 | Zero Latency PTY LTD | Mapping arena movements into a 3-D virtual world |
-
2016
- 2016-12-21 WO PCT/IB2016/057845 patent/WO2018037269A1/fr not_active Ceased
Non-Patent Citations (3)
| Title |
|---|
| A RAR, N. M. ET AL.: "Estimating Fusion Weights of a Multi-Camera Eye Tracking System by Leveraging User Calibration Data", PROC. ETRA '16 PROCEEDINGS OF THE NINTH BIENNIAL ACM SYMPOSIUM ON EYE TRACKING RESEARCH & APPLICATIONS, March 2016 (2016-03-01), pages 225 - 228, XP058079672 * |
| EHRL, J. ET AL.: "A Reliability Measure for Merging Data from Multiple Cameras in Optical Motion Correction", PROC. ISMRM SCIENTIFIC WORKSHOP - MOTION CORRECTION IN MRI, 20 July 2014 (2014-07-20) * |
| MANNBERG, M. ET AL.: "High Precision Real-time 3D Tracking Using Cameras", INFOTECH@AEROSPACE 2011, March 2011 (2011-03-01), pages 1 - 11 * |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10071306B2 (en) | 2016-03-25 | 2018-09-11 | Zero Latency PTY LTD | System and method for determining orientation using tracking cameras and inertial measurements |
| US10421012B2 (en) | 2016-03-25 | 2019-09-24 | Zero Latency PTY LTD | System and method for tracking using multiple slave servers and a master server |
| US10430646B2 (en) | 2016-03-25 | 2019-10-01 | Zero Latency PTY LTD | Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects |
| US10486061B2 (en) | 2016-03-25 | 2019-11-26 | Zero Latency Pty Ltd. | Interference damping for continuous game play |
| US10717001B2 (en) | 2016-03-25 | 2020-07-21 | Zero Latency PTY LTD | System and method for saving tracked data in the game server for replay, review and training |
| US10751609B2 (en) | 2016-08-12 | 2020-08-25 | Zero Latency PTY LTD | Mapping arena movements into a 3-D virtual world |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10486061B2 (en) | Interference damping for continuous game play | |
| US10430646B2 (en) | Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects | |
| WO2018037269A1 (fr) | Atténuation d'interférences permettant un jeu en continu | |
| US10717001B2 (en) | System and method for saving tracked data in the game server for replay, review and training | |
| US10421012B2 (en) | System and method for tracking using multiple slave servers and a master server | |
| CN101116101B (zh) | 位置/姿势测量方法和设备 | |
| US10071306B2 (en) | System and method for determining orientation using tracking cameras and inertial measurements | |
| CN106803271B (zh) | 一种视觉导航无人机的摄像机标定方法及装置 | |
| EP2728548B1 (fr) | Cadre automatisé d'étalonnage de référence à réalité amplifiée | |
| CN106170676B (zh) | 用于确定移动平台的移动的方法、设备以及系统 | |
| CN102257456B (zh) | 校正跟踪系统中的角度误差 | |
| CN105320271A (zh) | 利用直接几何建模的头戴式显示器校准 | |
| KR20200100102A (ko) | 증강 현실 디바이스를 교정하기 위한 방법 | |
| KR102369989B1 (ko) | 적외선 이미징을 사용한 색상 식별 | |
| JP2011179908A (ja) | 3次元計測装置、その処理方法及びプログラム | |
| CN102257830A (zh) | 具有最少用户输入的跟踪系统校准 | |
| CN106546230B (zh) | 定位点布置方法及装置、测定定位点三维坐标的方法及设备 | |
| Sušanj et al. | Effective area coverage of 2D and 3D environments with directional and isotropic sensors | |
| JP5030953B2 (ja) | 第2の対象物に対する第1対象物の相対位置を決定する方法及びシステム及び、対応するコンピュータプログラム及び対応するコンピュータ可読記録媒体 | |
| JP2021518953A (ja) | ナビゲートする方法及びシステム | |
| CN109933081A (zh) | 无人机避障方法、避障无人机以及无人机避障装置 | |
| JP2005270484A (ja) | キャリブレーション方法 | |
| CN110455292A (zh) | 飞行轨迹确定方法、设备、飞行轨迹推演系统及存储介质 | |
| CN102591456B (zh) | 对身体和道具的检测 | |
| CN118302717A (zh) | 经由动态视角的相机跟踪 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16914106 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16914106 Country of ref document: EP Kind code of ref document: A1 |