CN113680059B - Outdoor scene AR game positioning device and method - Google Patents
Outdoor scene AR game positioning device and method Download PDFInfo
- Publication number
- CN113680059B CN113680059B CN202111010213.9A CN202111010213A CN113680059B CN 113680059 B CN113680059 B CN 113680059B CN 202111010213 A CN202111010213 A CN 202111010213A CN 113680059 B CN113680059 B CN 113680059B
- Authority
- CN
- China
- Prior art keywords
- game
- luminous
- camera
- targets
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/02—Non-photorealistic rendering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/10—Image enhancement or restoration using non-spatial domain filtering
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
- G06T5/30—Erosion or dilatation, e.g. thinning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/75—Determining position or orientation of objects or cameras using feature-based methods involving models
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20061—Hough transform
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention belongs to the technical field of AR, and particularly relates to an outdoor scene AR game positioning device and method. The device comprises a game instruction server, a color-changing controller and a target object. The positioning system can simply and efficiently calculate the corresponding positions of the AR game machine camera and the player in the game scene, has lower comprehensive cost and higher algorithm precision, and can be popularized and used in society.
Description
Technical field:
the invention belongs to the technical field of AR, and particularly relates to an outdoor scene AR game positioning device and method.
The background technology is as follows:
The augmented reality technology (Augmented Reality, abbreviated as AR) is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and the goal of the technology is to fit a virtual world around the real world on a screen and interact with the virtual world. This technique was proposed in 1990. Current AR technology implementations rely primarily on AR devices, which currently fall into two categories: hand-held AR device: the principle of the handheld AR device represented by the ARKit development platform of apples and ARCode of android is that the real world is recorded through a camera, virtual objects are mixed through an algorithm, and finally the mixed results are displayed through a screen. Head-mounted AR device: as represented by microsoft holonens, it is common for players to see the real world through glasses, and the system projects virtual articles directly onto the lenses, and finally mixed imaging.
On existing AR gaming machine devices, three ways of locating the device are generally used:
1、Marker-Based AR
2、Marker-Less AR
3、LBS-Based AR
The shortcomings of these three positioning modes are analyzed as follows:
1、Marker-Based AR
the realization method needs a Marker (for example, a template card or a two-dimensional code with a certain specification is drawn), then the Marker is placed at a position in reality, which is equivalent to determining a plane in a real scene, then the Marker is identified and posture evaluated by a camera (Pose Estimation), the position of the Marker is determined, then a coordinate system with the center of the Marker as an origin is called Marker Coordinates, namely, a template coordinate system, what we do is to actually obtain is to obtain a transformation so that the template coordinate system and a screen coordinate system establish a mapping relation, so that the effect that the graph drawn on a screen according to the transformation is attached to the Marker can be achieved, the principle of understanding the graph needs knowledge of a 3D projection geometry, and the transformation from the template coordinate system to the real screen coordinate system needs to be rotationally translated to a camera coordinate system (Camera Coordinates) and then mapped from the camera coordinate system to the screen coordinate system.
Disadvantages: in a game scene, such markers are obtrusive and do not blend well with the surrounding environment.
2、Marker-Less AR
The basic principle is the same as that of the Marker based AR, but any object with enough characteristic points (such as a book cover) can be used as a plane reference, a special template is not required to be manufactured in advance, and the constraint of the template on AR application is eliminated. The principle is that feature points are extracted from a template object through a series of algorithms (such as SURF, ORB, FERN, etc.), and the feature points are recorded or learned. When the camera scans the surrounding scene, the feature points of the surrounding scene are extracted and compared with the feature points of the recorded template object, if the matching quantity of the scanned feature points and the template feature points exceeds a threshold value, the template is considered to be scanned, then a Tm matrix is estimated according to the corresponding feature point coordinates, and then graph drawing is carried out according to Tm (the method is similar to Marker-Based AR).
Disadvantages: the environment needs to be analyzed, and characteristic points of surrounding scenes are extracted, and are easily influenced by factors such as weather, illumination, reconstruction, decoration and the like.
3、LBS-Based AR
The basic principle is that the geographic position of a player is obtained through GPS, POI information of objects (such as surrounding restaurants, banks, schools and the like) nearby the position is obtained from certain data sources (such as wiki, google) and the like, the direction and the inclination angle of the handheld device of the player are obtained through an electronic compass and an acceleration sensor of the mobile device, a plane reference (equivalent to a Marker) of a target object in a real scene is established through the information, and the principle of coordinate transformation display and the like is similar to that of a Marker-Based AR.
The AR technology is realized by utilizing the GPS function and the sensor of the equipment, the dependence of application on a Marker is eliminated, the player experience is better than that of the Marker-Based AR, and the performance is better than that of the Marker-Based AR and the Marker-Less AR because the Marker gesture and the calculation characteristic point are not required to be recognized in real time, so that the LBS-Based AR can be better applied to the mobile equipment compared with the Marker-Based AR and the Marker-Less AR.
Disadvantages: positioning accuracy is inferior to the former two methods, and a huge external database (wiki, google, etc.) is required, and the computational task is heavy due to the complexity of nearby objects (the problem of information synchronization of the database and the real object).
However, the existing positioning method has higher computational complexity and higher requirements on environment and illumination conditions. There is no device and method for realizing self-positioning by simple algorithm in the data processor of AR game machine.
The invention comprises the following steps:
According to the invention, by setting the luminous target object in an outdoor scene, the calculation amount required by the camera positioning in the AR game is greatly simplified, and thus the positioning calculation is realized through a simple algorithm.
An outdoor scene AR game positioning device comprises a game instruction server, a luminous target object and a remote controller capable of controlling the luminous target object.
The game command server controls the remote controller through the wireless network, and the remote controller controls the luminous color and the brightness of the luminous target object through the wireless network. The game command server wirelessly transmits the luminous characteristics of the luminous target object and the position signals in the game scene to the AR game machine, and the AR game machine calculates the position of the player according to the characteristics in the captured image of the entity camera.
The luminous target object is arranged in an outdoor scene and consists of a base, a supporting part and a lamplight part; the base is connected with the lamplight part through the supporting part; an energy supply component is arranged in the base, and a wireless transceiver module is arranged in the luminous target object. Preferably: the base can be made of corrosion-resistant materials or low-cost materials such as reinforced cement; the base can be placed on the ground and/or buried underground, so that the whole body is firmer and more stable.
The light part is a lamp component capable of emitting more than two colors, and the two colors are non-approximate colors. Non-approximated colors refer to colors such as red, yellow, blue, and green, which differ greatly in contrast from each other. Preferably: the lamplight part can be arranged into a vertical column shape, a spherical shape, an elliptic shape or the like, so that a computer can conveniently extract effective lines from an image through Hough transformation. The method has the advantages that obvious image characteristics are presented in the scene image which is captured by the entity camera and contains the lighted luminous target object, so that the accuracy of the identification of the luminous target object is improved.
The top of the lamplight part is provided with a solar energy and/or wind energy generating module, the energy supply component is a storage battery and/or a charge-discharge battery pack, and the solar energy and/or wind energy generating module is connected with the charge-discharge battery pack. The purpose is to be able to store excess electrical energy for use when the solar and/or wind energy generation is insufficient.
An outdoor scene AR game positioning method is characterized in that a player is provided with AR equipment, wherein the AR equipment consists of an entity camera, a display screen, an acceleration sensor and a data processor; the AR equipment establishes signal communication with the game instruction server through wireless;
s1, a player enters a game area of an outdoor scene, and a game instruction server lights a luminous target object of the game area through controlling a remote controller, wherein the luminous target object shows a designated color;
S2, collecting scene images containing lighted luminous targets in the game through the entity camera, filtering the images according to color filters appointed by the game command server, and identifying the luminous targets; extracting lines of a specified type from the scene image by using Hough transformation; processing the effective line extracted from the Hough transformation to form a continuous line; the lines of the specific color extracted by the hough transform may be discontinuous due to the reason that the leaves or other objects block the target objects, and the discontinuous lines can be connected into continuous lines through an expansion corrosion algorithm.
S3, collecting length and position data of the effective lines, and obtaining virtual camera position coordinates in the AR game through calculation processing of a data processor;
S4, after the position of the virtual camera is obtained, a rendering picture of the 3D world is obtained through the 3D game engine, the background is kept transparent, and then the rendering picture of the 3D world is fused with a picture shot by the AR camera, so that an AR picture can be synthesized and displayed on a display screen of a player.
The number of the lighted objects in the same game scene is not less than 3, and the lighted objects are not on the same plane; the problem of data errors caused by mutual shielding among luminous targets due to the problem of visual angles is avoided.
If the camera recognizes that the luminous targets are smaller than 3 or larger than 5, re-recognizing;
If the cameras recognize that the luminous targets are equal to 3 or 4 or 5, taking the consistent value as a calculation result according to the positions of the cameras calculated by the luminous targets between every two cameras; if the numerical deviation is large, the outlier is removed, and then the average position is taken as a calculation result.
The entity camera identifies 3 luminous targets simultaneously, namely a luminous target object a, a luminous target object b and a luminous target object C, and the position coordinate of the virtual camera is set to be (C x,Cz), wherein the distance between the luminous target object a and the luminous target object b is 2n, the distance between the entity camera and the luminous target object a is d a, and the distance between the entity camera and the luminous target object b is d b, then the position coordinate of the virtual camera can be obtained:
Or/>
Wherein, the solution is that two values of C z are obtained, one positive value is negative, and the positive value is discarded to take the negative value, so as to obtain the position coordinate of the virtual camera (C x,Cz);
Two different light-emitting targets a and c are arranged, the heights of the top parts or the centers of the light-emitting targets a and c to the horizontal ground reference surface are respectively H a and H c, the distances from the light-emitting targets a and c to the physical camera are respectively D a and D c, the height difference of the top ends of the light-emitting targets a and c on the imaging of the light-emitting target image is dh prj;dscn, which is a virtual distance from the virtual camera to the virtual screen, and is related to the screen resolution and the design of the 3D scene, and is a known invariant;
to sum up, the position coordinate of the virtual camera is obtained as (C x,Cy,Cz).
The acceleration sensor collects acceleration values of a player in the advancing process, and fuses with computer vision positioning through inertial navigation, so that position coordinate information of a virtual camera in an AR game is corrected.
Of these, hough transform (Hough) is a very important method of detecting the boundary shape of a break point. It achieves the fitting of straight lines and curves by transforming the image coordinate space into the parameter space. In the step S2, after the required line is extracted from the target image, the boundary line is expanded by etching to form a continuous line. Corrosion expansion is a term of morphological image processing, and is characterized in that corrosion is performed to perform contraction or thinning operation on the basis of a binary image, and expansion is performed to perform lengthening or thickening operation on the basis of the binary image.
Corrosion is a process of eliminating boundary points and causing the boundary to shrink inward. Can be used to eliminate small and meaningless objects. Each pixel of the image is scanned with a 3X3 structuring element, and the structuring element and its overlaid binary image are anded, with the result that the pixel of the image is 1 if both are 1, or 0 if not, with the result that the binary image is one turn smaller.
Inflation is the process of incorporating all background points in contact with an object into the object, expanding the boundary outward. Can be used to fill voids in objects. Each pixel of the image is scanned with a 3X3 structuring element, and the structuring element and its overlaid binary image are anded with 0 if both result in 0 for that pixel of the image, and 1 otherwise. As a result, the binary image is enlarged one turn.
The process of etching followed by expansion is called open operation. Is used to eliminate small objects, separate objects at the slim points, smooth the boundary of larger objects, and does not significantly change its area. The process of expanding and then etching is called a ratio operation, and is used for filling a small space in an object, connecting adjacent objects, and smoothing the boundary of the objects without obviously changing the area of the objects.
Where D scn is the virtual camera to virtual screen distance, which is a virtual distance, which is a known invariant related to screen resolution and 3D scene design; the constant can be calculated by the technical data of a camera, and can also be calculated by photographing a standard object: for example, after the physical camera shoots a rod with a length h obj on the origin of the coordinate, the length h prj of the rod on the photo image is recorded, and the value of d scn is obtained by the following formula:
Thereby obtaining the distance from the virtual camera to the target object:
Wherein: let the coordinates of object a in the virtual game space be (-n, 0), the coordinates of object b in the virtual game space be (n, 0), the physical distance of the camera from object a be d a, the physical distance of the camera from object b be d b, the position coordinates of the camera be (C x,Cz),
Or/>
Wherein the solution results in two values of C z, one positive and one negative, and the positive value is discarded to take the negative value.
Finally, after the virtual camera position is obtained, a rendering picture of the 3D world is obtained through the 3D game engine, other backgrounds except for AR luminous targets are kept transparent, and then the rendering picture of the 3D world and a picture shot by the AR camera are fused, so that an AR picture can be synthesized and displayed on a display screen of a player.
The invention has the beneficial effects that: the positioning system can simply and efficiently calculate the corresponding positions of the AR game machine camera and the player in the game scene, has lower comprehensive cost and higher algorithm precision, and can be popularized and used in society.
Description of the drawings:
FIG. 1 is a simulated view of a game scenario of the present invention;
FIG. 2 is a view of a target object captured in a current game scene area according to the present invention;
Fig. 3 illustrates the present invention using hough transform to obtain the desired lines;
Fig. 4 is a continuous line formed after hough transform processing according to the present invention;
FIG. 5 is a diagram of the present invention for locating an illuminated object in a physical camera;
FIG. 6 is a graphical representation of the determination of the distance of a virtual camera to a virtual screen in accordance with the present invention;
FIG. 7 is a diagram of the position coordinates (C x,Cz) of the virtual camera according to the present invention;
FIG. 8 is a diagram of determining a position coordinate C y of a virtual camera according to the present invention;
The specific embodiment is as follows:
The application will be further described with reference to the accompanying drawings and specific embodiments. It is to be understood that these examples are illustrative of the present application and are not intended to limit the scope of the present application. Further, it will be understood that various changes and modifications may be made by those skilled in the art after reading the teachings of the application, and equivalents thereof fall within the scope of the application as defined by the claims.
Example 1:
An outdoor scene AR game positioning device comprises a game instruction server, a luminous target object and a remote controller capable of controlling the luminous target object.
The game command server controls the remote controller through the wireless network, and the remote controller controls the luminous color and the brightness of the luminous target object through the wireless network. The game command server wirelessly transmits the luminous characteristics of the luminous target object and the position signals in the game scene to the AR game machine, and the AR game machine calculates the position of the player according to the characteristics in the captured image of the entity camera.
The luminous target object is arranged in an outdoor scene and consists of a base, a supporting part and a lamplight part; the base is connected with the lamplight part through the supporting part; an energy supply component is arranged in the base, and a wireless transceiver module is arranged in the luminous target object. The base is made of reinforced cement and buried underground, so that the whole body is firmer and more stable.
The light part is composed of red and green lights, and is arranged in a vertical column shape, so that a computer can conveniently extract effective lines from an image through Hough transformation. The method has the advantages that obvious image characteristics are presented in the scene image which is captured by the entity camera and contains the lighted luminous target object, so that the accuracy of the identification of the luminous target object is improved.
The top of the lamplight part is provided with a solar power generation module, the energy supply component is a charge-discharge battery pack, and the solar power generation module is connected with the charge-discharge battery pack. The purpose is to be able to store excess electrical energy for use when the solar energy power generation is insufficient.
Example 2:
An outdoor scene AR game positioning device comprises a game instruction server, a luminous target object and a remote controller capable of controlling the luminous target object.
The game command server controls the remote controller through the wireless network, and the remote controller controls the luminous color and the brightness of the luminous target object through the wireless network. The method has the advantages that obvious image features are displayed in the scene image which is captured by the camera and contains the lighted luminous target object, so that the accuracy of the luminous target object identified by a computer algorithm is improved. Meanwhile, the game command server wirelessly transmits the luminous characteristics of the luminous target object and the position signals in the game scene to the AR game machine, and the AR game machine calculates the position of the player according to the characteristics in the image captured by the camera.
The luminous target object is arranged in an outdoor scene and consists of a base, a supporting part and a lamplight part; the base is connected with the lamplight part through the supporting part; an energy supply component is arranged in the base, and a wireless transceiver module is arranged in the luminous target object. The base is made of corrosion-resistant materials and is placed on the ground.
The light part is formed by lamps capable of emitting red, yellow and blue colors, a large number of straight lines exist for buildings in urban scenes, and the light part is spherical, so that a computer can conveniently extract effective lines from images through Hough transformation.
The top of the lamplight part is provided with a wind power generation module, the energy supply assembly is a storage battery and a charging and discharging battery pack, and the wind power generation module is connected with the charging and discharging battery pack. The purpose is to be able to store excess electrical energy for use when the amount of wind energy production is insufficient.
Example 3:
The apparatus of example 1 or 2 was used; an outdoor scene AR game positioning method is characterized in that a player is provided with AR equipment, wherein the AR equipment consists of an entity camera, a display screen, an acceleration sensor and a data processor; the AR equipment establishes signal communication with the game instruction server through wireless;
S1, a player enters a game area of an outdoor scene, a game command server lights up a luminous target object of the game area through a control remote controller, and the luminous target object shows specified red and green colors;
S2, collecting scene images containing lighted luminous targets in the game through the entity camera, filtering the images according to color filters appointed by the game command server, and identifying the luminous targets; extracting lines of a specified type from the scene image by using Hough transformation; processing the effective line extracted from the Hough transformation to form a continuous line; the lines of the specific color extracted by the hough transform may be discontinuous due to the reason that the leaves or other objects block the target objects, and the discontinuous lines can be connected into continuous lines through an expansion corrosion algorithm.
S3, collecting length and position data of the effective lines, and obtaining virtual camera position coordinates in the AR game through calculation processing of a data processor;
S4, after the position of the virtual camera is obtained, a rendering picture of the 3D world is obtained through the 3D game engine, other backgrounds except AR luminous targets are kept transparent, and then the rendering picture of the 3D world and a picture shot by the AR camera are fused, so that an AR picture can be synthesized and displayed on a display screen of a player.
When the number of the lighted luminous targets in the same game scene is 4, taking the consistent value as a calculation result according to the positions of cameras calculated by the luminous targets between every two; if the numerical deviation is large, the outlier is removed, and then the average position is taken as a calculation result.
The entity camera identifies 3 luminous targets simultaneously, namely a luminous target object a, a luminous target object b and a luminous target object C, and the position coordinate of the virtual camera is set to be (C x,Cz), wherein the distance between the luminous target object a and the luminous target object b is 2n, the distance between the entity camera and the luminous target object a is d a, and the distance between the entity camera and the luminous target object b is d b, then the position coordinate of the virtual camera can be obtained:
Or/>
Wherein the number of the Cz values obtained by the solution is two, one positive value is positive and the other negative value is negative, and the position coordinates (C x,Cz) of the virtual camera are obtained;
Two different light-emitting targets a and c are arranged, the heights of the top parts or the centers of the light-emitting targets a and c to the horizontal ground reference surface are respectively H a and H c, the distances from the light-emitting targets a and c to the physical camera are respectively D a and D c, the height difference of the top ends of the light-emitting targets a and c on the imaging of the light-emitting target image is dh prj;dscn, which is a virtual distance from the virtual camera to the virtual screen, and is related to the screen resolution and the design of the 3D scene, and is a known invariant;
to sum up, the position coordinate of the virtual camera is obtained as (C x,Cy,Cz).
The acceleration sensor collects acceleration values of a player in the advancing process, and fuses with computer vision positioning through inertial navigation, so that position coordinate information of a virtual camera in an AR game is corrected.
Example 4:
Using the method in embodiment 2, n=10 (meters) was measured in the game scene, the position of object a was (-n, 0), and the position of object b was (n, 0).
Placing a standard rod with the height of 1 meter at the origin of a physical scene coordinate system (in the game virtual world, the position is also set as the origin of coordinates, and the coordinate axis direction of the game virtual world is the same as the coordinate axis direction of the physical scene), wherein h obj = 1 meter, the physical camera is separated from the standard rod d obj = 10 meters, the optical axis of the physical camera is aligned with the standard rod for shooting, the height h prj = 0.025 meter of the standard rod in an image is substituted into a formula
The method comprises the following steps:
i.e. d scn = 0.25 (meters), in the present system d scn is used as a constant and does not change after a single measurement.
Knowing that the lamplight portions of the entity luminous object a, the luminous object b and the luminous object c are all 0.8 meter, h a=hb=hc =0.8 meter, the heights on the image shot by the entity camera are ha prj =0.016 meter, hb prj =0.014 meter respectively, and the formula is substituted
The method comprises the following steps:
D a =12.5 (meters) and d b =14.3 (meters) are obtained.
Substitution formula
The method comprises the following steps:
yield C x = 1.2 (meter)
Substitution formula
The method comprises the following steps:
Or:
The results of the two formulas are the same, cz= + -8.88 (meters) is obtained, and a negative value is taken, so that the position coordinates (C x,Cz) of the Cz= -8.88 (meters) virtual camera are (1.2, -8.88).
Let another light-emitting object c be (-p, q), when setting the light-emitting object a, light-emitting object c, it is known that their specifications are the same, the height of the light portion tip from the ideal horizontal ground is also the same H c=Hb=Ha =3 meters, which is a known quantity, and the light-emitting object hc prj =0.0096 meters on the image taken by the physical camera. Substitution formula:
The method comprises the following steps:
D c =20.83 (meters) is obtained.
When the physical camera is placed horizontally, namely when the lens is not pitching or tilting, the image can be corrected by an image rotation method, and the direction of the gravitational acceleration g can be measured by means of an acceleration sensor, so that the horizontal plane of the camera is perpendicular to the g, and the camera is fused with the computer vision positioning through inertial navigation. At this time, the difference dh prj between the tip of the light-emitting object a and the tip of the light-emitting object c was measured as 0.012 m. Substitution formula:
The method comprises the following steps:
C y =1.5 (meters) is obtained, and finally the position coordinates (C x,Cy,Cz) of the virtual camera are (1.2,1.5, -8.88).
The acceleration sensor is used for correcting the position coordinate information of the virtual camera in the AR game by collecting acceleration values of a player in the advancing process and fusing the acceleration values with the visual positioning of the computer through inertial navigation.
Wherein, in the 3D virtual world, there is also a camera; the position of the camera relative to the 3D world determines the 'view angle' of the 3D world; the key point of the AR game is that the real world and the virtual world are overlapped, the player moves the AR game machine in the real world, and the data processor correspondingly moves the position of the virtual camera in the 3D virtual world, so that the visual angle of the virtual world is consistent with the visual angle of the real world.
The position of the camera in the physical world is equal to the position of the virtual camera in the virtual world, the two worlds are in an equivalent relationship, and the images can be overlapped. After the virtual camera position is obtained, a rendering picture of the 3D world is obtained through the 3D game engine, the background is kept transparent, and then the rendering picture of the 3D world is fused with a picture shot by the AR camera, so that an AR picture can be synthesized and displayed on a display screen of a player.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit and scope of the invention, and thus, it is intended that the present invention also includes such modifications and variations provided that they come within the scope of the positive equivalent techniques of the claims of the present invention.
Claims (7)
1. An outdoor scene AR game positioning method is characterized in that: the player is provided with AR equipment, and the AR equipment is composed of an entity camera, a display screen, an acceleration sensor and a data processor; the AR equipment establishes signal communication with the game instruction server through wireless;
s1, a player enters a game area of an outdoor scene, and a game instruction server lights a luminous target object of the game area through controlling a remote controller, wherein the luminous target object shows a designated color;
S2, collecting scene images containing lighted luminous targets in the game through the entity camera, filtering the images according to color filters appointed by the game command server, and identifying the luminous targets; extracting lines of a specified type from the scene image by using Hough transformation; processing the effective line extracted from the Hough transformation to form a continuous line;
s3, collecting length and position data of the effective lines, and obtaining virtual camera position coordinates in the AR game through calculation processing of a data processor;
S4, obtaining a rendering picture of the 3D world through a 3D game engine after obtaining the position of the virtual camera, keeping the background transparent, and fusing the rendering picture of the 3D world with a picture shot by the AR camera to synthesize an AR picture, and displaying the AR picture on a display screen of a player;
the number of the lighted objects in the same game scene is not less than 3, and the lighted objects are not on the same plane;
If the camera recognizes that the luminous targets are smaller than 3 or larger than 5, re-recognizing;
if the cameras recognize that the luminous targets are equal to 3 or 4 or 5, taking the consistent value as a calculation result according to the positions of the cameras calculated by the luminous targets between every two cameras; if the numerical deviation is large, taking the average position as a calculation result after removing the outlier;
the entity camera simultaneously recognizes 3 luminous targets, namely a luminous target a, a luminous target b and a luminous target C, and the position coordinate of the virtual camera is set as (C x,Cz),
The distance between the light emitting object a and the light emitting object b is 2n, the distance between the physical camera and the light emitting object a is d a, and the distance between the physical camera and the light emitting object b is d b, then the following can be obtained:
Or/>
Wherein the number of the Cz values obtained by the solution is two, one positive value is positive and the other negative value is negative, and the position coordinates (C x,Cz) of the virtual camera are obtained;
Two different light-emitting targets a and c are arranged, the heights of the top parts or the centers of the light-emitting targets a and c to the horizontal ground reference surface are respectively H a and H c, the distances from the light-emitting targets a and c to the physical camera are respectively D a and D c, the height difference of the top ends of the light-emitting targets a and c on the imaging of the light-emitting target image is dh prj;dscn, which is a virtual distance from the virtual camera to the virtual screen, and is related to the screen resolution and the design of the 3D scene, and is a known invariant;
to sum up, the position coordinate of the virtual camera is obtained as (C x,Cy,Cz).
2. The method of claim 1, wherein the acceleration sensor corrects the virtual camera position coordinate information in the AR game by collecting acceleration values of the player during the travel, and fusing the acceleration values with the computer vision positioning through inertial navigation.
3. A positioning device employing the method of claim 1, wherein the device comprises a game command server, a lighted target, and a remote control that can control the lighted target.
4. The apparatus of claim 3, wherein the game command server controls a remote controller through a wireless network, and the remote controller controls the light emitting color and brightness of the light emitting object through the wireless network.
5. The device according to claim 4, wherein the luminous object is arranged in an outdoor scene and consists of a base, a supporting part and a lamplight part; the base is connected with the lamplight part through the supporting part; an energy supply component is arranged in the base, and a wireless transceiver module is arranged in the luminous target object.
6. The device of claim 5, wherein the light portion is a light composition capable of emitting more than two colors, and the two colors are non-similar colors.
7. The device according to claim 6, wherein a solar energy and/or wind energy generating module is arranged at the top of the light part, the energy supply component is a storage battery and/or a charge-discharge battery pack, and the solar energy and/or wind energy generating module is connected with the charge-discharge battery pack.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111010213.9A CN113680059B (en) | 2021-08-31 | 2021-08-31 | Outdoor scene AR game positioning device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111010213.9A CN113680059B (en) | 2021-08-31 | 2021-08-31 | Outdoor scene AR game positioning device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113680059A CN113680059A (en) | 2021-11-23 |
CN113680059B true CN113680059B (en) | 2024-05-14 |
Family
ID=78584322
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111010213.9A Active CN113680059B (en) | 2021-08-31 | 2021-08-31 | Outdoor scene AR game positioning device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113680059B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115487493B (en) * | 2022-09-02 | 2025-06-13 | 湖南快乐阳光互动娱乐传媒有限公司 | A spatial positioning method, device and system |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103400409A (en) * | 2013-08-27 | 2013-11-20 | 华中师范大学 | 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera |
CN104436634A (en) * | 2014-11-19 | 2015-03-25 | 重庆邮电大学 | Real person shooting game system adopting immersion type virtual reality technology and implementation method of real person shooting game system |
WO2015096806A1 (en) * | 2013-12-29 | 2015-07-02 | 刘进 | Attitude determination, panoramic image generation and target recognition methods for intelligent machine |
CN104919507A (en) * | 2012-06-14 | 2015-09-16 | 百利游戏技术有限公司 | Systems and methods for augmented reality games |
CN106390454A (en) * | 2016-08-31 | 2017-02-15 | 广州麦驰网络科技有限公司 | Reality scene virtual game system |
WO2017029279A2 (en) * | 2015-08-17 | 2017-02-23 | Lego A/S | Method of creating a virtual game environment and interactive game system employing the method |
CN107833280A (en) * | 2017-11-09 | 2018-03-23 | 交通运输部天津水运工程科学研究所 | A kind of outdoor moving augmented reality method being combined based on geographic grid with image recognition |
CN107979418A (en) * | 2017-11-22 | 2018-05-01 | 吴东辉 | Determine that its client corresponds to the AR method and systems of id based on mobile phone flashlight |
CN108325208A (en) * | 2018-03-20 | 2018-07-27 | 昆山时记信息科技有限公司 | Augmented reality implementation method applied to field of play |
JP6410874B1 (en) * | 2017-05-30 | 2018-10-24 | 株式会社タカラトミー | AR video generator |
KR20190001348A (en) * | 2017-06-27 | 2019-01-04 | (주)셀빅 | Virtual reality·argumented reality complex arcade game system |
CN109840949A (en) * | 2017-11-29 | 2019-06-04 | 深圳市掌网科技股份有限公司 | Augmented reality image processing method and device based on optical alignment |
CN110187774A (en) * | 2019-06-06 | 2019-08-30 | 北京悉见科技有限公司 | The AR equipment and its entity mask method of optical perspective formula |
CN111192365A (en) * | 2019-12-26 | 2020-05-22 | 江苏艾佳家居用品有限公司 | Virtual scene positioning method based on ARkit and two-dimensional code |
WO2021102566A1 (en) * | 2019-11-25 | 2021-06-03 | Eidos Interactive Corp. | Systems and methods for improved player interaction using augmented reality |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9511291B2 (en) * | 2010-11-15 | 2016-12-06 | Bally Gaming, Inc. | System and method for enhanced augmented reality tracking |
US8401343B2 (en) * | 2011-03-27 | 2013-03-19 | Edwin Braun | System and method for defining an augmented reality character in computer generated virtual reality using coded stickers |
US9132342B2 (en) * | 2012-10-31 | 2015-09-15 | Sulon Technologies Inc. | Dynamic environment and location based augmented reality (AR) systems |
US9367961B2 (en) * | 2013-04-15 | 2016-06-14 | Tencent Technology (Shenzhen) Company Limited | Method, device and storage medium for implementing augmented reality |
CN109840947B (en) * | 2017-11-28 | 2023-05-09 | 广州腾讯科技有限公司 | Implementation method, device, equipment and storage medium of augmented reality scene |
-
2021
- 2021-08-31 CN CN202111010213.9A patent/CN113680059B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104919507A (en) * | 2012-06-14 | 2015-09-16 | 百利游戏技术有限公司 | Systems and methods for augmented reality games |
CN103400409A (en) * | 2013-08-27 | 2013-11-20 | 华中师范大学 | 3D (three-dimensional) visualization method for coverage range based on quick estimation of attitude of camera |
WO2015096806A1 (en) * | 2013-12-29 | 2015-07-02 | 刘进 | Attitude determination, panoramic image generation and target recognition methods for intelligent machine |
CN104436634A (en) * | 2014-11-19 | 2015-03-25 | 重庆邮电大学 | Real person shooting game system adopting immersion type virtual reality technology and implementation method of real person shooting game system |
WO2017029279A2 (en) * | 2015-08-17 | 2017-02-23 | Lego A/S | Method of creating a virtual game environment and interactive game system employing the method |
CN106390454A (en) * | 2016-08-31 | 2017-02-15 | 广州麦驰网络科技有限公司 | Reality scene virtual game system |
JP6410874B1 (en) * | 2017-05-30 | 2018-10-24 | 株式会社タカラトミー | AR video generator |
KR20190001348A (en) * | 2017-06-27 | 2019-01-04 | (주)셀빅 | Virtual reality·argumented reality complex arcade game system |
CN107833280A (en) * | 2017-11-09 | 2018-03-23 | 交通运输部天津水运工程科学研究所 | A kind of outdoor moving augmented reality method being combined based on geographic grid with image recognition |
CN107979418A (en) * | 2017-11-22 | 2018-05-01 | 吴东辉 | Determine that its client corresponds to the AR method and systems of id based on mobile phone flashlight |
CN109840949A (en) * | 2017-11-29 | 2019-06-04 | 深圳市掌网科技股份有限公司 | Augmented reality image processing method and device based on optical alignment |
CN108325208A (en) * | 2018-03-20 | 2018-07-27 | 昆山时记信息科技有限公司 | Augmented reality implementation method applied to field of play |
CN110187774A (en) * | 2019-06-06 | 2019-08-30 | 北京悉见科技有限公司 | The AR equipment and its entity mask method of optical perspective formula |
WO2021102566A1 (en) * | 2019-11-25 | 2021-06-03 | Eidos Interactive Corp. | Systems and methods for improved player interaction using augmented reality |
CN111192365A (en) * | 2019-12-26 | 2020-05-22 | 江苏艾佳家居用品有限公司 | Virtual scene positioning method based on ARkit and two-dimensional code |
Non-Patent Citations (2)
Title |
---|
商汤科技:面向增强现实的视觉定位技术的创新突破与应用;章国锋;;杭州科技;20191215(第06期);全文 * |
增强现实场景光源的实时检测方法和真实感渲染框架;姚远;朱淼良;卢广;;计算机辅助设计与图形学学报;20060820(第08期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN113680059A (en) | 2021-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110568447B (en) | Visual positioning method, device and computer readable medium | |
JP5075182B2 (en) | Image processing apparatus, image processing method, and image processing program | |
CN110443898A (en) | A kind of AR intelligent terminal target identification system and method based on deep learning | |
CN109520500A (en) | One kind is based on the matched accurate positioning of terminal shooting image and streetscape library acquisition method | |
WO2023280038A1 (en) | Method for constructing three-dimensional real-scene model, and related apparatus | |
CN110120099A (en) | Localization method, device, recognition and tracking system and computer-readable medium | |
CN111028358A (en) | Augmented reality display method and device for indoor environment and terminal equipment | |
CN110858414A (en) | Image processing method and device, readable storage medium and augmented reality system | |
CN110634138A (en) | Bridge deformation monitoring method, device and equipment based on visual perception | |
CN112348886A (en) | Visual positioning method, terminal and server | |
CN110119190A (en) | Localization method, device, recognition and tracking system and computer-readable medium | |
CN113409438B (en) | Digital photogrammetry method, electronic equipment and system | |
CN108564662A (en) | The method and device that augmented reality digital culture content is shown is carried out under a kind of remote scene | |
CN112348887B (en) | Terminal posture determination method and related device | |
CN106774910A (en) | Streetscape implementation method and device based on virtual reality | |
US20250239014A1 (en) | Vector data projection and feature matching to determine three-dimensional structure | |
CN113536854B (en) | A method, device and server for generating high-precision map road signs | |
CN113680059B (en) | Outdoor scene AR game positioning device and method | |
JP3791186B2 (en) | Landscape modeling device | |
CN117057086B (en) | Three-dimensional reconstruction method, device and equipment based on target identification and model matching | |
CN110120100A (en) | Image processing method, device and recognition and tracking system | |
CN114719759B (en) | Object surface perimeter and area measurement method based on SLAM algorithm and image instance segmentation technology | |
CN114723923B (en) | Transmission solution simulation display system and method | |
CN107730584A (en) | A kind of panoramic space constructing system based on the technology of taking photo by plane | |
CN110120062B (en) | Image processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |