CN116939362A - Method, system and client for AR virtual shooting based on cloud rendering - Google Patents
Method, system and client for AR virtual shooting based on cloud rendering Download PDFInfo
- Publication number
- CN116939362A CN116939362A CN202310787887.2A CN202310787887A CN116939362A CN 116939362 A CN116939362 A CN 116939362A CN 202310787887 A CN202310787887 A CN 202310787887A CN 116939362 A CN116939362 A CN 116939362A
- Authority
- CN
- China
- Prior art keywords
- rendering
- actor
- shooting
- virtual
- client
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Processing Or Creating Images (AREA)
Abstract
The application relates to the technical field of AR virtual shooting, in particular to a method, a system and a client for AR virtual shooting based on cloud rendering, which aims to reduce hardware cost and get rid of platform limitation. The method for carrying out AR virtual shooting based on cloud rendering, which is provided by the application, is suitable for an AR shooting client, and comprises the following steps: receiving a scene picture and a mirror-carrying instruction uploaded by a director; receiving performance data uploaded by an actor; placing the virtual digital person in a scene picture, and rendering in real time according to the mirror-transporting instruction and the performance data; transmitting the video generated after rendering to a director end and an actor end respectively in real time; the AR shooting client is arranged in a container of the cloud rendering server; the director end and the actor end are mobile terminals. According to the application, high-quality AR virtual shooting experience can be obtained on the cloud only by using a common mobile phone or tablet personal computer, so that the hardware input cost is effectively reduced, and the method is not limited by any platform.
Description
Technical Field
The application relates to the technical field of AR virtual shooting, in particular to a method, a system and a client for AR virtual shooting based on cloud rendering.
Background
AR (Augmented Reality ) is capturing images and videos of a real world environment based on a camera, and superimposing virtual elements with the real scene through computer vision technology, so that a user perceives the effects of the virtual elements in the real scene. The AR virtual shooting is an emerging technology for realizing virtual film shooting by using AR technology.
In the current AR virtual photographing technology, the AR application requires a relatively strong processing capability of the device in order to achieve a high-quality rendering effect by means of a high-end device. Therefore, the user needs to purchase a device with a high price to enjoy a better AR virtual shooting experience. In addition, most of the existing AR virtual shooting products are designed based on a specific platform, such as the ARKit of apple corporation and the ARCore of google corporation, and these products have very excellent performances on a single platform, but have poor performances on other platforms.
Disclosure of Invention
In order to solve the problems in the prior art, the application provides a method, a system and a client for AR virtual shooting based on cloud rendering, which reduce hardware cost and are not limited by a platform.
In a first aspect of the present application, a method for performing AR virtual shooting based on cloud rendering is provided, the method is applicable to an AR shooting client, and the method includes:
receiving a scene picture and a mirror-carrying instruction uploaded by a director;
receiving performance data uploaded by an actor;
placing the virtual digital person in the scene picture, and rendering in real time according to the mirror-transporting instruction and the performance data;
transmitting the video generated after rendering to the director end and the actor end respectively in real time;
wherein,,
the AR shooting client is arranged in a container of the cloud rendering server;
the director end and the actor end are mobile terminals.
Preferably, the performance data comprises: expression data and motion data of actors.
Preferably, the step of "placing a virtual digital person in the scene picture and rendering in real time according to the mirror instruction and the performance data" includes:
placing a virtual digital person in the scene picture;
driving the expression and the action of the virtual digital person according to the expression data and the action data;
and rendering the scene picture and the virtual digital person in real time according to the mirror transporting instruction.
Preferably, the step of "placing a virtual digital person in the scene picture and rendering in real time according to the mirror instruction and the performance data" further comprises:
and dubbing the video generated after rendering.
Preferably, the expression data is output to the actor's end by a face capturing device worn by the actor;
the action data includes: gesture ID or gesture data;
the gesture ID is obtained by shooting a performance video of an actor and analyzing the performance video of the actor from the actor side;
the gesture data are obtained by processing output data of the dynamic capture clothing worn by the actor through a data transfer device, and are sent to the AR shooting client through the actor end.
Preferably, the method further comprises:
and receiving a start or end instruction uploaded by the director side, so as to start or end AR virtual shooting.
Preferably, the method further comprises:
and sending the video generated after rendering to a streaming media server, so as to realize real-time push and/or recording.
Preferably, the actor end is one or more;
the scene picture, the mirror-transporting instruction and the performance data are all uploaded to the AR shooting client through a UDP protocol;
and the video generated after rendering is sent to the director side and the actor side in real time through a UDP protocol.
In a second aspect of the present application, an AR photographing client is provided, the client including:
the director end data receiving module is used for receiving the scene images and the mirror transporting instructions uploaded by the director end;
the actor terminal data receiving module is used for receiving the performance data uploaded by the actor terminal;
the rendering module is used for placing the virtual digital person in the scene picture and rendering in real time according to the mirror-transporting instruction and the performance data;
the sending module is used for respectively sending the video generated after rendering to the director end and the actor end in real time;
wherein,,
the AR shooting client is arranged in a container of the cloud rendering server;
the director end and the actor end are mobile terminals.
In a third aspect of the present application, a system for performing AR virtual shooting based on cloud rendering is provided, the system comprising: director side, actor side, AR shooting client side as described above;
wherein,,
the director side is used for shooting the scene picture, uploading the scene picture and the mirror-transporting instruction to the AR shooting client side, and receiving a video rendered by the AR shooting client side;
and the actor end is used for uploading the performance data and receiving the video rendered by the AR shooting client.
The application has the following beneficial effects:
according to the cloud rendering method, the rendering task is transferred from the local equipment to the cloud for processing through the cloud rendering technology, and a large amount of professional hardware support is not needed. A user can obtain high-quality AR virtual shooting experience on the cloud only by using a common mobile phone or a tablet personal computer, so that the hardware input cost is effectively reduced, and the popularity of AR virtual shooting is greatly improved.
The application adopts the cloud rendering technology to realize AR virtual shooting, is not limited by any platform, and can be used on any operating system platform, including IOS, android, windows and the like. Such a design can improve product flexibility and universality, and can also bring better use experience for users. The cloud rendering technology can well solve the stability and safety problems of AR application, and can dynamically adjust system configuration along with the change of virtual scenes, so that the quality and detail of virtual shooting are ensured.
The application supports multi-person online AR virtual shooting, which is very useful for virtual reality and other technologies, because the virtual reality technology generally needs a plurality of users to participate synchronously, and meanwhile, the multi-person online virtual shooting can also realize multi-role shooting and scenario description, thereby enhancing the interestingness and practical application value of the virtual shooting.
In addition, the application can also directly transmit the virtual shooting content generated in real time to the streaming media server through network connection, and can also record in real time, thereby facilitating the later editing and vermicelli interaction communication.
Drawings
Fig. 1 is a schematic diagram of the main steps of a first embodiment of a method for performing AR virtual shooting based on cloud rendering according to the present application;
fig. 2 is a schematic diagram of the main steps of a second embodiment of the method for performing AR virtual shooting based on cloud rendering according to the present application;
fig. 3 is a schematic diagram of the main components of an AR photographing client embodiment of the present application.
Detailed Description
Preferred embodiments of the present application are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are merely for explaining the technical principles of the present application, and are not intended to limit the scope of the present application.
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
It should be noted that in the description of the present application, the terms "first," "second," and the like are merely used for convenience of description and are not to be construed as limiting the application as to the relative importance of the device, element or parameter being described or implied. In addition, the term "and/or" in the present application is merely an association relationship describing the association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In this context, unless otherwise specified, the term "/" generally indicates that the associated object is an "or" relationship.
In the following embodiment, AR photographing clients are developed by using the universal Engine 5.0, and the clients can complete operations such as network communication, asset assembly, real-time rendering, and the like.
Fig. 1 is a schematic diagram of the main steps of a method embodiment of AR virtual shooting based on cloud rendering according to the present application. The present embodiment is applicable to an AR photographing client, as shown in fig. 1, and the method of the present embodiment includes steps a10-a40:
and step A10, the AR shooting client receives the scene picture and the mirror-carrying instruction uploaded by the director.
And step A20, receiving performance data uploaded by the actor.
In this embodiment, the performance data includes: expression data and motion data of actors.
The expression data are output to the actor by a facial capture device worn by the actor, and then are uploaded to the AR shooting client.
The action data may include: gesture ID or gesture data.
When an actor does not wear the dynamic capturing clothing (such as a scene of the outside of a body, etc.), the actor can put out a plurality of fixed gestures, the gestures are all preset with corresponding IDs, the actor obtains the corresponding gesture IDs after shooting the performance video of the actor and analyzing the performance video, and the gesture IDs are uploaded to an AR shooting client; when the actor wears the dynamic capturing clothing, the output data of the dynamic capturing clothing can be processed by the data transfer device to obtain gesture data, and then the gesture data is sent to the AR shooting client through the actor.
And step A30, placing the virtual digital person in a scene picture, and rendering in real time according to the mirror-transporting instruction and the performance data. This step may specifically comprise steps a31-a33:
step a31, placing the virtual digital person in the scene.
And step A32, driving the expression and the action of the virtual digital person according to the expression data and the action data of the actor.
And step A33, rendering the scene picture and the virtual digital person in real time according to the mirror operation instruction.
And step A40, transmitting the video generated after rendering to a director end and an actor end respectively in real time.
The AR shooting client is arranged in a container of the cloud rendering server; the director end and the actor end are mobile terminals.
Preferably, after step a33, it may further include:
step a34, dubbing the video generated after rendering, for example, adding dialogue and background music between virtual digital persons.
Fig. 2 is a schematic diagram of the main steps of a second embodiment of the method for performing AR virtual shooting based on cloud rendering according to the present application. The embodiment is also applicable to an AR photographing client, as shown in fig. 2, and the method of the embodiment includes steps B10-B80:
and step B10, receiving a start instruction uploaded by the director side, so as to start AR virtual shooting.
And step B20, the AR shooting client receives the scene picture and the mirror-carrying instruction uploaded by the director.
And step B30, receiving the performance data uploaded by the actor.
And step B40, placing the virtual digital person in a scene picture, and rendering in real time according to the mirror-transporting instruction and the performance data.
And step B50, transmitting the video generated after rendering to a director end and an actor end respectively in real time.
And step B60, transmitting the video generated after rendering to a streaming media server, and further realizing real-time push and/or recording.
The streaming media server can adopt RTMP and other protocols when transmitting the streaming media server.
And step B70, judging whether an end instruction uploaded by the director end is received, if yes, turning to step B80, otherwise, repeatedly executing the steps B20-B60.
And step B80, ending shooting.
After shooting is finished, the video recorded in the streaming media server can be used for later editing.
Although the steps are described in the above-described sequential order in the above-described embodiments, it will be appreciated by those skilled in the art that in order to achieve the effects of the present embodiments, the steps need not be performed in such order, and may be performed simultaneously (in parallel) or in reverse order, and such simple variations are within the scope of the present application.
In the first and second embodiments, the actor may be one or more, that is, there may be multiple actors performing simultaneously, and multiple virtual digital persons are driven accordingly; the scene picture, the mirror-transporting instruction and the performance data can be uploaded to the AR shooting client through a UDP protocol; and the video generated after rendering is sent to a director end and an actor end in real time through a UDP protocol.
The application also provides an embodiment of the AR shooting client based on the same technical conception as the embodiment of the method.
Fig. 3 is a schematic diagram of the main components of an AR photographing client embodiment of the present application. As shown in fig. 3, the client 100 of the present embodiment includes: a director side data receiving module 110, an actor side data receiving module 120, a rendering module 130, and a transmitting module 140.
The director end data receiving module 110 is configured to receive a scene image and a mirror instruction uploaded by the director end 200; the actor terminal data receiving module 120 is configured to receive performance data uploaded by the actor terminal 300; the rendering module 130 is used for placing the virtual digital person in the scene picture and rendering in real time according to the mirror-transporting instruction and the performance data; the sending module 140 is configured to send the video generated after rendering to the director end 200 and the actor end 300 in real time, respectively.
In this embodiment, the AR photographing client is disposed in a container of the cloud rendering server; both the director end 100 and the actor end 200 are mobile terminals.
Further, the application also provides a system for performing AR virtual shooting based on cloud rendering, and the system of the embodiment comprises: director side, actor side, and AR photographing client as described above. The director side is used for shooting a scene picture, uploading the scene picture and a mirror-transporting instruction to the AR shooting client side, and receiving a video rendered by the AR shooting client side; the actor end is used for uploading performance data and receiving video rendered by the AR shooting client.
Those of skill in the art will appreciate that the various illustrative method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of functionality in order to clearly illustrate the interchangeability of electronic hardware and software. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not intended to be limiting.
Thus far, the technical solution of the present application has been described in connection with the preferred embodiments shown in the drawings. However, it will be readily appreciated by those skilled in the art that the scope of the application is obviously not limited to these specific embodiments. Equivalent modifications and substitutions for related technical features may be made by those skilled in the art without departing from the principles of the present application, and such modifications and substitutions will be within the scope of the present application.
Claims (10)
1. A method for performing AR virtual shooting based on cloud rendering, wherein the method is applicable to an AR shooting client, and the method comprises:
receiving a scene picture and a mirror-carrying instruction uploaded by a director;
receiving performance data uploaded by an actor;
placing the virtual digital person in the scene picture, and rendering in real time according to the mirror-transporting instruction and the performance data;
transmitting the video generated after rendering to the director end and the actor end respectively in real time;
wherein,,
the AR shooting client is arranged in a container of the cloud rendering server;
the director end and the actor end are mobile terminals.
2. The method of AR virtual photography based on cloud rendering of claim 1, wherein the performance data comprises: expression data and motion data of actors.
3. The method of AR virtual shooting based on cloud rendering according to claim 2, wherein the step of placing a virtual digital person in the scene picture and rendering in real time according to the mirror instruction and the performance data comprises:
placing a virtual digital person in the scene picture;
driving the expression and the action of the virtual digital person according to the expression data and the action data;
and rendering the scene picture and the virtual digital person in real time according to the mirror transporting instruction.
4. The method of AR virtual shooting based on cloud rendering of claim 3, wherein the step of "placing a virtual digital person in the scene picture and rendering in real time according to the mirror instruction and the performance data" further comprises:
and dubbing the video generated after rendering.
5. The method for AR virtual photography based on cloud rendering as claimed in claim 2,
the expression data is output to the actor end by a facial capture device worn by the actor;
the action data includes: gesture ID or gesture data;
the gesture ID is obtained by shooting a performance video of an actor and analyzing the performance video of the actor from the actor side;
the gesture data are obtained by processing output data of the dynamic capture clothing worn by the actor through a data transfer device, and are sent to the AR shooting client through the actor end.
6. The method of AR virtual shooting based on cloud rendering of claim 1, further comprising:
and receiving a start or end instruction uploaded by the director side, so as to start or end AR virtual shooting.
7. The method of AR virtual shooting based on cloud rendering of claim 1, further comprising:
and sending the video generated after rendering to a streaming media server, so as to realize real-time push and/or recording.
8. The method for AR virtual photography based on cloud rendering as claimed in any of claims 1 to 7,
one or more actor ends are arranged;
the scene picture, the mirror-transporting instruction and the performance data are all uploaded to the AR shooting client through a UDP protocol;
and the video generated after rendering is sent to the director side and the actor side in real time through a UDP protocol.
9. An AR photographing client, the client comprising:
the director end data receiving module is used for receiving the scene images and the mirror transporting instructions uploaded by the director end;
the actor terminal data receiving module is used for receiving the performance data uploaded by the actor terminal;
the rendering module is used for placing the virtual digital person in the scene picture and rendering in real time according to the mirror-transporting instruction and the performance data;
the sending module is used for respectively sending the video generated after rendering to the director end and the actor end in real time;
wherein,,
the AR shooting client is arranged in a container of the cloud rendering server;
the director end and the actor end are mobile terminals.
10. A system for AR virtual photography based on cloud rendering, the system comprising: director side, actor side, AR photographing client side as claimed in claim 9;
wherein,,
the director side is used for shooting the scene picture, uploading the scene picture and the mirror-transporting instruction to the AR shooting client side, and receiving a video rendered by the AR shooting client side;
and the actor end is used for uploading the performance data and receiving the video rendered by the AR shooting client.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310787887.2A CN116939362A (en) | 2023-06-29 | 2023-06-29 | Method, system and client for AR virtual shooting based on cloud rendering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310787887.2A CN116939362A (en) | 2023-06-29 | 2023-06-29 | Method, system and client for AR virtual shooting based on cloud rendering |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116939362A true CN116939362A (en) | 2023-10-24 |
Family
ID=88381934
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310787887.2A Pending CN116939362A (en) | 2023-06-29 | 2023-06-29 | Method, system and client for AR virtual shooting based on cloud rendering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116939362A (en) |
-
2023
- 2023-06-29 CN CN202310787887.2A patent/CN116939362A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11218649B2 (en) | Photographing method, apparatus, and device | |
US20120287231A1 (en) | Media sharing during a video call | |
KR100889367B1 (en) | Virtual studio implementation system and method through network | |
KR101859064B1 (en) | Video synchronous playback method, apparatus, and system | |
CN106576158A (en) | immersive video | |
CN106331880B (en) | Information processing method and system | |
CN108307182A (en) | A kind of network interaction sharing experience system and method based on virtual reality technology | |
EP3371966B1 (en) | Depth camera based image stabilization | |
CN113572975B (en) | Video playback method, device and system, and computer storage medium | |
CN112584084B (en) | Video playing method and device, computer equipment and storage medium | |
US20210105404A1 (en) | Video photographing processing method, apparatus, and video photographing processing system | |
CN108769755A (en) | High-resolution full view frequency live streaming camera system and method | |
CN110868620A (en) | Remote interaction system and method based on television | |
CN116939362A (en) | Method, system and client for AR virtual shooting based on cloud rendering | |
CN107396200A (en) | The method that net cast is carried out based on social software | |
CN117784933A (en) | Multi-person AR interaction method based on cloud rendering | |
CN116016961B (en) | VR content live broadcast method, device and storage medium | |
JP5818326B2 (en) | Video viewing history analysis method, video viewing history analysis apparatus, and video viewing history analysis program | |
KR101887380B1 (en) | Apparatus and method for transmitting and processing image filmed using a plurality of camera | |
US20190116214A1 (en) | Method and system for taking pictures on real time dynamic basis | |
CN109525483A (en) | The generation method of mobile terminal and its interactive animation, computer readable storage medium | |
CN106060481A (en) | Video collection method and device of pan-tilt-zoom camera | |
US20190215539A1 (en) | Video production systems and methods | |
US12356118B2 (en) | Video communications platform virtual environment streaming | |
JP7743934B2 (en) | Viewpoint switching method, device, and system for free viewpoint video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |