Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the following detailed description of the embodiments of the present application will be given with reference to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the application, are intended to be within the scope of the embodiments of the present application.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. In the description of the present application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. The words "if"/"if" as used herein may be interpreted as "at" or "when" or "in response to a determination.
Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or" describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate that there are three cases of a alone, a and B together, and B alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
As will be appreciated by those skilled in the art, the terms "client", "terminal device" as used herein include both devices of a wireless signal transmitter having only a wireless signal transmitter capable of transmitting and devices of a wireless signal receiver having only a wireless signal receiver capable of receiving, and devices of receiving and transmitting hardware having devices capable of receiving and transmitting two-way communications over a two-way communications link. Such devices may include cellular or other communication devices such as Personal computers, tablet computers, cellular or other communication devices having a single-wire or multi-wire display or no multi-wire display, PCS (PersonalCommunications Service, personal communication system) which may combine voice, data processing, facsimile and/or data communication capabilities, PDA (Personal DIGITAL ASSISTANT ) which may include a radio frequency receiver, pager, internet/intranet access, web browser, notepad, calendar and/or GPS (Global PositioningSystem ) receiver, conventional laptop and/or palmtop computer or other device having and/or including a radio frequency receiver. As used herein, "client," "terminal device" may be portable, transportable, installed in a vehicle (aeronautical, maritime, and/or land-based), or adapted and/or configured to operate locally and/or in a distributed fashion, at any other location(s) on earth and/or in space. As used herein, a "client," "terminal device," or "terminal device" may also be a communication terminal, an internet terminal, or a music/video playing terminal, for example, may be a PDA, a MID (Mobile INTERNET DEVICE ), and/or a Mobile phone with a music/video playing function, or may also be a device such as a smart tv, a set top box, or the like.
The application refers to the hardware of server, client, service node, etc., which is essentially the computer equipment with personal computer, etc., and is the hardware device with the necessary components revealed by von neumann principle, such as central processing unit (including arithmetic unit and controller), memory, input equipment and output equipment, etc., the computer program is stored in the memory, the central processing unit calls the program stored in the external memory to run, executes the instructions in the program, and interacts with the input and output equipment, thereby completing the specific functions.
It should be noted that the concept of the present application, called "server", is equally applicable to the case of server clusters. According to the network deployment principle understood by those skilled in the art, each server should be logically divided, and in physical space, the servers may be independent from each other but may be invoked through interfaces, or may be integrated into one physical computer or one set of computers. Those skilled in the art will appreciate this variation and should not be construed as limiting the implementation of the network deployment approach of the present application.
Referring to fig. 1, fig. 1 is a schematic diagram of an application scenario of a live-room game interaction method with separated front and back ends, where the application scenario includes a hosting client 110, a spectator client 120 and a server 130.
The anchor client 110 interacts with the viewer client 120 through the server 130. Specifically, the anchor client 110 and the audience client 120 may access the internet through a network access manner, and establish a data communication link with the server 130. The network may be a communication medium of various connection types capable of enabling communication between the anchor client 110 and the server 130 and between the viewer client 120 and the server 130, such as a wired communication link, a wireless communication link, or a fiber optic cable, etc., and the present application is not limited herein.
It should be noted that, the clients proposed in the embodiment of the present application include a hosting client 110 and a viewer client 120.
It should be noted that in the prior art there are a number of understandings of the concept "client", which may be understood as an application installed in a computer device, or as a hardware device corresponding to a server, for example.
In the embodiment of the application, the term "client" refers to a hardware device corresponding to a server, and more specifically, refers to a computer device, such as a smart phone, a smart interaction tablet, a personal computer, and the like.
When the client is a mobile device such as a smart phone and an intelligent interaction tablet, a user can install a matched mobile terminal application program on the client, and can access a Web terminal application program on the client.
When the client is a non-mobile device such as a Personal Computer (PC), the user can install a matched PC application program on the client, and can access a Web application program on the client.
The mobile terminal application program refers to an application program which can be installed in mobile equipment, the PC terminal application program refers to an application program which can be installed in non-mobile equipment, and the Web terminal application program refers to an application program which needs to be accessed through a browser.
Specifically, the Web application may be further divided into a mobile version and a PC version according to the difference of client types, and there may be a difference between the page layout manner and the available server support of the two.
In the embodiment of the application, the types of live broadcast application programs provided for users are divided into mobile live broadcast application programs, PC live broadcast application programs and Web live broadcast application programs. The user can autonomously select the mode of participating in the network live broadcast according to different types of the client.
The present application divides clients into a hosting client 110 and a spectator client 130 according to the difference in the user identities of the clients entering the living room. It should be noted that in practical applications, the functions of the viewer client 120 and the anchor client 110 may be performed by the same client at different times. Thus, the same client may act as a viewer client 120 when viewing a webcast and as a anchor client 110 when publishing live video.
The anchor client 110 refers to an end that transmits a live video, and is typically a client used by an anchor user in live webcasting. The hardware pointed to by the anchor client 110 essentially refers to a computer device, and in particular, as shown in fig. 1, may be a smart phone, a smart interactive tablet, a personal computer, or the like.
The viewer client 120 refers to the end that receives and views the live video, and is typically the client employed by the viewer user viewing the video in the live video. The hardware pointed to by the viewer client 120 is essentially a computer device, and in particular, as shown in fig. 1, may be a smart phone, a smart interactive tablet, a personal computer, or the like.
The server 130 may act as a service server that may be responsible for further interfacing with related audio data servers, video streaming servers, and other servers providing related support, etc., to form a logically associated service cluster for serving related end devices, such as the anchor client 110 and the viewer client 120 shown in fig. 1.
In the embodiment of the present application, the anchor client 110 and the viewer client 120 may join the same live broadcast room (i.e., live broadcast channel), where the live broadcast room refers to a chat room implemented by means of the internet technology and the server 130, and generally has an audio/video playing control function. The anchor user plays live in the live room through the anchor client 110, and the viewer user of the viewer client 120 can log into the server 130 to watch live in the live room.
In the live broadcasting room, the interaction between the host user and the audience user can be realized through the well-known online interaction modes such as voice, video, characters and the like, generally, the host user performs programs for the audience user in the form of audio and video streams, meanwhile, the audience user can interact with the host user in the mode of characters or giving virtual gifts, and economic transaction behaviors can be generated in the interaction process, and of course, the application form of the live broadcasting room is not limited to online entertainment and can be popularized to other related scenes.
Specifically, the process of watching live broadcast by the audience user is that the audience user can click to access a live broadcast application program installed on the audience client 120 and select to enter any live broadcast room, the audience client 120 is triggered to load a live broadcast room interface for the audience user, the live broadcast room interface comprises a plurality of interaction components, such as a video component, a virtual gift bar component, a public screen component and the like, and the audience user can watch live broadcast in the live broadcast room and perform various online interactions by loading the interaction components, wherein the online interactions include but are not limited to giving virtual gift, participating in live broadcast activities, and speaking chat on the public screen.
It should be noted that the application scenario in fig. 1 is only an exemplary application scenario, and is not intended to limit the scheme of the present invention. The scheme of the invention can also be applied to other forms of network live broadcast application scenes, and the description of the scheme is omitted.
In the game industry, in order to provide users with real physical experiences, real motion, rotation and collision status data of each game object are generally obtained through simulation calculation according to game event data. In the related art, the simulation calculation of the motion, rotation and collision state data of each game object and the game rendering run together at the client, and the mode is easy to be cracked and tampered with data by malicious users, especially when the game interaction is quoted in the network live broadcast, the development complexity of the client is increased, and the workload is increased.
Based on the above-mentioned problems, a first embodiment of the present application provides a live room game interaction method with separated front and rear ends. Referring to fig. 2, fig. 2 is a flow chart of a live room game interaction method with separated front and back ends according to a first embodiment of the present application. The live-room game interaction method with separated front and back ends according to the first embodiment of the present application may be executed by a viewer client and a server as execution subjects, and includes the steps of:
Step S101, a server responds to a game interaction starting instruction of a live broadcasting room, analyzes the game interaction starting instruction to obtain a user identifier, and transmits game page data to a target client corresponding to the user identifier, wherein the game page data comprises game object data and game picture data.
As shown in fig. 3, a live room interface 10 of the live room client displays a game interaction opening control 11, and a user can trigger game interaction through the game interaction opening control 11. And the client side of the live broadcasting room acquires the user identification after monitoring that the user triggers the game starting control 11, generates a game interaction starting instruction according to the user identification, and sends the game interaction starting instruction to the server.
In an alternative embodiment, the user identifier may be a viewer user identifier, and the viewer user triggers the game opening instruction through a game interaction opening control on the viewer client where the viewer user is located, so that a game interaction page is displayed in a live broadcast room where the viewer client is located, so that the viewer user can watch live broadcast and perform game interaction at the same time.
In another alternative embodiment, the user identifier may be a host user identifier, and the host user triggers the game start instruction through the game interaction start control on the host client where the host user is located, so that the game interaction page is displayed in the live broadcast room where the host client is located, so that the host user can play the live broadcast while performing the game interaction.
It can be appreciated that the host user can also live his game interaction page, so that the users entering the host living room can watch the host game interaction page.
In yet another alternative embodiment, the user identifier may be a plurality of the host's user identifiers, and the server may further establish a connection between the host who triggers the game interaction start control at the same time, so that each host can perform game interaction, and synchronize the game interaction page to the respective live broadcast room of the host, so that the user entering each host can watch the game interaction between the host's hosts.
It is understood that the game page data is data required for generating a game interaction page, and specifically, the game page data includes game object data and game screen data.
The game object of the embodiment of the application refers to an object which can be selected or seen in a game interaction page. In particular, game object data includes, but is not limited to, object identification, object type, object position data, object rotation data, object speed data, and the like.
The game picture data of the embodiment of the application comprises game map data. The game map is used for bringing a visual effect to the game model, and the game map can show various colors/textures and describe the material of the surface of the game object. It is understood that the game map data refers to data describing the visual effect of the game model.
Step S102, the target client renders and displays a game interaction page in a preset area of the live broadcasting room page according to the game page data.
And the target client determines the position state of each game object in the game map corresponding to the game map data according to the game object data, so that the game interaction page is rendered and displayed.
It should be noted that, when the server monitors that the target client triggers the game interaction start instruction for the first time, the server sends the game object data and the game picture data to the target client, and the target client stores the game picture data. Then, in the interaction between the server and the target client, or starting the game interaction starting instruction again, the server only transmits the game object data, and does not transmit the game picture data, so that the data transmission efficiency is improved.
It should be appreciated that the game interaction page and the live room page may be separate pages. In an alternative embodiment, the game interaction page and the live room page are superimposed on each other, and as shown in FIG. 4, the game interaction page 20 may be suspended in the lower right portion of the live room page 10. In another alternative embodiment, the game interaction page is superimposed with the live room page, e.g., the live room page is located on the upper side of the display page and the game interaction page is located on the lower side of the display page.
It will be appreciated that the display position and display size of the game interaction page may or may not be fixed. When the display position and the display size of the game interaction page are not fixed, a user can pull the game interaction page according to actual needs, and the display position and the display size of the game interaction page are adjusted.
Step S103, the target client responds to the interactive operation on the game interactive page and sends game event data corresponding to the interactive operation to the server.
It can be understood that a plurality of game interaction controls are arranged on the game interaction page, and a user can conduct game interaction through the game interaction controls.
And the target client generates corresponding game event data according to the triggering operation of the user on the game interaction control, and sends the corresponding game event data to the server.
Where an event refers to a certain or a certain series of activities of a game player, which may be categorized into transient activities and non-transient activities.
The instant activity is an activity that can be completed at the instant when the player issues an instruction, for example, the player purchases a bottle of liquid medicine, and after the player issues the instruction, the player's money is reduced and obtains the liquid medicine, all of which are completed at the instant when the player issues the instruction, and it can be understood that the instant activity can be presented through a frame of data frame.
A non-transient activity is an activity that requires a period of time to complete after a player issues a certain instruction. For example, when the player clicks a place on the map, the game character automatically walks to the place just clicked, and the moving process is a non-transient process, and has a moving process, which needs to consume a certain time, and the time is perceived by the player, and can be understood as being displayed through a plurality of frames of data frames.
The game event data refers to data for indicating the event, and it is understood that different events correspond to different game event data, for example, for an event that a player purchases a bottle of liquid medicine, the player may trigger a control to purchase the liquid medicine, the game event data includes a control identifier, for a player clicking a place on a map to enable the player to automatically travel to the event just clicked, the game event data includes a player clicking operation identifier, a position identifier clicked on the map, and the like.
Step S104, the server calculates the position state data of each game object according to the game event data, and sends the position state data of each game object to the target client.
It will be appreciated that the positional state data of the game object is used to indicate the actual state data of the simulated game object, such as the motion rotations.
May include, but is not limited to, one or more of a game object identification, a game object type, game object location data, game object rotation data.
The server may be implemented by a physical engine or other data computing model to simulate the position state of each game object in the game page based on the game event data.
Wherein, the physical engine refers to a mode of endowing the object with real physical properties and simulating and calculating motion, rotation and collision reflection. Specifically, the physical engine of the embodiment of the present application may be various physical engines that can implement the method of the present application, for example bullet (physical simulation computing engine), havok (Havok GAME DYNAMICS SDK), physX, and the like. The application is not limited.
It will be appreciated that since the physical engine is provided in the server, the position status data of each game object can be calculated by modifying or replacing the physical engine as needed.
It should be appreciated that for game event data, in addition to simulating the position states of individual game objects in a game page, game logic data may also be calculated from the game logic service. Where game logic services refer to game design logic, for example, where a player moves to a location, corresponding special effects are produced, etc.
The server can simulate and calculate the position state data of each game object through the physical engine according to the game event data, calculate the game logic data through the game logic service, and then respectively send the position state data and the game logic data of each game object to the target client. The server may calculate game logic data through the game logic service according to the game event data, and input the game logic data and the game event data to the physical engine to simulate and calculate the position state data of each game object.
Wherein, since the physics engine does not involve game play logic, if various game play logic are to be synchronized, the physics engine computational efficiency is affected. Therefore, the application can also realize the interaction between the physical engine and the game logic service in an asynchronous event triggering mode, and ensure that the physical engine is not influenced by the game logic processing.
Step 105, the target client updates the rendering game interaction page according to the game picture data and the game position state data of each game object.
It can be understood that the target client stores game picture data, and after the target client receives the game position state data of each game object, the target client only needs to render and display each game object according to the game position state data of each game object on the basis of the game picture data, so that the game interaction page can be updated rapidly.
The method and the device for processing the game interaction page comprise the steps that a server responds to a game interaction opening command in a live broadcasting room, analyzes the game interaction opening command to obtain a user identification, sends game page data to a target client corresponding to the user identification, the game page data comprise game object data and game picture data, the target client renders and displays the game interaction page in a preset area of the live broadcasting room page according to the game page data, the target client responds to interaction operation in the game interaction page and sends game event data corresponding to the interaction operation to the server, the server calculates position state data of each game object according to the game event data and sends the position state data of each game object to the target client, and the target client updates and renders the game interaction page according to the game picture data and the game position state data of each game object, so that motion, rotation and collision position state data of each game object are simulated and calculated are placed in the server for processing, game rendering is only needed at the client, back-end data processing and front-end data rendering separation are achieved, the client does not need to process data processing and data analysis, the client does not need to be adapted to game logic processing and data analysis, the problem that the client needs to be solved, and the client needs complex data rendering only is reduced, and the development of malicious data is required.
In an optional embodiment, the server is provided with a physical engine service cluster, wherein the physical engine service cluster comprises a plurality of physical engines, and the step of calculating the position state data of each game object according to the game event data by the server in step S104 and sending the position state data of each game object to the target client comprises the following steps:
In step S1041, the server determines, according to a preset master selection manner, one of the physical engines to be the master physical engine from the physical engine service cluster.
The preset voting method may be a voting method using a zookeeper (distributed coordination service), or a voting method using redis (Remote Dictionary Server), i.e. remote dictionary service). In the embodiment of the application, the voting master mode of the zookeeper is adopted for master selection, so that the whole physical engine service cluster is stable and high in availability.
In step S1042, the server establishes an association relationship between the main physical engine and the target client, calculates the position status data of each game object corresponding to the game event data by the main physical engine, and sends the position status data of each game object to the target client corresponding to the association relationship.
It can be understood that in a single game interaction, each target client corresponds to a main physical engine through which the target clients are served, and in a multi-player game interaction, the target clients in the same game correspond to the same main physical engine through which the target clients are served.
It should be understood that after the server establishes the association between the main physical engine and the target client, if a new target client starts the game interaction starting instruction, the server will redetermine the new main physical engine in the physical engine according to a preset main selecting mode, and provide services for the new target client through the new physical engine, where the physical engine is one of the physical engines except the main physical engine in the physical engine cluster.
According to the embodiment of the application, the physical engine cluster is arranged in the server, and the main physical engine is determined to provide service for the target client in a main selecting mode, so that the development complexity and workload of the client are reduced, and the stability and high availability of the physical engine are improved.
In an alternative embodiment, the step S102, after the step of rendering and displaying the game interaction page in the preset area of the live room page according to the game page data, includes:
step S1021, the server sends updated game object data to the target client according to the preset time interval;
Step 1022, the target client updates the rendering game interaction page according to the game object data and the game picture data.
It should be understood that, in order to achieve the real-time dynamic effect of game interaction, after displaying the game interaction page, the server will generate each game object data of the game interaction page in real time according to the game logic service, and then send the game object data to the target client, so that the target client performs real-time rendering according to the game object data and the game picture data, thereby improving the dynamic reality of the game interaction.
In an optional embodiment, the step S1042 of establishing an association between the main physical engine and the target client by the server, calculating, by the main physical engine, position status data of each game object corresponding to the game event data, and sending the position status data of each game object to the target client corresponding to the association includes:
In step S10421, the server establishes a message storage queue corresponding to the main physical engine, stores the position state data of each game object calculated by the main physical engine through the message storage queue, and sends the position state data of each game object to the target client through the message storage queue.
Each main physical engine corresponds to one message storage queue, and the main physical engines sequentially store the calculated position state data of each game object to the message storage queue for consumption by the target client.
According to the embodiment of the application, the message storage queue is used for storing the position state data of each game object calculated by the main physical engine, and the message storage queue is used for sending the position state data of each game object to the target client, so that the throughput of the server can be improved, and the reliability of message storage can be improved.
Further, after step S10421, the method further includes:
and step S10422, when the main physical engine fails and is switched to serve the target client by the physical engine, disconnecting the main physical engine from the message storage queue, establishing the connection between the slave physical engine and the message storage queue, and calculating position state data of each game object by the physical engine and storing the position state data in the message storage queue, wherein the physical engine is one physical engine except the main physical engine in the physical engine cluster.
When the main physical engine fails, the embodiment of the application switches from the main physical engine to the target client for service, and the message storage queue provides service for the target client, so that the server only knows that the physical engine is switched, and the external target client does not feel the physical engine, thereby improving the stability and the reliability of the server.
Further, in order to ensure continuity of the position state data of each game object, the data stored in the main physical engine needs to be loaded before the physical engine provides service. When the main physical engine is switched to the target client service by the slave engine, the position state data of each game object which is previously stored by the main physical engine is obtained from the database by the slave engine so as to enable the calculated data to have continuity. The database is any database which can realize the method of the application, such as a redis database and the like.
In an alternative embodiment, the step S104 of calculating the position status data of each game object according to the game event data by the server, and sending the position status data of each game object to the target client includes:
Step S1043, the server calculates and stores the position state data of each game object according to the game event data;
step S1044, when the server monitors that the stored position state data of each game object meets the frame aggregation condition, aggregating the stored position state data of each game object into a game data packet and sending the game data packet to the target client;
In step S105, the step of updating the rendered game interaction page by the target client according to the game picture data and the game position status data of each game object includes:
in step S1051, the target client de-frames according to the game data packet to obtain the position state data of the multi-frame game object, and renders the position state data of the multi-frame game object to update the game interaction page.
Since the position status data of each game object calculated by the physical engine needs to be transmitted to the target client through the network, delay is an unavoidable problem in order to reduce the network delay as much as possible. According to the embodiment of the application, the position state data of the multi-frame game objects are aggregated into one game data packet in a frame aggregation mode and then sent to the target client, so that the target client can have enough data to render when rendering, the rendering effect is improved, and the game smoothness is improved.
In an alternative embodiment, the live broadcast page is displayed with a virtual gift giving control, and the live broadcast room game interaction method with separated front and back ends further comprises the following steps:
step S1061, the target client responds to the triggering operation of the virtual gift-giving control and sends a virtual gift-giving request to the server;
Step S1062, the server acquires a live broadcasting room identifier of the target client and virtual gift data triggered by the target client according to the virtual gift giving request, generates a virtual gift giving instruction according to the live broadcasting room identifier and the virtual gift data, and sends the virtual gift giving instruction to the target client;
In step S1063, the target client responds to the virtual gift giving instruction, reduces the display proportion of the game interaction page according to the preset adjustment mode, and displays the virtual gift special effect corresponding to the virtual gift data on the live broadcast room page.
And step S1064, after the special effects of the virtual gift are displayed, the target client controls and restores the display proportion of the game interaction page.
When the user triggers the virtual gift and further displays the special effect of the virtual gift, the user can fully watch the special effect of the virtual gift by adjusting the display size of the game interaction page, and the situation that the game interaction page shields the special effect of the virtual gift and influences watching experience is avoided.
In an alternative embodiment, the live room game interaction method with separated front and back ends further comprises the following steps:
Step S1071, a target client monitors user entrance information of a live broadcasting room;
step S1072, if the target client monitors that the user entrance information is preset entrance information, the target client reduces the display proportion of the game interaction page according to a preset adjustment mode, and the user entrance corresponding to the user entrance information is displayed on the live broadcasting room page.
And step S1073, when the display time of the user entrance information reaches the preset display time, the target client controls and restores the display proportion of the game interaction page.
The user entry information may include information such as user identification, user rating, etc. The preset entrance information may include a preset user level. When the current entrance level of the user reaches the preset entrance level, the target client reduces the display proportion of the game interaction page according to the preset adjustment mode, and the user entrance corresponding to the user entrance information is displayed on the live broadcasting room page.
When the embodiment of the application monitors that the user entrance information is the preset entrance information, the display proportion of the game interaction page is reduced according to the preset adjustment mode, the user entrance corresponding to the user entrance information is displayed on the live broadcast room page, the game interaction page is prevented from shielding the entrance user information, audience users are prevented from being missed, and interaction between a host and audience and interaction between the audience and the audience are reduced.
Referring to fig. 5, a second embodiment of the present application provides a live room game interaction method with separated front and back ends, which includes the following steps:
Step S201, responding to a game interaction starting instruction of a live broadcasting room, analyzing the game interaction starting instruction to obtain a user identifier, and transmitting game page data to a target client corresponding to the user identifier, wherein the game page data comprises game object data and game picture data.
Step S202, game event data fed back by a target client side is received, wherein the game event data is event data corresponding to received interaction operation on a game interaction page rendered and displayed in a preset area of a live broadcasting room page by the target client side according to the game page data.
Step S203, calculating the position state data of each game object according to the game event data, and sending the position state data of each game object to the target client so that the target client updates the rendered game interaction page according to the game picture data and the game position state data of each game object.
The second embodiment of the present application is a live broadcasting room game interaction method with a server as an execution main body, and describes a live broadcasting room game interaction method with separated front and rear ends, which belongs to the same concept as the live broadcasting room game interaction method with separated front and rear ends provided in the first embodiment of the present application, and the detailed implementation process is shown in the method embodiment, and is not repeated here.
The third embodiment of the application also provides a live broadcasting room game interaction system with separated front and back ends, which comprises a server and a target client;
the server responds to a game interaction starting instruction of the live broadcasting room, analyzes the game interaction starting instruction to obtain a user identifier, and transmits game page data to a target client corresponding to the user identifier, wherein the game page data comprises game object data and game picture data;
The target client renders and displays a game interaction page in a preset area of the live broadcasting room page according to the game page data;
The target client responds to the interactive operation on the game interactive page and sends game event data corresponding to the interactive operation to the server;
The server calculates the position state data of each game object according to the game event data, and sends the position state data of each game object to the target client;
And the target client updates and renders the game interaction page according to the game picture data and the game position state data of each game object.
The live broadcasting room game interaction system with separated front and back ends provided by the third embodiment of the present application belongs to the same concept as the live broadcasting room game interaction method with separated front and back ends provided by the first embodiment of the present application, and the detailed implementation process of the live broadcasting room game interaction system is shown in the method embodiment and will not be described herein.
Referring to fig. 6, a live room game interaction device 300 with separated front and rear ends according to a fourth embodiment of the present application is further provided, including:
The game page data acquisition module 301 is configured to respond to a game interaction starting instruction in a live broadcasting room, analyze the game interaction starting instruction to acquire a user identifier, and send game page data to a target client corresponding to the user identifier;
The game event data receiving module 302 is configured to receive game event data fed back by the target client, where the game event data is event data corresponding to received interaction operations on a game interaction page rendered and displayed by the target client in a preset area of a live broadcasting room page according to the game page data;
The position state data calculating module 303 is configured to calculate position state data of each game object according to the game event data, and send the position state data of each game object to the target client, so that the target client updates the rendered game interaction page according to the game picture data and the game position state data of each game object.
It should be noted that, when executing the live-broadcast room game interaction method with separated front and back ends in live broadcast, the live-broadcast room game interaction device provided by the application is only exemplified by the division of the functional modules, and in practical application, the function allocation can be completed by different functional modules according to the needs, i.e. the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the live broadcasting room game interaction device with separated front and rear ends provided by the embodiment of the application belongs to the same conception as the live broadcasting room game interaction method with separated front and rear ends of the first embodiment of the application, and detailed implementation processes are shown in the method embodiment and are not repeated here.
The embodiment of the live broadcasting room game interaction device with the separated front and rear ends can be applied to electronic equipment, such as a server, and can be realized by software or hardware or a combination of hardware and software. Taking software implementation as an example, the device in a logic sense is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory through a processor of the file processing where the device is located. In a hardware-level, the electronic devices in which they reside may include a processor, a network interface, memory, and non-volatile storage, which are interconnected by a data bus or other well-known means.
Fig. 7 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present application. As shown in FIG. 7, the electronic device 16 may include a processor 160, a memory 161, and a computer program 162 stored in the memory 161 and executable on the processor 160, such as an approach program of a live-room game interaction method with separated front and back ends, wherein the steps in the live-room game interaction method with separated front and back ends of the above embodiments are implemented when the processor 160 executes the computer program 162.
Wherein the processor 160 may include one or more processing cores. The processor 160 connects various portions within the electronic device 16 using various interfaces and lines, performs various functions of the electronic device 16 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 161, and invoking data in the memory 161, and optionally, the processor 160 may be implemented in at least one hardware form of digital signal Processing (DIGITAL SIGNAL Processing, DSP), field-Programmable gate array (fieldprogrammable GATE ARRAY, FPGA), programmable logic array (Programble Logic Array, PLA). The processor 160 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like, the GPU is used for rendering and drawing contents required to be displayed by the touch display screen, and the modem is used for processing wireless communication. It will be appreciated that the modem may not be integrated into the processor 160 and may be implemented by a single chip.
The Memory 161 may include a random access Memory (Random Access Memory, RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 161 includes a non-transitory computer readable medium (non-transitory computer-readable storage medium). Memory 161 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 161 may include a stored program area that may store instructions for implementing an operating system, instructions for at least one function (such as touch instructions, etc.), instructions for implementing the various method embodiments described above, etc., and a stored data area that may store data, etc., referred to in the various method embodiments described above. The memory 161 may also optionally be at least one storage device located remotely from the aforementioned processor 160.
The sixth embodiment of the present application further provides a computer storage medium, where a plurality of instructions may be stored, where the instructions are adapted to be loaded onto and executed by a processor, and the specific implementation procedure may refer to the specific description of the foregoing embodiment, and details are not repeated herein.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment described above may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc.
The present invention is not limited to the above-described embodiments, but, if various modifications or variations of the present invention are not departing from the spirit and scope of the present invention, the present invention is intended to include such modifications and variations as fall within the scope of the claims and the equivalents thereof.