CN117582672A - Data processing method, device, electronic equipment and storage medium - Google Patents
Data processing method, device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN117582672A CN117582672A CN202311552131.6A CN202311552131A CN117582672A CN 117582672 A CN117582672 A CN 117582672A CN 202311552131 A CN202311552131 A CN 202311552131A CN 117582672 A CN117582672 A CN 117582672A
- Authority
- CN
- China
- Prior art keywords
- event
- auxiliary
- target behavior
- game
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003860 storage Methods 0.000 title claims abstract description 22
- 238000003672 processing method Methods 0.000 title abstract description 22
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000012545 processing Methods 0.000 claims abstract description 12
- 230000003993 interaction Effects 0.000 claims description 155
- 238000009826 distribution Methods 0.000 claims description 48
- 230000002452 interceptive effect Effects 0.000 claims description 43
- 230000004044 response Effects 0.000 claims description 41
- 238000004891 communication Methods 0.000 claims description 35
- 230000003542 behavioural effect Effects 0.000 claims description 23
- 230000000977 initiatory effect Effects 0.000 claims description 7
- 238000004590 computer program Methods 0.000 claims description 4
- 230000006399 behavior Effects 0.000 description 188
- 238000010586 diagram Methods 0.000 description 17
- 230000006870 function Effects 0.000 description 7
- 230000001960 triggered effect Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000014509 gene expression Effects 0.000 description 5
- XEEYBQQBJWHFJM-UHFFFAOYSA-N Iron Chemical compound [Fe] XEEYBQQBJWHFJM-UHFFFAOYSA-N 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000003825 pressing Methods 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000000547 structure data Methods 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000000386 athletic effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 229910052742 iron Inorganic materials 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000003997 social interaction Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000004083 survival effect Effects 0.000 description 2
- 239000000725 suspension Substances 0.000 description 2
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 241000989913 Gunnera petaloidea Species 0.000 description 1
- SQVRNKJHWKZAKO-PFQGKNLYSA-N N-acetyl-beta-neuraminic acid Chemical compound CC(=O)N[C@@H]1[C@@H](O)C[C@@](O)(C(O)=O)O[C@H]1[C@H](O)[C@H](O)CO SQVRNKJHWKZAKO-PFQGKNLYSA-N 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000004883 computer application Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 238000005304 joining Methods 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/847—Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/57—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of game services offered to the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6009—Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a data processing method, a data processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: displaying the virtual object in a graphical user interface; responding to a mutual assistance request instruction initiated aiming at a target behavior event of a game, entering a game content new page, wherein the target behavior event is a behavior event currently performed by a virtual object; and responding to the content release instruction, and providing an event auxiliary label through a preset release channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content release information determined on a content new page and a mutual assistance link generated based on the target behavior event. Through the application, the convenience of assisting the game event is improved, so that the game event is completed in an accelerating way, and the consumption of a server is reduced.
Description
Technical Field
The present disclosure relates to the field of computer applications, and in particular, to a data processing method, apparatus, electronic device, and storage medium.
Background
In some related games, a player may perform related game events such as building upgrades, solicitation, etc. on a virtual object, and after the game event is established, it needs to wait for a certain time to complete, which may result in an excessively long waiting time for a primary player and a relatively large consumption of a server. In another related game, a technology is provided that can invite other players to complete a game event together, although the time for the players to unlock the game event can be reduced to a certain extent, however, during the game, the invited players tend to have no participation will due to insufficient knowledge of the game event involved or due to too great a difficulty in the game event, and one party easily gives out an invitation, while the other party does not respond late or refuses immediately, and the interaction efficiency is low.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide at least a data processing method, apparatus, electronic device, and storage medium, so as to overcome at least one of the foregoing drawbacks.
In a first aspect, exemplary embodiments of the present application provide a data processing method, the method including: displaying the virtual object in a graphical user interface; responding to a mutual assistance request instruction initiated aiming at a target behavior event of a game, entering a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object; and responding to a content issuing instruction, and providing the event auxiliary label through a preset issuing channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content issuing information determined on the content newly-built page and a mutual assistance link generated based on the target behavior event.
In a second aspect, embodiments of the present application further provide a data processing apparatus, where the apparatus includes: the display control module displays the virtual object in the graphical user interface; the editing module responds to a mutual assistance request instruction initiated aiming at a target behavior event of a game, and enters a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object; the issuing module is used for responding to a content issuing instruction and providing the event auxiliary label through a preset issuing channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content issuing information determined on the content newly-built page and a mutual assistance link generated based on the target behavior event.
In a third aspect, embodiments of the present application further provide an electronic device, a processor, a storage medium, and a bus, where the storage medium stores machine-readable instructions executable by the processor, and when the electronic device is running, the processor communicates with the storage medium through the bus, and the processor executes the machine-readable instructions to perform the steps of the data processing method described above.
In a fourth aspect, embodiments of the present application also provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor performs the steps of the data processing method described above.
The data processing method, the data processing device, the electronic equipment and the storage medium are beneficial to improving the convenience of assisting the game event so as to accelerate the completion of the game event and reduce the consumption of a server.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows a flow chart of a data processing method provided by an exemplary embodiment of the present application;
FIG. 2 illustrates a flowchart of the steps provided by exemplary embodiments of the present application to generate a mutual aid request instruction;
FIG. 3 illustrates a schematic diagram of a general portal interface under a virtual combat platform provided by exemplary embodiments of the present application;
fig. 4 is a schematic diagram of a first operation page under a virtual fight platform according to an exemplary embodiment of the present application;
FIG. 5 shows a flowchart of the steps for editing published content provided by an exemplary embodiment of the present application;
fig. 6 shows a schematic diagram of a content creation page under the content publishing platform provided in an exemplary embodiment of the present application;
FIG. 7 is a schematic diagram of communication channels within a virtual combat platform provided in accordance with an exemplary embodiment of the present application;
FIG. 8 is a flowchart illustrating steps provided by exemplary embodiments of the present application for multi-channel secondary publication of a targeted behavioral event;
FIG. 9 shows a flowchart of steps for presenting interactive pages provided by an exemplary embodiment of the present application;
FIG. 10 illustrates a schematic diagram of an interactive details interface for a target behavioral event provided by an exemplary embodiment of the present application;
FIG. 11 is a schematic diagram showing a structure of a data processing apparatus according to an exemplary embodiment of the present application;
Fig. 12 shows a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be appreciated that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
It should be understood that in embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" is merely an association relationship describing an association object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. "comprising A, B and/or C" means comprising any 1 or any 2 or 3 of A, B, C.
It should be understood that in the embodiments of the present application, "B corresponding to a", "a corresponding to B", or "B corresponding to a", means that B is associated with a, from which B may be determined. Determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
In the related art, a group/ticket order can be initiated by a user, and a power-assisted link is sent to other users through social application, so that power is assisted by clicking the power-assisted link by the other users, and corresponding service resources are allocated to the users when the power-assisted value meets the condition.
However, the above-mentioned power-assisted sharing process has the following drawbacks:
(1) When the inviter performs assisted sharing, the sharing content is fixed and is automatically generated by the system, the inviter cannot edit more customized sharing content, and the sharing content can only be supplemented in the social application through chat information in the social application.
(2) The inviting party can check the assistance records of the assistance person, but the assistance person does not provide relevant feedback, and the assistance person simply executes clicking assistance operation, so that more content feedback cannot be obtained.
(3) The booster and the inviter need to be users of the application to which the initiated order belongs, and can not distinguish the boosting executed under different channels, and can not realize the resource boosting across applications.
In the embodiment of the present application, considering that the above-mentioned power-assisted sharing manner is applied to a game, currently, in the game (e.g., a strategic game), a user may execute various behavior events for a land parcel in a virtual scene, where each behavior event has a corresponding behavior completion condition.
In one case, the user can independently achieve a behavior completion condition for a behavior event. At this time, interactivity with other users is poor, and the user is required to consume more time and cost, resulting in low execution efficiency.
Alternatively, the behavior completion condition for the behavior event may be achieved with multiuser collaboration. At this time, the interactive page for assisting the behavior event needs to be provided to more other users, so as to improve the efficiency of achieving the behavior completion condition.
If the power-assisted sharing method in the related art is applied to the game, various defects still exist, and only assistance can be sought among users who install game application programs, so that interaction efficiency is low.
In view of at least one of the foregoing problems, the present application proposes a data processing method, apparatus, electronic device, and storage medium, so as to improve the transmissibility and interactivity of content.
First, names involved in the embodiments of the present application will be described.
Terminal equipment:
the terminal device in the embodiment of the present application mainly refers to an intelligent device that is used to provide a game screen (such as a related setting/configuration interface in a game and an interface for presenting a virtual scene), and is capable of performing a control operation on a virtual character, where the terminal device may include, but is not limited to, any one of the following devices: smart phones, tablet computers, portable computers, desktop computers, gaming machines, personal Digital Assistants (PDAs), electronic book readers, MP4 (Moving Picture Experts Group Audio Layer IV, dynamic image experts compression standard audio layer 4) players, and the like. The terminal device is installed and operated with an application program supporting a virtual scene, such as an application program supporting a three-dimensional virtual scene. The application may include, but is not limited to, any of a virtual reality application, a three-dimensional map application, a military simulation application, a MOBA game (Multiplayer Online Battle Arena, multiplayer online tactical athletic game), a multiplayer gunfight survival game, a Third person shooter game (TPS, third-Personal Shooting Game). Alternatively, the application may be a stand-alone application, such as a stand-alone 3D (Three-dimensional) game program, or a network online application.
Graphical user interface:
is an interface display format in which a person communicates with a computer, allowing a user to manipulate icons, logos, or menu options on a screen using an input device such as a mouse, a keyboard, and/or a joystick, and also allowing a user to manipulate icons or menu options on a screen by performing a touch operation on a touch screen of a touch terminal to select a command, start a program, or perform some other task, etc.
And providing or displaying an interface corresponding to the application program through a graphical user interface, wherein the interface is a picture corresponding to at least one observation mode for observing the virtual scene. Here, the at least one observation means may include, but is not limited to: viewing angle, viewing configuration (e.g., whether to turn on a night vision device), viewing center, viewing angle. For example, the interface may refer to a screen obtained by observing a virtual scene with an observation angle having a certain lens height with a certain virtual character or a certain coordinate position in the virtual scene as an observation center. By way of example, virtual characters such as game characters, NPC characters (non-player characters), AI (Artificial Intelligence ) characters, etc., that execute game logic in a virtual scene may be included in the graphical user interface.
Any visual controls or elements that can be seen are included on the graphical User Interface, and may include, for example, game controls (e.g., skill controls, movement controls, functionality controls, etc.), indication identifiers (e.g., direction indication identifiers, character indication identifiers, etc.), information presentation areas (e.g., defeater numbers, game times, etc.), or game setting controls (e.g., system settings, stores, gold coins, etc.), and may include picture, input box, text box, etc. controls, some UI (User Interface) controls being responsive to User operation.
Virtual scene:
is a virtual environment that an application displays (or provides) when running on a terminal device or server. Optionally, the virtual scene is a simulation environment for the real world, or a semi-simulated semi-fictional virtual environment, or a purely fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, and the virtual environment may be sky, land, sea, or the like. The virtual scene is a scene of a complete game logic of a user control virtual character, and optionally, the virtual scene is also used for virtual environment fight between at least two virtual characters, and virtual resources available for the at least two virtual characters are arranged in the virtual scene.
Virtual roles:
may be a virtual character manipulated by a player in a virtual environment including, but not limited to, at least one of a virtual character, a virtual animal, a cartoon character, a virtual warship, a virtual vehicle, a virtual plane, a virtual vessel, or may be a non-player-manipulated virtual character (NPC). Alternatively, when the virtual environment is a three-dimensional virtual environment, the virtual characters may be three-dimensional virtual models, each having its own shape and volume in the three-dimensional virtual environment, occupying a part of the space in the three-dimensional virtual environment. Alternatively, the virtual character is a three-dimensional character constructed based on a three-dimensional human skeleton technique or a three-dimensional object constructed based on a three-dimensional technique, and the virtual character realizes different external figures by being given different skins. In some implementations, the avatar may also be implemented using a 2.5-dimensional or 2-dimensional model, which is not limited by embodiments of the present application.
There may be multiple virtual characters in the virtual scene, which are virtual characters that a player manipulates (i.e., objects that the player controls through an input device), or artificial intelligence set in the virtual environment combat through training. Optionally, the avatar is a avatar/virtual object performing an athletic activity in the virtual scene. Optionally, the number of virtual characters in the virtual scene fight is preset, or is dynamically determined according to the number of terminal devices joining the virtual fight, which is not limited in the embodiment of the present application. In one possible implementation, a user can control a virtual character to move in the virtual scene, and can also control the virtual character to fight against other virtual characters using virtual skills, virtual props, and the like provided by an application.
In an alternative embodiment, the terminal device may be a local terminal device. Taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal device, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
Application scenarios applicable to the application are introduced. The method and the device can be applied to the technical field of games, and in the games, a plurality of players participating in the games join in the same virtual game together.
Before entering the virtual game, the player may select different character attributes, e.g., identity attributes, for the virtual characters in the virtual game by assigning the different character attributes to determine different camps, so that the player wins the game play by performing the assigned tasks of the game at different stages of the virtual game, e.g., multiple virtual characters having the character attribute a "culls" the virtual character having the character attribute B at the stages of the game play to obtain the winning of the game play. Here, when entering the virtual game, a character attribute may be randomly assigned to each virtual character participating in the virtual game.
The implementation environment provided in one embodiment of the present application may include: the system comprises a first terminal device, a server and a second terminal device. The first terminal device and the second terminal device are respectively communicated with the server to realize data communication. In this embodiment, the first terminal device and the second terminal device are respectively installed with an application program for executing the data processing method provided in the present application, and the server is a server end for executing the data processing method provided in the present application. The first terminal device and the second terminal device can communicate with the server respectively through the application program.
Taking a first terminal device as an example, the first terminal device establishes communication with a server by running an application. In an alternative embodiment, the server establishes the virtual game according to the game request of the application program. The parameters of the virtual game may be determined according to the parameters in the received game request, for example, the parameters of the virtual game may include the number of persons participating in the virtual game, the role level of participating in the virtual game, and the like. When the first terminal equipment receives the response of the game server, displaying a virtual scene corresponding to the virtual game through a graphical user interface of the first terminal equipment, wherein the first terminal equipment is equipment controlled by a first user, the virtual character displayed in the graphical user interface of the first terminal equipment is a player character controlled by the first user, and the first user inputs an operation instruction through the graphical user interface so as to control the virtual character to execute corresponding operation in the virtual scene.
Taking a second terminal device as an example, the second terminal device establishes communication with the server by running an application. In an alternative embodiment, the server establishes the virtual game according to the game request of the application program. The parameters of the virtual game may be determined according to the parameters in the received game request, for example, the parameters of the virtual game may include the number of persons participating in the virtual game, the role level of participating in the virtual game, and the like. And when the second terminal equipment receives the response of the server, displaying a virtual scene corresponding to the virtual game through a graphical user interface of the second terminal equipment. The second terminal equipment is equipment controlled by a second user, the virtual character displayed in the graphical user interface of the second terminal equipment is a player character controlled by the second user, and the second user inputs an operation instruction through the graphical user interface so as to control the virtual character to execute corresponding operation in the virtual scene.
The server calculates data according to game data reported by the first terminal equipment and the second terminal equipment, and synchronizes the calculated game data to the first terminal equipment and the second terminal equipment, so that the first terminal equipment and the second terminal equipment control the graphical user interface to render corresponding virtual scenes and/or virtual roles according to the synchronous data issued by the game server.
In this embodiment, the virtual character controlled by the first terminal device and the virtual character controlled by the second terminal device are virtual characters in the same virtual pair. The virtual roles controlled by the first terminal device and the virtual roles controlled by the second terminal device may have the same role attribute, or may have different role attributes, where the virtual roles controlled by the first terminal device and the virtual roles controlled by the second terminal device belong to the same camping, or belong to different camps of hostile relationships. The game AI avatar participating in the virtual counter may have a campaigns to which the virtual character controlled by the first terminal device belongs, or may not have a campaigns to which the virtual character controlled by the second terminal device belongs, and may interact with other virtual characters in the virtual counter.
It should be noted that, in the virtual game, two or more virtual roles may be included, and different virtual roles may correspond to different terminal devices, that is, in the virtual game, there are two or more terminal devices that perform transmission and synchronization of game data with the game server, respectively.
The data processing method provided by the embodiment of the application can be applied to any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a multi-player online tactical competition game (MOBA), a multi-player gun survival game, a third-person combat game, a first-person combat game and a strategic game.
The data processing method in one embodiment of the present application may be executed on a local terminal device or a server. When the control method is operated on a server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and client equipment.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and operation of the control method are completed on the cloud game server, the function of the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the invention provides a data processing method, and a graphical user interface is provided through a terminal device, wherein the terminal device can be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
For the convenience of understanding the present application, the data processing method, apparatus, electronic device and storage medium provided in the embodiments of the present application are described in detail below.
Referring to fig. 1, a flowchart of a data processing method according to an exemplary embodiment of the present application is generally applied to a game server, for example, the cloud game server described above, but the present application is not limited thereto.
As shown in fig. 1, the data processing method in the exemplary embodiment of the present application specifically includes:
step S101: the virtual object is presented in a graphical user interface.
In the embodiment of the application, the virtual scene of the game may be presented in the graphical user interface, and the virtual scene may be a two-dimensional virtual environment and/or a three-dimensional virtual environment, and accordingly, the virtual object displayed in the virtual scene may also be a two-dimensional model and/or a three-dimensional model.
In a preferred embodiment of the present application, the virtual scene presented in the graphical user interface may be steplessly zoomed, e.g. zoomed in or zoomed out.
In an embodiment of the present application, at least one virtual object may be included in the virtual scene. With the zoom operation for the virtual scene, at least one virtual object can be displayed on the virtual scene displayed in the graphical user interface, and under different map zoom scales, the virtual object can be displayed on the virtual scene in a text form or a picture form, or the internal details of the virtual object can be displayed.
Here, the virtual object may refer to an area previously divided in the virtual scene, and the virtual object is an area that can be contended for and occupied by virtual objects that belong to different camps, for example. By way of example, virtual objects in a virtual scene may include, but are not limited to, virtual cities created in the virtual scene and/or virtual tiles that divide the virtual scene.
For example, in a virtual combat, a player may select a character attribute or be randomly assigned a character attribute to perform a virtual task or combat in a virtual scene, where the virtual object may be a point where virtual resources can be provided to perform a behavioral event on an occupied virtual object, e.g., by building a city at the virtual object, developing virtual resources, troop, treasuring, upgrading grain or troop, etc., to enhance virtual capacity (e.g., combat capacity) in the virtual scene to complete a virtual task corresponding to the character attribute possessed.
For example, land, mountain, river, wharf plugs, etc. may be distributed in the virtual scene, camping may be formed based on character attributes of the virtual characters operated by the player in the present virtual fight, and the virtual characters operated by the player (e.g., the armed forces, troops) may move in the virtual scene to occupy each virtual object (e.g., the virtual land parcel) to obtain corresponding virtual resources. After the virtual object is occupied, the home party of the virtual object may perform various behavioral events for the virtual object to improve its own (or affiliated camping) virtual capabilities.
Step S102: responding to a mutual assistance request instruction initiated aiming at a target behavior event of the game, and entering a content new page of the game.
Here, the above-mentioned initiated mutual aid request instruction is a control instruction issued by the game player through the terminal device, the mutual aid request instruction being used to trigger or request an auxiliary interaction behavior for the target behavior event.
As an example, the above-described mutual-assistance request instruction may be generated based on at least one of: the operation of icons or buttons on the graphical user interface for triggering auxiliary interaction, the operation of shortcut keys on the terminal device and/or shortcut keys on an external input device connected to the terminal device. For example, in the case where the terminal device is a device having a touch screen, the above-described operation may be a touch operation performed on the touch screen for an icon or the like.
In this embodiment of the present application, the above-mentioned interaction request instruction carries a behavior identifier, where the behavior identifier is used to indicate a target behavior event executed for a virtual object, that is, the interaction request instruction indicates which behavior event executed for which virtual object is to trigger an auxiliary interaction behavior. The auxiliary interaction behavior may include behavior that can bring a positive impact to achieving a behavior completion condition for a target behavior event, which may include, for example, improving execution efficiency for completing the target behavior event for a virtual object.
Here, the above-mentioned virtual object may refer to a virtual object occupied by a camp (or family) to which a user of the terminal device belongs in a virtual scene, and exemplary behavior events that can be performed for the occupied virtual object may include, but are not limited to, at least one of the following: building, building a building key, setting up an iron wall, deploying a strategy, training a troop, developing resources and yielding resources. The target behavior event is a behavior event currently performed by the virtual object.
Here, creating a building may refer to being able to build various types of buildings on the land of the virtual object, which is not limited by the present application. In addition, a key plug may be built on the occupied virtual object, which may include, but is not limited to, multiple rooms, hallways, stairs, doors, or other structures, may be mobilized to the key plug, and may also be used for front line combat troops. The iron wall is characterized in that a defensive attribute is added to a virtual object so as to increase the defensive capability of the virtual object, and when the virtual object is attacked by an enemy, the attack of the enemy can be effectively resisted, and own buildings, plugs and teams are protected so as to reduce common attack in fight and warfare injury influenced by the attack.
The above-described strategy (or strategy) may include a strategy effect that affects the strategy being affected in the fight to reduce the damage to the virtual object itself from the strategy attack. In the virtual game, the player can have at least one armed party, the armed party can have the army force, the army force of the armed party can be increased by soliciting the soldiers, and the combat ability of the army force can be improved by the soldiers of the army, so that the life value (HP) and/or the attack attribute of the armed party are increased.
There may be a variety of virtual resources at a virtual object, which may be obtained by performing a development resource behavior on the virtual object. In addition, let the resource point to a virtual resource developed by letting the virtual object be in the same camp or to a virtual character in a different camp.
It should be understood that each of the above-listed behavior events that can be executed for the virtual object is only an example, and the present application is not limited thereto, and other behaviors than the above-listed behavior events may be executed for the virtual object.
In the embodiment of the application, the completion of the behavior event for the virtual object needs to consume virtual resources and time in the game, for example, each behavior event may have a corresponding behavior time, and the behavior time pointer indicates a duration required for the virtual object to complete the corresponding behavior event. For example, the auxiliary interaction behavior may be used to shorten the behavior time for completing the corresponding behavior event for the virtual object, so as to improve the execution efficiency of completing the behavior event. In addition, the auxiliary interaction behavior can also be used to provide more virtual resources required for completing the behavior event for the user of the terminal device and/or the campaigns to which the user belongs, which is not limited in the application.
Several specific ways of generating the mutually-solicited instruction are described below in connection with fig. 2-4.
In the first case, the mutual aid request instruction is generated based on the total portal interface under the virtual combat platform.
Here, the general entry interface may be entered by operating a trigger control displayed on the graphical user interface, which may be displayed resident on the graphical user interface, e.g., when a combat scene corresponding to the virtual combat platform is presented on the graphical user interface, the trigger control may be displayed on the graphical user interface, and illustratively, the display of the trigger control may be synchronized with the display of the combat scene corresponding to the virtual combat platform, e.g., the trigger control may be displayed while the combat scene is displayed, and the trigger control may be dismissed while the combat scene is being switched. Alternatively, access to the total portal interface may be triggered by a shortcut key, a gesture operation, and/or a language operation, which is not limited in this application.
Fig. 2 shows a flowchart of the steps provided by an exemplary embodiment of the present application to generate a mutual aid request instruction.
Referring to fig. 2, in step S201, a target behavior event is determined in response to a second interactive operation performed on a general portal interface under a virtual combat platform, and a second operation page of the target behavior event is entered.
Here, the game server can provide different kinds of service platforms for users to meet different demands of the users. By way of example, the provided service platform may include, but is not limited to: virtual fight platform, content release platform, resource exchange platform, task platform.
For example, a player under the virtual fight platform can perform virtual fight with other players under the virtual scene of the game, i.e., show the virtual scene, rob virtual resources, and perform fight interaction. The content publishing platform can refer to a community (also can be a forum), and various game attacks, array content recommendations, novice guides, game information, game recourse and the like can be published in the community.
The resource exchange platform may refer to a store including, but not limited to, at least one of: equivalent interactions are achieved by using virtual resources accumulated in the terminal device or directly paying a certain real money to obtain a virtual character (e.g., hero) in the game, or to obtain, upgrade, change the appearance (e.g., skin) of the virtual character, virtual weapon, virtual accessory, or qualify to participate in the event.
The mission platform may issue a game mission, which is one of the main ways to obtain a game prize, periodically or sporadically.
In an embodiment of the present application, the total portal interface includes a plurality of behavior type controls, each behavior type control is used for characterizing a behavior event executed for the virtual object, and the second interaction operation includes a selection operation for a behavior type control used for characterizing the target behavior event in the plurality of behavior type controls. That is, the above-mentioned total entry interface is an interface for initiating the auxiliary interaction behavior, and the behavior event corresponding to the auxiliary interaction behavior is selected under the total entry interface.
Fig. 3 shows a schematic diagram of a general portal interface under a virtual combat platform provided in an exemplary embodiment of the present application.
As shown in FIG. 3, the total entry interface 60 may be entered under a virtual combat platform, with a plurality of behavior type controls, shown as behavior type controls 111-999, displayed on the total entry interface 60.
Here, the above-described second interactive operation may refer to an operation for selecting a behavior type control, and may include, but is not limited to, at least one of: the operation of moving the cursor or the suspension control point to the position of the behavior type control, the pressing operation of a preset physical key on the terminal device, the pressing operation of a physical key on an external device of the terminal device, and the touching operation (such as a single click operation, a double click operation, a long press operation and a sliding operation) of the behavior type control are controlled.
In this example, the second operation page of the target behavior event may be entered directly after the behavior type control for characterizing the target behavior event is selected through the second interactive operation, or may be entered through a selection operation of "ok" on the total portal interface 60 after a certain behavior type control is selected.
Returning to fig. 2, in step S202, in response to the third interactive operation performed on the second operation page, an execution object corresponding to the target behavior event is determined from the virtual scene of the game, so as to generate a mutual assistance request instruction.
In this embodiment of the present application, the second operation page is used to provide at least one object capable of being executed with the target behavior event in the virtual scene, that is, the provided object is a virtual object occupied by the user of the terminal device and/or the camping to which the user belongs, and supports the target behavior event.
Illustratively, the third interaction includes a selection operation for a target object of the at least one object. The third interaction may include, but is not limited to, at least one of: the method comprises the steps of controlling a cursor or a suspension control point to move to a position where a target object is located, pressing a preset physical key on terminal equipment, pressing a physical key on external equipment of the terminal equipment, and touching the target object.
In a second case, a mutual aid request instruction is generated based on a first operation page under the virtual fight platform.
For example, the mutual aid request instruction may be initiated according to a first interactive operation performed on a first operation page under the virtual fight platform.
For example, the first operation page may be triggered to be displayed on the graphical user interface in response to a selected operation on the virtual object. Here, the first operation page is a page for executing a target behavior event with respect to the virtual object, and illustratively, the first operation page may include, but is not limited to: a levying interface, a building interface and an upgrade facility interface.
An auxiliary control may be displayed on the first operation page, where the auxiliary control is a control used to initiate an auxiliary interaction behavior for the target behavior event, and in this case, the first interaction operation may include a selection operation for the auxiliary control, where the auxiliary interaction behavior for the target behavior event is initiated by an operation for the auxiliary control displayed on the first operation page.
Fig. 4 shows a schematic diagram of a first operation page under the virtual fight platform according to an exemplary embodiment of the present application.
In this example, as shown in fig. 4, the terminal device 11 provides the graphical user interface 22, and a scene screen corresponding to the virtual scene observed at the third viewing angle is displayed in the graphical user interface 22, so that the virtual scene content displayed in the graphical user interface 22 can be changed in response to a drag operation for the virtual scene or a zoom instruction for the virtual scene.
In a strategic game, multiple characters may participate in virtual game play independently, or may be grouped to form a camp (which may also be referred to as a family or an organization), where each character in the camp may use virtual resources in a virtual scene to attack a virtual object, or may control the virtual characters in the virtual scene to attack a virtual object, and after the virtual object is occupied by a virtual character in a camp, each virtual character in the camp may perform various behavioral events on the occupied virtual object.
In the example shown in fig. 4, a plurality of virtual objects (a plurality of plots as shown in the figure) are included in the virtual scene, wherein the virtual objects filled with the pattern D1 and the virtual objects filled with the pattern D2 respectively belong to different camps.
As shown in fig. 4, a first operation page 33 is provided on the graphical user interface 22 in response to a trigger operation for the virtual object DK. Preferably, the first operation page 33 may be displayed at the uppermost layer of the graphical user interface 20, and in addition, the transparency of the first operation page 33 may be adjusted so that the player can observe other content on the graphical user interface 22 or in the virtual scene that at least partially overlaps with the display area of the first operation page 33. For example, the first operation page 33 may be displayed at a peripheral side of the virtual object DK and not overlap with a display area of the virtual object DK.
In a preferred embodiment of the present application, at least one preset control may also be provided at the virtual object in response to a triggering operation for the virtual object. Here, each preset control is used to perform a corresponding preset function for the virtual object.
As shown in fig. 4, each preset control KJ may be displayed around the virtual object DK, and exemplary preset controls may include, but are not limited to, at least one of the following: positioning control, viewing control, sharing control and auxiliary control. Here, the positioning control is used for setting a positioning mark for the virtual object, the viewing control is used for viewing a war report of the virtual object, and the war report can include beneficial results brought by various behavior events of the host party aiming at the virtual object, and negative results brought by the attack behavior of the enemy aiming at the virtual object. The sharing control is used to share information (e.g., coordinate locations) about the virtual object with other players participating in the virtual game or registered in the game.
The above examples are embodiments for displaying a virtual scene in a landscape display mode, and the application is not limited thereto, and also supports providing a first operation page at a virtual object or triggering auxiliary interaction behavior when the virtual scene is displayed in a portrait display mode.
In addition to the above manner, the first operation page may be presented by way of a presentation window.
For example, a presentation window is provided on the graphical user interface in response to a selected operation on the virtual object, and controls associated with the virtual object are displayed in the presentation window. Illustratively, the auxiliary control described above to initiate the auxiliary interaction behavior for the target behavior event is displayed in a presentation window.
Preferably, the display window may be displayed floating at the uppermost layer of the graphical user interface. For example, the display position of the presentation window on the graphical user interface may also be changed in response to a drag operation on the presentation window.
In a preferred embodiment of the present application, the platform for editing the published content and the platform for initiating the mutual assistance request instruction are different platforms in the game, so as to enhance interactivity between users under different platforms and promote transmissibility of the shared content. In addition, aiming at the condition that the content release platform is a community, auxiliary interaction aiming at the behavior event is accessed into the community, and social interaction of the game is promoted.
The content new page is a new page for editing release content under the content release platform, preferably, the release content at least comprises automatically generated content and/or custom content, and as an example, the automatically generated content is a mutual link for assisting a target behavior event, and the custom content is information input by a user under the content new page, such as a release title, release content and the like.
Illustratively, the above-described mutual links may be page links for interactive pages that assist in the targeted behavioral event, alternatively, in this embodiment, the interactive pages may include, but are not limited to, H5 (HyperText Markup Language, 5 th generation hypertext markup language) interactive pages. For example, the auxiliary interactions for the target behavioral event are presented through an H5 interaction page.
In a preferred embodiment, users of the game may share auxiliary interactions initiated with respect to the target behavioral events performed by the virtual objects under the virtual combat platform into the content distribution platform, and the published content (e.g., community posts) under the content distribution platform may also share other channels, such as external social applications, so that non-game users may also view and interact with the community posts by way of guests.
In a preferred embodiment of the present application, after receiving the mutual request instruction, H5 page structure data may be formed based on information carried by the mutual request instruction, where the H5 page structure data may include, but is not limited to: h5 page component information, H5 page layout information and H5 page style rules, and forming an H5 interaction page based on the H5 page structure data.
Fig. 5 shows a flowchart of the steps for editing published content provided by an exemplary embodiment of the present application.
Referring to fig. 5, in step S301, first description information input by a user on a content new page is received.
Here, the user may refer to a user who logs in to the terminal device using an account number, who may input a release title on a content creation page, and may edit release content. By way of example, the entered descriptive information may include text, emoticons, pictures, and the like.
In step S302, the first description information and the second description information preset to describe the target behavior event are both displayed as release contents in the content new page, that is, as content release information on the content new page.
Here, in the content newly-built page, besides the H5 link entry of the auxiliary interaction behavior aiming at the target behavior event, custom description, expression, video and the like can be attached, so that the sharing channel is more diversified and the sharing content is more abundant. The content release information presented on the content creation page may also be one of the above two types of information (custom content and automatically generated content), which is not limited in this application.
Fig. 6 shows a schematic diagram of a content creation page under the content distribution platform provided in an exemplary embodiment of the present application.
The newly created page of contents as shown in fig. 6 includes an operation field 10, a title field 20, an editing area 30, an additional information field 40, and a toolbar 50. Illustratively, the operation panel 10 includes a return control "<", through which a click operation on the return control can return to a previous operation, for example, cancel the display of a new page of content, return to a page of a previous step, and the operation panel 10 further includes a "release" button, through which a click operation on the "release" button, a release title and release content can be disclosed on the content release platform to be viewed by other users.
The title of the released content is displayed in the title bar 20, and when the new content page is entered, a title can be automatically generated based on the information carried in the mutual assistance request instruction, and the information (such as XXXX) input by the user in the title bar 20 can be received as the title, and the editing operation of the user for the automatically generated title can be received. The edit area 30 is used to display distribution content.
In the example of fig. 6, the automatically generated content is a mutual connection, which may include an image, text, and a control, where the image may be an image used to represent a game, or an image used to represent a current invitation game, or an image used to represent an avatar of a user of the terminal device, the text may be descriptive information (such as mmmmmmmmmm in the figure) of an auxiliary interaction behavior for a target behavior event, and the control "B" is a control used to trigger the auxiliary interaction behavior for the target behavior event, and by clicking the control "B", the auxiliary interaction can be successfully provided for the target behavior event.
The custom content may include "YYYYYYYYY-, in the edit area 30! The following is carried out The following is carried out And the supplemental text information input by the user aiming at the auxiliary interaction behavior can also comprise an expression picture.
The additional information field 40 is used to provide auxiliary functions for the distribution of content, such as real-time locating positions when the distribution of content is disclosed, etc. The toolbar 50 includes at least one control, for example, a positioning control A1, for adding a positioning position in the additional information bar 40, and an expression input control A2, wherein an expression picture is selected to be inserted into the custom release content by operating the control A2, a control A3 is inserted, and a picture or video is inserted into the release content by operating the control A3.
Aiming at the condition that the content publishing platform is a community, the community paste can be used as a carrier for initiating auxiliary interaction, and besides the interaction page display, the community functions of retaining relevant comments, praise and the like can also be recorded.
Returning to fig. 1, step S103: and responding to the content issuing instruction, and providing an event auxiliary label through a preset issuing channel.
Here, the event assist tab is a tab for assisting in completing the target behavior event, and includes content release information determined on the content creation page, and a mutual link generated based on the target behavior event.
Here, the content distribution instruction is a control instruction to distribute content in public under the content distribution platform, and is generated by a selection operation of a "distribution" button under the content creation page, taking the example shown in fig. 6 described above as an example.
In the embodiment of the application, the event auxiliary label can be provided through various channels, and further auxiliary interaction of the target behavior event is completed through operation of the provided event auxiliary label.
In the first embodiment, the preset distribution channel includes a content distribution platform.
At this time, in response to the content distribution instruction, the content distribution platform presents the distribution content, that is, the event auxiliary label for assisting the target behavior event under the virtual fight platform is distributed through the content distribution platform.
Here, the release content released through the content release platform may include mutual links, expressions, pictures, texts, etc. which are custom edited, that is, all the content on the content creation page.
In a second embodiment, the preset distribution channel comprises a communication channel within the virtual combat platform.
At the moment, in response to the content issuing instruction, determining a target communication channel to which the user belongs in the virtual fight platform, and presenting the event auxiliary label to each member in the target communication channel in a channel message mode. In this case, second description information for describing the target behavior event is issued within the communication channel as a preset, for example, a mutual link generated based on the target behavior event.
Illustratively, the target communication channel may include, but is not limited to, any of the following: the system comprises family channels, group channels, world channels and designated contact channels, wherein each member in the family channels is each member in the family to which a user of the terminal equipment belongs, the channels can be automatically created while the family is formed, the group channels can be group chat organizations which the user joins by himself, the world channels refer to channels which can release information to all users in a game, the users can have corresponding contact lists in the game, and the designated contact channels can refer to channels which communicate with any contact in the contact lists.
Fig. 7 is a schematic diagram of communication channels in a virtual combat platform according to an exemplary embodiment of the present application.
As shown in fig. 7, by providing the graphical user interface 22 through the terminal device 11, a channel chat interface may be triggered and displayed under the virtual combat platform, where the channel chat interface may be exemplarily displayed at an upper layer of the virtual scene, and the channel chat interface includes a chat content field 44 and a selection field 55, where the chat content field 44 is used to display chat records of each member in the channel, and the selection field 55 displays a plurality of channels for the user to select.
In this example, taking the target communication channel as a family channel as an example, the H5 interaction page 66 is published in the family channel in the form of channel information (e.g., in the form of a channel link), so that each member in the family channel can perform auxiliary interactions for the target behavior event through the H5 interaction page.
Preferably, the interactive page may be automatically triggered to be released under the two release channels at the same time, for example, in response to a content release instruction, the content release platform to which the content newly-built page belongs and the target communication channel in the game may release the event auxiliary tag at the same time, or the event auxiliary tag may be released under a certain release channel according to the setting. By means of the mode, the help seeking paste released by the community can be shared in the community, the paste can be shared to other external social platforms for seeking assistance, and synchronous sharing H5 can be linked to a game chat channel in a game.
Here, the completion of the target behavior event is updated based on first data obtained by the virtual object executing the target behavior event, and the first data is illustratively related to execution of the target behavior event, and the first data may be changed as the progress of the game (e.g., the completion of a virtual task corresponding to the target behavior event) progresses, or as the time of the game elapses, for example, as the time of the game increases, the completion of the target behavior event changes, e.g., more nearly the completion.
In a preferred exemplary embodiment, the first data may be determined based on the following manner: determining first increment data or first decrement data of a unit time based on the attribute of the virtual object; the first data is determined based on the executed time of the target behavior event and the first increment data or the first decrement data of the unit time.
By way of example, the attributes of the virtual object may include, but are not limited to, at least one of: building level, virtual skill possessed by the virtual character, and virtual equipment possessed by the virtual character. The building may be an object of a behavioral event, e.g., when the behavioral event is an upgrade building, the building level refers to the level of the building to be upgraded, the virtual character may refer to an object that may be manipulated to perform the behavioral event with respect to the virtual object, the virtual skill provided may include skills that may affect the completion of the behavioral event, and the virtual equipment provided may include equipment that may affect the completion of the behavioral event, accordingly.
Here, the amount of change in the target behavior time per unit time, which is a variable that advances the target behavior time to the degree of completion, may be determined based on the attribute of the virtual object, and may include first incremental data or first decrement data, for example, in a game, the first incremental data may be embodied as an increase in the number of recruits, and the first decrement data may be embodied as a decrease in the construction time of a building, and a decrease in the time of characterization to other plots.
In an exemplary embodiment of the present application, the completion of the target behavior event may be superimposed with the second data on the basis of updating based on the first data, while updating the completion of the target behavior event.
In one case, the second data may be generated based on a direct trigger to the event assisted tag.
For example, in response to an auxiliary interaction of the event auxiliary tag through a preset distribution channel, second data is determined, and the completion of the target behavioral event is updated based on the first data and the second data.
Here, the auxiliary interaction may be generated through access to an event auxiliary tag provided under a preset distribution channel. By way of example, the auxiliary interaction operations may include generating auxiliary interaction instructions in response to clicking operations on the mutual links within the communication channel within the game client, clicking operations on the mutual links published within the community, clicking operations on the mutual links published in the external social application, indicating that auxiliary interactions on the target behavioral event are completed.
Alternatively, an interface may be triggered to display by an event assisted tab and the second data generated based on an operation on the interface.
For example, in response to a triggering operation of an event auxiliary tag through a preset distribution channel, entering an auxiliary interaction page of a target behavior event, in response to an auxiliary interaction operation performed in the auxiliary interaction page, determining second data, and updating the completion degree of the target behavior event based on the first data and the second data.
Here, the triggering operation is an operation for triggering the display of the auxiliary interaction page, and the triggering operation may be, for example, a selection operation of the event auxiliary tag, where the auxiliary interaction page includes a page for triggering the auxiliary interaction with respect to the target behavior event, and in this case, the auxiliary interaction operation may include an operation performed on the auxiliary interaction page for completing the auxiliary interaction. For example, a "mutual" control may be displayed on a page of the auxiliary interaction, which may be a selection operation of the "mutual" control, to complete the auxiliary interaction.
Taking the auxiliary interaction page as an H5 page as an example, the H5 page may be triggered to be displayed by a selection operation of an event auxiliary label, and the auxiliary interaction may be completed by a selection operation of an auxiliary interaction control ("mutual assistance" control) in the H5 page.
It should be understood that the second data may be generated by selecting one of the two cases, or both the two cases may be included, for example, the triggering operation and the auxiliary interaction operation of the event auxiliary tag are different operations, so that different operations are performed on the event auxiliary tag, and a subsequent different processing procedure may be triggered, which is not limited in this application.
Here, the second data may be determined by: and determining second data based on the operation times of the auxiliary interactive operation and the second increment data or the second decrement data, wherein the operation times have a preset upper limit value.
The number of operations of the above operations corresponds to a variable triggering a change amount, which is a variable that advances the target behavior time to the completion degree, and the change amount may include, for example, second increment data or second decrement data.
For example, if multiple auxiliary interactions are performed for the event auxiliary tag, each auxiliary interaction corresponds to a second data, and the completion of the target behavior event is updated.
Alternatively, the preset upper limit value may be set according to the need, and preferably, the preset upper limit value may be 1. At this time, if the plurality of auxiliary interaction operations are operations performed by the same user, only one second data is determined, and the completion degree of the target behavior event is updated once based on the second data. The second data may correspond to the first of the plurality of auxiliary interactions, or may correspond to any of the plurality of auxiliary interactions, which is not limited in this application. If the auxiliary interaction operation is an operation executed by different users, each auxiliary interaction operation can trigger updating of the completion degree of the target behavior event.
The second data is related to the auxiliary interaction performed on the event auxiliary label, and by way of example, taking the target behavior event as an upgrade building as an example, updating the target behavior event based on the second data may refer to a decrease in building upgrade time, taking the target behavior event as an example, and updating the target behavior event based on the second data may refer to an increase in the number of soldiers.
In a preferred embodiment of the present application, the interactive page of the auxiliary target behavior event may also be subjected to multi-channel secondary publishing, which is specifically shown in the flow chart of fig. 8.
FIG. 8 is a flowchart illustrating steps provided by exemplary embodiments of the present application for multi-channel secondary publication of targeted behavioral events.
Referring to fig. 8, in response to the auxiliary interactive operation, an interactive detail interface for a target behavior event is entered in step S401.
In a preferred embodiment, an auxiliary interaction record for the target behavioral event may be formed and saved in response to an auxiliary interaction operation. By way of example, an auxiliary interaction record may include channel identification, user identification, application identification, interaction identification, auxiliary interaction description information. The channel identifier is used for indicating a release channel corresponding to the execution of the auxiliary interaction operation, the user identifier is used for indicating a user executing the auxiliary interaction operation, such as a user identity (e.g. a user ID, a user nickname) under the release channel, the application identifier is used for indicating an application to which the release channel providing the event auxiliary label belongs, the interaction identifier indicates whether the auxiliary interaction aiming at the target behavior event is successful or not, and the auxiliary interaction description information is used for representing feedback of the completion degree of the auxiliary interaction operation on the target behavior event, for example, an influence value of the second data on the completion degree of the target behavior event.
Here, the interactive detail interface includes a page for presenting mutual details for the target behavior event under each distribution channel. Alternatively, auxiliary interaction records generated under different distribution channels may each be stored to the game server to present display content of the interaction details interface based on content stored in the game server upon entry into the interaction details interface.
In this case, the auxiliary interaction information for the target behavior event can be synchronously updated in three-terminal data of the virtual fight platform, the external social application and the content distribution platform, so that the auxiliary interaction details can be checked by the user under the virtual fight platform, the user of the external social application and the user under the content distribution platform.
In step S402, in response to a first determination operation performed on the interactive detail interface, the event assist label is forwarded under the target distribution channel.
In this embodiment of the present application, at least one channel control is displayed in the interactive detail interface, where each channel control corresponds to a distribution channel, and the first determining operation includes a selection operation for a target channel control in the at least one channel control, where the target channel control is a control for indicating the target distribution channel.
By way of example, the target distribution channel may be any of the communication channels within the virtual fight platform, external social applications, content distribution platforms described above.
For example, in the case where the preset distribution channel is a content distribution platform, the target distribution channel includes a communication channel in the virtual fight platform, the content distribution platform, and an external communication application.
For example, taking a target distribution channel as a content distribution platform, all distributed content may be re-disclosed under the content distribution platform, for example, forwarding posts within a community. Taking a target release channel as an example of a communication channel in the virtual fight platform, the interactive page can be released again in the communication channel in the form of a channel paste. Taking the target publishing channel as an external communication application as an example, the interactive page can be reissued in the form of a sub-link in the chat page of the external communication application.
In a preferred embodiment, for the case that the preset distribution channel is the content distribution platform, a control may be displayed at a position where the content is published under the content distribution platform, and the published content is directly sent to the chat page of the external communication application through a selection operation of the control, for example, the community post is shared in the chat page of the external communication application.
Aiming at the condition that the preset release channel is a communication channel in the virtual fight platform, the target release channel comprises the communication channel in the virtual fight platform and an external communication application.
For example, taking the target distribution channel as an example of a communication channel in the virtual fight platform, the interactive page may be reissued in the communication channel in the form of a channel link, and taking the target distribution channel as an example of an external communication application, the interactive page (or all of the distributed contents) may be reissued in the form of a sub link.
In an alternative embodiment of the present application, the response result for the auxiliary interaction instruction may be determined based on the game state in which the target behavior event is currently located.
Fig. 9 shows a flowchart of steps for presenting an interaction page provided by an exemplary embodiment of the present application.
Referring to fig. 9, in step S501, a game state of a target behavior event is determined in response to an auxiliary interactive operation.
As described above, the target behavior event has a corresponding behavior time, and it is determined whether the behavior time corresponding to the target behavior event is ended at the time when the auxiliary interaction instruction is received, if not, it is determined that the game state of the target behavior event is in the execution stage, and if so, it is determined that the game state of the target behavior event is in the completion stage.
In step S502, in response to the target behavior event being in the execution phase, an interactive detail interface for the target behavior event is entered.
Illustratively, the interactive detail interface displays auxiliary interactive process descriptions of the game users and/or users of external communication applications aiming at the target behavior events. For example, the auxiliary interaction procedure description may include at least one of the auxiliary interaction records described above.
FIG. 10 illustrates a schematic diagram of an interactive details interface for a target behavioral event provided by an exemplary embodiment of the present application.
As shown in fig. 10, in the interaction details interface 70, header information "ccccc" is displayed at the header position 71, which may indicate a target action event, may also indicate an invitation game in which the user is currently participating, or may also indicate a game application to which the target action event belongs, and a plurality of auxiliary interaction records are displayed at the details presentation 72, for example, the plurality of auxiliary interaction records may be ordered according to how many contribution values may be determined according to the time when the auxiliary interaction action is performed for the target action event, the more the contribution values are performed first. Illustratively, the contribution value is used to characterize a degree of impact on improving efficiency of execution of the target behavioral event for the virtual object.
In this example, each auxiliary interaction record may include a user identification, such as the nicknames "CACA", "NANA", "HAHA", "ya" of the respective users shown in the figure, and may also include auxiliary interaction description information, "D1D1D1", "D2D 2", "D3D3D3", "D4D4D4" as shown in the figures, preferably, for users from different applications or under different channels, the user identification of the user from the external social application outside the game can be displayed differently to distinguish different applications or different power assisting channels. It should be understood that the assistance path may also be characterized by displaying channel identifiers or application identifiers in the auxiliary interaction record, as this application is not limiting.
Referring to FIG. 10, the behavior times for the target behavior event are displayed at display position 73, e.g., 23:23:23. Control Z1 and control Z2 are also displayed on the interactive detail interface 70, with different controls corresponding to different distribution channels.
In this example, the ranking is by contribution value, each auxiliary interaction record including the user nickname under the corresponding publication channel along with the contribution value, e.g., the contribution value is to shorten the time to reach the behavior completion condition. In particular, for users of non-game clients, application identifiers of external social applications can be displayed in corresponding auxiliary interaction records, and channel identifiers can be displayed in auxiliary interaction records so as to distinguish auxiliary interaction conditions of different assistance approaches in a game.
Returning to fig. 9, in step S503, in response to the target behavior event being in the completion phase, an interactive settlement interface for the target behavior event is entered.
Here, at the moment of receiving the auxiliary interaction instruction, the action time corresponding to the target action event is ended, and at this moment, the interaction settlement interface is entered.
And displaying auxiliary interaction results of the game users and/or the users of the external communication application aiming at the target behavior event on the interaction settlement interface.
Referring to the example shown in fig. 10, a corresponding plurality of auxiliary interaction records are also displayed in the interaction settlement interface, which differs from the interaction details interface in that no controls for channel selection are provided on the interaction settlement interface.
In the related art, players of non-game clients cannot interact with players in the game clients, and through the data processing and release of the game client, the game content can be shared outside the game through the content release platform, so that the players of the non-game clients interact with the game content. In the case of an invitation game, members involved in the invitation game acquire assistance of players of non-participating games or users of other applications by sharing recourse according to the event conditions.
By the method, topic degree transmission focus is formed, participation of players outside the event can be improved, and social interaction is stronger through effectively fusing community playing methods.
Based on the same application conception, the embodiment of the present application further provides a data processing device corresponding to the method provided by the foregoing embodiment, and since the principle of solving the problem by the device in the embodiment of the present application is similar to that of the data processing method in the foregoing embodiment of the present application, the implementation of the device may refer to the implementation of the method, and the repetition is omitted.
Fig. 11 is a schematic structural diagram of a data processing apparatus according to an exemplary embodiment of the present application. As shown in fig. 11, the data processing apparatus 200 includes:
A display control module 210 that exposes virtual objects in a graphical user interface;
the editing module 220 responds to a mutual assistance request instruction initiated aiming at a target behavior event of a game, and enters a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object;
the publishing module 230 is configured to provide the event assist tag through a preset publishing channel in response to a content publishing instruction, where the event assist tag is used for assisting in completing the target behavior event, and the event assist tag includes content publishing information determined on the content newly created page and a mutual assistance link generated based on the target behavior event.
In one possible implementation manner of the application, the completion degree of the target behavior event is updated based on first data obtained by the virtual object executing the target behavior; further comprises: and the updating module is used for determining second data in response to the auxiliary interaction operation of the event auxiliary label through a preset release channel and updating the completion degree of the target behavior event based on the first data and the second data.
In one possible implementation of the present application, the update module determines the first data based on: determining first increment data or first decrement data of a unit time based on the attribute of the virtual object; the first data is determined based on the executed time of the target behavior event and the first increment data or the first decrement data of the unit time.
In one possible embodiment of the present application, the update module determines the second data by: determining second data based on the operation times of the auxiliary interactive operation and second increment data or second decrement data; the operation times have a preset upper limit value.
In one possible embodiment of the present application, the editing module 220 is further configured to: and initiating the mutual assistance request instruction according to a first interactive operation executed on a first operation page under a virtual fight platform, wherein the first operation page is a page for executing the target behavior event aiming at the virtual object, the first operation page comprises an auxiliary control, the auxiliary control is a control for initiating auxiliary interaction aiming at the target behavior event, and the first interactive operation comprises a selection operation aiming at the auxiliary control.
In one possible embodiment of the present application, the editing module 220 is further configured to: determining a target behavior event in response to a second interactive operation executed on a total entry interface under a virtual fight platform, and entering a second operation page of the target behavior event, wherein the total entry interface comprises a plurality of behavior type controls, each behavior type control is used for representing a behavior event executed for a virtual object, and the second interactive operation comprises a selection operation for a behavior type control used for representing the target behavior event in the plurality of behavior type controls; and responding to a third interactive operation executed on the second operation page, determining an execution object corresponding to the target behavior event from the game, and generating the mutual assistance request instruction.
In one possible implementation manner of the application, the content release information determined on the content new page includes first description information input by a user on the content new page and/or second description information preset for describing the target behavior event.
In one possible implementation manner of the present application, the issuing module 230 issues the event assist tag simultaneously with the content issuing platform to which the content newly created page belongs and the target communication channel in the game in response to the content issuing instruction.
In one possible implementation manner of the present application, the updating module is further configured to: responding to triggering operation of the event auxiliary label through a preset release channel, entering an auxiliary interaction page of the target behavior event, wherein the auxiliary interaction page comprises a page for triggering auxiliary interaction aiming at the target behavior event; and determining second data in response to the auxiliary interaction operation performed in the auxiliary interaction page, and updating the completion degree of the target behavior event based on the first data and the second data.
In one possible implementation of the present application, the publishing module 230 is further configured to: responding to the auxiliary interaction operation, entering an interaction detail interface aiming at the target behavior event, wherein the interaction detail interface comprises pages for presenting mutual assistance details aiming at the target behavior event under each release channel; and forwarding the event auxiliary label under a target release channel in response to release operation executed on the interaction detail interface.
In one possible implementation of the present application, the interaction details interface includes at least one auxiliary interaction record for the target behavioral event, each auxiliary interaction record including at least one of: the system comprises a channel identifier, a user identifier and auxiliary interaction description information, wherein the channel identifier is used for indicating a release channel corresponding to the execution of the auxiliary interaction operation, the user identifier is used for indicating a user executing the auxiliary interaction operation, and the auxiliary interaction description information is used for representing feedback of the completion degree of the auxiliary interaction operation on the target behavior event.
In one possible implementation of the present application, the publishing module 230 is further configured to: determining a game state of the target behavior event in response to the auxiliary interaction operation, and entering an interaction detail interface aiming at the target behavior event in response to the target behavior event being in an execution stage; and responding to the target behavior event in a completion stage, and entering an interactive settlement interface aiming at the target behavior event.
Through the device, the convenience of assisting the game event is improved, so that the game event is completed in an accelerated manner, and the consumption of a server is reduced.
Referring to fig. 12, fig. 12 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application. As shown in fig. 12, the electronic device 300 includes a processor 310, a memory 320, and a bus 330.
The memory 320 stores machine-readable instructions executable by the processor 310, and when the electronic device 300 is running, the processor 310 communicates with the memory 320 through the bus 330, and when the machine-readable instructions are executed by the processor 310, the steps of the data processing method in any of the embodiments described above may be executed, as follows:
displaying a virtual object in the graphical user interface; responding to a mutual assistance request instruction initiated aiming at a target behavior event of a game, entering a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object; and responding to a content issuing instruction, and providing the event auxiliary label through a preset issuing channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content issuing information determined on the content newly-built page and a mutual assistance link generated based on the target behavior event.
Through the electronic equipment, the assistance convenience for game events is improved, so that the game events are completed quickly, and the consumption of a server is reduced.
The embodiments of the present application further provide a computer readable storage medium, where a computer program is stored, where the computer program when executed by a processor may perform the steps of the data processing method in any of the foregoing embodiments, specifically as follows:
displaying a virtual object in the graphical user interface; responding to a mutual assistance request instruction initiated aiming at a target behavior event of a game, entering a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object; and responding to a content issuing instruction, and providing the event auxiliary label through a preset issuing channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content issuing information determined on the content newly-built page and a mutual assistance link generated based on the target behavior event.
The computer readable storage medium is beneficial to improving the convenience of assisting the game event so as to accelerate the completion of the game event and reduce the consumption of a server.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application may be embodied in essence or a part contributing to the prior art or a part of the technical solutions, or in the form of a software product, which is stored in a storage medium and includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (15)
1. A method of data processing, the method comprising:
displaying the virtual object in a graphical user interface;
responding to a mutual assistance request instruction initiated aiming at a target behavior event of a game, entering a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object;
and responding to a content issuing instruction, and providing the event auxiliary label through a preset issuing channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content issuing information determined on the content newly-built page and a mutual assistance link generated based on the target behavior event.
2. The method of claim 1, wherein the completion of the target behavior event is updated based on first data obtained from the virtual object performing a target behavior;
After providing the event auxiliary label through a preset distribution channel in response to a content distribution instruction, the method further comprises:
and determining second data in response to auxiliary interaction operation of the event auxiliary label through a preset release channel, and updating the completion degree of the target behavior event based on the first data and the second data.
3. The method of claim 2, wherein the first data is determined based on:
determining first increment data or first decrement data of a unit time based on the attribute of the virtual object;
the first data is determined based on the executed time of the target behavior event and the first increment data or the first decrement data of the unit time.
4. The method of claim 2, wherein the determining the second data comprises:
determining second data based on the operation times of the auxiliary interactive operation and second increment data or second decrement data; the operation times have a preset upper limit value.
5. The method of claim 1, wherein the mutual aid request instruction is determined based on:
Initiating the mutual aid request instruction according to a first interactive operation executed on a first operation page under a virtual fight platform,
the first operation page is a page for executing the target behavior event aiming at the virtual object, the first operation page comprises an auxiliary control, the auxiliary control is a control for initiating auxiliary interaction aiming at the target behavior event, and the first interaction operation comprises selection operation aiming at the auxiliary control.
6. The method of claim 1, wherein the mutual aid request instruction is determined based on:
determining a target behavior event in response to a second interactive operation executed on a total entry interface under a virtual fight platform, and entering a second operation page of the target behavior event, wherein the total entry interface comprises a plurality of behavior type controls, each behavior type control is used for representing a behavior event executed for a virtual object, and the second interactive operation comprises a selection operation for a behavior type control used for representing the target behavior event in the plurality of behavior type controls;
and responding to a third interactive operation executed on the second operation page, determining an execution object corresponding to the target behavior event from the game, and generating the mutual assistance request instruction.
7. The method according to claim 1, wherein the content release information determined on the content creation page includes first description information input by a user on the content creation page and/or second description information preset to describe the target behavior event.
8. The method of claim 1, wherein providing the event assist tag through a preset distribution channel in response to a content distribution instruction comprises:
and responding to the content issuing instruction, and simultaneously issuing the event auxiliary label on a content issuing platform to which a content newly-built page belongs and a target communication channel in the game.
9. The method of claim 2, wherein after providing the event assist tags through a preset distribution channel in response to a content distribution instruction, the method further comprises:
responding to triggering operation of the event auxiliary label through a preset release channel, entering an auxiliary interaction page of the target behavior event, wherein the auxiliary interaction page comprises a page for triggering auxiliary interaction aiming at the target behavior event;
and determining second data in response to the auxiliary interaction operation performed in the auxiliary interaction page, and updating the completion degree of the target behavior event based on the first data and the second data.
10. The method according to claim 2, wherein the method further comprises:
responding to the auxiliary interaction operation, entering an interaction detail interface aiming at the target behavior event, wherein the interaction detail interface comprises pages for presenting mutual assistance details aiming at the target behavior event under each release channel;
and forwarding the event auxiliary label under a target release channel in response to release operation executed on the interaction detail interface.
11. The method of claim 10, wherein the interaction details interface comprises at least one auxiliary interaction record for the target behavioral event, each auxiliary interaction record comprising at least one of: the system comprises a channel identifier, a user identifier and auxiliary interaction description information, wherein the channel identifier is used for indicating a release channel corresponding to the execution of the auxiliary interaction operation, the user identifier is used for indicating a user executing the auxiliary interaction operation, and the auxiliary interaction description information is used for representing feedback of the completion degree of the auxiliary interaction operation on the target behavior event.
12. The method of claim 10, wherein entering an interaction details interface for the target behavioral event in response to an auxiliary interaction of the event auxiliary tag through the preset distribution channel comprises:
In response to the auxiliary interaction, determining a game state of the target behavioral event,
responding to the target behavior event in an execution stage, and entering an interaction detail interface aiming at the target behavior event;
and responding to the target behavior event in a completion stage, and entering an interactive settlement interface aiming at the target behavior event.
13. A data processing apparatus, the apparatus comprising:
the display control module displays the virtual object in the graphical user interface;
the editing module responds to a mutual assistance request instruction initiated aiming at a target behavior event of a game, and enters a content new page of the game, wherein the target behavior event is a behavior event currently performed by the virtual object;
the issuing module is used for responding to a content issuing instruction and providing the event auxiliary label through a preset issuing channel, wherein the event auxiliary label is used for assisting in completing the target behavior event, and comprises content issuing information determined on the content newly-built page and a mutual assistance link generated based on the target behavior event.
14. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is running, the processor executing the machine-readable instructions to perform the steps of the method of any one of claims 1 to 12.
15. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the steps of the method according to any of claims 1 to 12.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311552131.6A CN117582672A (en) | 2023-11-20 | 2023-11-20 | Data processing method, device, electronic equipment and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311552131.6A CN117582672A (en) | 2023-11-20 | 2023-11-20 | Data processing method, device, electronic equipment and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN117582672A true CN117582672A (en) | 2024-02-23 |
Family
ID=89910870
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311552131.6A Pending CN117582672A (en) | 2023-11-20 | 2023-11-20 | Data processing method, device, electronic equipment and storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN117582672A (en) |
-
2023
- 2023-11-20 CN CN202311552131.6A patent/CN117582672A/en active Pending
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240207736A1 (en) | Game process control method and apparatus, electronic device, and storage medium | |
| CN113082718B (en) | Game operation method, game operation device, terminal and storage medium | |
| US12186662B2 (en) | Data processing method in virtual scene, device, storage medium, and program product | |
| US11810234B2 (en) | Method and apparatus for processing avatar usage data, device, and storage medium | |
| US20230241501A1 (en) | Display method and apparatus for virtual prop, electronic device and storage medium | |
| CN113101635B (en) | Virtual map display method and device, electronic equipment and readable storage medium | |
| CN112691366B (en) | Virtual prop display method, device, equipment and medium | |
| CN116712733A (en) | Virtual character control method and device, electronic equipment and storage medium | |
| CN114307152A (en) | Virtual scene display method and device, electronic equipment and storage medium | |
| CN119367752A (en) | Information processing method, electronic device and readable storage medium in game | |
| WO2023061133A1 (en) | Virtual scene display method and apparatus, device, and storage medium | |
| CN113633968A (en) | A method, device, electronic device and storage medium for displaying information in a game | |
| CN113058265B (en) | Interaction method, device, equipment and storage medium between teams in virtual scene | |
| CN118161864A (en) | Virtual skill processing method and device, electronic equipment and storage medium | |
| CN117861210A (en) | Information processing method and device in game, electronic equipment and readable storage medium | |
| CN117582672A (en) | Data processing method, device, electronic equipment and storage medium | |
| CN114210046B (en) | Virtual skill control method, device, equipment, storage medium and program product | |
| CN117160038A (en) | Information display method and device, electronic equipment and storage medium | |
| JP2025528762A (en) | Information exchange method, apparatus, electronic device and storage medium | |
| CN115645923A (en) | Game interaction method and device, terminal equipment and computer-readable storage medium | |
| CN115624748A (en) | Information processing method and device in game, electronic equipment and storage medium | |
| CN117531191A (en) | Interaction methods, devices, equipment, storage media and products in virtual scenes | |
| CN116999801A (en) | Information processing method and device in game, electronic equipment and storage medium | |
| CN115089968B (en) | Operation guiding method and device in game, electronic equipment and storage medium | |
| CN116407850A (en) | Information processing method and device in game, electronic equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |