CN114191817B - Virtual character shooting control method, device, electronic device and storage medium - Google Patents
Virtual character shooting control method, device, electronic device and storage medium Download PDFInfo
- Publication number
- CN114191817B CN114191817B CN202111648792.XA CN202111648792A CN114191817B CN 114191817 B CN114191817 B CN 114191817B CN 202111648792 A CN202111648792 A CN 202111648792A CN 114191817 B CN114191817 B CN 114191817B
- Authority
- CN
- China
- Prior art keywords
- shooting
- virtual
- current
- stage
- prop
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a shooting control method, device, electronic equipment and computer readable storage medium of a virtual character, wherein the method comprises the steps of displaying the virtual character and a virtual shooting prop held by a holding part in a virtual scene, responding to shooting trigger operation based on the virtual shooting prop, acquiring jitter configuration information corresponding to the current shooting stage of the virtual shooting prop, acquiring jitter data corresponding to the current shooting stage according to the jitter configuration information, and controlling the holding part to drive the virtual shooting prop to carry out corresponding jitter based on the jitter data corresponding to the current shooting stage in the current shooting stage. According to the application, the real shooting performance can be simulated, so that the visual experience of a user when the user controls the virtual character to shoot is improved.
Description
Priority description
The application claims 202110646706.5, 2021, 06 and 10 days, and names as shooting control method and device for virtual character, electronic equipment and priority of storage medium.
Technical Field
The present application relates to the field of computer man-machine interaction technologies, and in particular, to a shooting control method and apparatus for a virtual character, an electronic device, and a computer readable storage medium.
Background
The man-machine interaction technology of the virtual scene based on the graphic processing hardware can realize diversified interactions among virtual objects controlled by users or artificial intelligence according to actual application requirements, and has wide practical value. For example, in a virtual scene of a game, a real combat process between virtual objects can be simulated.
Taking a game scene as an example, the shooting game is a competitive game which is deeply favored by users, not only can help the users release pressure and relax moods, but also can improve the reaction capability and sensitivity of the users through the shooting game.
However, the related art generally simulates the shooting performance at the time of shooting by using a simple shooting animation when presenting the shooting performance, that is, repeatedly playing the animation as the shooting performance at the time of continuous shooting each time the user controls the virtual character to perform the shooting operation. That is, the shooting performance presented by the related art is relatively single, and is inconsistent with the shooting performance of an actual firearm in the real world when shooting, so that the unrealistic shooting performance is caused, and the visual experience of a user is poor.
Disclosure of Invention
The embodiment of the application provides a shooting control method, device, electronic equipment and computer-readable storage medium for a virtual character, which can simulate real shooting performance so as to improve the visual experience of a user when the user controls the virtual character to shoot.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides a shooting control method of a virtual character, which comprises the following steps:
displaying a virtual character in a virtual scene and a virtual shooting prop held by the virtual character through a holding part;
In response to a firing trigger operation based on the virtual firing prop, obtaining jitter configuration information corresponding to a current firing phase of the virtual firing prop, obtaining jitter data corresponding to the current firing phase according to the jitter configuration information, and
In the current shooting stage, based on the shake data corresponding to the current shooting stage, the holding part is controlled to drive the virtual shooting prop to shake correspondingly.
In the above scheme, when the curve resource and the program curve are configured in the shake configuration information corresponding to the current shooting stage, after the shake data corresponding to the current shooting stage is determined, the method further comprises the steps of executing the following processing for each reference direction of the holding part, namely adding displacement in different reference directions, which is included in the shake data determined based on the curve resource, corresponding to displacement in different reference directions, which is included in the shake data determined based on the program curve, to obtain a displacement sum, adding rotation in different reference directions, which is included in the shake data determined based on the curve resource, corresponding to rotation in different reference directions, which is included in the shake data determined based on the program curve, to obtain a rotation sum, and updating the shake data corresponding to the current shooting stage based on the displacement sum and the rotation sum.
The embodiment of the application provides a shooting control device for a virtual character, which comprises the following components:
The display module is used for displaying the virtual character and the virtual shooting prop held by the virtual character through the holding part in the virtual scene;
the acquisition module is used for responding to shooting trigger operation based on the virtual shooting prop and acquiring jitter configuration information corresponding to the current shooting stage of the virtual shooting prop;
the acquisition module is further used for acquiring shake data corresponding to the current shooting stage according to the shake configuration information;
And the control module is used for controlling the holding part to drive the virtual shooting prop to perform corresponding shake based on shake data corresponding to the current shooting stage in the current shooting stage.
The device comprises an acquisition module, a determination module and a shooting module, wherein the acquisition module is also used for acquiring the shot times corresponding to the shooting starting time to the current time, the determination module is also used for determining shooting phases corresponding to the shot times and serving as the current shooting phase of the virtual shooting prop, each shooting phase comprises fixed shooting times, the determination module is also used for determining the time difference between the current time and the shooting starting time, and is also used for determining shooting phases corresponding to the time difference and serving as the current shooting phase of the virtual shooting prop, and each shooting phase comprises fixed duration.
In the above scheme, the obtaining module is further configured to obtain shake data corresponding to the current shooting stage according to an animated squatting manner.
In the scheme, the acquisition module is further used for acquiring a change value of the recoil corresponding to the current shooting stage of the virtual shooting prop, and the determination module is further used for determining shake data corresponding to the current shooting stage according to the change value, wherein the shake data comprise displacement and rotation of the holding part relative to different reference directions.
In the scheme, the acquisition module is further used for acquiring the change values of the recoil force respectively corresponding to the virtual shooting prop from the starting shooting stage to the current shooting stage, and the determination module is further used for performing accumulation processing on a plurality of the change values and determining shaking data corresponding to the current shooting stage according to accumulation results, wherein the shaking data comprise displacement and rotation of the holding part relative to different reference directions.
In the scheme, the determining module is further used for determining curve resources according to corresponding relations between different shooting stages and offset ranges and jitter data corresponding to the current shooting stage according to the curve resources, or determining the jitter data corresponding to the current shooting stage according to a program curve, wherein the program curve comprises at least one program curve of a trigonometric function type determined according to a period, an amplitude and an initial value and a program curve of an attenuation function type determined according to an attenuation base, an attenuation frequency, a fade-in time and a fade-out time.
In the above scheme, the determining module is further configured to determine an offset range corresponding to the current shooting stage in the curve resource according to the current shooting stage, determine a corresponding offset value according to the offset range, determine a time difference between a current moment and a shooting moment corresponding to the current shooting stage, determine a value corresponding to the time difference in an interpolation curve corresponding to shooting interval time of the virtual shooting prop based on the time difference, and take a product of the offset value and the value as jitter data corresponding to the current shooting stage.
In the above scheme, the determining module is further configured to perform, for each reference direction corresponding to the holding portion, the following process, where a first program curve corresponding to displacement is obtained, a first function value corresponding to a current time in the first program curve is taken as displacement in the reference direction, a second program curve corresponding to rotation is obtained, and a second function value corresponding to the current time in the second program curve is taken as rotation in the reference direction.
In the scheme, the determining module is further used for executing the following processing for each reference direction of the holding part, wherein the displacement in different reference directions is included in the shake data determined based on the curve resources, the displacement sum is obtained by adding the displacement in different reference directions and the displacement in different reference directions is included in the shake data determined based on the program curve, the rotation in different reference directions is included in the shake data determined based on the curve resources, the rotation sum is obtained by adding the rotation in different reference directions and the shake data corresponding to the current shooting stage is updated based on the displacement sum and the rotation sum.
The device comprises an acquisition module, a determination module and a judgment module, wherein the acquisition module is also used for acquiring the type of the virtual shooting prop, the determination module is also used for determining a first adjustment coefficient corresponding to the type of the virtual shooting prop, and taking the product of the first adjustment coefficient and jitter data corresponding to the current shooting stage as updated jitter data, wherein the first adjustment coefficient is positively correlated with recoil or killing power of the type of the virtual shooting prop.
In the scheme, the acquisition module is further configured to acquire a shooting mode corresponding to the virtual character in the current shooting stage, the determination module is further configured to determine a second adjustment coefficient corresponding to the shooting mode, and take a product of the second adjustment coefficient and jitter data corresponding to the current shooting stage as updated jitter data, where accuracy of the shooting mode is inversely related to the second adjustment coefficient.
In the above scheme, the control module is further configured to perform the following processing when updating and displaying each frame of image of the virtual scene in the current shooting stage, where the displacement and rotation in different reference directions included in the shake data of each frame of image are respectively superimposed on the position component of the holding part corresponding to the reference direction, and the displacement and rotation in different reference directions are used to make the holding part drive the virtual shooting prop to perform corresponding shake.
In the scheme, the acquisition module is further used for acquiring the real-time shaking direction of the virtual shooting prop in the current shooting stage, the determination module is further used for determining a bullet drop point synchronously offset with the real-time shaking direction, and the control module is further used for controlling the virtual bullet shot by the virtual shooting prop to hit the bullet drop point.
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
And the processor is used for realizing the shooting control method of the virtual character provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application provides a computer readable storage medium which stores executable instructions for causing a processor to execute the method for controlling shooting of a virtual character.
The embodiment of the application provides a computer program product, which comprises computer executable instructions for realizing the shooting control method of the virtual character provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial effects:
According to different shooting stages, shake data corresponding to each shooting stage are respectively obtained, in each shooting stage, the holding part of the virtual character is controlled to drive the virtual shooting prop to shake correspondingly based on the shake data corresponding to the shooting stage, so that shooting performances corresponding to different shooting stages are different in shooting process instead of a mode of repeatedly playing the same animation resource in the related technology, visual performance of the virtual character in shooting can be remarkably improved, and real shooting experience is provided for a user.
Drawings
Fig. 1 is an application mode schematic diagram of a shooting control method of a virtual character according to an embodiment of the present application;
fig. 2 is an application mode schematic diagram of a shooting control method of a virtual character according to an embodiment of the present application;
Fig. 3 is a schematic structural diagram of a terminal device 400 according to an embodiment of the present application;
Fig. 4 is a flow chart of a method for controlling shooting of a virtual character according to an embodiment of the present application;
fig. 5 is a flow chart of a method for controlling shooting of a virtual character according to an embodiment of the present application;
Fig. 6 is a flowchart of a method for controlling shooting of a virtual character according to an embodiment of the present application;
fig. 7 is a schematic diagram of shooting action performance of a first person character provided in the related art in different shooting phases;
Fig. 8 is a schematic diagram of shooting action performance of a first person character according to an embodiment of the present application in different shooting phases;
FIG. 9 is an overlay of the maximum time of pull-back of a first person continuous shooting hand provided by an embodiment of the present application;
Fig. 10 is a schematic diagram of shooting action expressions corresponding to different shooting stages when a first person character provided in an embodiment of the present application is in a target state;
Fig. 11 is an overlay diagram of maximum time of continuous shooting of hands when a first person character is in an aiming state according to an embodiment of the present application;
fig. 12 is a flowchart of a method for controlling shooting of a virtual character according to an embodiment of the present application;
fig. 13 is a schematic diagram of a configuration of a weapon shooting shake slot according to an embodiment of the present application.
Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is to be understood that "some embodiments" can be the same subset or different subsets of all possible embodiments and can be combined with one another without conflict.
In the following description, the terms "first", "second", and the like are merely used to distinguish between similar objects and do not represent a particular ordering of the objects, it being understood that the "first", "second", or the like may be interchanged with one another, if permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application will be used in the following explanation.
1) And the client, the application program used for providing various services, such as a video playing client, a game client and the like, running in the terminal equipment.
2) In response to a condition or state representing a dependency of an operation performed, the one or more operations performed may be in real-time or with a set delay when the dependency is satisfied, and without any particular limitation to execution sequencing.
3) The virtual scene is a virtual scene that an application program displays (or provides) when running on the terminal device. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual character to move in the virtual scene.
4) Virtual characters, images of various people and objects in a virtual scene that can interact, or movable objects in a virtual scene. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as a character, an animal, etc., displayed in a virtual scene. The avatar may be an avatar in a virtual scene for representing a user. A virtual scene may include multiple virtual characters, each having its own shape and volume in the virtual scene, occupying a portion of the space in the virtual scene.
For example, the virtual character may be drawn by 3D graphics modeling rendering techniques through a 3D game engine or digital content creation (DCC, digital Content Create) software, where the virtual character data may include character model data and character skeleton data.
5) The scene data may be, for example, the area of a building area in the virtual scene, the current building style of the virtual scene, or the like, or may include the position of the virtual building in the virtual scene, the floor area of the virtual building, and the like.
Taking a virtual scene as an example of a game scene, in various shooting games, a related technology generally adopts a simple shooting animation to simulate shooting performance when shooting, that is, the animation is repeatedly played as shooting performance when shooting continuously each time a user controls a virtual character to execute shooting operation. That is, the shooting performance presented by the related art is relatively single, and is inconsistent with the shooting performance of an actual firearm in the real world when shooting, so that the unrealistic shooting performance is caused, and the visual experience of a user is poor.
Aiming at the technical problems, the embodiment of the application provides a shooting control method, device, electronic equipment and computer readable storage medium for a virtual character, which can simulate real shooting performance so as to improve the visual experience of a user when the user controls the virtual character to shoot. In order to facilitate easier understanding of the shooting control method for a virtual character provided by the embodiment of the present application, first, an exemplary implementation scenario of the shooting control method for a virtual character provided by the embodiment of the present application is described, where the virtual scenario in the shooting control processing method for a virtual character provided by the embodiment of the present application may be based on output of a terminal device completely or based on cooperative output of the terminal device and a server.
In other embodiments, the virtual scene may also be an environment for interaction of game characters, for example, the game characters may fight in the virtual scene, and both parties may interact in the virtual scene by controlling actions of the game characters, so that a user can relax life pressure in the game process.
In an implementation scenario, referring to fig. 1, fig. 1 is a schematic application mode diagram of a shooting control method of a virtual character according to an embodiment of the present application, which is suitable for some application modes that can complete relevant data calculation of a virtual scene 100 completely depending on the computing capability of graphics processing hardware of a terminal device 400, for example, a game in a stand-alone/offline mode, and output the virtual scene through various different types of terminal devices 400 such as a smart phone, a tablet computer, and a virtual reality/augmented reality device.
By way of example, the types of graphics processing hardware include central processing units (CPU, central Processing Unit) and graphics processors (GPU, graphics Processing Unit).
When forming the visual perception of the virtual scene 100, the terminal device 400 calculates the data required for display through the graphic computing hardware and completes loading, parsing and rendering of the display data, outputs video frames capable of forming the visual perception for the virtual scene at the graphic output hardware, for example, presents two-dimensional video frames on the display screen of the smart phone or projects video frames realizing three-dimensional display effects on the lenses of the augmented reality/virtual reality glasses, and furthermore, in order to enrich the perception effects, the terminal device 400 can also form one or more of auditory perception, tactile perception, motion perception and gustatory perception by means of different hardware.
As an example, the terminal device 400 is provided with a client 410 (e.g. a stand-alone game application) running thereon, and outputs a virtual scene including role playing during the running of the client 410, which may be an environment for the game role to interact with, for example, a plains, streets, valleys, etc. for the game role to fight, and the virtual scene 100 is displayed in the first person perspective, for example, the virtual scene 100 is displayed with the virtual role 101 and the virtual shooting prop 102 (e.g. virtual submachine gun, virtual sniper gun, virtual shotgun gun, etc.) held by the virtual role 101 through a holding part (e.g. hand), wherein the virtual role 101 may be a game role controlled by a user, i.e. the virtual role 101 is controlled by a real user, will move in the virtual scene 100 in response to the operation of the real user with respect to the controller (e.g. touch screen, voice control switch, keyboard, mouse, joystick, etc.), and when the real user moves the joystick to the right, the virtual role 101 will move to the right in the virtual scene 100, and also stay stationary in place, jump, control the virtual role 101 to shoot, etc.
For example, when the client 410 receives a shooting operation triggered by a user and based on the virtual shooting prop 102, the client acquires shake configuration information corresponding to a current shooting stage of the virtual shooting prop 102, acquires shake data corresponding to the current shooting stage according to the acquired shake configuration information, and in the current shooting stage, controls a holding part of the virtual character 101 to drive the virtual shooting prop 102 to perform corresponding shake based on the shake data corresponding to the current shooting stage, that is, in a shooting process, respectively acquires shake data corresponding to each shooting stage for different shooting stages, and in each shooting stage, controls the holding part of the virtual character 101 to drive the virtual shooting prop 102 to perform corresponding shake based on the shake data corresponding to the shooting stage, so that, because shooting performances corresponding to different shooting stages are different from each other, instead of a mode of repeatedly playing the same animation resource in a related technology, visual performance of the virtual character in shooting can be remarkably improved, and a real shooting experience can be provided for the user.
In another implementation scenario, referring to fig. 2, fig. 2 is a schematic application mode diagram of a shooting control method of a virtual character according to an embodiment of the present application, which is applied to a terminal device 400 and a server 200, and is applicable to an application mode that completes virtual scene calculation depending on a computing capability of the server 200 and outputs a virtual scene at the terminal device 400.
Taking the example of forming the visual perception of the virtual scene 100, the server 200 performs the computation of the virtual scene related display data (e.g. scene data) and sends it to the terminal device 400 via the network 300, the terminal device 400 performs the loading, parsing and rendering of the computed display data in dependence of the graphics computing hardware, outputs the virtual scene in dependence of the graphics output hardware to form the visual perception, e.g. a two-dimensional video frame may be presented on the display screen of a smartphone or a video frame enabling a three-dimensional display effect may be projected on the lenses of an augmented reality/virtual reality glasses, for the perception of the form of the virtual scene, it may be appreciated that the corresponding hardware output of the terminal device 400 may be used, e.g. to form the auditory perception using microphones, to form the tactile perception using vibrators, etc.
As an example, where the terminal device 400 is running with a client 410 (e.g. a web-based game application), through the connection server 200 (e.g. a game server), and the terminal device 400 outputs the virtual scene 100 of the client 410, and takes the first person perspective to display the virtual scene 100 as an example, a virtual character 101 is displayed in the virtual scene 100, and a virtual shooting prop 102 (e.g. a virtual submachine gun, a virtual sniper gun, a virtual shotgun gun, etc.) held by the holding part (e.g. a hand) of the virtual character 101, where the virtual character 101 may be a game character controlled by a user, i.e. the virtual character 101 is controlled by a real user, will move in the virtual scene 100 in response to an operation of the real user with respect to a controller (e.g. a touch screen, a voice-controlled switch, a keyboard, a mouse, a joystick, etc.), and when the real user moves the joystick to the right, the virtual character 101 will move to the right in the virtual scene 100, and may also remain stationary in place, jump, control the virtual character 101 to perform shooting operations, etc.
For example, when the client 410 receives the shooting operation triggered by the user and based on the virtual shooting prop 102, the client obtains the corresponding shake configuration information of the current shooting stage of the virtual shooting prop 102 (for example, obtains the shake configuration information corresponding to the current shooting stage from the server 200), obtains the shake data corresponding to the current shooting stage according to the obtained shake configuration information, and controls the holding part of the virtual character 101 to drive the virtual shooting prop 102 to perform the corresponding shake based on the shake data corresponding to the current shooting stage in the current shooting stage, that is, during shooting, obtains the shake data corresponding to each shooting stage separately for different shooting stages, and controls the holding part of the virtual character 101 to drive the virtual shooting prop 102 to perform the corresponding shake based on the shake data corresponding to the shooting stage in each shooting stage.
In some embodiments, the terminal device 400 may implement the shooting control method of the virtual character provided in the embodiments of the present application by running a computer program, for example, the computer program may be a Native program or a software module in an operating system, may be a local (Native) application program (APP, APPlication), that is, a program that needs to be installed in the operating system to be run, for example, a game APP (that is, the client 410 described above), may be an applet, that is, a program that needs to be downloaded only to a browser environment to be run, or may be a game applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
Taking a computer program as an example of an application program, in actual implementation, the terminal device 400 installs and runs an application program supporting a virtual scene. The application may be any of a First person shooter game (FPS), a third person shooter game, a virtual reality application, a three-dimensional map program, or a multiplayer gunfight survival game. The user operates a virtual object located in a virtual scene using the terminal device 400 to perform an activity including, but not limited to, at least one of adjusting body posture, crawling, walking, running, riding, jumping, driving, picking up, shooting, attacking, throwing, building a virtual building. Illustratively, the virtual object may be a virtual character, such as an emulated persona or a cartoon persona, or the like.
In other embodiments, the embodiments of the present application may also be implemented by means of Cloud Technology (Cloud Technology), which refers to a hosting Technology that unifies serial resources such as hardware, software, networks, etc. in a wide area network or a local area network, so as to implement calculation, storage, processing, and sharing of data.
The cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like based on cloud computing business model application, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical network systems require a large amount of computing and storage resources.
For example, the server 200 in fig. 2 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, and basic cloud computing services such as big data and artificial intelligence platforms. The terminal device 400 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The terminal device 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, which is not limited in the embodiment of the present application.
In other embodiments, the shooting control method for virtual characters provided in the embodiments of the present application may also be implemented in combination with a blockchain technique.
Blockchains are novel application modes of computer technologies such as distributed data storage, point-to-point transmission, consensus mechanisms, encryption algorithms, and the like. The blockchain (Blockchain), essentially a de-centralized database, is a string of data blocks that are generated in association using cryptographic methods, each of which contains information from a batch of network transactions for verifying the validity (anti-counterfeit) of its information and generating the next block. The blockchain may include a blockchain underlying platform, a platform product services layer, and an application services layer.
For example, the shake configuration information corresponding to different shooting phases pre-configured for the virtual shooting prop 102 may be stored in the blockchain network, when the terminal device 400 has the authority of initiating to acquire the shake configuration information and receives a shooting trigger operation based on the virtual shooting prop 102, a transaction for querying shake configuration information corresponding to the current shooting phase of the virtual shooting prop 102 may be generated by the terminal device 400 and submitted to the blockchain network, where the query request carries a key name (i.e. the current shooting phase) so that a consensus node in the blockchain network performs a transaction to query data corresponding to the key name (i.e. shake configuration information corresponding to the current shooting phase) from the state database, and then, the blockchain network sends the queried shake configuration information corresponding to the current shooting phase to the terminal device 400, so that the terminal device 400 acquires shake data corresponding to the current shooting phase according to the shake configuration information, and in the current shooting phase, based on the shake data corresponding to the current shooting phase, controls the holding position of the virtual character 101 to drive the virtual shooting prop 102 to perform corresponding shake. In this way, the dithering configuration information corresponding to different shooting stages is stored in the blockchain network, and the security and reliability of the dithering configuration information are ensured based on the characteristics of decentralization, distributed storage and non-falsification of the blockchain network.
The structure of the terminal apparatus 400 shown in fig. 1 is explained below. Referring to fig. 3, fig. 3 is a schematic structural diagram of a terminal device 400 according to an embodiment of the present application, and the terminal device 400 shown in fig. 3 includes at least one processor 420, a memory 460, at least one network interface 430, and a user interface 440. The various components in terminal device 400 are coupled together by bus system 450. It is understood that the bus system 440 is used to enable connected communication between these components. The bus system 450 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various buses are labeled as bus system 450 in fig. 3.
The Processor 420 may be an integrated circuit chip having signal processing capabilities such as a general purpose Processor, such as a microprocessor or any conventional Processor, a digital signal Processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
The user interface 440 includes one or more output devices 441 that enable presentation of media content, including one or more speakers and/or one or more visual displays. The user interface 440 also includes one or more input devices 442, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
Memory 460 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard drives, optical drives, and the like. Memory 460 optionally includes one or more storage devices physically remote from processor 420.
Memory 460 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The non-volatile Memory may be a Read Only Memory (ROM) and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 460 described in embodiments of the present application is intended to comprise any suitable type of memory.
In some embodiments, memory 460 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 461 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
A network communication module 462 for accessing other computing devices via one or more (wired or wireless) network interfaces 430, exemplary network interfaces 430 include bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), among others;
A presentation module 463 for enabling presentation of information (e.g., a user interface for operating peripheral devices and displaying content and information) via one or more output devices 441 (e.g., a display screen, speakers, etc.) associated with the user interface 440;
An input processing module 464 for detecting one or more user inputs or interactions from one of the one or more input devices 442 and translating the detected inputs or interactions.
In some embodiments, the shooting control apparatus for a virtual character according to the present application may be implemented in software, and fig. 3 shows the shooting control apparatus 465 for a virtual character stored in the memory 460, which may be software in the form of a program, a plug-in unit, or the like, including software modules including a display module 4651, an acquisition module 4652, a control module 4653, and a determination module 4654, which are logically combined, so that any combination or further split may be performed according to the implemented functions. It should be noted that all the above modules are shown once in fig. 3 for convenience of expression, but should not be construed as excluding the implementation of the shooting control means 465 in the avatar, which may include only the display module 4651, the acquisition module 4652 and the control module 4653, the functions of each module will be described below.
In other embodiments, the shooting control apparatus for a virtual character provided in the embodiments of the present application may be implemented in hardware, and as an example, the shooting control apparatus for a virtual character provided in the embodiments of the present application may be a processor in the form of a hardware decoding processor, which is programmed to perform the shooting control method for a virtual character provided in the embodiments of the present application, for example, the processor in the form of a hardware decoding processor may use one or more Application specific integrated circuits (ASICs, applications SPECIFIC INTEGRATED circuits), DSPs, programmable logic devices (PLDs, programmable Logic Device), complex Programmable logic devices (CPLDs, complex Programmable Logic Device), field Programmable gate arrays (FPGAs, field-Programmable GATE ARRAY), or other electronic components.
The method for controlling shooting of virtual characters according to the embodiment of the present application will be specifically described below with reference to the accompanying drawings. The shooting control method of the virtual character provided by the embodiment of the application can be independently executed by the terminal equipment 400 in fig. 1, or can be cooperatively executed by the terminal equipment 400 and the server 200 in fig. 2.
Next, a shooting control method of a virtual character provided by an embodiment of the present application is described by way of example in which the terminal device 400 in fig. 1 alone is executed. Referring to fig. 4, fig. 4 is a flowchart of a method for controlling shooting of a virtual character according to an embodiment of the present application, and will be described with reference to the steps shown in fig. 4.
It should be noted that the method shown in fig. 4 may be executed by various computer programs running on the terminal device 400, and is not limited to the above-mentioned client 410, but may also be the operating system 461, software modules and scripts described above, and therefore the client should not be considered as limiting the embodiments of the present application.
In step S101, a virtual character and a virtual shooting prop held by the virtual character through a holding portion are displayed in a virtual scene.
In some embodiments, a client supporting a virtual scene is installed on the terminal device (for example, when the virtual scene is a game, the corresponding client may be a shooting game APP), and when the user opens the client installed on the terminal device (for example, the user clicks an icon corresponding to the shooting game APP presented on a user interface of the terminal device), and the terminal device runs the client, a virtual character (for example, a virtual character a controlled by the user) and a virtual shooting prop (for example, a virtual heavy machine gun, a virtual shotgun, a virtual sniping gun, etc.) held by the virtual character through a holding part (for example, a hand) may be displayed in the virtual scene presented on a man-machine interface of the client.
In other embodiments, the above-described displaying of the virtual character in the virtual scene and the virtual shooting prop held by the virtual character through the holding portion may also be achieved by displaying the virtual character and the selected target virtual shooting prop held by the virtual character through the holding portion in the virtual scene in response to the virtual shooting prop selection operation.
By way of example, taking a virtual scene as an example of a game, a plurality of virtual weapons are provided in the game for selection by a user, including, for example, a virtual heavy machine gun, a virtual shotgun, a virtual sniping gun, and the like, and for each virtual weapon, a corresponding icon is displayed in the game screen. When the user clicks the icon corresponding to the virtual heavy machine gun displayed in the game picture, the game picture that the virtual character controlled by the user holds the virtual heavy machine gun through the hand is displayed.
In some embodiments, the virtual scene may be displayed in a first-person view angle (for example, playing a virtual role in a game in a view angle of a user), may be displayed in a third-person view angle (for example, the user follows the virtual role in the game, and plays the game), or may be displayed in a bird's eye view angle, where any switching may be performed between the different view angles.
By way of example, the virtual character may be an object controlled by a current user in the game, although other virtual characters may also be included in the virtual scene, such as may be controlled by other users or by a robot program. The virtual roles may be partitioned into any of a plurality of teams, the teams may be hostile or collaborative, and the teams in the virtual scenario may include one or all of the above.
Taking the example of displaying the virtual scene from the first person perspective, the virtual scene displayed in the human-computer interaction interface may include determining a field of view area of the virtual character according to a viewing position and a field angle of the virtual character in the complete virtual scene, and presenting a part of the virtual scene in the field of view area in the complete virtual scene, that is, the displayed virtual scene may be a part of the virtual scene relative to the panoramic virtual scene. Because the first person perspective is the viewing perspective that is most capable of giving the user impact, immersive perception of the user as being immersive during operation can be achieved.
Taking a bird's eye view large viewing angle as an example, displaying the virtual scene in the human-computer interaction interface may include, in response to a zoom operation for the panoramic virtual scene, presenting a portion of the virtual scene corresponding to the zoom operation in the human-computer interaction interface, i.e., the displayed virtual scene may be a portion of the virtual scene relative to the panoramic virtual scene. Therefore, the operability of the user in the operation process can be improved, and the efficiency of man-machine interaction can be improved.
In step S102, in response to a shooting trigger operation based on the virtual shooting prop, shake configuration information corresponding to the current shooting phase of the virtual shooting prop is acquired.
In some embodiments, the different firing phases may be distinguished by the number of shots, and then prior to obtaining the jitter profile information corresponding to the current firing phase of the virtual firing prop, a process may also be performed of obtaining the number of shots that have been fired from the time of starting the firing (i.e., the time when the firing trigger operation was received) to the current time, determining the firing phase corresponding to the number of shots, and as the current firing phase of the virtual firing prop, wherein each firing phase includes a fixed number of shots (e.g., each firing phase may include 1 shot or multiple shots).
For example, assuming that the time when the shooting trigger operation is received (i.e., the starting shooting time) is 00:00, the current time is 00:01, and the shooting interval time of the virtual shooting prop is 0.1 seconds (i.e., the virtual shooting prop shoots once every 0.1 seconds), the number of shot times corresponding to the starting shooting time to the current time can be determined to be 10, and meanwhile, assuming that the number of shot times included in each shooting phase is 2, the current shooting phase of the virtual shooting prop can be determined to be the 5 th shooting phase in the shooting process.
It should be noted that, in practical application, the value of the number of shots included in each shooting stage may be flexibly adjusted according to practical situations, for example, when the performance of the terminal device is better, the value of the number of shots included in each shooting stage may be smaller (for example, 1 shot may be used as one shooting stage, that is, the shooting performance of each shot may be different from each other), so as to more finely present the shooting effect, and when the performance of the terminal device is worse, the value of the number of shots included in each shooting stage may be larger (for example, continuous 5 shots may be used as one shooting stage, that is, the shooting performance of the 5 shots is the same), so as to avoid the unsmooth game running (for example, causing a game picture to be blocked) due to the excessively focused shooting performance.
In other embodiments, the different shooting phases may also be distinguished by a fixed time period, and then before obtaining the jitter configuration information corresponding to the current shooting phase of the virtual shooting prop, determining a time difference between the current moment and the starting shooting moment, determining the shooting phase corresponding to the time difference, and serving as the current shooting phase of the virtual shooting prop, where each shooting phase includes a fixed duration (e.g., each shooting phase includes a fixed duration that may be 0.1 seconds or 0.5 seconds, etc.).
For example, assuming that the starting shooting time is 00:00 and the current time is 00:01, the time difference between the current time and the starting shooting time can be determined to be 1 second, and assuming that the fixed duration included in each shooting stage is 0.1 second, the current shooting stage of the virtual shooting prop can be determined to be the 10 th shooting stage in the shooting process.
It should be noted that, in practical application, the value of the fixed duration included in each shooting stage may be flexibly adjusted according to practical situations, for example, when the performance of the terminal device of the user is better and the requirement of the user on the performance of the shooting action is higher, the value of the fixed duration included in each shooting stage may be smaller, and when the performance of the terminal device of the user is worse or the requirement of the user on the performance of the shooting action is lower, the value of the fixed duration included in each shooting stage may be larger. That is, the value of the fixed duration may be correspondingly determined according to the user's setting in the game.
In some embodiments, a game user or a game developer may configure, in advance, shake configuration information corresponding to each shooting stage for different shooting stages of the virtual shooting stage in the shooting process in the configuration interface, so that after determining the current shooting stage of the virtual shooting stage, shake configuration information corresponding to the current shooting stage may be obtained from a plurality of shake configuration information.
In step S103, shake data corresponding to the current shooting stage is acquired according to the shake configuration information.
In some embodiments, the jitter configuration information may include a configured animated squat style, where the animated squat style includes at least one of an overlay style, and a curve style, and the above-described obtaining jitter data corresponding to a current shooting stage according to the jitter configuration information may be implemented by obtaining jitter data corresponding to the current shooting stage according to the animated squat style.
For example, when the animated squat style includes an overlay (which may also be referred to as a break-in style, i.e., the change in squat force corresponding to each firing phase is not accumulated, but the change in squat force of the virtual shooting prop corresponding to each firing phase is used to obtain the shake data corresponding to the current firing phase), the above-described obtaining shake data corresponding to the current firing phase according to the animated squat style may be implemented by obtaining the change in squat force of the virtual shooting prop corresponding to the current firing phase, and determining shake data corresponding to the current firing phase according to the change in squat force, where the shake data includes displacement and rotation of the holding part of the virtual character relative to different reference directions.
For example, taking the displacement of the holding part (e.g., the hand) of the virtual character with respect to the X axis in the current shooting stage as an example, the change value of the recoil corresponding to the virtual shooting stage (e.g., the virtual light machine gun) is first obtained, then the obtained change value is decomposed to obtain a change value component corresponding to the X axis (when the change value component is positive, the positive direction corresponding to the X axis, i.e., the right direction; when the change value component is negative, the negative direction corresponding to the X axis, i.e., the left direction), and then the displacement of the hand position of the virtual character with respect to the X axis is determined according to the change value component corresponding to the X axis (e.g., 1% of the change value component corresponding to the X axis may be regarded as the displacement of the hand position of the virtual character with respect to the X axis).
In practical application, the value of the percentage of the variable value component corresponding to the X axis may be adjusted according to the specific situation, where the value of the percentage may be positively related to the recoil or the killing force of the virtual shooting prop, for example, when the virtual shooting prop is a virtual heavy machine gun (the recoil is large), the corresponding shake is large, so the value of the percentage may be larger (for example, 3% of the variable value component corresponding to the X axis may be regarded as the displacement of the hand relative to the X axis), and when the virtual shooting prop is a virtual light machine gun (the recoil is small), the corresponding shake is small, and therefore the value may be smaller (for example, 1% of the variable value component corresponding to the X axis may be regarded as the displacement of the hand relative to the X axis).
For example, when the animated squat style includes a superimposed style (which may also be referred to as a cumulative style, i.e., the changing values of the squat force of the virtual shooting prop for each shooting stage are accumulated, i.e., the shake data corresponding to the current shooting stage is determined according to the accumulated result of the changing values of the squat force corresponding to the current shooting stage from the beginning shooting stage, respectively), the above-described acquisition of the shake data corresponding to the current shooting stage according to the animated squat style may be achieved by acquiring the changing values of the squat force corresponding to the virtual shooting prop corresponding to the current shooting stage from the beginning shooting stage, accumulating the changing values, and determining the shake data corresponding to the current shooting stage according to the accumulated result, wherein the shake data includes displacement and rotation of the holding part of the virtual character with respect to different reference directions.
For example, taking the example of determining the rotation of the hand of the virtual character around the X axis in the current shooting stage, firstly, the change values of the recoil forces respectively corresponding to the virtual shooting stage from the starting shooting stage to the current shooting stage of the virtual shooting stage (such as a virtual light machine gun) are obtained, then, the accumulation values are accumulated, the accumulation results are decomposed to obtain accumulation result components corresponding to the X axis, and then, the rotation of the hand of the virtual character around the X axis can be determined based on the accumulation result components corresponding to the X axis (for example, it is assumed that the rotation of the hand of the virtual character around the X axis is determined to be 0.5 ° based on the accumulation result components corresponding to the X axis). That is, for the superimposed animation squat mode, the acquired value of the rotation of the hand of the virtual character around the X-axis is the sum of the rotation changes from the start shooting phase to the current shooting phase.
For example, when the animated squat style includes a curve style (including a user or developer custom curve resource and a system provided program curve of a fixed type (e.g., a trigonometric function type or an attenuation function type) to determine jitter data corresponding to a current shooting phase based on the curve resource or the program curve), the above-described acquisition of the jitter data corresponding to the current shooting phase based on the animated squat style may be implemented by determining the curve resource (i.e., the user or developer custom curve) based on a correspondence between different shooting phases and an offset range and determining the jitter data corresponding to the current shooting phase based on the curve resource, or determining the jitter data corresponding to the current shooting phase (i.e., the system provided fixed type curve) based on the program curve, wherein the program curve includes at least one of the trigonometric function type program curve determined based on a period, an amplitude, and an initial value, and the attenuation function type program curve determined based on a base of attenuation, a frequency of attenuation, a fade-in time, and a fade-out time.
For example, a plurality of different types of animation squat modes are provided in the configuration interface of the jitter configuration information for selection by a game user or a developer, when the game user or the developer selects a curve mode from the plurality of different types of animation squat modes, one curve resource (for example, the horizontal axis of the curve may be sequentially increasing shooting stages, the vertical axis of the curve may be an offset range corresponding to each shooting stage respectively) or a filling curve type (for example, sine or berlin noise), a period, an amplitude, a random initial value, a decayed base, a decayed frequency, fade-in time, fade-out time, and the like may be further configured, so as to obtain a curve resource or a program curve for determining jitter data corresponding to a current shooting stage.
For example, the above-mentioned jitter data corresponding to the current shooting stage can be determined according to the curve resource by determining an offset range corresponding to the current shooting stage in the curve resource according to the current shooting stage, determining a corresponding offset value according to the offset range, determining a time difference between shooting moments corresponding to the current shooting stage and the current moment, determining a value corresponding to the time difference in an interpolation curve corresponding to the shooting interval time of the virtual shooting prop based on the time difference, and taking the product of the offset value and the value as the jitter data corresponding to the current shooting stage.
For example, taking the displacement of the hand of the virtual character relative to the X axis as an example in the current shooting phase, assuming that the current shooting phase is the 3 rd shooting phase, the offset range corresponding to the 3 rd shooting phase (assuming that the offset upper limit is 1 and the offset lower limit is 0.2) may be obtained from the curve resource first, then the corresponding offset value is determined according to the offset range (for example, a value is randomly taken from the offset range, assuming that the taken offset value is 0.5), then the time difference between the current moment and the shooting moment corresponding to the 3 rd shooting phase (assuming that the time difference between the two is 0.05 seconds) is determined, and the interpolation curve corresponding to the shooting interval time of the virtual shooting prop (for example, assuming that the shooting interval time of the virtual shooting prop is 0.1 seconds, and the interval of the interpolation curve is 0-1, and then the 0 second corresponds to the value of the ordinate corresponding to the interpolation curve at the abscissa of 0, and the 0.1 second corresponds to the value of the ordinate corresponding to the interpolation curve at the abscissa of 0), so that the time difference between the current moment and the shooting interval of the virtual character is 0.5 = 0.5, and the time difference between the values of the virtual character and the virtual character is obtained as the time difference (assuming that the time difference between the time and the X value and the value of the X axis and the coordinate is 0.5).
For example, with the above, the above-described determination of shake data corresponding to the current shooting stage from the program curves can be achieved by, for each reference direction corresponding to the grip portion (e.g., hand) of the virtual character, acquiring a first program curve corresponding to the displacement, taking a first function value corresponding to the current time in the first program curve as the displacement in the reference direction, acquiring a second program curve corresponding to the rotation, and taking a second function value corresponding to the current time in the second program curve as the rotation in the reference direction.
For example, taking the reference direction as the X axis as an example, the displacement of the hand of the virtual character with respect to the X axis may be obtained by obtaining a first program curve (for example, a program curve of a trigonometric function type, the abscissa of which represents time, and the ordinate represents displacement) corresponding to the displacement, taking a first function value corresponding to the current time in the first program curve as the value of the displacement of the X axis, and similarly, the rotation of the hand of the virtual character about the X axis may be obtained by obtaining a second program curve (for example, a program curve of an attenuation function type, the abscissa of which represents time, and the ordinate of which represents rotation), taking a second function value corresponding to the current time in the second program curve as the value of the rotation about the X axis.
It should be noted that, in practical application, the first program curve and the second program curve may be the same type of program curve, for example, the first program curve and the second program curve are both trigonometric function type program curves, or are both decay function type program curves, and when the first program curve and the second program curve are both trigonometric function type program curves, parameters such as a period, an amplitude, an initial value, and the like of the first program curve and the second program curve may be different.
In addition, it should be noted that, in practical application, when the curve resources and the program curve are simultaneously configured in the shake configuration information corresponding to the current shooting phase, after the shake data corresponding to the current shooting phase is determined, a process of adding, for each reference direction corresponding to the grip portion of the virtual character, a displacement sum c=a+b obtained by adding displacements in different reference directions included in the shake data determined based on the curve resources, a displacement sum (i.e., a result obtained by adding two displacements, for example, taking the reference direction as the X-axis, and assuming that a displacement of the hand of the virtual character with respect to the X-axis is determined based on the curve resources, and a displacement sum c=a+b obtained by adding two displacements based on the program curve), a rotation sum (e.g., a result obtained by adding two rotations together, and a rotation sum, e.e=a+a rotation sum, obtained by adding two rotations together, and a rotation sum, based on the virtual character, and a rotation sum c=a+b) are obtained by adding the displacements in different reference directions included in the shake data determined based on the curve resources, and a rotation sum corresponding to the shake data determined based on the curve resources, and a rotation sum c=a rotation sum c+b, and the integrated shaking data are overlapped on the position of the holding part of the virtual character, so that the shaking effect in the process of presenting the user to control the virtual character to shoot can be smoother and more natural, and the shooting experience of the user is further improved.
For example, in the case where the displacement of the holding portion of the virtual character with respect to the X axis is determined in the current shooting phase, when the curve resource and the program curve are simultaneously arranged in the shake arrangement information, assuming that the displacement of the holding portion of the virtual character with respect to the X axis determined based on the curve resource is a and the displacement of the holding portion of the virtual character with respect to the X axis determined based on the program curve is b, the sum c of the two displacements may be regarded as the final displacement of the holding portion of the virtual character with respect to the X axis and superimposed on the position component of the holding portion with respect to the X axis.
In some embodiments, step S103 shown in fig. 4 may be implemented by steps S1031A to S1032A shown in fig. 5, and will be described in connection with the steps shown in fig. 5.
In step S1031A, the type of virtual shooting prop is acquired.
In some embodiments, since the shake conditions corresponding to different actual firearms in the real world are different when shooting, in order to make the shooting performance of the virtual shooting prop more consistent with the shooting performance of the actual firearms in the real world, shake data corresponding to the current shooting stage may also be updated according to the type of the virtual shooting prop, where the type of the virtual shooting prop may include a virtual heavy machine gun, a virtual light machine gun, a virtual shotgun, a virtual sniper gun, and so on.
In step S1032A, a first adjustment coefficient corresponding to the virtual firing stage of the type is determined, and the product of the first adjustment coefficient and the shake data corresponding to the current firing stage is used as updated shake data.
In some embodiments, the shake configuration information may further include a correspondence between different types of virtual shooting props and adjustment coefficients, and after the types of virtual shooting props are acquired, a first adjustment coefficient corresponding to the acquired types of virtual shooting props may be determined according to the correspondence between the types of virtual shooting props and the adjustment coefficients, and a product of the first adjustment coefficient and shake data corresponding to a current shooting stage may be used as updated shake data, where the first adjustment coefficient may be positively correlated with a recoil or a killing force of the virtual shooting props of the type.
For example, if the obtained type of the virtual shooting prop is a virtual heavy machine gun, an adjustment coefficient corresponding to the virtual heavy machine gun may be obtained from a correspondence between the different types of virtual shooting props and the adjustment coefficient included in the shake configuration information (assuming that the obtained adjustment coefficient is a), and a product of the adjustment coefficient a and shake data corresponding to the current shooting stage is used as updated shake data (for example, assuming that a displacement of a holding part included in shake data corresponding to the current shooting stage relative to an X axis is b, a displacement of the updated holding part relative to the X axis is a×b), so that the shake data is updated by the adjustment coefficient corresponding to the type of the virtual shooting prop, so that shooting performances presented by different subsequent virtual shooting props are different, and thus shooting performances of different actual guns in real life can be simulated, and visual experience of a user when shooting under control of the virtual role is further improved.
In other embodiments, step S103 shown in fig. 4 may also be implemented by steps S1031B to S1032B shown in fig. 6, and will be described in connection with the steps shown in fig. 6.
In step S1031B, a shooting mode corresponding to the virtual character in the current shooting phase is acquired.
In some embodiments, since the shake conditions corresponding to the real person when shooting in different shooting modes (such as waist shooting and mirror opening) are also different in real life, in order to make the shooting performance of the virtual shooting prop more consistent with the shooting performance corresponding to the real person when shooting in different shooting modes in real life, the adjustment coefficient may be determined according to the shooting mode corresponding to the virtual character in the current shooting stage, and the shake data corresponding to the current shooting stage may be updated according to the determined adjustment coefficient.
In step S1032B, a second adjustment coefficient corresponding to the shooting mode is determined, and the product of the second adjustment coefficient and the shake data corresponding to the current shooting stage is used as updated shake data.
In some embodiments, the shake configuration information may further include a correspondence between different shooting modes and adjustment coefficients, and after the shooting mode of the virtual character corresponding to the current shooting stage is acquired, a second adjustment coefficient corresponding to the current shooting mode of the virtual character may be acquired from the correspondence between the different shooting modes and adjustment coefficients included in the shake configuration information, and a product of the second adjustment coefficient and shake data corresponding to the current shooting stage may be used as updated shake data, where the second adjustment coefficient may be inversely related to the accuracy of the shooting mode.
For example, assuming that the shooting mode corresponding to the current shooting stage of the virtual character is waist shooting (that is, shooting is performed without opening a mirror, the corresponding accuracy is low), the adjustment coefficient corresponding to the waist shooting may be obtained from the corresponding relation between the different shooting modes and the adjustment coefficient included in the shake configuration information (assuming that the obtained adjustment coefficient is c), and the product of the adjustment coefficient c and the shake data corresponding to the current shooting stage is used as updated shake data (assuming that the displacement of the holding part included in the shake data corresponding to the current shooting stage relative to the X axis is d, for example, the displacement of the holding part after updating relative to the X axis is c×d), so that the shake data is adjusted by the adjustment coefficient corresponding to the shooting mode in which the virtual character is currently located, so that the shooting performance corresponding to the different shooting modes may be different when the shooting performance is presented later, so as to simulate the shooting performance of a real person when shooting in the different shooting modes, and further improve the visual experience of the user when controlling the virtual character.
It should be noted that in practical application, the shake data can be adjusted by combining the first adjustment coefficient corresponding to the type of the virtual shooting prop and the second adjustment coefficient corresponding to the shooting mode where the virtual character is currently located, so that the shooting performance presented later can be more consistent with the shooting performance of the actual firearm in real life when shooting by comprehensively considering the type of the virtual shooting prop and the shooting mode where the character is currently located, and the visual experience of the user is improved.
In step S104, in the current shooting stage, the holding portion is controlled to drive the virtual shooting prop to perform corresponding shake based on shake data corresponding to the current shooting stage.
In some embodiments, the above-mentioned shaking data corresponding to the current shooting stage can be realized by controlling the holding part to drive the virtual shooting prop to perform corresponding shaking, wherein when each frame of image of the virtual scene is updated and displayed in the current shooting stage, the following processing is performed, namely displacement and rotation of different reference directions included in the shaking data of each frame of image are respectively overlapped on position components of the holding part of the virtual character corresponding to the reference directions, wherein the displacement and rotation of the different reference directions are used for enabling the holding part of the virtual character to drive the virtual shooting prop to perform corresponding shaking.
Taking a virtual scene as an example, when each frame of image of the game is updated and displayed in the current shooting stage, the following processing may be performed, where the hand of the virtual character is located, is first decomposed to obtain position components corresponding to different reference directions respectively (for example, the position of the hand of the virtual character may be decomposed into position components in X, Y, Z directions), then displacement and rotation of different reference directions included in shake data of each frame of image are respectively superimposed on the position components corresponding to the reference directions of the hand of the virtual character, for example, displacement and rotation corresponding to an X axis are superimposed on the position components corresponding to the X axis, displacement and rotation corresponding to a Y axis are superimposed on the position components corresponding to the Y axis, and displacement and rotation corresponding to a Z axis are superimposed on the position components corresponding to the Z axis, so that shake of bones of the hand of the virtual character is achieved by superimposing the position components corresponding to the hand of the virtual character, and further shaking of a held virtual shooting prop is achieved, and actual performance of a simulated in life is achieved, so that the shooting experience of the user is controlled when shooting is promoted.
In other embodiments, after the holding part is controlled to drive the virtual shooting prop to perform corresponding shake based on shake data corresponding to the current shooting stage, a real-time shake direction of the virtual shooting prop in the current shooting stage can be obtained, a bullet drop point synchronously offset from the real-time shake direction is determined, and a virtual bullet shot by the virtual shooting prop is controlled to hit the determined bullet drop point, so that an effect that the bullet drop point shot by the virtual shooting prop is consistent with the real-time shake direction of the virtual shooting prop is achieved.
Taking a virtual scene as an example, after the hand of the virtual character is controlled to drive the virtual weapon to shake correspondingly based on shake data corresponding to the current shooting stage, a real-time shake direction (for example, the virtual weapon shakes rightwards) corresponding to the virtual weapon in the current shooting stage can be obtained, and then, a synchronously-offset bullet drop point can be determined according to the shake direction of the virtual weapon (when the virtual weapon shakes rightwards, the bullet drop point of the virtual bullet also synchronously shifts rightwards), so as to control the virtual bullet launched by the virtual weapon to hit the determined bullet drop point.
For example, assuming that the virtual shooting prop is a in the case that no shake occurs, when the hand of the virtual character drives the virtual shooting prop to shake to the right, the virtual bullet emitted at this time will fall on the B located at a certain distance on the right side of a, where the distance between A, B is positively related to the shake degree of the virtual shooting prop, i.e. the greater the shake degree of the virtual shooting prop, the farther the position of the corresponding shooting point B deviates from the position of the corresponding shooting point a when no shake occurs under the shake condition, so, by controlling the shooting point of the virtual shooting prop to keep consistent with the shake direction, the fidelity and operability during shooting can be further improved, and a more real hand feeling is brought to the user.
According to the shooting control method of the virtual character, corresponding shaking data of each shooting stage are obtained respectively aiming at different shooting stages, in each shooting stage, the holding part of the virtual character is controlled to drive the virtual shooting prop to shake correspondingly based on the shaking data corresponding to the shooting stage, so that shooting performances corresponding to different shooting stages are different in shooting process instead of a mode of repeatedly playing the same animation resource in the related technology, visual performance of the virtual character in shooting can be remarkably improved, and real shooting experience is provided for a user.
In the following, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
In shooting games, the related art is usually implemented in such a way that an art person can shoot (also called fire) an animation resource when performing the action of shooting a character. Because of the animation resource of only one shooting, the action performance of shooting is single in the shooting process, and the shaking of the action recoil of a real person in the actual shooting of an actual firearm in real life is inconsistent, so that the unrealistic shooting performance is caused, and meanwhile, the problems of visual fatigue and the like of a user can be caused.
For example, referring to fig. 7, fig. 7 is a schematic diagram of shooting actions of a first person character corresponding to different shooting stages in a related art, where fig. 7 is a shooting action of the first person character corresponding to a first shooting, a first person character shooting, and a second person character shooting in sequence from top to bottom, as shown in fig. 7, since there is only one animation resource for shooting, the action of the first shooting and the action of the second shooting are the same, that is, the same animation resource is repeatedly played each time the user controls the virtual character to perform shooting operation, and this simple and monotonous shooting action mode is easy for the user to generate visual fatigue.
Aiming at the technical problems, the embodiment of the application provides a shooting control method of a virtual character, which can provide a set of configuration system for game users or developers, and can generate hand action shake data of the character in shooting according to parameter configuration (such as trigonometric function, decay function, shake curve and the like) in real time when the game is operated, and add the shake data to the position of the hand of the current virtual character, so that the hand skeleton generates a shake effect, and further drives the virtual weapon to shake together, thereby achieving the effect of simulating the shooting of the real weapon. At the same time, corresponding dithering configurations may be provided for different weapons and for different firing modes (e.g., waist shot and mirror open).
For example, referring to fig. 8, fig. 8 is a schematic diagram of shooting performance of a character of a first person according to the embodiment of the present application corresponding to each of different shooting stages, as shown in fig. 8, in a game, a virtual weapon 802 is displayed, in which the virtual character 801 and the virtual character 801 are held by hands, in which fig. 8, from top to bottom, is a first person not shooting (in which the distance from the virtual weapon 802 to the top of the screen is a, the distance from the right of the screen is B), a first person first shooting hand pull-back maximum time (in which the distance from the virtual weapon 802 to the top of the screen is C, the distance from the right of the screen is D), a first person second shooting hand pull-back maximum time (in which the distance from the virtual weapon 802 to the top of the screen is E, the distance from the right of the screen is F), and a shooting performance corresponding to a first person third shooting hand pull-maximum time (in which the distance from the virtual weapon 802 to the top of the screen is G, the distance from the right of the screen is H), wherein the values of A, C, E, G are different, and the values of B, D, F, H are different.
For example, referring to fig. 9, fig. 9 is a superimposed diagram of the maximum time of pull-back of the hand of the first person in continuous shooting provided in the embodiment of the present application, as shown in fig. 9, a virtual character 901 and a virtual weapon 902 held by the hand of the virtual character 901 are displayed in the game, in addition, fig. 9 also shows different positions and rotations of the virtual weapon 902 in different shooting phases (for example, the position and rotation 903 of the virtual weapon 902 in the first shooting and the position and rotation 904 of the virtual weapon 902 in the second shooting), and it can be seen in conjunction with fig. 8 and fig. 9 that the positions and deflections of the hand of the virtual character and the virtual weapon are different at the same time of each shooting (the ghost is a reduced picture transparency), that is, when the character starts shooting, the program generates displacements and rotations according to the shooting configuration of the weapon, thereby generating a shaking effect, and making the action of each shooting appear differently.
For example, referring to fig. 10, fig. 10 is a schematic diagram of shooting performance corresponding to different shooting stages when a character of a first person is in an aiming state, as shown in fig. 10, a virtual weapon 1002 held by a virtual character 1001 and a virtual character 1001 through a hand is displayed in a game, where fig. 10 shows, from top to bottom, shooting performance corresponding to a first person aiming state not shooting (where a distance from the virtual weapon 1002 to the bottom of the screen is I and a distance from the right of the screen is J), a first person aiming state first time pull-back maximum moment (where a distance from the virtual weapon 1002 to the bottom of the screen is K and a distance from the right of the screen is L), a first person aiming state second time pull-back maximum moment (where a distance from the virtual weapon 1002 to the bottom of the screen is M and a distance from the right of the screen is N), and a first person aiming state third time pull-back maximum moment (where a distance from the virtual weapon 1002 to the bottom of the screen is O and a distance from the right of the screen is P), where values of I, K, M, O are different and values of J, L, N, P are different.
For example, referring to fig. 11, fig. 11 is a superimposed diagram of the maximum time of continuous shooting of hands after a first person's character is in an aiming state, as shown in fig. 11, in a game, a virtual weapon 1102 is displayed in which the virtual character 1101 and the virtual character 1101 are held by the hands, in addition, fig. 11 also shows different positions and rotations of the virtual weapon 1102 in different shooting stages (for example, the position of the virtual weapon 1102 in the first shooting and the position of the virtual weapon 1102 in the second shooting and the rotation 1104), and in conjunction with fig. 10 and fig. 11, it can be seen that the positions and deflections of the hands and the virtual weapon are different at the same time of each shooting (the ghost is a reduced picture transparency), that is, when the character starts shooting, the program generates displacements and rotations according to the shooting configuration of the weapon, thereby generating a jittering effect, so that the action performance of each shooting is different, and in conjunction with fig. 9 and 11, for example, the corresponding jittering effect of the aiming mode is different from fig. 9 and fig. 11.
The method for controlling shooting of the virtual character provided by the embodiment of the application is specifically described below.
For example, referring to fig. 12, fig. 12 is a flowchart of a method for controlling shooting of a virtual character according to an embodiment of the present application, and as shown in fig. 12, the method for controlling shooting of a virtual character according to an embodiment of the present application mainly includes five steps of receiving a shooting instruction input by a user, updating shooting logic, generating hand animation shake data according to a weapon shooting configuration program, acquiring shake data by the animation updating logic, and applying the shake data to an animation skeleton to calculate, which are described below.
Receiving shooting instructions input by a user
In some embodiments, when the user (or player) presses the firing button displayed on the screen, corresponding to the user inputting a firing command, the virtual weapon (corresponding to the virtual firing prop) receives the user input command to start firing, and fires in a logic similar to a timer at regular intervals.
(II) update shooting logic
In some embodiments, the weapon module, when updating the firing logic, will generally also accumulate the updated current number of shots in a timer-like logic based on the shot interval time. For example, assuming that the firing interval of the virtual weapon is 0.1 seconds and the duration of the current moment from the time when the instruction to start firing entered by the user is received is 1 second, the current firing number is the 10 th firing.
(III) generating hand animation shake data according to weapon shooting configuration program
In some embodiments, after determining the current number of shots, the shake data (e.g., hand shake data) corresponding to the current shot may be obtained from the weapon shot configuration shown in FIG. 13 by the current number of shots.
The meaning of the various parameters included in the weapon firing configuration shown in FIG. 13 is described below.
The animation squat mode (Weap Anim Recoil Type) provides three different types of animation squat modes, including an overlay mode (Override), also called a break mode, which does not accumulate the recoil of the weapon for each shot, but processes the animation by using the change value of the recoil of each frame of weapon, an Additive mode, also called an accumulation mode, which is to accumulate the change of the recoil of each frame of weapon, which is equivalent to the sum of the rotation changes from the beginning of shooting to the current moment, and a curve mode (Wave), wherein two different curve modes are provided, one is to configure one curve resource, and one is to fill in the curve type (comprising Sine (Sine) or berlin), period, amplitude, random initial value, the base of attenuation, frequency of attenuation, fade-in time, fade-out time and the like.
Animation offset types (Anim Offset Type) are divided into rotation (Rot) and displacement (Loc), depending on whether the current sub-configuration is to process rotation or displacement.
The Axis (Axis) is the Axis to be processed X, Y, Z.
The curve resource (Weapon Anim Recoil Curve) is X, Y, Z three-in-one curve, and a Z curve is not used, and only a X, Y curve is used, wherein X is the lower limit of random value, Y is the upper limit of random value, and the horizontal axis is the shooting frequency.
The initial offset type (Initial Offset Type) is classified into no offset (Zero) and Random offset (Random), and only takes effect on the program curve.
Duration (Duration) the game user or developer can flexibly set, for example, the Duration can be set to 5 seconds.
Program curve type (Waveform) including trigonometric functions (e.g., sine) or decay functions (e.g., perlin Noise).
Amplitude (Wave Amplitude) may be filled with positive or negative values.
Period (Wave Period) for a Sine function (Sine), a complete Sine Period can be marked with 1 second, i.e. 1 second corresponds to 2 pi, half Period is 0.5 second, and corresponding to pi, i.e. 0 point.
The Base Value of the exponent (Power Base Value) is the Base for the decay function.
Exponential update Frequency (Power Frequency), the speed for decay.
The fade In Time (Blend In Time) is flexibly set by the game user or developer, for example, the fade In Time can be set to 0.5 seconds.
A fade-in process curve (Blend in Progress Curve) is a curve set based on the time of the fade-in for representing the fade-in process.
The fade Out Time (Blend Out Time) is flexibly set by the game user or developer, for example, the fade Out Time can be set to 0.4 seconds.
A fade-out process curve (Blend Out Progress Curve) is a curve set based on the time of the fade-out for representing the fade-out process.
Scaling curves (Curve FOV Weight) at different angles Of View X is the angle Of View (FOV), and Y is the scale (Weight).
Interpolation speed (INTERP SPEED) types for Additive and overlay.
Interpolation curve (Curve Interp) interpolation curve for curve type change can be used to configure the speed, start and end of interpolation to scale the whole length of interpolation curve to shooting interval time, for example, assuming that the interval of interpolation curve X is 0-1 and shooting interval time is 0.1 seconds, then 0 seconds corresponds to the value of curve X at 0 and 0.1 seconds corresponds to the value of curve X at 1.
For example, about 6 shake slots may be configured for a virtual weapon to calculate the displacement of the hand skeleton of the virtual character in three directions of X, Y, Z axis and three rotations around X, Y, Z axis, respectively, according to each configuration parameter in the shake slots, for example, obtain an offset value from a curve resource (the horizontal axis is the shooting number and the vertical axis is the offset value), and after obtaining the offset value, perform a smooth interpolation on the offset value according to the interpolation curve according to the time difference between the current time and the shooting time. This is the entire calculation process for each slot. After all slots have been calculated separately, all jitter data (including the X, Y, Z axis displacement in three directions and three rotations about X, Y, Z axis) for this shot is obtained.
In addition, it should be noted that, because the shake conditions corresponding to different actual firearms in the real world are different when shooting, the weapon configuration in the embodiment of the application can also be configured independently according to different virtual weapons, so when the shooting expression is presented, the shake corresponding to the virtual light machine gun when shooting is small, and the shake corresponding to the virtual heavy machine gun when shooting is large, thereby being capable of more conforming to the recoil shake condition of a real person when shooting of the actual firearms in the real world, and further improving the visual expression of a user when controlling the virtual character to shoot.
(IV) animation update logic obtains the dithered data
In some embodiments, the animation module updates the animation of each frame of game screen in logic similar to a timer. When updated, the program accesses the shake data generated in the weapon module and saves the shake data to the animation module for calculation of bone position and rotation after the animation module is updated.
(V) application of dithered data to animated skeleton calculation
In some embodiments, after updating, the animation module calculates the position and rotation data of the skeleton of the virtual character, and when the hand skeleton of the virtual character is calculated, the hand animation shake data stored in the animation module can be accessed, and the shake data is superimposed on the current position of the hand skeleton of the virtual character, so that the hand skeleton of the virtual character generates a shake effect, and further the virtual weapon is driven to shake together.
In other embodiments, in order to further improve the fidelity and operability of shooting, when the virtual weapon shoots, the shooting point of the virtual bullet shot by the virtual weapon may be controlled to be consistent with the current shaking direction of the virtual weapon, for example, when the shooting time is reached, the virtual weapon is controlled to shoot, and according to the current shaking direction of the virtual weapon, the shooting point of the virtual bullet shot is controlled to be consistent with the shaking direction, for example, when the shaking direction of the virtual weapon is left-shifted, the shooting point of the virtual weapon is correspondingly left-shifted (for example, if the virtual weapon does not shake, the shooting point of the virtual bullet shot by the virtual weapon is A, and when the hand of the virtual character drives the virtual weapon to shake leftwards, the shooting point of the virtual bullet shot by the virtual weapon is located at a certain distance B to the left of the A, wherein the distance between the shooting point A and the shooting point B is positively related to the shaking degree of the virtual weapon, that is, the shaking degree of the virtual weapon is large, the distance of the shooting point B is larger from the shooting point A, so that the shooting point B is larger than the distance of the virtual weapon is deviated from the shooting point A, and the shooting point A is more practical, so that the shooting experience of a user can be improved in the shooting process of the virtual weapon can be more realistic.
According to the shooting control method for the virtual character, when the game is running, the hand action shaking data of the character when shooting is generated in real time according to the parameter configuration, and the generated shaking data is superimposed on the current hand skeleton of the virtual character, so that the effect of simulating the shooting of a real weapon can be achieved, the visual performance of the character when shooting can be obviously improved, a real shooting experience is provided for a user, and better hand feeling and experience are provided for the user.
Continuing with the description below of an exemplary configuration of the shooting control means 465 for a virtual character provided by an embodiment of the present application implemented as software modules, in some embodiments, as shown in fig. 3, the software modules stored in the shooting control means 465 for a virtual character of the memory 460 may include a display module 4651, an acquisition module 4652, and a control module 4653.
The virtual shooting device comprises a display module 4651, an acquisition module 4652 and a control module 4653, wherein the display module is used for displaying a virtual character and a virtual shooting prop held by a holding part in a virtual scene, the acquisition module 4652 is used for responding to shooting trigger operation based on the virtual shooting prop and acquiring jitter configuration information corresponding to the current shooting stage of the virtual shooting prop, the acquisition module 4652 is also used for acquiring jitter data corresponding to the current shooting stage according to the jitter configuration information, and the control module 4653 is used for controlling the holding part to drive the virtual shooting prop to carry out corresponding jitter based on the jitter data corresponding to the current shooting stage in the current shooting stage.
In some embodiments, the acquisition module 4652 is further configured to acquire a number of shots fired from a start shooting time to a current time, the shooting control device 465 of the virtual character further includes a determination module 4654 configured to determine a shooting phase corresponding to the number of shots fired and as a current shooting phase of the virtual shooting prop, wherein each shooting phase includes a fixed number of shots, and the determination module 4654 is further configured to determine a time difference between the current time and the start shooting time, and to determine a shooting phase corresponding to the time difference and as a current shooting phase of the virtual shooting prop, wherein each shooting phase includes a fixed duration.
In some embodiments, the obtaining module 4652 is further configured to obtain shake data corresponding to a current shooting phase according to an animated squat style.
In some embodiments, the acquisition module 4652 is further configured to acquire a change value of the recoil corresponding to the current shooting stage of the virtual shooting stage, and the determination module 4654 is further configured to determine shake data corresponding to the current shooting stage according to the change value, where the shake data includes displacement and rotation of the grip portion relative to different reference directions.
In some embodiments, the obtaining module 4652 is further configured to obtain a change value of the recoil corresponding to each of the starting shooting phase and the current shooting phase of the virtual shooting prop, and the determining module 4654 is further configured to perform accumulation processing on the plurality of change values, and determine shake data corresponding to the current shooting phase according to the accumulation result, where the shake data includes displacement and rotation of the holding portion relative to different reference directions.
In some embodiments, the determining module 4654 is further configured to determine a curve resource according to a correspondence between different firing phases and offset ranges, and determine jitter data corresponding to a current firing phase according to the curve resource, or determine jitter data corresponding to a current firing phase according to a program curve, where the program curve includes at least one of a program curve of a trigonometric function type determined according to a period, an amplitude, and an initial value, and a program curve of an attenuation function type determined according to a base of attenuation, a frequency of attenuation, a fade-in time, and a fade-out time.
In some embodiments, the determining module 4654 is further configured to determine an offset range corresponding to the current shooting stage in the curve resource according to the current shooting stage, determine a corresponding offset value according to the offset range, determine a time difference between the current moment and the shooting moment corresponding to the current shooting stage, determine a value corresponding to the time difference in the interpolation curve corresponding to the shooting interval time of the virtual shooting prop based on the time difference, and take a product of the offset value and the value as jitter data corresponding to the current shooting stage.
In some embodiments, the determining module 4654 is further configured to, for each reference direction corresponding to the grip portion, obtain a first program curve corresponding to the displacement, obtain a displacement in the first program curve with a first function value corresponding to the current time point as the reference direction, and obtain a second program curve corresponding to the rotation, and obtain a rotation in the second program curve with a second function value corresponding to the current time point as the reference direction.
In some embodiments, the determining module 4654 is further configured to perform, for each reference direction of the grip portion, a process of summing displacements in different reference directions included in the shake data determined based on the curve resources corresponding to displacements in different reference directions included in the shake data determined based on the program curve to obtain a displacement sum, a process of summing rotations in different reference directions included in the shake data determined based on the curve resources corresponding to rotations in different reference directions included in the shake data determined based on the program curve to obtain a rotation sum, and a process of updating shake data corresponding to a current shooting phase based on the displacement sum and the rotation sum.
In some embodiments, the acquisition module 4652 is further configured to acquire a type of virtual shooting prop, and the determination module 4654 is further configured to determine a first adjustment coefficient corresponding to the type of virtual shooting prop, and take a product of the first adjustment coefficient and shake data corresponding to a current shooting stage as updated shake data, wherein the first adjustment coefficient is positively correlated with recoil or killing power of the type of virtual shooting prop.
In some embodiments, the obtaining module 4652 is further configured to obtain a shooting mode corresponding to the virtual character in the current shooting stage, and the determining module 4654 is further configured to determine a second adjustment coefficient corresponding to the shooting mode, and take a product of the second adjustment coefficient and jitter data corresponding to the current shooting stage as updated jitter data, where the accuracy of the shooting mode is inversely related to the second adjustment coefficient.
In some embodiments, the control module 4653 is further configured to, when updating each frame of image of the virtual scene in the current shooting stage, perform a process of respectively superimposing displacements and rotations of different reference directions included in the shake data of each frame of image on a position component of the holding portion corresponding to the reference direction, where the displacements and rotations of the different reference directions are used to make the holding portion drive the virtual shooting prop to perform corresponding shake.
In some embodiments, the acquisition module 4652 is further configured to acquire a real-time jitter direction of the virtual shooting prop in a current shooting phase, the determination module 4654 is further configured to determine a drop point synchronously offset from the real-time jitter direction, and the control module 4653 is further configured to control the virtual bullet shot by the virtual shooting prop to hit the drop point.
It should be noted that, in the embodiment of the present application, the description of the device is similar to the implementation of the shooting control method of the virtual character, and has similar beneficial effects, so that the description is omitted. The technical details of the shooting control apparatus for virtual characters according to the embodiment of the present application, which are not described in detail, can be understood from the description of any one of fig. 4 to 6, or fig. 12.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the shooting control method of the virtual character according to the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions that, when executed by a processor, cause the processor to perform a method provided by embodiments of the present application, for example, a shooting control method of a virtual character as shown in fig. 4 to 6, or fig. 12.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM, or various devices including one or any combination of the above.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, such as in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or distributed across multiple sites and interconnected by a communication network.
In summary, according to the embodiment of the application, for different shooting stages, the shake data corresponding to each shooting stage is obtained respectively, and in each shooting stage, the holding part of the virtual character is controlled to drive the virtual shooting prop to shake correspondingly based on the shake data corresponding to the shooting stage, so that the shooting performance of each shooting stage can be different from that of a simple single animation, thereby remarkably improving the visual performance of the virtual character when shooting and providing a real shooting experience for users.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application.
Claims (15)
1. A method of controlling shooting of a virtual character, the method comprising:
displaying a virtual character in a virtual scene and a virtual shooting prop held by the virtual character through a holding part;
In response to a firing trigger operation based on the virtual firing prop, obtaining jitter configuration information corresponding to a current firing phase of the virtual firing prop, obtaining jitter data corresponding to the current firing phase according to the jitter configuration information, and
In the current shooting stage, based on shaking data corresponding to the current shooting stage, controlling the holding part to drive the virtual shooting prop to shake correspondingly;
When the jitter configuration information includes correspondence between different shooting modes and adjustment coefficients, the obtaining jitter data corresponding to the current shooting stage according to the jitter configuration information includes:
determining a second adjustment coefficient corresponding to the shooting mode, taking the product of the second adjustment coefficient and jitter data corresponding to the current shooting stage as updated jitter data, wherein the accuracy of the shooting mode is inversely related to the second adjustment coefficient.
2. The method of claim 1, wherein prior to obtaining jitter configuration information corresponding to a current firing phase of the virtual firing prop, the method further comprises:
acquiring the number of times of shooting corresponding to the time from the starting shooting time to the current time, determining shooting stages corresponding to the number of times of shooting, and taking the shooting stages as the current shooting stages of the virtual shooting prop, wherein each shooting stage comprises fixed shooting times, or
And determining a time difference between the current moment and the moment of starting shooting, determining a shooting stage corresponding to the time difference, and taking the shooting stage as the current shooting stage of the virtual shooting prop, wherein each shooting stage comprises a fixed duration.
3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
When the shake configuration information includes a configured animated squatting style, the obtaining shake data corresponding to the current shooting stage according to the shake configuration information includes:
Obtaining shake data corresponding to the current shooting stage according to the animation squatting mode;
wherein the animation squat mode comprises at least one of a covering mode, a superposition mode and a curve mode.
4. A method according to claim 3, wherein when the animated squat style comprises the overlay style, the acquiring shake data corresponding to the current shooting phase according to the animated squat style comprises:
acquiring a change value of recoil corresponding to the virtual shooting prop in the current shooting stage;
And determining shake data corresponding to the current shooting stage according to the change value, wherein the shake data comprises displacement and rotation of the holding part relative to different reference directions.
5. A method according to claim 3, wherein when the animated squat style comprises the superimposed style, the obtaining jitter data corresponding to the current shooting phase according to the animated squat style comprises:
obtaining the change values of the recoil forces respectively corresponding to the virtual shooting props from the initial shooting stage to the current shooting stage;
and accumulating the plurality of variation values, and determining jitter data corresponding to the current shooting stage according to an accumulation result, wherein the jitter data comprises displacement and rotation of the holding part relative to different reference directions.
6. A method according to claim 3, wherein when the animated squat style comprises the curve style, the acquiring shake data corresponding to the current shooting phase according to the animated squat style comprises:
Determining curve resources according to the corresponding relation between different shooting phases and offset ranges, and determining jitter data corresponding to the current shooting phase according to the curve resources, or
And determining jitter data corresponding to the current shooting stage according to a program curve, wherein the program curve comprises at least one program curve of a trigonometric function type determined according to a period, an amplitude and an initial value, and a program curve of an attenuation function type determined according to an attenuation base, an attenuation frequency, a fade-in time and a fade-out time.
7. The method of claim 6, wherein said determining jitter data corresponding to said current firing phase from said curve resources comprises:
Determining an offset range corresponding to the current shooting stage in the curve resource according to the current shooting stage;
Determining a corresponding offset value according to the offset range, and determining a time difference between the current moment and a shooting moment corresponding to the current shooting stage;
determining a value corresponding to the time difference in an interpolation curve corresponding to the shooting interval time of the virtual shooting prop based on the time difference;
and taking the product of the offset value and the value as jitter data corresponding to the current shooting stage.
8. The method of claim 6, wherein determining jitter data corresponding to the current firing phase from a program curve comprises:
for each reference direction corresponding to the grip portion, the following processing is performed:
Acquiring a first program curve corresponding to displacement, and taking a first function value corresponding to the current moment in the first program curve as the displacement of the reference direction;
and acquiring a second program curve corresponding to the rotation, and taking a second function value corresponding to the current moment in the second program curve as the rotation of the reference direction.
9. The method of claim 1, wherein when the jitter configuration information includes correspondence between different types of virtual shooting properties and adjustment coefficients, the obtaining jitter data corresponding to the current shooting stage according to the jitter configuration information includes:
obtaining the type of the virtual shooting prop;
And determining a first adjustment coefficient corresponding to the type of virtual shooting prop, and taking the product of the first adjustment coefficient and the shaking data corresponding to the current shooting stage as updated shaking data, wherein the first adjustment coefficient is positively correlated with the recoil or killing force of the type of virtual shooting prop.
10. The method of claim 1, wherein the step of determining the position of the substrate comprises,
The shaking data comprise displacement and rotation of the holding part relative to different reference directions;
the controlling the holding part to drive the virtual shooting prop to perform corresponding shake based on shake data corresponding to the current shooting stage includes:
when each frame of image of the virtual scene is updated and displayed in the current shooting stage, the following processing is performed:
And respectively superposing displacement and rotation of different reference directions included in the dithering data of each frame of image on a position component of the holding part corresponding to the reference direction, wherein the displacement and rotation of the different reference directions are used for enabling the holding part to drive the virtual shooting prop to perform corresponding dithering.
11. The method according to claim 1, wherein the method further comprises:
acquiring a real-time shaking direction of the virtual shooting prop in the current shooting stage, and determining a bullet drop point synchronously offset with the real-time shaking direction;
and controlling the virtual bullet launched by the virtual shooting prop to hit the bullet drop point.
12. A shooting control apparatus for a virtual character, the apparatus comprising:
The display module is used for displaying the virtual character and the virtual shooting prop held by the virtual character through the holding part in the virtual scene;
the acquisition module is used for responding to shooting trigger operation based on the virtual shooting prop and acquiring jitter configuration information corresponding to the current shooting stage of the virtual shooting prop;
the acquisition module is further used for acquiring shake data corresponding to the current shooting stage according to the shake configuration information;
the control module is used for controlling the holding part to drive the virtual shooting prop to shake correspondingly based on shake data corresponding to the current shooting stage in the current shooting stage;
the acquisition module is further used for acquiring a shooting mode corresponding to the virtual character in the current shooting stage;
And the determining module is used for determining a second adjustment coefficient corresponding to the shooting mode, taking the product of the second adjustment coefficient and the jitter data corresponding to the current shooting stage as updated jitter data, and the accuracy of the shooting mode is inversely related to the second adjustment coefficient.
13. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor for implementing the shooting control method of the virtual character according to any one of claims 1-11 when executing the executable instructions stored in the memory.
14. A computer readable storage medium storing executable instructions for implementing the shooting control method of a virtual character according to any one of claims 1-11 when executed by a processor.
15. A computer program product comprising computer programs or instructions which, when executed by a processor, implement a method of shooting control of a virtual character according to any one of claims 1 to 11.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2021106467065 | 2021-06-10 | ||
| CN202110646706 | 2021-06-10 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN114191817A CN114191817A (en) | 2022-03-18 |
| CN114191817B true CN114191817B (en) | 2025-05-30 |
Family
ID=80657419
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202111648792.XA Active CN114191817B (en) | 2021-06-10 | 2021-12-30 | Virtual character shooting control method, device, electronic device and storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114191817B (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112121424A (en) * | 2020-09-18 | 2020-12-25 | 网易(杭州)网络有限公司 | Shooting control method, device, equipment and storage medium |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20140112117A (en) * | 2012-12-20 | 2014-09-23 | (주)창진인터내셔널 | Wireless indoor shooting simulation system |
| CN110075521B (en) * | 2019-05-22 | 2025-02-21 | 努比亚技术有限公司 | Gun-control assisting method, device, mobile terminal and storage medium for shooting games |
| CN110548288B (en) * | 2019-09-05 | 2020-11-10 | 腾讯科技(深圳)有限公司 | Virtual object hit prompting method and device, terminal and storage medium |
| CN111388993A (en) * | 2020-03-16 | 2020-07-10 | 网易(杭州)网络有限公司 | Control method and device for virtual reality shooting game |
| CN112169325B (en) * | 2020-09-25 | 2021-12-10 | 腾讯科技(深圳)有限公司 | Virtual prop control method and device, computer equipment and storage medium |
| CN112107856B (en) * | 2020-09-30 | 2022-10-28 | 腾讯科技(深圳)有限公司 | Hit feedback method and device, storage medium and electronic equipment |
| CN112156472B (en) * | 2020-10-23 | 2023-03-10 | 腾讯科技(深圳)有限公司 | Control method, device and equipment of virtual prop and computer readable storage medium |
-
2021
- 2021-12-30 CN CN202111648792.XA patent/CN114191817B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112121424A (en) * | 2020-09-18 | 2020-12-25 | 网易(杭州)网络有限公司 | Shooting control method, device, equipment and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| CN114191817A (en) | 2022-03-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102794785B1 (en) | Methods, devices, electronic devices and readable storage media for handling interaction with virtual props | |
| TWI818343B (en) | Method of presenting virtual scene, device, electrical equipment, storage medium, and computer program product | |
| CN112691377A (en) | Control method and device of virtual role, electronic equipment and storage medium | |
| CN112057860B (en) | Method, device, equipment and storage medium for activating operation control in virtual scene | |
| TWI793837B (en) | Method of controlling virtual object, device, electrical equipment, storage medium, and computer program product | |
| CN112076473B (en) | Control method and device of virtual prop, electronic equipment and storage medium | |
| CN112156472B (en) | Control method, device and equipment of virtual prop and computer readable storage medium | |
| CN112138385B (en) | Virtual shooting prop aiming method and device, electronic equipment and storage medium | |
| JP7697034B2 (en) | Method and apparatus for displaying virtual gun shooting, computer device, and computer program | |
| JP7761644B2 (en) | Method, device, electronic device, and computer program for controlling virtual objects | |
| CN112800252B (en) | Method, device, equipment and storage medium for playing media files in virtual scene | |
| CN113440852A (en) | Control method, device, equipment and storage medium for virtual skill in virtual scene | |
| JP2023541697A (en) | Position acquisition method, device, electronic device, storage medium and computer program in virtual scene | |
| CN114146414B (en) | Virtual skill control method, device, apparatus, storage medium, and program product | |
| CN113633968A (en) | A method, device, electronic device and storage medium for displaying information in a game | |
| CN113663329B (en) | Shooting control method and device for virtual character, electronic equipment and storage medium | |
| CN114130006B (en) | Virtual prop control method, device, equipment, storage medium and program product | |
| CN114219924B (en) | Adaptive display method, device, equipment, medium and program product for virtual scene | |
| CN114191817B (en) | Virtual character shooting control method, device, electronic device and storage medium | |
| CN114210046B (en) | Virtual skill control method, device, equipment, storage medium and program product | |
| CN112891930B (en) | Information display method, device, equipment and storage medium in virtual scene | |
| CN120837933A (en) | A shooting control method and related device for a virtual character | |
| WO2025102919A1 (en) | Virtual scene interaction processing method and apparatus, electronic device, computer-readable storage medium and computer program product | |
| HK40055279A (en) | Shooting control method of virtual character, device, electronic equipment and storage medium | |
| WO2024032176A1 (en) | Virtual item processing method and apparatus, electronic device, storage medium, and program product |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |