CN113888679B - Animation generation method, device, equipment and medium - Google Patents
Animation generation method, device, equipment and medium Download PDFInfo
- Publication number
- CN113888679B CN113888679B CN202111055114.2A CN202111055114A CN113888679B CN 113888679 B CN113888679 B CN 113888679B CN 202111055114 A CN202111055114 A CN 202111055114A CN 113888679 B CN113888679 B CN 113888679B
- Authority
- CN
- China
- Prior art keywords
- animation
- skeleton
- bone
- model
- sub
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
本申请提供的动画生成方法、装置、设备及介质中,电子设备通过复用第一动画的第一局部动画以及第二动画的第二局部动画进行拼接,获得拼接动画。由于复用了现有动画,因此,提高了动画的制作效率。
In the animation generation method, device, equipment and medium provided by the present application, the electronic device obtains a spliced animation by reusing a first partial animation of a first animation and a second partial animation of a second animation. Since the existing animation is reused, the animation production efficiency is improved.
Description
Technical Field
The present application relates to the field of computers, and in particular, to an animation generating method, apparatus, device, and medium.
Background
In three-dimensional games in the world, which include a large number of various static models and skin models, the dynamic representation of each model requires a large amount of animation data to be represented. This also results in a large number of motion designers being required to make a large number of animation data files.
The inventor researches and discovers that the action designer is required to independently manufacture each type of action, and when the action type reaches a certain degree, huge workload is brought to the action designer.
Disclosure of Invention
To overcome at least one of the disadvantages in the prior art, in a first aspect, an embodiment of the present application provides an animation generation method, applied to an electronic device, including:
acquiring a first local animation of a first animation and a second local animation of a second animation, wherein the first animation corresponds to a first action of a target object, and the second animation corresponds to a second action of the target object;
and splicing the first local animation and the second local animation to obtain a spliced animation corresponding to the third action of the target object.
In a second aspect, an embodiment of the present application provides an animation generating apparatus, applied to an electronic device, including:
The system comprises an acquisition module, a control module and a control module, wherein the acquisition module is used for acquiring a first local animation of a first animation and a second local animation of a second animation, the first animation corresponds to a first action of a target object, and the second animation corresponds to a second action of the target object;
and the splicing module is used for splicing the first local animation and the second local animation to obtain a spliced animation corresponding to the third action of the target object.
In a third aspect, an embodiment of the present application provides an electronic device, where the electronic device includes a processor and a memory, where the memory stores computer-executable instructions that, when executed by the processor, implement the animation generation method.
In a fourth aspect, an embodiment of the present application provides a storage medium storing a computer program, which when executed by a processor, implements the animation generation method.
Compared with the prior art, the application has the following beneficial effects:
In the animation generation method, the device, the equipment and the medium provided by the embodiment of the application, the electronic equipment performs splicing by multiplexing the first local animation of the first animation and the second local animation of the second animation to obtain the spliced animation. The existing animation is multiplexed, so that the animation production efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating steps of an animation generation method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a bone parent-child relationship provided by an embodiment of the present application;
FIGS. 4A-4B are schematic illustrations of a center bone split provided in an embodiment of the present application;
FIG. 5 is a schematic view of a virtual skeleton provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of a virtual bone insertion method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an animation resolution provided by an embodiment of the present application;
Fig. 8 is a schematic structural diagram of an animation generating device according to an embodiment of the present application.
The icons are 120-memory, 130-processor, 201-thigh, 202-calf, 301-target bone, 302-first sub-model, 303-second sub-model, 400-center bone, 401-first bone, 402-second bone, 501-upper body, 502-lower body, 601-acquisition module, 602-stitching module.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that like reference numerals and letters refer to like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
In the description of the present application, it should be noted that, directions or positional relationships indicated by terms such as "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., are directions or positional relationships based on those shown in the drawings, or are directions or positional relationships conventionally put in use of the inventive product, are merely for convenience of describing the present application and simplifying the description, and are not indicative or implying that the apparatus or element to be referred to must have a specific direction, be constructed and operated in a specific direction, and thus should not be construed as limiting the present application. Furthermore, the terms "first," "second," "third," and the like are used merely to distinguish between descriptions and should not be construed as indicating or implying relative importance.
In the related art, an animator is required to individually produce corresponding animation data for each type of motion. When the action type reaches a certain degree, huge workload is brought to action designers.
Illustratively, in creating games, it is desirable to create as rich a game animation as possible as a way to enhance the game experience of the player. If the game scenario requires an archery action, a running action and an archery-while-running action, then an animation producer needs to produce a set of animations for the archery action and a set of animations for the running action and the archery-while-running action respectively.
Therefore, as the types of motion increase, the workload of animators increases.
In view of this, an embodiment of the present application provides an animation production method applied to an electronic device, which splices a new animation by multiplexing an existing animation of a target object, so as to achieve the purpose of improving animation production efficiency.
The electronic device may be, but is not limited to, a server, a smart phone, a Personal computer (Personal Computer, PC), a tablet computer, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA), a Mobile internet device (Mobile INTERNET DEVICE, MID), and the like.
The operating system of the electronic device may be, but is not limited to, an Android (Android) system, IOS (iPhone operating system) system, windows phone system, windows system, etc.
Aiming at the electronic equipment, the embodiment of the application also provides a structural schematic diagram of the electronic equipment. As shown in fig. 1, the electronic device includes a memory 120, a processor 130.
The memory 120 and the processor 130 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The Memory 120 may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. The memory 120 is used for storing computer executable instructions, and the processor 130 implements the animation generation method when executing the computer executable instructions in the memory.
The processor 130 may be an integrated circuit chip with signal processing capabilities. The processor may be a general-purpose processor including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
For the above animation generation method, the embodiment of the application provides a schematic diagram of steps of the animation generation method. The method includes various steps that are described in detail below in conjunction with fig. 2. As shown in fig. 2, the animation generation method includes:
Step S101, a first partial animation of the first animation and a second partial animation of the second animation are acquired.
Wherein the first animation corresponds to a first action of the target object and the second animation corresponds to a second action of the target object.
Step S102, the first local animation and the second local animation are spliced to obtain a spliced animation corresponding to the third action of the target object.
In addition, the target object may be, but is not limited to, a human object, an animal object, a fictional object (e.g., monster in a game), and the like.
Illustratively, taking a human object as an example, the first action may be an archery action as well, and the second action may be a running action. Of course, the first action and the second action may be other types of actions according to specific scenarios, and the embodiment of the present application is not limited in particular, and the archery action and the running action described above are only examples provided by the embodiment of the present application.
Those skilled in the art can also extend to jumping, martial arts, climbing, etc. actions as desired, which do not require inventive contributions based on embodiments of the present application.
Correspondingly, the first partial animation may be an upper body of the animation corresponding to the archery action, and the second partial animation may be a lower body of the animation corresponding to the running action.
And the electronic equipment splices the upper half body of the archery action pair animation with the lower half body of the animation corresponding to the running action, so that the animation corresponding to the archery action while running can be obtained.
Therefore, compared with the related art, the animation corresponding to the motion of running while shooting is required to be independently manufactured, the animation generation method provided by the embodiment of the application obtains the spliced animation by multiplexing the first partial animation of the first animation and the second partial animation of the second animation. The existing animation is multiplexed, so that the animation production efficiency is improved.
In addition, in the embodiment of the present application, in order to coordinate the obtained stitched animation on the body scale, the first animation and the second animation are required to be animations of the same target object. Also, in order for the obtained stitched animation to be coordinated in motion, the first animation and the second animation also need to have the same frame sequence.
Also described by way of example above with respect to archery and running actions, in order to coordinate the resulting stitched animation with respect to body proportions and actions. First, it is necessary that the animation corresponding to the archery action and the animation corresponding to the running action belong to the same person object.
In contrast, if the animation corresponding to the archery action belongs to a character object of 1.8 m and the animation corresponding to the running action belongs to a character object of 1.5 m, the problem of inconsistent body proportions occurs by stitching the character object of 1.8 m with the animation of the character object of 1.5 m.
Therefore, in the embodiment of the application, the first local animation and the second local animation both carry the serial numbers of the target objects, so that when an animation producer performs splicing, whether the two local animations belong to the same object is determined by the serial numbers.
Meanwhile, if the animation corresponding to the archery action is 30 frames, the animation corresponding to the running action is 30 frames. In order to coordinate the splicing animation in terms of actions, the local animation of the first frame of archery animation and the local animation of the first frame of running animation need to be spliced, the local animation of the second frame of archery animation and the local animation of the second frame of running animation are spliced, and the splicing animation of 30 frames of archery while running is obtained by pushing the local animation.
In one implementation, the first animation uses the second animation as a skeleton animation. It should be understood that skeletal animation is one of the model animations. In skeletal animation, a model has a skeletal structure of interconnected "skeletons" that are animated by changing the orientation and position of the skeletons.
Wherein, between each skeleton that constitutes this skeleton texture, there is father child relation, can then produce the linkage when leading to the motion. The method is characterized in that the movement of the father node drives the overall movement of the skeleton serving as the child node, and the skeleton serving as the father node is not influenced when the skeleton serving as the child node moves.
Illustratively, in the human skeleton shown in fig. 3, thigh 201 serves as a parent node of shank 202, and thigh 201 moves together with shank 202. In contrast, thigh 201 is not affected by movement of lower leg 202.
Because the skeletons in the skeleton animation can be linked, when the first local animation and the second local animation are directly spliced together, the first local animation and the second local animation can be mutually influenced, and then the spliced animation cannot achieve the desired animation effect.
For example, the above-described animation of the archery motion is also taken as an example. And (3) assuming that the upper half body of the animation corresponding to the archery action is spliced with the lower half body of the animation corresponding to the running action, so as to obtain the splicing animation for archery while running.
However, since there is a linkage between bones, the lower body may be affected to shake left and right when running, and the upper body needs to be kept stable when shooting. If the upper body local animation and the lower body local animation are directly spliced, the upper body can shake left and right when shooting an arrow, and the animation effect of aiming at the target cannot be achieved.
Therefore, in the embodiment of the application, aiming at the skeletal animation, the offset from the lower body needs to be automatically offset by splicing the animation. Therefore, in the implementation of the application, the electronic equipment takes the first animation and the second animation as the animation to be split, and obtains the skeleton model corresponding to the animation to be split aiming at each animation to be split.
Wherein the target bone at the splitting position splits the bone model into a first sub-model and a second sub-model.
The electronic device constructs a virtual skeleton, wherein the virtual skeleton comprises a central skeleton, a first skeleton and a second skeleton, and the central skeleton is a father node of the first skeleton and the second skeleton.
Further, the electronic device connects the first sub-model with the target bone through the first bone, wherein the first bone serves as a father node of the first sub-model, and connects the second sub-model with the target bone through the center bone and the second bone, wherein the center bone serves as a father node of the second sub-model.
And finally, the electronic equipment splits each animation to be split from the connection position between the target skeleton and the first skeleton according to all skeleton models inserted into the virtual skeleton, so as to obtain a first local animation and a second local animation.
The effect of the virtual bone will be exemplarily described below using the bone model shown in fig. 4A-4B. As shown in fig. 4A, a target bone 301 is determined from the bone model, the connection between the target bone 301 and other bones is broken, and the bone model shown in fig. 4A is split into a first sub-model 302 and a second sub-model 303 shown in fig. 4B.
Then, a virtual skeleton as shown in fig. 5 is constructed. As shown in fig. 5, the virtual skeleton includes a central skeleton 400, a first skeleton 401, and a second skeleton 402, where the central skeleton 400 is a parent node of the first skeleton 401 and the second skeleton 402, that is, a parent-child relationship exists between the central skeleton 400 and the first skeleton 401 and the second skeleton 402.
On this occasion, as shown in fig. 6, the first sub-model 302 is connected to the target bone 301 through the first bone 401, and the target bone 301 is connected to the second sub-model 303 through the center bone 400 and the second bone 402. Thus, a bone model in which the virtual bone is inserted is obtained.
As shown in fig. 7, according to the bone model in which the virtual bone is inserted, splitting is performed from the connection position between the target bone 301 and the first bone 401, splitting the animation to be split into partial animations of the upper body 501 and the lower body 502, and then, according to the splicing requirement, selecting the first partial animation and the second partial animation from all the split partial animations to splice.
As shown in fig. 7, assuming that the target bone 301 is rotated 30 ° clockwise, the central bone 400 and the first bone 401 are rotated 30 ° clockwise together by the bone linkage.
If the first bone 401 is directly connected to the central bone 400 in the manner shown in fig. 5, a 30 ° rotation of the central bone 400 would result in a 30 ° rotation of the first bone 401 also clockwise.
However, the first skeleton 401 is connected with the first sub-model 302 at this time, but a parent-child relationship still exists between the first skeleton 401 and the central skeleton 400, so that the first skeleton 401 rotates by 30 ° anticlockwise at this time, and then the first skeleton 401 and the central skeleton counteract each other, so that the first sub-model 302 in the spliced animation is not affected by the second sub-model 303.
In addition, the splitting position shown in fig. 7 is only an example provided by the embodiment of the present application, and those skilled in the art may insert the virtual bone at other positions of the bone model as required for splitting. Such as the neck, shoulders, waist, etc.
Referring to fig. 8, the present embodiment further provides an animation generating device. The animation generating means comprises at least one functional module which may be stored in the form of software in a memory. Functionally divided, the animation generation means may include:
The obtaining module 601 is configured to obtain a first partial animation of a first animation and a second partial animation of a second animation, where the first animation corresponds to a first action of a target object, and the second animation corresponds to a second action of the target object.
In the embodiment of the present application, when the computer executable code corresponding to the obtaining module 601 is executed by the processor, the step S101 shown in fig. 2 is implemented, and for the detailed description of the obtaining module 601, reference may be made to the detailed description of the step S101.
And a stitching module 602, configured to stitch the first partial animation and the second partial animation to obtain a stitched animation corresponding to the third action of the target object.
In the embodiment of the present application, when the computer executable code corresponding to the splicing module 602 is executed by the processor, the step S102 shown in fig. 2 is implemented, and for the detailed description of the splicing module 602, reference may be made to the detailed description of the step S102.
In one possible implementation, the first animation and the second animation have the same frame sequence.
In one possible implementation, the split position of the first partial animation in the first animation is the same as the split position of the second partial animation in the second animation.
In one possible implementation manner, the first animation and the second animation are bone animations, and the obtaining module 601 is specifically configured to:
Taking the first animation and the second animation as the animation to be split;
For each animation to be split, acquiring a skeleton model corresponding to the animation to be split, wherein a target skeleton positioned at the splitting position splits the skeleton model into a first sub-model and a second sub-model;
constructing a virtual skeleton, wherein the virtual skeleton comprises a central skeleton, a first skeleton and a second skeleton, and the central skeleton is a father node of the first skeleton and the second skeleton;
connecting the first sub-model with the target bone through the first bone, wherein the first bone serves as a father node of the first sub-model;
connecting the second sub-model with the target bone through the central bone and the second bone, wherein the central bone serves as a parent node of the second sub-model;
and splitting each animation to be split from the connection position between the target skeleton and the first skeleton according to all skeleton models inserted into the virtual skeleton, so as to obtain the first local animation and the second local animation.
The embodiment of the application also provides a storage medium. Wherein the storage medium stores a computer program which, when executed by a processor, implements the animation generation method described above.
In summary, in the animation generation method, device, equipment and medium provided by the embodiments of the present application, the electronic device performs stitching by multiplexing the first partial animation of the first animation and the second partial animation of the second animation, so as to obtain the stitched animation. The existing animation is multiplexed, so that the animation production efficiency is improved.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
The above description is merely illustrative of various embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application, and the application is intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (6)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111055114.2A CN113888679B (en) | 2021-09-09 | 2021-09-09 | Animation generation method, device, equipment and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111055114.2A CN113888679B (en) | 2021-09-09 | 2021-09-09 | Animation generation method, device, equipment and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113888679A CN113888679A (en) | 2022-01-04 |
CN113888679B true CN113888679B (en) | 2025-02-21 |
Family
ID=79008884
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111055114.2A Active CN113888679B (en) | 2021-09-09 | 2021-09-09 | Animation generation method, device, equipment and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113888679B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118212327A (en) * | 2022-12-15 | 2024-06-18 | 完美世界(北京)软件科技发展有限公司 | Character local animation realization method and device, storage medium and electronic device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111773686A (en) * | 2020-06-30 | 2020-10-16 | 完美世界(北京)软件科技发展有限公司 | Animation generation method and device, storage medium, electronic device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105976417B (en) * | 2016-05-27 | 2020-06-12 | 腾讯科技(深圳)有限公司 | Animation generation method and device |
CN111161427A (en) * | 2019-12-04 | 2020-05-15 | 北京代码乾坤科技有限公司 | Self-adaptive adjustment method and device of virtual skeleton model and electronic device |
CN112070868B (en) * | 2020-09-08 | 2024-04-30 | 北京默契破冰科技有限公司 | Animation playing method based on iOS system, electronic equipment and medium |
-
2021
- 2021-09-09 CN CN202111055114.2A patent/CN113888679B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111773686A (en) * | 2020-06-30 | 2020-10-16 | 完美世界(北京)软件科技发展有限公司 | Animation generation method and device, storage medium, electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN113888679A (en) | 2022-01-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110517337B (en) | Animation character expression generation method, animation production method and electronic equipment | |
US20090091563A1 (en) | Character animation framework | |
CN111773686A (en) | Animation generation method and device, storage medium, electronic device | |
CN111714880B (en) | Picture display method and device, storage medium and electronic device | |
CN112598773B (en) | Method and device for realizing bone skin animation | |
CN111773688B (en) | Flexible object rendering method and device, storage medium and electronic device | |
CN112927331B (en) | Character model animation generation method and device, storage medium and electronic equipment | |
CN113313796B (en) | Scene generation method, device, computer equipment and storage medium | |
CN112669414A (en) | Animation data processing method and device, storage medium and computer equipment | |
CN113888679B (en) | Animation generation method, device, equipment and medium | |
CN112184862A (en) | Control method and device of virtual object and electronic equipment | |
US20180144531A1 (en) | Animating a virtual object in a virtual world | |
CN115115753B (en) | Animation video processing method, device, equipment and storage medium | |
CN112819931B (en) | Animation generation method, device, terminal and storage medium | |
JP7654905B2 (en) | Inferred Skeleton Structures for Practical 3D Assets | |
CN116934913A (en) | Animation generation method, device, equipment and storage medium | |
CN116805344B (en) | Digital human action redirection method and device | |
CN115841500B (en) | Object dynamic solution method, device, equipment and storage medium | |
CN117876551A (en) | Redirecting method, system and device for two-dimensional animation and electronic equipment | |
CN116681808A (en) | Method and device for generating model animation, electronic equipment and storage medium | |
CN115937371A (en) | Character model generation method and system | |
CN114596394A (en) | Method, device, system and storage medium for generating bone animation | |
US20250061670A1 (en) | Determination and display of inverse kinematic poses of virtual characters in a virtual environment | |
CN110120092B (en) | Three-dimensional head data acquisition method and device and electronic equipment | |
CN119425091A (en) | Fabric processing method, device, computer equipment, storage medium and product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |