[go: up one dir, main page]

HK40073651A - Animation data processing method, device, equipment and computer readable storage medium - Google Patents

Animation data processing method, device, equipment and computer readable storage medium Download PDF

Info

Publication number
HK40073651A
HK40073651A HK42022062747.5A HK42022062747A HK40073651A HK 40073651 A HK40073651 A HK 40073651A HK 42022062747 A HK42022062747 A HK 42022062747A HK 40073651 A HK40073651 A HK 40073651A
Authority
HK
Hong Kong
Prior art keywords
movement
information
determining
virtual object
movement attribute
Prior art date
Application number
HK42022062747.5A
Other languages
Chinese (zh)
Other versions
HK40073651B (en
Inventor
章文涵
Original Assignee
腾讯科技(深圳)有限公司
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of HK40073651A publication Critical patent/HK40073651A/en
Publication of HK40073651B publication Critical patent/HK40073651B/en

Links

Description

Animation data processing method, device, equipment and computer readable storage medium
Technical Field
The present application relates to animation technologies, and in particular, to an animation data processing method, apparatus, device, and computer-readable storage medium.
Background
At present, in animation production, an animator usually designs a plurality of animation segments, and then mixes and switches the plurality of animation segments through an engine, so as to finally realize an animation effect. The animation is a representation mode of the character behavior, and the action of a character object in a period of time is recorded and played to form a complete animation segment. In the related art, when the virtual object is controlled to move with different movement parameters, in-place animation can be used, control can be performed through different codes, and animation files corresponding to different movement parameters can also be manufactured, but the two implementation modes have large limitations, large manufacturing difficulty and are difficult to maintain.
Disclosure of Invention
The embodiment of the application provides an animation data processing method, an animation data processing device and a computer readable storage medium, which can realize the movement of virtual objects at different angles and different displacement distances by using one animation file, thereby reducing the packet size and lowering the manufacturing difficulty.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an animation data processing method, which comprises the following steps:
acquiring a reference animation file, reference movement information of a virtual object in the reference animation file and a current destination of the virtual object, wherein the reference movement information at least comprises a reference destination and a reference displacement distance;
creating a first movement attribute, a second movement attribute and a third movement attribute on the skeleton of the virtual object, and acquiring setting information aiming at the first movement attribute, the second movement attribute and the third movement attribute to obtain an updated animation file;
importing the updated animation file into an engine, and determining superposition movement information of the virtual object relative to the reference movement information based on the spacing distance between the reference target place and the destination, the reference displacement distance and the setting information;
determining target movement information of the virtual object based on the overlay movement information and the reference movement information, and controlling the virtual object to move based on the target movement information.
An embodiment of the present application provides an animation data processing apparatus, including:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a reference animation file, reference movement information of a virtual object in the reference animation file and a current destination of the virtual object, and the reference movement information at least comprises a reference destination and a reference displacement distance;
the second acquisition module is used for creating a first movement attribute, a second movement attribute and a third movement attribute on the skeleton of the virtual object, and acquiring setting information aiming at the first movement attribute, the second movement attribute and the third movement attribute to obtain an updated animation file;
a first determining module, configured to import the updated animation file into an engine, and determine superimposed movement information of a virtual object with respect to the reference movement information based on the distance between the reference destination and the destination, the reference displacement distance, and the setting information;
a second determination module for determining target movement information of the virtual object based on the overlay movement information and the reference movement information, and controlling the virtual object to move based on the target movement information.
In some embodiments, the second obtaining module is further configured to:
creating a first movement attribute, a second movement attribute and a third movement attribute through a plug-in;
acquiring a first key frame corresponding to the starting of the plug-in, a second key frame reaching the destination, a third key frame corresponding to the starting of the plug-in and a fourth key frame corresponding to the arrival original position;
determining the first key frame, the second key frame, the third key frame, and the fourth key frame as the setting information.
In some embodiments, the apparatus further comprises:
and the third determining module is used for importing the updated animation file into an engine, and determining a first movement attribute value of a first movement curve corresponding to the first movement attribute, a second movement attribute value of a second movement curve corresponding to the second movement attribute and a third movement attribute value of a third movement curve corresponding to the third movement attribute based on the setting information.
In some embodiments, the first determining module is to:
determining an included angle between a first moving track and a second moving track based on the interval distance and the reference displacement distance, wherein the first moving track is a moving track corresponding to the virtual object reaching the destination, and the second moving track is a moving track corresponding to the virtual object reaching the reference destination;
determining superimposed displacement information of a skeleton of the virtual object based on the separation distance, the reference displacement distance, the first movement attribute value and the included angle;
acquiring a reference rotation angle in the reference movement information, and determining first superposed rotation information of a capsule body of the virtual object based on the reference rotation angle, the included angle and a second movement attribute value;
determining second superimposed rotation information of the skeleton based on the first superimposed rotation information of the capsule body and the third movement attribute value.
In some embodiments, the first determining module is further configured to:
determining the ratio of the spacing distance to the reference displacement distance, wherein a connecting line of the initial position of the virtual object and the reference destination is vertical to a connecting line of the reference target place and the destination;
and determining the arc tangent value of the ratio as an included angle between the first moving track and the second moving track.
In some embodiments, the first determining module is further configured to:
determining a first length based on a product of the reference displacement distance and a cosine value of the included angle;
determining a second length based on the reference displacement distance and the first length, wherein the reference displacement distance is the length of the hypotenuse of the right triangle, and the first length and the second length are the lengths of the catheti;
determining superimposed displacement information of the skeleton in a roll angle direction based on the reference displacement distance, the first length and the first movement attribute value;
and determining the superposition displacement information of the framework in the pitch angle direction based on the interval distance, the second length and the first movement attribute value.
In some embodiments, the first determining module is further configured to:
determining a difference between the separation distance and the second length;
determining a superposition displacement value of the framework in the pitch angle direction based on the difference value and the first movement attribute value;
determining a first direction coefficient corresponding to the offset direction of the current destination relative to the reference destination;
and determining the superposition displacement information of the framework in the pitch angle direction based on the superposition displacement value and the first direction coefficient.
In some embodiments, the first determining module is further configured to:
merging the reference rotation angle and the included angle to obtain a merged rotation angle;
determining a first direction coefficient corresponding to the offset direction of the current destination relative to the reference destination;
determining a product of the merged rotation angle, the first direction coefficient, and the second movement attribute value as first superimposed rotation information of the capsule body.
In some embodiments, the first determining module is further configured to:
acquiring a preset second direction coefficient;
determining a product of the first superimposition rotation information, a second direction coefficient, and the third movement attribute value as the second superimposition rotation information.
In some embodiments, the apparatus further comprises:
the third acquisition module is used for acquiring the size information of the virtual object;
the fourth acquisition module is used for acquiring the moving speed corresponding to the size information;
correspondingly, the second determining module further includes:
controlling the virtual object to move based on the moving speed and the target movement information.
An embodiment of the present application provides an animation data processing apparatus, including:
a memory for storing executable instructions;
and the processor is used for realizing the method provided by the embodiment of the application when executing the executable instructions stored in the memory.
Embodiments of the present application provide a computer-readable storage medium, which stores executable instructions for causing a processor to implement the method provided by the embodiments of the present application when the processor executes the executable instructions.
The embodiment of the application has the following beneficial effects:
firstly, obtaining a reference animation file, reference movement information of a virtual object in the reference animation file and a current destination of the virtual object, wherein the reference movement information at least comprises the reference destination and a reference displacement distance, then creating a first movement attribute, a second movement attribute and a third movement attribute on a skeleton of the virtual object of the animation file, obtaining setting information aiming at the first movement attribute, the second movement attribute and the third movement attribute, obtaining an updated animation file, importing the updated animation file into an engine, determining superposition movement information of the virtual object relative to the reference movement information based on the spacing distance between a reference target area and the current destination, the reference displacement distance and the setting information, and finally determining target movement information of the virtual object based on the superposition movement information and the reference movement information, and controlling the virtual object to move based on the target movement information, so that when animation files with different movement parameters, such as different angles and different displacement distances, need to be manufactured, only three movement attributes and setting information of each movement attribute need to be added on the basis of referring to the animation files, the updated animation files are led into an engine to determine the target movement information, the movement of the virtual object is controlled, the manufacturing difficulty can be reduced, the packet size is reduced, and the manufacturing efficiency is improved.
Drawings
FIG. 1 is a schematic diagram of a network architecture of an animation data processing system 100 according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an animation terminal 400 according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart of an implementation of an animation data processing method according to an embodiment of the present application;
FIG. 4 is a schematic flowchart of another implementation of an animation data processing method according to an embodiment of the present application;
fig. 5 is a schematic flow chart of implementation of determining superimposed movement information of a virtual object relative to the reference movement information according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of still another implementation of the animation data processing method according to the embodiment of the present application;
FIG. 7 is a schematic diagram of a plug-in interface for adding custom properties to a bone according to an embodiment of the present application;
fig. 8A is a schematic diagram of a plug-in interface for setting a key frame according to an embodiment of the present application;
fig. 8B is a schematic diagram of three animation curves obtained after setting a key frame according to an embodiment of the present application;
fig. 9 is a schematic diagram of three actual animation curves obtained after a key frame is set according to an embodiment of the present application;
FIG. 10 is a schematic diagram of an animation blueprint for obtaining curve values according to an embodiment of the present application;
FIG. 11 is a schematic view of an animation blueprint for obtaining initial rotation information of a character/pet according to an embodiment of the present application;
FIG. 12 is a schematic diagram of an animation blueprint for calculating an angle required to rotate according to an embodiment of the present application;
FIG. 13 is a schematic diagram illustrating a principle of calculating an angle required to rotate according to an embodiment of the present disclosure;
FIG. 14 is a schematic diagram of an animation blueprint for calculating a displacement value to be superimposed on a skeleton according to an embodiment of the present disclosure;
FIG. 15 is a schematic diagram of an animation blueprint for obtaining a rotation angle required to be superimposed on a capsule body according to an embodiment of the present application;
FIG. 16 is a schematic diagram of an animation blueprint for calculating a displacement value to be actually superimposed on a skeleton according to an embodiment of the present disclosure;
FIG. 17 is a schematic diagram of an animation blueprint for calculating a rotation value that actually needs to be superimposed on a skeleton according to an embodiment of the present disclosure;
FIG. 18 is a schematic illustration of an animation blueprint for applying calculated displacement values and rotation values to a virtual character according to an embodiment of the present disclosure;
FIG. 19 is a schematic diagram of an interface for applying the calculated displacement value and the rotation value to a virtual character according to an embodiment of the present disclosure;
FIG. 20 is a diagram illustrating the effect of normal displacement using a created animation file;
fig. 21 is a schematic diagram illustrating an effect of character movement achieved by using the animation data processing method according to the embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) The Skeleton (Skeleton) comprises bones and joints, wherein the bones are coordinate spaces, and the bone hierarchy is a nested coordinate space. The joint only describes the position of the skeleton, namely the position of the origin of the own coordinate space of the skeleton in the parent space of the skeleton, and the rotation around the joint refers to the rotation of the coordinate space of the skeleton (including all subspaces) per se.
2) Skinning (Skin), refers to attaching (binding) the vertices in Mesh to bones, and each vertex can be controlled by multiple bones, so that the vertices at joints change position due to the simultaneous pulling of the parent and child bones, and cracks are eliminated.
3) Skeletal Animation (also known as Skeleton Animation) divides a three-dimensional model into two parts, a Skin (Skin) for drawing the model, and a Skeleton for controlling actions.
4) An Integrated Development Environment (IDE) is an application program for providing a program Development Environment, and generally includes tools such as a code editor, a compiler, a debugger, and a graphical user interface. The integrated development software service set integrates a code compiling function, an analyzing function, a compiling function, a debugging function and the like. All software or software suite (group) with the characteristic can be called an integrated development environment.
5)3D Studio Max: often abbreviated as 3d Max or 3ds MAX, is developed by Discreet corporation (later incorporated by Autodesk corporation) three-dimensional animation rendering and production software based on PC system
6) Blueprint (Blueprint), a special type of resource in the universal Engine, provides an intuitive, node-based interface for creating new types of Actor and checkpoint script events; it provides a tool for level designers and game developers to quickly create and iterate game playability in a fantasy editor, and a line of code does not need to be written.
7) An animated blueprint, performing animated blending, directly controlling the skeleton of the skeleton, or setting logic that will ultimately define the final animated pose of the skeletal mesh object to be used for each frame.
When the combat units of the two round combat parties attack the opposite side, the two round combat parties move to the opposite side and turn back after the attack, and in an application scene, the displacement distance and the direction of a role need to be determined firstly. In the related art, there are at least two implementations:
the first scheme is as follows: using in-place animation, the displacement distance is controlled by the code. That is, the animation is always in place, including actions such as jumping, attacking, returning, turning around, and so on. When to attack, when to jump, these times need to be tailored in advance. And moving the animation per se at a constant speed within a fixed time point according to a time point established in advance in the engine.
The second scheme is as follows: all implemented using animation files. And correspondingly using one animation file at different angles, wherein the animation file only needs to be imported into the engine.
The first scheme can not distinguish different rhythms according to different body type roles, is at a uniform speed, has poor expression effect and large limitation, and a time point must be fixed: when to move, when to attack, etc.; displacement at different angles cannot be achieved.
The second scheme has a large number of animation files and is not easy to maintain, and if the animation files are modified or the required quantity is increased, the manufacturing cost is huge.
Based on this, the embodiment of the application provides an animation data processing method, which calculates the distance through a trigonometric function formula, then multiplies a coefficient to calculate the difference between rotation and displacement, and superposes the difference on a default animation file to achieve the purpose of fixing the displacement distance and realizing different angles and displacement distances of the animation.
An exemplary application of the animation data processing device provided in the embodiments of the present application is described below, and the device provided in the embodiments of the present application may be implemented as various types of user terminals such as a notebook computer, a tablet computer, a desktop computer, a set-top box, a mobile device (e.g., a mobile phone, a portable music player, a personal digital assistant, a dedicated messaging device, a portable game device), and may also be implemented as a server. In the following, an exemplary application will be explained when the device is implemented as a terminal.
Referring to fig. 1, fig. 1 is a schematic diagram of a network architecture of an animation data processing system 100 according to an embodiment of the present application, and as shown in fig. 1, the animation data processing system includes: animation terminal 400, network 300 and development terminal 200, animation terminal 400 connects development terminal 200 through network 300, and network 300 can be wide area network or local area network, or the combination of the two.
An artist creates a reference animation file through the animation creation terminal 400, in the reference animation file, a virtual object is just opposite to a reference destination, a reference displacement distance between the virtual object and the reference destination is known, then a custom attribute is added on a skeleton of the virtual object in the reference animation file through a plug-in, the setting of the custom attribute is carried out through the plug-in to obtain an updated animation file, then the updated animation file is led into an engine to be rendered, the engine can identify the created custom attribute as a curve, the curve value (namely, the custom attribute value) is determined through the setting information of the custom attribute, then the target movement information of the virtual object is determined according to the spacing distance between the reference destination and the current destination, the reference displacement distance and each custom attribute value, and the movement of the virtual object is controlled to be different from that in the reference animation file, after obtaining the updated animation file through the animation production terminal 400, the art worker may send the updated animation file to the development terminal 200, so that the development worker applies the updated animation file to an actually required application scene.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an animation terminal 400 according to an embodiment of the present application, and the animation terminal 400 shown in fig. 2 includes: at least one processor 410, memory 450, at least one network interface 420, and a user interface 430. The various components in animation terminal 400 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable communications among the components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 440 in fig. 2.
The Processor 410 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 430 includes one or more output devices 431, including one or more speakers and/or one or more visual displays, that enable the presentation of media content. The user interface 430 also includes one or more input devices 432, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display screen, camera, other input buttons and controls.
The memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 450 optionally includes one or more storage devices physically located remote from processor 410.
The memory 450 includes either volatile memory or nonvolatile memory, and may include both volatile and nonvolatile memory. The nonvolatile memory may be a Read Only Memory (ROM), and the volatile memory may be a Random Access Memory (RAM). The memory 450 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data, examples of which include programs, modules, and data structures, or a subset or superset thereof, to support various operations, as exemplified below.
An operating system 451, including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for communicating to other computing devices via one or more (wired or wireless) network interfaces 420, exemplary network interfaces 420 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), and the like;
a presentation module 453 for enabling presentation of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices 431 (e.g., display screens, speakers, etc.) associated with user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices 432 and translating the detected inputs or interactions.
In some embodiments, the apparatus provided by the embodiments of the present application may be implemented in software, and fig. 2 shows an animation data processing apparatus 455 stored in the memory 450, which may be software in the form of programs and plug-ins, and includes the following software modules: a first obtaining module 4551, a second obtaining module 4552, a first determining module 4553 and a second determining module 4554, which are logical and thus may be arbitrarily combined or further divided according to the functions implemented.
The functions of the respective modules will be explained below.
In other embodiments, the apparatus provided in the embodiments of the present Application may be implemented in hardware, and as an example, the apparatus provided in the embodiments of the present Application may be a processor in the form of a hardware decoding processor, which is programmed to perform the animation data processing method provided in the embodiments of the present Application, for example, the processor in the form of the hardware decoding processor may be one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components.
The animation data processing method provided by the embodiment of the present application will be described with reference to an exemplary application and implementation of the terminal provided by the embodiment of the present application, and the animation data processing method is applied to an animation production terminal.
Referring to fig. 3, fig. 3 is a schematic diagram of an implementation flow of an animation data processing method provided in an embodiment of the present application, and will be described with reference to steps shown in fig. 3.
Step S101, obtaining a reference animation file, reference moving information of a virtual object in the reference animation file and a destination of the virtual object.
Here, the reference movement information includes at least a reference destination and a reference displacement distance, and in some embodiments, the reference movement information may further include a reference movement direction. The virtual object may be a virtual game character or a virtual pet provided in the application.
The reference animation file, in which the virtual object moves toward the reference destination, performs a certain action after reaching the reference destination, and then returns from the reference destination to the start point, may be previously created by the art through the animation terminal, and the virtual object may be directly opposite to the reference destination, as shown in fig. 20. In the embodiment of the present application, a movement different from the reference animation file may be implemented by setting the current destination, creating a movement attribute, and the like according to the reference animation file, for example, a movement at a different angle and/or at a different displacement distance. In actual implementation, a connection line between the current destination and the reference destination is perpendicular to a connection line between the starting point of the virtual object and the reference destination.
Step S102, a first movement attribute, a second movement attribute and a third movement attribute are created on the skeleton of the virtual object, setting information aiming at the first movement attribute, the second movement attribute and the third movement attribute is obtained, and an updated animation file is obtained.
In implementation, the first movement attribute, the second movement attribute and the third movement attribute may be created on a bone of the virtual object through the plug-in, where the bone may be a bone "Root" bone of the highest level of the skeleton of the virtual object, or a bone of another level of the skeleton. Creating a customized movement attribute on a skeleton can be understood as creating or adding a parameter on the skeleton, that is, the movement attribute can refer to a movement parameter of the skeleton. The setting of the first movement attribute, the second movement attribute and the third attribute may also be implemented by a plug-in. In some embodiments, the setting for the first movement attribute, the second movement attribute, and the third movement attribute may be implemented by setting corresponding key frames at respective times of starting departure, reaching opposite, starting return, and reaching the home position.
Step S103, importing the updated animation file into an engine, and determining superimposed movement information of the virtual object relative to the reference movement information based on the distance between the reference destination and the current destination, the reference displacement distance, and the setting information.
In some embodiments, after the updated animation file is imported into the engine, the engine may identify the created three motion attributes as three curves, and determine a curve value (i.e., a motion attribute value) of each curve based on setting information of the three motion attributes, so that the three curves may be presented, where an abscissa of the three curves may be represented by a key frame, and an ordinate of the three curves is a motion attribute value corresponding to the key frame. The curve value (i.e. the second movement attribute value) of the second movement curve represents the coefficients of the rotation angle of the skeleton in two intervals from ' not starting ' to ' starting ' and ' arriving at the original position ' to ' ending of the animation of the virtual object; the curve value of the third movement curve (i.e. the third movement attribute value) characterizes the coefficient of the rotation angle of the skeleton in the interval from "reach opposite" to "start coming back".
When the step S103 is implemented, an included angle between a first movement track that the virtual object moves from the starting point to the reference destination and a second movement track that the virtual object moves from the starting point to the current destination may be determined, and then superimposed movement information of the virtual object with respect to the reference movement information may be determined according to a trigonometric function and the calculated included angle, and based on the separation distance between the reference target location and the current destination, the reference displacement distance, and the movement attribute values corresponding to the three movement attributes, where the superimposed movement information includes superimposed displacement information and superimposed rotation angle information.
Step S104, determining target movement information of the virtual object based on the overlay movement information and the reference movement information, and controlling the virtual object to move based on the target movement information.
And performing superposition calculation on the superposed movement information with reference movement information to obtain target movement information of the virtual object, further controlling the virtual object to move from a starting point based on the target movement information until the current destination is reached, executing a certain action after the current destination is reached, then returning to the starting point from the current destination, and then ending the animation. In the embodiment of the application, the distance and the direction between the current destination and the reference destination are different from the distance and the direction between the starting point relative to the virtual object, so that some self-defined movement attributes are created and set on the skeleton of the virtual object of the reference animation file, and then through a series of calculations, the target movement parameters of the virtual object moving from the starting point to the current destination can be determined, and the movement of different angles and different displacements is realized.
In the animation data processing method provided in the embodiment of the present application, first, a reference animation file, reference movement information of a virtual object in the reference animation file, and a current destination of the virtual object are obtained, the reference movement information at least includes a reference destination and a reference displacement distance, then, a first movement attribute, a second movement attribute, and a third movement attribute are created on a skeleton of the virtual object of the animation file, setting information for the first movement attribute, the second movement attribute, and the third movement attribute is obtained, an updated animation file is obtained, after the updated animation file is imported to an engine, superimposed movement information of the virtual object with respect to the reference movement information is determined based on the separation distance between the reference destination and the current destination, the reference displacement distance, and the setting information, and finally, determining target movement information of the virtual object based on the superimposed movement information and the reference movement information, and controlling the virtual object to move based on the target movement information, so that when different movement parameters, such as animation files with different angles and different displacement distances, need to be manufactured, three movement attributes and setting information of each movement attribute are added on the basis of the reference animation file, the updated animation file is led into an engine to determine the target movement information, the movement of the virtual object is controlled, the manufacturing difficulty can be reduced, the packet size is reduced, and the manufacturing efficiency is improved.
In some embodiments, the rhythm of the displacement, that is, the moving speed of the virtual object may be controlled by an animation file instead of the uniform motion controlled by the code, as shown in fig. 4, before step S104, the art producer may produce animations with different displacement rhythms for different body types of characters by the following steps:
step S201, size information of the virtual object is acquired.
In the embodiment of the present application, the size information of the virtual object may include height, width, fat and thin information of the virtual object.
Step S202, obtaining the moving speed corresponding to the size information.
When the step S202 is implemented, a corresponding relationship between the size information and the moving speed may be pre-established, and then the moving speed corresponding to the size information of the virtual object is determined based on the object relationship, for example, a virtual object with a larger size in the corresponding relationship corresponds to a lower moving speed, and a virtual object with a smaller size corresponds to a higher moving speed; the actual requirements of the animation include that a virtual object with a relatively high size and a relatively thin size corresponds to a relatively high moving speed, and a virtual object with a relatively low size and a relatively fat size corresponds to a relatively low moving speed.
Correspondingly, as shown in fig. 4, the step S104 of controlling the virtual object to move based on the target movement information may be implemented by controlling the virtual object to move based on the moving speed and the target movement information. That is, the moving speed of the virtual object can be set by the art personnel based on the size information of the virtual object, thereby realizing the control of the displacement rhythm and increasing the diversity of the virtual object in the animation file.
In some embodiments, the step S102 "creating the first movement attribute, the second movement attribute, and the third movement attribute on the skeleton of the virtual object and acquiring the setting information for the first movement attribute, the second movement attribute, and the third movement attribute" shown in fig. 3 may be implemented by:
step S1021, a first movement attribute, a second movement attribute and a third movement attribute are created on the skeleton of the virtual object through a plug-in.
In implementation, the virtual object can be created on the bone "Root" of the highest hierarchy level of the virtual object through a plug-in, and can also be created on the bones of other hierarchies of the virtual object. Creating a custom movement attribute, wherein in implementation, a movement parameter is newly added to a skeleton, in the embodiment of the present application, three movement attributes, that is, a first movement attribute, a second movement attribute, and a third movement attribute, may be created, and in other embodiments, the three movement attributes may be respectively a button _1, a button _2, and a button _ 3.
Step S1022, a first key frame corresponding to the start of departure, a second key frame corresponding to the destination of this time, a third key frame corresponding to the start of return, and a fourth key frame corresponding to the arrival location set by the plug-in are acquired.
In some embodiments, a plug-in button control may be clicked on a certain key frame of an animation file, and at this time, a display interface for selecting start of departure, arrival at the current destination, start of return, and arrival at the home position is presented, as shown in fig. 8A, "3" in the upper left corner indicates a third key frame, if "start of departure" is clicked at this time, it indicates that the 3 rd key frame is set as a key frame for "start of departure", and similarly, if the 9 th frame is a key frame corresponding to the current destination, when the 9 th key frame is slid, the button control corresponding to the plug-in is clicked or touched, and "arrival at the current destination" is clicked, and similarly, key frames corresponding to "start of return" and "arrival at the home position" may be set.
Step S1023, determining the first key frame, the second key frame, the third key frame, and the fourth key frame as the setting information.
In the embodiment of the present application, the first key frame, the second key frame, the third key frame, and the fourth key frame are determined as setting information for the first movement attribute, the second movement attribute, and the third movement attribute, and the movement attribute values corresponding to the first movement attribute, the second movement attribute, and the third movement attribute are determined based on the setting information.
In some embodiments, when the updated animation file is imported into the engine, the engine may identify the first movement attribute, the second movement attribute, and the third movement attribute created in the above steps as three curves, that is, a first movement curve, a second movement curve, and a third movement curve, and then determine, based on the setting information, a first movement attribute value of the first movement curve corresponding to the first movement attribute, a second movement attribute value of the second movement curve corresponding to the second movement attribute, and a third movement attribute value of the third movement curve corresponding to the third movement attribute.
The engine will respectively identify the created three movement attributes as three curves, and determine the curve values (i.e. movement attribute values) of the respective curves based on the setting information of the three movement attributes, so that three curves can be presented, the abscissa of the three curves can be represented by a key frame, and the ordinate is the movement attribute value corresponding to the key frame. The curve value (i.e. the second movement attribute value) of the second movement curve represents the coefficients of the rotation angle of the skeleton in two intervals from ' not starting ' to ' starting ' and ' arriving at the original position ' to ' ending of the animation of the virtual object; the curve value of the third movement curve (i.e., the third movement attribute value) represents the coefficient of the rotation angle of the skeleton in the interval from "reaching the current destination" to "starting to return". These shift attribute values are used for calculation of the shift distance and the rotation angle in the subsequent steps.
In some embodiments, the step S103 "determining the superimposed movement information of the virtual object with respect to the reference movement information based on the separation distance between the reference target and the destination, the reference displacement distance, and the setting information" shown in fig. 3 may be implemented as steps S1031 to S1034 shown in fig. 5, and each step is described below with reference to fig. 5.
And step S1031, determining an included angle between the first moving track and the second moving track based on the spacing distance and the reference displacement distance.
Here, the first movement trajectory is a movement trajectory corresponding to the virtual object reaching the current destination, and the second movement trajectory is a movement trajectory corresponding to the virtual object reaching the reference destination. Taking FIG. 13 as an example, the first movement track is OD 2 The second movement track is OD 1 . In the embodiment of the application, the connection between the current destination and the reference destination and the starting point and the reference destination of the virtual objectThe connecting line of the ground is vertical, that is, the connecting line of the destination and the reference destination is vertical to the second moving track, and the destination, the reference destination and the starting point of the virtual object form a right triangle, that is, Δ D in fig. 13 1 OD 2 And the connecting line of the destination and the reference destination and the second movement track are two right-angle sides of the right-angle triangle, and the first movement track is the hypotenuse of the right-angle triangle.
The distance between the current destination and the reference destination is also a spacing distance, and the length of the second movement track is also a reference displacement distance, so that the included angle between the first movement track and the second movement track is obtained by solving the inverse tangent of the ratio of the spacing distance to the reference displacement distance, the included angle is a numerical value between 0 and pi, and in order to obtain the angle, the included angle can be multiplied by 90 degrees, so that the angle value between 0 and 180 degrees is obtained.
Step S1032, determining superimposed displacement information of the skeleton of the virtual object based on the separation distance, the reference displacement distance, the first movement attribute value, and the included angle.
In the step, the included angle is also ≈ D in fig. 13 1 OD 2 I.e. is < B and the spacing distance is D 1 D 2 The reference displacement distance is OD 1. The superimposed displacement information may include a superimposed displacement value in the pitch angle direction and a superimposed displacement value in the roll angle direction. When step S1032 is implemented, OD can be calculated by the formula (1-1) 2
OD 2 =OD 1 /cosB; (1-1);
Cos is a cosine function, then ED 2 =OD 2 OE, since OE ═ OD 1 Therefore, ED 2 It can be derived from the formula (1-2):
ED 2 =OD 1 /cosB-OD 1 ; (1-2);
then, EP can be derived from equations (1-3):
EQ=ED 2 *cosB=OD 1 -OD 1 *cosB; (1-3);
in some embodiments, since ═ a +. B is 90 °, cosB is sinA, and therefore EQ can be derived in some embodiments using equation (1-4):
EQ=OD 1 -OD 1 *sinA; (1-4);
EP can be calculated by the equations (1-5):
EP=ED 2 *sinB=OD 1 *tanB-OD 1 *sinB (1-5);
as can be seen from fig. 13, OD1 is the distance D between the two 1 D 2 ,Thus, in some embodiments, EP's may also be usedCalculated, wherein LineA is OD 1 *cosB。
In the embodiment of the present application, after the values of EP and EQ are obtained, the first movement attribute value is multiplied by EP and EQ, respectively, so that superimposed displacement information of the skeleton of the virtual object in the pitch angle direction and superimposed displacement information of the skeleton of the virtual object in the roll angle direction are obtained.
Step S1033, obtaining a reference rotation angle in the reference movement information, and determining first superimposed rotation information of the capsule body of the virtual object based on the reference rotation angle, the included angle, and the second movement attribute value.
Here, the first superimposed rotation information is superimposed rotation information applied to the capsule body. The capsule body can also be called a capsule collider and consists of a cylinder with a unit length and two semi-units of hemispheres. Typically in game animation, the capsule (or other shape) of the virtual object is driven in the scene by a controller. The data for that capsule is then used to drive animation. For example, if a capsule is moving forward, the system knows to play a running or walking animation on the character to take on the effect that the character is moving with its own power.
Step S1034, determining second superimposed rotation information of the skeleton based on the first superimposed rotation information of the capsule body and the third movement attribute value.
The second superimposed rotation information is also the superimposed rotation information applied to the skeleton.
Through the above steps S1031 to S1034, the superimposed displacement information of the skeleton of the virtual object, the first superimposed rotation information of the capsule body, and the second superimposed rotation information of the skeleton are respectively determined, that is, the superimposed movement parameter value of the virtual object in the current movement is obtained, so that the movement of different angles and/or different displacements is performed based on the superimposed movement parameter value.
The step S1031 "determining the included angle between the first movement trajectory and the second movement trajectory based on the separation distance and the reference displacement distance" may be implemented by the following steps:
step S311, determining a ratio of the separation distance and the reference displacement distance.
Here, a line connecting the initial position of the virtual object and the reference destination is perpendicular to a line connecting the reference target ground and the current destination. Assuming that the separation distance is 20 and the reference displacement distance is also 20, the ratio of the separation distance to the reference displacement distance is 20/20-1.
Step S312, determining the arctan value of the ratio as an included angle between the first moving trajectory and the second moving trajectory.
Here, since the connection line between the initial position of the virtual object and the reference destination is perpendicular to the connection line between the reference target location and the current destination, the ratio of the separation distance to the reference displacement distance is also the tangent value of the included angle, and the tangent is negated on the ratio, so that the included angle between the first movement track and the second movement track is obtained. After the included angle between the first moving track and the second moving track is calculated, the superposition displacement information and the superposition rotation information can be determined through a trigonometric function.
In some embodiments, the step S1032 "of determining the superimposed displacement information of the skeleton of the virtual object based on the separation distance, the reference displacement distance, the first movement attribute value, and the included angle" may be implemented by the following steps S321 to S324:
step S321, determining a first length based on a product of the reference displacement distance and the cosine value of the included angle.
Here, the product of the displacement distance and the cosine of the angle, i.e. the OD in the above-mentioned step, is referred to 1 cosB, i.e. the first length LineA is OD 1 *cosB。
Step S322, determining a second length based on the reference displacement distance and the first length.
Here, the reference displacement distance is a right triangle (Δ COD in FIG. 13) 1 ) The first length and the second length are right-angle side lengths, and the step S322 may determine the second length LineB by using the pythagorean theorem when implementing, that is, formula (1-6):
LineB, that is, CD in FIG. 13 1 Length of (d).
Step S323, determining superimposed displacement information of the skeleton in the roll angle direction based on the reference displacement distance, the first length, and the first movement attribute value.
As can be seen from the above formula (1-3), EQ ═ OD 1 -OD 1 cosB, i.e. EQ ═ OD 1 LineA, the superimposed displacement information of the bone in the direction of the roll angle, i.e. (OD) 1 -LineA)' curve0, where curve0 is the first move attribute value.
Step S324, determining superimposed displacement information of the skeleton in the pitch angle direction based on the separation distance, the second length, and the first movement attribute value.
Step S324 may be implemented by:
step S3241, determining a difference between the separation distance and the second length.
As can be seen from fig. 13, Δ CED1 and Δ EQD1 are congruent triangles, so CD1 is QD1, and CD1, i.e., lineB, has the formula (1-7):
QD 2 =D 1 D 2 -QD 1 =D 1 D 2 -LineB (1-7);
wherein D is 1 D 2 For the separation distance, LineB is the second length value, i.e., QD is determined by formula (1-7) in step S3241 2
Step S3242, determining a superimposed displacement value of the skeleton in the pitch angle direction based on the difference value and the first movement attribute value.
On the basis of the formulas (1-7), the superimposed displacement value of the skeleton in the pitch angle direction is (D) 1 D 2 -LineB)*curve0。
Step S3243 determines a first direction coefficient corresponding to a direction of deviation of the current destination from the reference destination.
In the embodiment of the present application, when the offset direction of the current destination with respect to the reference destination is the right direction, the first direction coefficient is 1, and when the offset direction of the current destination with respect to the reference destination is the left direction, the first direction coefficient is-1.
Step S3244, determining the superimposed displacement information of the framework in the pitch angle direction based on the superimposed displacement value of the framework in the pitch angle direction and the first direction coefficient.
Step S3244 may, when the step is implemented, multiply the superimposed displacement value of the skeleton in the pitch angle direction by the first direction coefficient to obtain actual superimposed displacement information of the skeleton in the pitch angle direction.
In some embodiments, the above step S1033 "determining the first superimposed rotation information of the capsule body of the virtual object based on the reference rotation angle, the included angle and the second movement attribute value" may be implemented by:
and step S331, merging the reference rotation angle and the included angle to obtain a merged rotation angle.
Here, as shown in fig. 15, the reference rotation angle and the angle between the first movement trajectory and the second movement trajectory may be obtained by a "merge rotator" button on the animated blueprint.
Step S332 is to determine a first direction coefficient corresponding to the offset direction of the current destination with respect to the reference destination.
In the embodiment of the present application, when the offset direction of the current destination with respect to the reference destination is the right direction, the first direction coefficient is 1, and when the offset direction of the current destination with respect to the reference destination is the left direction, the first direction coefficient is-1.
Step S333, determining a product of the merged rotation angle, the first direction coefficient, and the second movement attribute value as first superimposed rotation information of the capsule body.
In some embodiments, the above step S1034 "determining the second superimposed rotation information of the skeleton based on the first superimposed rotation information of the capsule body and the third movement attribute value" may be implemented by:
in step S341, a preset second direction coefficient is obtained.
Here, when the capsule body of the virtual object reaches the current destination after rotating a certain angle, in order to ensure that the skeleton is directly facing the current destination, the skeleton needs to be rotated in the opposite direction of the capsule body, so that the second direction coefficient is-1 to represent that the rotation direction of the skeleton is opposite to the rotation direction of the capsule body.
In step S342, a product of the first superimposition rotation information, the second direction coefficient, and the third movement attribute value is determined as the second superimposition rotation information.
In some embodiments, after determining the overlay rotation information, the overlay displacement information, and the first overlay rotation information of the capsule of the virtual object, a "transform (modify) bone" node may be used to transform a bone "Root" at a highest level of the skeleton in the animation blueprint, where a "translation" model of the node needs to be changed to "add to the existing" and a "translation space" needs to be changed to "world scene space", so that the overlay displacement information and the overlay rotation information are applied to the skeleton and the capsule to obtain the target movement information of the virtual object, and then the virtual object is moved based on the target movement information.
Next, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
In the embodiment of the application, only one animation file (i.e., the reference animation file in other embodiments) including the displacement distance needs to be made, the movement coefficient is generated and set through the plug-in, then the difference between the rotation and the displacement is calculated through a trigonometric formula and is superposed on the animation file, and the movement of different angles and displacements can be realized.
Fig. 6 is a schematic diagram of a further implementation flow of the animation data processing method according to the embodiment of the present application, and as shown in fig. 6, the flow includes:
step S601, generating a custom attribute (i.e. a mobile attribute in other embodiments) through a plug-in.
In implementation, the manufactured animation file with the displacement distance is opened, the custom attribute is added to the skeleton, the plug-in interface shown in fig. 7 is presented at this time, and as shown in fig. 7, the custom attribute is created by using the plug-in, and three attributes are automatically created, namely "button _ 1" 701, "button _ 2" 702, and "button _ 3" 703.
This is done so that when an animation file is imported into the engine, the engine will automatically identify the custom properties as animation curves, resulting in three animation curves.
Step S602, the displacement information is recorded.
In some embodiments, after the displacement information is recorded, custom attributes are also set. In the animation file, a key frame is set for the custom property at the position of "start", "reach opposite", "start back", "reach home position". The step also realizes plug-in, the plug-in interface is as shown in fig. 8A, the effect diagram after setting the key frame is as shown in fig. 8B, the curve corresponding to the button _1 is 811, the curve corresponding to the button _2 is 812, and the curve corresponding to the button _3 is 813.
In this step, the set keyframes are used to respectively assign values to the three animation curves, and the first animation curve (i.e. the first motion curve in other embodiments) is used as a coefficient of a displacement distance value and a character capsule body rotation value to be superimposed; the second animation curve (i.e. the second movement curve in other embodiments) is used as a coefficient of an angle by which the skeleton needs to rotate between two intervals of "not sending" to "start starting" and "reaching the original position" to "ending animation"; the third animation curve (i.e., the third movement curve in the other embodiments) is used as a coefficient of the "reach-to-face" and "start-to-return" interval, the angle that the skeleton needs to rotate.
In step S603, a curve value is obtained.
In this step, the animation is imported into the engine, and the animation sequence displays three curves as shown in fig. 9, which are "button _ 1" 901, "button _ 2" 902, and "button _ 3" 903, respectively.
In practical implementation, as shown in fig. 10, values of three curves can be obtained by using a "obtain Curve value" node 1001, and are assigned to three variables, namely "Curve _ 0", "Curve _ 1" and "Curve _ 2", through a "SET" node 1002. The curve numerical value obtained in the step is prepared for subsequent calculation of the superposition numerical value.
Step S604, the initial rotation information of the character/pet is obtained.
As shown in fig. 11, when implemented, the "acquire scene rotation" node 1101 acquires initial rotation information, and assigns a variable "Default Rotate" to the "SET" node 1102. The initial rotation information obtained in this step also provides for subsequent calculation of the superimposed numerical value.
In step S605, rotation information and displacement information are calculated.
Assuming that the displacement distance in the original drawing file is default distance (i.e. the reference displacement distance in other embodiments), and the distance between the second destination to be reached and the first destination to be reached by the original drawing is Spacing (i.e. the Spacing distance in other embodiments), as shown in fig. 12, the angle required to be rotated can be calculated according to the formula: Spacing/default distance. It should be noted that the current values are only values that rotate to the right.
The operation of this step is to calculate the normal animation and the offset animation, the size of the included angle between the moving tracks of the normal animation and the offset animation, and this rotation value is not the value that needs to be finally superimposed, and the subsequent processing will be continued, so as to increase the distinction of the left and right directions, that is, the positive and negative division.
As shown in FIG. 13, assume that the first destination of the original animation is D 1 The second target of the local animation is D 2 When DefaultDistance is Y and Spacing is 4x, as shown in FIG. 14, AngleA can be calculated according to the formula (2-1):
AngleA=[90-inverse tangent(Spacing/DefaultDistance)] (2-1);
as shown in FIG. 14, LineA and LineB can be determined according to equations (2-2) and (2-3):
LineA=Sin(AngleA)*DefaultDistance (2-2);
the Roll axis needs to be superimposed by the following values: (default distance-line) current _0, the value to be superimposed for the Pitch axis is: (Spacing-line b) currve _ 0. And then calculating the displacement value of the skeleton to be superposed.
And step S606, calculating and applying the rotation value which is actually needed to be superposed by the capsule body.
As shown in fig. 15, the revolution value calculated in step S605 and the default value are merged using the "merged revolution solid" node 1501 to obtain a new revolution solid variable. The Yaw axis output is multiplied by an "LR" variable 1502, which shifts to the right when "LR" is 1 and to the left when "LR" is-1. And multiplying the value by the value of the second curve to obtain the value which is actually needed to be superposed by the capsule body.
And step S607, calculating the displacement value and the rotation information which are really needed to be superposed on the skeleton.
Since there is a distinction in the left-right direction, the displacement value calculated in step S605 needs to be input to a new vector, and as shown in fig. 16, a variable 1601 "LR" is multiplied to the place where the Pitch axis is output, and when "LR" is equal to 1, the displacement is shifted to the right, and when "LR" is equal to-1, the displacement is shifted to the left, and at this time, the value of the skeleton that actually needs the superimposed displacement is calculated.
Since the rotation angle on the frame is exactly opposite to the rotation angle of the capsule body, the character arriving at the destination cannot face the target after the capsule body rotates by a certain degree, so that the frame needs to be rotated back. As shown in fig. 17, the rotation value calculated in step S606 is multiplied by a value of "-1", so as to achieve the purpose that the rotation value of the capsule body is exactly opposite to the rotation value of the skeleton. This inverse value is then multiplied by a factor, i.e. by the value of the third Curve (Curve 2). Finally, the rotation numerical value which is really needed to be superposed on the framework is obtained.
After the rotation value and the displacement value of the skeleton, the rotation value and the displacement value need to be applied to the skeleton, as shown in fig. 18, in the animation blueprint, the skeleton "Root" skeleton at the highest level of the skeleton is transformed by using a "transform (modify) skeleton" node 1801. As shown in fig. 19, the "translation" model of this node needs to be changed to "add to the existing", and the "translation space" needs to be changed to "world scene space".
Fig. 20 is a schematic view showing the effect of normal displacement using a created animation file in which a character/pet moves to a target facing thereto as shown in fig. 20. Fig. 21 is a schematic diagram illustrating the effect of character movement achieved by the animation data processing method according to the embodiment of the present application, where, as shown in fig. 21, a character/pet moves to a destination on the front right of the character/pet.
In the animation data processing method provided by the embodiment of the application, only one animation file with the displacement distance needs to be manufactured, different displacement distances of different angles can be achieved, the package size is reduced, when the movement of different displacement distances of different angles is carried out, parameters are added through plug-in units, the parameters are set, the setting in an engine is not complex, the purpose of improving the manufacturing efficiency is achieved, the manufacturing cost is reduced, the rhythm of the displacement is controlled by the animation file instead of the uniform motion of code control, art manufacturers can manufacture animations with different displacement rhythms aiming at the characters of different body types, and the animation expression can be improved.
Continuing with the exemplary structure of the animation data processing device 455 provided in the embodiments of the present application implemented as software modules, in some embodiments, as shown in fig. 2, the software modules stored in the animation data processing device 455 of the memory 440 may include:
a first obtaining module 4551, configured to obtain a reference animation file, reference movement information of a virtual object in the reference animation file, and a current destination of the virtual object, where the reference movement information at least includes a reference destination and a reference displacement distance;
a second obtaining module 4552, configured to create a first movement attribute, a second movement attribute, and a third movement attribute on a skeleton of the virtual object, and obtain setting information for the first movement attribute, the second movement attribute, and the third movement attribute, so as to obtain an updated animation file;
a first determining module 4553, configured to import the updated animation file into an engine, and determine overlay movement information of a virtual object with respect to the reference movement information based on the separation distance between the reference target location and the current destination, the reference displacement distance, and the setting information;
a second determining module 4554, configured to determine target movement information of the virtual object based on the overlay movement information and the reference movement information, and control the virtual object to move based on the target movement information.
In some embodiments, the second obtaining module is further configured to:
creating a first movement attribute, a second movement attribute and a third movement attribute through a plug-in;
acquiring a first key frame corresponding to the starting of the plug-in, a second key frame reaching the destination, a third key frame corresponding to the starting of the plug-in and a fourth key frame corresponding to the arrival original position;
determining the first key frame, the second key frame, the third key frame, and the fourth key frame as the setting information.
In some embodiments, the apparatus further comprises:
and a third determining module, configured to import the updated animation file into an engine, and determine, based on the setting information, a first movement attribute value of a first movement curve corresponding to the first movement attribute, a second movement attribute value of a second movement curve corresponding to the second movement attribute, and a third movement attribute value of a third movement curve corresponding to the third movement attribute.
In some embodiments, the first determining module is to:
determining an included angle between a first moving track and a second moving track based on the interval distance and the reference displacement distance, wherein the first moving track is a moving track corresponding to the virtual object reaching the destination, and the second moving track is a moving track corresponding to the virtual object reaching the reference destination;
determining superimposed displacement information of a skeleton of the virtual object based on the separation distance, the reference displacement distance, the first movement attribute value and the included angle;
acquiring a reference rotation angle in the reference movement information, and determining first superposed rotation information of a capsule body of the virtual object based on the reference rotation angle, the included angle and a second movement attribute value;
determining second superimposed rotation information of the skeleton based on the first superimposed rotation information of the capsule body and the third movement attribute value.
In some embodiments, the first determining module is further configured to:
determining the ratio of the spacing distance to the reference displacement distance, wherein a connecting line of the initial position of the virtual object and the reference destination is vertical to a connecting line of the reference target place and the destination;
and determining the arc tangent value of the ratio as an included angle between the first moving track and the second moving track.
In some embodiments, the first determining module is further configured to:
determining a first length based on a product of the reference displacement distance and a cosine value of the included angle;
determining a second length based on the reference displacement distance and the first length, wherein the reference displacement distance is a hypotenuse length of a right triangle, and the first length and the second length are cathetus lengths;
determining superimposed displacement information of the skeleton in a roll angle direction based on the reference displacement distance, the first length and the first movement attribute value;
and determining the superposition displacement information of the framework in the pitch angle direction based on the spacing distance, the second length and the first movement attribute value.
In some embodiments, the first determining module is further configured to:
determining a difference between the separation distance and the second length;
determining a superposition displacement value of the framework in the pitch angle direction based on the difference value and the first movement attribute value;
determining a first direction coefficient corresponding to the offset direction of the current destination relative to the reference destination;
and determining the superposition displacement information of the framework in the pitch angle direction based on the superposition displacement value and the first direction coefficient.
In some embodiments, the first determining module is further configured to:
merging the reference rotation angle and the included angle to obtain a merged rotation angle;
determining a first direction coefficient corresponding to the offset direction of the destination relative to the reference destination;
determining a product of the merged rotation angle, the first direction coefficient and the second movement attribute value as first superimposed rotation information of the capsule body.
In some embodiments, the first determining module is further configured to:
acquiring a preset second direction coefficient;
determining a product of the first overlay rotation information, a second direction coefficient, and the third movement attribute value as the second overlay rotation information.
In some embodiments, the apparatus further comprises:
the third acquisition module is used for acquiring the size information of the virtual object;
the fourth acquisition module is used for acquiring the moving speed corresponding to the size information;
correspondingly, the second determining module further includes:
controlling the virtual object to move based on the moving speed and the target movement information.
Here, it should be noted that: the above description of the embodiment of the animation data processing apparatus is similar to the above description of the method, and has the same advantageous effects as the embodiment of the method. For technical details not disclosed in the embodiments of the animation data processing apparatus of the present application, those skilled in the art should understand with reference to the description of the embodiments of the method of the present application.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions, so that the computer device executes the animation data processing method according to the embodiment of the application.
Embodiments of the present application provide a computer-readable storage medium having stored therein executable instructions, which when executed by a processor, will cause the processor to perform a method provided by embodiments of the present application, for example, the method as illustrated in fig. 3, fig. 4, fig. 5, and fig. 6.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EP ROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of a program, software module, script, or code written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (H TML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices located at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (13)

1. An animation data processing method, comprising:
acquiring a reference animation file, reference movement information of a virtual object in the reference animation file and a current destination of the virtual object, wherein the reference movement information at least comprises a reference destination and a reference displacement distance;
creating a first movement attribute, a second movement attribute and a third movement attribute on the skeleton of the virtual object, and acquiring setting information aiming at the first movement attribute, the second movement attribute and the third movement attribute to obtain an updated animation file;
importing the updated animation file into an engine, and determining superposition movement information of the virtual object relative to the reference movement information based on the spacing distance between the reference target place and the destination, the reference displacement distance and the setting information;
determining target movement information of the virtual object based on the overlay movement information and the reference movement information, and controlling the virtual object to move based on the target movement information.
2. The method of claim 1, wherein creating the first movement attribute, the second movement attribute, and the third movement attribute on the skeleton of the virtual object and obtaining the setting information for the first movement attribute, the second movement attribute, and the third movement attribute comprises:
creating, by a plug-in, a first movement attribute, a second movement attribute, and a third movement attribute on a skeleton of the virtual object;
acquiring a first key frame corresponding to the starting of the plug-in, a second key frame reaching the destination, a third key frame corresponding to the starting of the plug-in and a fourth key frame corresponding to the arrival original position;
determining the first key frame, the second key frame, the third key frame, and the fourth key frame as the setting information.
3. The method of claim 2, further comprising:
and importing the updated animation file into an engine, and determining a first movement attribute value of a first movement curve corresponding to the first movement attribute, a second movement attribute value of a second movement curve corresponding to the second movement attribute and a third movement attribute value of a third movement curve corresponding to the third movement attribute based on the setting information.
4. The method according to claim 3, wherein the determining superimposed movement information of the virtual object with respect to the reference movement information based on the separation distance between the reference destination and the current destination, the reference displacement distance, and the setting information comprises:
determining an included angle between a first moving track and a second moving track based on the interval distance and the reference displacement distance, wherein the first moving track is a moving track corresponding to the virtual object reaching the destination, and the second moving track is a moving track corresponding to the virtual object reaching the reference destination;
determining superimposed displacement information of a skeleton of the virtual object based on the separation distance, the reference displacement distance, the first movement attribute value and the included angle;
acquiring a reference rotation angle in the reference movement information, and determining first superposed rotation information of a capsule body of the virtual object based on the reference rotation angle, the included angle and a second movement attribute value;
determining second superimposed rotation information of the skeleton based on the first superimposed rotation information of the capsule body and the third movement attribute value.
5. The method of claim 4, wherein determining an angle between the first movement path and the second movement path based on the separation distance and the reference displacement distance comprises:
determining the ratio of the spacing distance to the reference displacement distance, wherein a connecting line of the initial position of the virtual object and the reference destination is vertical to a connecting line of the reference target place and the destination;
and determining the arctangent value of the ratio as an included angle between the first moving track and the second moving track.
6. The method of claim 4, wherein determining overlay displacement information for a skeleton of the virtual object based on the separation distance, the reference displacement distance, the first movement attribute value, and the included angle comprises:
determining a first length based on a product of the reference displacement distance and a cosine value of the included angle;
determining a second length based on the reference displacement distance and the first length, wherein the reference displacement distance is the length of the hypotenuse of the right triangle, and the first length and the second length are the lengths of the catheti;
determining superimposed displacement information of the skeleton in a roll angle direction based on the reference displacement distance, the first length and the first movement attribute value;
and determining the superposition displacement information of the framework in the pitch angle direction based on the interval distance, the second length and the first movement attribute value.
7. The method of claim 6, wherein determining superimposed displacement information of the frame in a pitch direction based on the separation distance, the second length, and the first movement property value comprises:
determining a difference between the separation distance and the second length;
determining a superposition displacement value of the framework in the pitch angle direction based on the difference value and the first movement attribute value;
determining a first direction coefficient corresponding to the offset direction of the current destination relative to the reference destination;
and determining the superposition displacement information of the framework in the pitch angle direction based on the superposition displacement value and the first direction coefficient.
8. The method according to claim 4, wherein the determining first superimposed rotation information of the capsule body of the virtual object based on the reference rotation angle, the included angle and the second movement attribute value comprises:
combining the reference rotation angle and the included angle to obtain a combined rotation angle;
determining a first direction coefficient corresponding to the offset direction of the current destination relative to the reference destination;
determining a product of the merged rotation angle, the first direction coefficient and the second movement attribute value as first superimposed rotation information of the capsule body.
9. The method of claim 3, wherein determining the second overlay rotation information for the skeleton based on the first overlay rotation information for the capsule body and the third movement attribute value comprises:
acquiring a preset second direction coefficient;
determining a product of the first overlay rotation information, a second direction coefficient, and the third movement attribute value as the second overlay rotation information.
10. The method of claim 1, further comprising:
acquiring size information of the virtual object;
acquiring a moving speed corresponding to the size information;
correspondingly, the controlling the virtual object to move based on the target movement information includes:
controlling the virtual object to move based on the moving speed and the target movement information.
11. An animation data processing apparatus, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a reference animation file, reference movement information of a virtual object in the reference animation file and a current destination of the virtual object, and the reference movement information at least comprises a reference destination and a reference displacement distance;
a second obtaining module, configured to create a first movement attribute, a second movement attribute, and a third movement attribute on a skeleton of the virtual object, and obtain setting information for the first movement attribute, the second movement attribute, and the third movement attribute, so as to obtain an updated animation file;
a first determining module, configured to import the updated animation file into an engine, and determine superimposed movement information of a virtual object relative to the reference movement information based on the distance between the reference destination and the current destination, the reference displacement distance, and the setting information;
a second determination module for determining target movement information of the virtual object based on the overlay movement information and the reference movement information, and controlling the virtual object to move based on the target movement information.
12. An animation data processing apparatus characterized by comprising:
a memory for storing executable instructions;
a processor for implementing the method of any one of claims 1 to 10 when executing executable instructions stored in the memory.
13. A computer-readable storage medium having stored thereon executable instructions for, when executed by a processor, implementing the method of any one of claims 1 to 10.
HK42022062747.5A 2022-10-26 Animation data processing method, device, equipment and computer readable storage medium HK40073651B (en)

Publications (2)

Publication Number Publication Date
HK40073651A true HK40073651A (en) 2022-12-23
HK40073651B HK40073651B (en) 2025-09-26

Family

ID=

Similar Documents

Publication Publication Date Title
US10803647B1 (en) Generating animation rigs using scriptable reference modules
Hartmann et al. Design as exploration: creating interface alternatives through parallel authoring and runtime tuning
US7336280B2 (en) Coordinating animations and media in computer display output
CN112927331B (en) Character model animation generation method and device, storage medium and electronic equipment
US9159168B2 (en) Methods and systems for generating a dynamic multimodal and multidimensional presentation
Alatalo An entity-component model for extensible virtual worlds
KR102618644B1 (en) Method and apparatus for generating composite image using 3d model
Dörner et al. Content creation and authoring challenges for virtual environments: from user interfaces to autonomous virtual characters
CN116452786A (en) Virtual reality content generation method, system, computer device and storage medium
US11625900B2 (en) Broker for instancing
HK40073651A (en) Animation data processing method, device, equipment and computer readable storage medium
CN115006845B (en) Animation data processing method, device, equipment and computer readable storage medium
Bao et al. A toolkit for automatically generating and modifying VR hierarchy tile menus
CN115794979A (en) GIS digital twinning system based on Unity3D
CN202771416U (en) Visual three-dimensional (3D) programming device
US8682464B2 (en) System and method for generating a three-dimensional image
Bull Integrating dynamic views using model driven development
Perevalov et al. openFrameworks Essentials
KR102874054B1 (en) 3D animation production system using animation library tools
Trindade et al. LVRL: Reducing the gap between immersive VR and desktop graphical applications
Kovács et al. The Theatre Metaphor for Spatial Computing in Architectural Design
Pirchheim et al. Visual Programming for Hybrid User Interfaces
CN119127022A (en) A three-dimensional presentation type intelligent operating system and its interactive method
CN121147424A (en) Virtual scene dynamic adaptation system, method, media, program products and terminals based on spatial computing
Shangguan et al. Implementation of a simulation platform for BCM with xPC target