CN117011426A - Object processing method, apparatus, computer device, storage medium, and program product - Google Patents

Object processing method, apparatus, computer device, storage medium, and program product Download PDF

Info

Publication number
CN117011426A
CN117011426A CN202211287170.3A CN202211287170A CN117011426A CN 117011426 A CN117011426 A CN 117011426A CN 202211287170 A CN202211287170 A CN 202211287170A CN 117011426 A CN117011426 A CN 117011426A
Authority
CN
China
Prior art keywords
target
object data
data
feather
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211287170.3A
Other languages
Chinese (zh)
Inventor
韩宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202211287170.3A priority Critical patent/CN117011426A/en
Publication of CN117011426A publication Critical patent/CN117011426A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present application relates to an object processing method, apparatus, computer device, storage medium, and program product. The method comprises the following steps: and acquiring a basic object frame, extracting each object attribute parameter value corresponding to the basic object frame, and generating object data based on the basic object frame and each object attribute parameter value, wherein the object data comprises an object hair curve, an object grid body and an object map. And carrying out format matching processing on the object data according to the target engine, determining target object data matched with the target engine, calling the target engine, and carrying out rendering processing on the target object data to generate a target object with a realistic effect. The method can flexibly and rapidly generate object data corresponding to different object attribute parameter values, generate various different forms of object data, better adapt to unused processing engines or application programs, and simultaneously improve the production efficiency of manufacturing target objects with writing effect and reduce resource consumption.

Description

Object processing method, apparatus, computer device, storage medium, and program product
Technical Field
The present application relates to the field of artificial intelligence, and in particular, to an object processing method, an object processing apparatus, a computer device, a storage medium, and a computer program product.
Background
With the development of artificial intelligence technology and the gradual popularization and application of virtual elements in different technical fields, such as the field of games, in order to obtain better writing effect for large games, to provide high-quality virtual expression for game users, virtual objects such as various game elements, various game roles and the like in different game scenes need to be constructed, such as feathers with real effects need to be constructed in various forms for some specific game scenes.
In the conventional feather generating method with different effects, a model of a feather patch is usually manually placed in a modeling mode, a manufactured feather map is given to the patch, and for the feather map, proper modeling and materials are required to be searched, and then the feather materials are converted into required map types. However, in the actual project production, a plurality of versions are usually required to be modified and adjusted to achieve the final feather effect. However, since a plurality of software is required to be rolled in the conventional feather manufacturing process, and most of manufacturing steps are linear and irreversible in the whole manufacturing and modifying process, if the effect is greatly changed, the manufacturing needs to be restarted according to the modification requirement, and the previous work results cannot be reused, so that the manufacturing efficiency of virtual objects such as feathers with different effects is still lower.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an object processing method, apparatus, computer device, computer-readable storage medium, and computer program product that can improve production efficiency of virtual objects such as feathers with different effects.
In a first aspect, the present application provides an object processing method. The method comprises the following steps:
acquiring a basic object frame, and extracting each object attribute parameter value corresponding to the basic object frame;
generating object data based on the base object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
performing format matching processing on the object data according to a target engine, and determining target object data matched with the target engine;
and calling the target engine to perform rendering processing on the target object data to generate a target object with a realistic effect.
In one embodiment, the performing, according to a target engine, a format matching process on the object data, and determining target object data matched with the target engine includes:
Acquiring a target object format supported by a target engine; and carrying out format matching processing on the object data according to the target object format, and determining target object data matched with the target object format.
In one embodiment, the type of proxy mesh is used to represent a proxy mesh appearance, including a rectangular appearance and an object frame appearance; generating an object mesh body based on the updated object of the attribute information, the type of the proxy mesh, and the number of mesh segments, including:
determining shape adjustment strength according to the type of the proxy grid; performing shape adjustment on the proxy grid according to the shape adjustment strength by using the object updated by the attribute information to obtain a proxy grid after shape adjustment; and generating an object grid body based on the agent grid with the adjusted shape, the grid segmentation number and the updated object of the attribute information.
In one embodiment, the object property parameters include a plume length, a plume width, a plume shape, a feather color, a feather width on both sides of the plume, and a feather shape.
In a second aspect, the application further provides an object processing device. The device comprises:
The object attribute parameter value extraction module is used for obtaining a basic object frame and extracting each object attribute parameter value corresponding to the basic object frame;
an object data generating module, configured to generate object data based on the basic object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
the target object data determining module is used for carrying out format matching processing on the object data according to a target engine and determining target object data matched with the target engine;
and the target object generation module is used for calling the target engine, rendering the target object data and generating a target object with a realistic effect.
In a third aspect, the present application also provides a computer device. The computer device comprises a memory storing a computer program and a processor which when executing the computer program performs the steps of:
acquiring a basic object frame, and extracting each object attribute parameter value corresponding to the basic object frame;
generating object data based on the base object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
Performing format matching processing on the object data according to a target engine, and determining target object data matched with the target engine;
and calling the target engine to perform rendering processing on the target object data to generate a target object with a realistic effect.
In a fourth aspect, the present application also provides a computer-readable storage medium. The computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of:
acquiring a basic object frame, and extracting each object attribute parameter value corresponding to the basic object frame;
generating object data based on the base object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
performing format matching processing on the object data according to a target engine, and determining target object data matched with the target engine;
and calling the target engine to perform rendering processing on the target object data to generate a target object with a realistic effect.
In a fifth aspect, the present application also provides a computer program product. The computer program product comprises a computer program which, when executed by a processor, implements the steps of:
Acquiring a basic object frame, and extracting each object attribute parameter value corresponding to the basic object frame;
generating object data based on the base object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
performing format matching processing on the object data according to a target engine, and determining target object data matched with the target engine;
and calling the target engine to perform rendering processing on the target object data to generate a target object with a realistic effect.
In the object processing method, the device, the computer equipment, the storage medium and the program product, the object data corresponding to different object attribute parameter values are flexibly and rapidly generated based on the basic object frame and the object attribute parameter values by acquiring the basic object frame and extracting the object attribute parameter values corresponding to the basic object frame. The object data comprises an object hair curve, an object grid body and an object map, and when the object data is subjected to format matching processing according to the target engine, the target object data matched with the target engine can be determined, so that the effect of better adapting to unused processing engines or application programs by generating various different forms of object data and avoiding the situations of mismatching of formats and incapability of applying processing is achieved. Further, the target engine is called to conduct rendering processing on the target object data so as to generate a target object with a realistic effect, so that the production efficiency of the target object with the realistic effect is improved, and the resource consumption in the production process of the target object is reduced.
Drawings
FIG. 1 is a diagram of an application environment for an object processing method in one embodiment;
FIG. 2 is a flow diagram of a method of object processing in one embodiment;
FIG. 3 is a schematic view of the construction of a base feather in one embodiment;
FIG. 4 is a schematic diagram of object attribute parameters corresponding to a basic object framework in one embodiment;
FIG. 5 is a schematic diagram of object attribute parameters corresponding to a basic object framework according to another embodiment;
FIG. 6 is a schematic diagram of different object data in one embodiment;
FIG. 7 is a schematic diagram of a feather map file adjusted according to the shape of the feathers in one embodiment;
FIG. 8 is a schematic diagram of a feather map file according to a feather shape adjustment in another embodiment;
FIG. 9 is a schematic diagram of a feather patch, feather map file obtained by parameter adjustment of feathers in one embodiment;
FIG. 10 is a schematic view of feathers with a realistic effect in one embodiment;
FIG. 11 is a flowchart of an object processing method according to another embodiment;
FIG. 12 is a schematic diagram of a base object framework and a guide curve of an object handling method in one embodiment;
FIG. 13 is a schematic illustration of a process flow of a feather generator and a feather deformer in one embodiment;
FIG. 14 is a schematic diagram of deformed object data and deformed object mesh bodies in one embodiment;
FIG. 15 is a flow diagram of obtaining deformation object data in one embodiment;
FIG. 16 is a flow chart of an object processing method according to yet another embodiment;
FIG. 17 is a diagram of basic parameters of a basic object framework of an object processing method in one embodiment;
FIG. 18 is a diagram of replaced object data in one embodiment;
FIG. 19 is a flow chart of an object processing method according to another embodiment;
FIG. 20 is a block diagram of an object processing apparatus in one embodiment;
fig. 21 is an internal structural view of a computer device in one embodiment.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
The object processing method provided by the embodiment of the application relates to an artificial intelligence technology, wherein artificial intelligence (Artificial Intelligence, AI) is a theory, a method, a technology and an application system which simulate, extend and expand human intelligence by using a digital computer or a machine controlled by the digital computer, sense environment, acquire knowledge and acquire an optimal result by using the knowledge. In other words, artificial intelligence is a comprehensive technology of computer science, which attempts to understand the essence of intelligence and produce a new intelligent machine that can react in a similar way to human intelligence, i.e., artificial intelligence, i.e., research on the design principles and implementation methods of various intelligent machines, so that the machine has the functions of sensing, reasoning and decision. The artificial intelligence technology is used as a comprehensive discipline, and relates to a technology with a wide field range and a technology with a hardware level and a technology with a software level, wherein the artificial intelligence basic technology generally comprises technologies such as a sensor, a special artificial intelligence chip, cloud computing, distributed storage, big data processing technology, an operation/interaction system, electromechanical integration and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and other directions.
The Computer Vision technology (CV) is a science of researching how to make a machine "look at", and more specifically, to replace a human eye with a camera and a Computer to perform machine Vision such as recognition, detection and measurement on a target, and further perform graphic processing, so that the Computer is processed into an image more suitable for the human eye to observe or transmit to an instrument to detect. As a scientific discipline, computer vision research-related theory and technology has attempted to build artificial intelligence systems that can acquire information from images or multidimensional data. Computer vision techniques typically include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D techniques, virtual reality, augmented reality, synchronous positioning, and map construction, among others, as well as common biometric recognition techniques such as face recognition, fingerprint recognition, and others. Machine Learning (ML) is a multi-domain interdisciplinary, involving multiple disciplines such as probability theory, statistics, approximation theory, convex analysis, algorithm complexity theory, etc. It is specially studied how a computer simulates or implements learning behavior of a human to acquire new knowledge or skills, and reorganizes existing knowledge structures to continuously improve own performance. Machine learning is the core of artificial intelligence, a fundamental approach to letting computers have intelligence, which is applied throughout various areas of artificial intelligence. Machine learning and deep learning typically include techniques such as artificial neural networks, confidence networks, reinforcement learning, transfer learning, induction learning, and teaching learning.
With research and advancement of artificial intelligence technology, research and application of artificial intelligence technology is being developed in various fields, such as common smart home, smart wearable devices, virtual assistants, smart speakers, smart marketing, unmanned, automatic driving, unmanned aerial vehicles, robots, smart medical treatment, smart customer service, etc., and it is believed that with the development of technology, artificial intelligence technology will be applied in more fields and with increasing importance value.
The object processing method provided by the embodiment of the application particularly relates to the technologies such as a computer vision technology, a machine learning technology and the like in an artificial intelligence technology, and can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server. The terminal 102 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, internet of things devices and portable wearable devices, and the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, aircrafts, etc. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
Further, both the terminal 102 and the server 104 may be used separately to perform the object processing method provided in the embodiment of the present application, and the terminal 102 and the server 104 may also be used cooperatively to perform the object processing method provided in the embodiment of the present application. For example, taking the terminal 102 and the server 104 cooperatively execute the object processing method provided in the embodiment of the present application as an example, the server 104 obtains a basic object frame, extracts each object attribute parameter value corresponding to the basic object frame, and generates object data based on the basic object frame and each object attribute parameter value. The object data specifically comprises an object hair curve, an object grid body and an object map. The basic object framework and the object attribute parameter values corresponding to the basic object framework may be stored in the local storage of the terminal 102, or may be stored in the cloud storage of the data storage system or the server 104, whenWhen the object is processed, the object may be obtained from a local storage of the terminal 102, a data storage system, or a cloud storage of the server 104. Further, the server 104 performs format matching processing on the object data according to the target engine, determines target object data matched with the target engine, further invokes the target engine, performs rendering processing on the target object data, and generates a writing effect Is a target object of (1). For the obtained target object with the realistic effect, the server 104 may feed back the target object to the actual application scenario of the specific application program of the terminal 102, for example, the actual application scenario in the game application program installed in the terminal 102, for example, the game environment, the game role, the game clothing, and the game prop in the actual game scenario, so as to provide the game user with a high-quality game screen with a realistic effect.
In one embodiment, as shown in fig. 2, an object processing method is provided, where the method is implemented by a computer device, which is illustrated by way of example, and it is understood that the computer device may be the terminal 102 shown in fig. 1, or may be the server 104, or may be a system formed by the server 104 of the terminal 102, and implemented through interaction between the terminal 102 and the server 104. In this embodiment, the object processing method specifically includes the following steps:
step S202, a basic object framework is obtained, and each object attribute parameter value corresponding to the basic object framework is extracted.
In the development process of the game application program, in order to obtain a better writing effect, a high-quality virtual representation is provided for a game user, virtual objects such as a plurality of game elements, a plurality of game roles and the like in different game scenes need to be constructed, for example, feathers with real effects in various forms need to be constructed aiming at some specific game scenes.
For example, for the feather to be constructed to have a realistic effect, the basic object frame is the basic feather structure, wherein the feather can be divided into positive feather, down feather, half down feather, fiber feather and powder according to the structure. As shown in fig. 3, a schematic structure of a basic feather is provided, and referring to fig. 3, taking a positive feather as an example for explanation, the structure of the positive feather shown in fig. 3 is divided into a plurality of parts such as a feather shaft, a feather root, a feather branch, a feather little hook and the like, the feathers of different parts show different shapes, and the determination is made according to the different shapes of each part of the feathersThe curve and the attribute parameters of each part can be adjusted and controlled according to actual requirements so as to simulate the growth modeling of the real feather, thereby generating the feather with the writing effect.
Specifically, by acquiring a basic object framework, such as specifically a basic feather structure, and further acquiring object attribute parameters corresponding to the basic feather structure. The object attribute parameters specifically comprise a feather shaft length, a feather shaft width, a feather shaft shape, a feather color, a feather width at the left side and the right side of the feather shaft and a feather shape.
In one embodiment, as shown in fig. 4, there is provided object attribute parameters corresponding to the basic object frame, and referring to fig. 4, it can be seen that when the basic object frame is a basic feather structure, the object attribute parameters corresponding to the basic feather structure include a feather axis length, a feather axis shape, and a feather axis radius (i.e., a feather axis width) shown in fig. 4.
Further, as shown in fig. 5, another object attribute parameter corresponding to the basic object frame is provided, and as can be seen with reference to fig. 5, when the basic object frame is a basic feather structure, the object attribute parameter corresponding to the basic feather structure includes the feather color, the feather thickness (i.e., the feather width), and the feather shape shown in fig. 5. The feathers on two sides of the feather shaft can be respectively provided with the shape and the width of the feathers, and can be the same parameter value or the parameter value for hosting your.
If symmetrical feathers are needed, the width and shape of the feathers at two sides of the feather shaft can be set to be consistent, and a mirror image copying mode can be adopted specifically, specific attribute parameter values at one side are copied to the other side, for example, specific parameter values of the width and shape of the feathers at the left side are copied to the right side in a mirror image mode, so that complicated operation is reduced, and functions of one-key copying and rapid configuration at two sides are realized. Similarly, if a feather with a changeable shape and asymmetric left and right sides is required, the feather width and the feather shape of the feather on both sides of the feather shaft can be set respectively.
In step S204, object data including an object hair curve, an object mesh body, and an object map is generated based on the basic object frame and the respective object attribute parameter values.
Wherein the supported data formats or file formats are different due to the different applications or processing engines, and different object data comprising object hair curves, object mesh volumes and object maps are required to be generated in order to be adapted to the different processing engines or applications. For example, the object hair curve may be understood as a hair file of the feather, and may specifically be a file in abc format, where the hair file of the feather may be output by combining an abstract model preset by the feather itself with object attribute parameters and parameter curve control.
Similarly, the object grid body can be understood as a feather patch file, specifically, a file in fbx format, where the feather patch file is obtained by obtaining the type of the proxy grid and the number of grid segments, and further outputting the obtained feather patch file according to the shape, size, and the like of the feather, the type of the proxy grid and the number of grid segments. The agent grid is used for displaying objects in a lighter form, namely in a mode of combining an object grid body with an object map, such as in a mode of combining a feather patch file and a feather map to display feathers.
The object map may be understood as a map file of the feather, and may specifically be a file in tag format, where a polygon corresponding to the object hair curve may be obtained by performing conversion processing on the object hair curve, and then baking processing (i.e., rake processing) is performed on the polygon and the object mesh body (i.e., the feather patch file), so as to obtain the feather map file.
In one embodiment, as shown in fig. 6, a schematic diagram of different object data is provided, wherein fig. 6 (a) is a hair file of feathers, that is, a bloom file shown in fig. (a), and the bloom file is a file of a figure of feathers described by using a plurality of curves. Fig. 6 (b) shows a feather patch file (i.e., object grid body), that is, a feather card file shown in fig. (b), where the feather patch file includes different shapes, such as a rectangular frame shape shown in fig. (b), and may also be other shapes that are more fit to the shape of the feather. The card & tex file shown in fig. 6 (c) is a feather effect file which is displayed after the feather map file and the feather patch file are combined after the baking treatment, but the feather map file and the feather patch file can be directly loaded into an application program or a processing engine for rendering treatment and then used, and similarly, the groom file shown in fig. (a) can also be directly loaded into the application program or the processing engine for rendering treatment and then used.
Specifically, according to the attribute values of each object, the original attribute information of the basic object framework is modified, so that the object with updated attribute information can be obtained. Further, the object grid body is generated by acquiring the type of the proxy grid and the number of grid segments and based on the updated object of the attribute information, the type of the proxy grid and the number of grid segments. The object hair curve matched with the object with updated attribute is determined, and the object hair curve can be further subjected to conversion processing to obtain an object polygon corresponding to the object hair curve, so that the object map is obtained based on the object polygon and the object grid body by baking processing.
The object hair curve includes a plurality of curves for describing the modeling of the object with updated attribute, such as the hair curve of feather, and the curves can be directly loaded into a program or a processing engine for rendering and then used, and the object hair curve is specifically obtained by modifying the original attribute information of the basic object frame according to the attribute values of each object, obtaining the object with updated attribute information, and determining the object based on the updated attribute. Specifically, the abstract model of the object is updated by combining the attribute information, and the updated object attribute parameters and parameter curves, so that the abstract model of the object is modified and controlled, and the object hair curves can be obtained by outputting.
Likewise, the object mesh body may be understood as an object patch file, such as a feather patch file, specifically, the object mesh body may be generated by acquiring the type of the proxy mesh and the number of mesh segments, and updating the object, the type of the proxy mesh, and the number of mesh segments based on the attribute information.
The type of the proxy grid is used for representing the appearance of the proxy grid, and can comprise rectangular appearance and object frame appearance, wherein the number of grid segments can be set and adjusted according to the requirements of actual application scenes, the method is not limited to a certain or certain specific values, and the more the number of grid segments is, the more the shape of an object can be represented in a finer manner, and a better writing effect is provided. For example, if the feather patch file needs to be output, updated attribute information can be acquired based on the object updated by the attribute information, for example, attribute information such as a shape of the feather, a width of the feather and the like is acquired, and the object grid body is generated by further combining information such as a type of the proxy grid, a number of grid segments, the shape, the width and the like of the feather.
In one embodiment, as shown in fig. 7, a schematic diagram of a feather map file obtained by adjusting a feather shape is provided, referring to fig. 7, it can be known that a type of a proxy grid may be a proxy grid with a rectangular appearance, for example, fig. 7 (a), referring to fig. 7 (a), an object updated according to a proxy grid and an attribute may be obtained from the initial state of the feather map file, and according to actual requirements, further according to the updated feather with the attribute, the proxy grid is adjusted so that the proxy grid is more attached to the feather appearance, and a feather map file with a shape attached to the feather appearance as shown in fig. 7 (b) is obtained.
Further, as shown in fig. 8, another schematic diagram of a feather map file obtained according to feather shape adjustment is provided, and referring to fig. 8, for example, when the type of the proxy mesh shown in fig. 8 (a) is rectangular, if the feather shape is adjusted according to actual requirements, for example, the proxy mesh is adjusted to a curved feather shape (i.e., as shown in fig. 8 (b)), so that the feather map file with the curved feather shape is obtained according to the curved feather hair curve and the proxy mesh.
In one embodiment, the step of generating the object mesh body based on the object updated by the attribute information, the type of the proxy mesh and the number of mesh segments specifically includes:
determining shape adjustment strength according to the type of the proxy grid; the method comprises the steps of performing shape adjustment on a proxy grid according to shape adjustment strength by using an object updated by attribute information to obtain a proxy grid after shape adjustment; an object mesh body is generated based on the shape-adjusted proxy mesh, the number of mesh segments, and the updated object of the attribute information.
Specifically, the type of the proxy grid is used for representing the appearance of the proxy grid, and specifically comprises a rectangular appearance and an object frame appearance, wherein if the appearance is rectangular, for example, when the object is feather, the feather updated according to the attribute information is used for adjusting the shape of the proxy grid, and the determined shape adjustment strength is larger because the proxy grid is required to be close to the shape of the feather.
Similarly, if the appearance of the object frame is the appearance of the object, for example, if the object is a feather, the appearance of the proxy mesh is the appearance of the feather close to the shape of the feather, when the shape of the proxy mesh is adjusted according to the updated feather of the attribute information, the appearance of the proxy mesh is the appearance of the feather close to the shape of the feather, and therefore the proxy mesh does not need to be adjusted greatly, and the determined shape adjustment force is smaller, namely smaller than the adjustment force when the shape is the rectangular appearance.
Further, based on the shape adjusted proxy mesh, the determined number of mesh segments, and the updated object of the attribute information, an object mesh body may be generated.
For example, as shown in fig. 9, a schematic diagram of a feather patch and a feather mapping file obtained by adjusting parameters of feathers is provided, and it can be understood that feathers with different shapes and different effects, such as feathers with straight, curved, curled shapes and the like, can be manufactured by adjusting the attribute parameter values of the feathers. As can be seen from fig. 9, by adjusting the shape of the feathers, it is possible to output feathers having a curved shape as shown in fig. 9 (a) (i.e., a hair curve of the feathers having a curved shape), object mesh bodies having a curved shape as shown in fig. 9 (b) (i.e., a patch file of the feathers), and a feather map file having a curved shape as shown in fig. 9 (c). The feather mapping file shown in the (c) diagram in fig. 9 is further combined with a feather mapping file, and is used for displaying the feather effect obtained after various parameter adjustment.
Step S206, according to the target engine, carrying out format matching processing on the object data, and determining target object data matched with the target engine.
The target engine may be understood as a processing engine applied in the actual project development process, for example, a game engine in a certain game application program, and file formats or data types supported by different game application programs are different, so that the target engine applied in the actual project development process needs to perform format matching processing on the object data, determine the target object data, and call the target engine to perform rendering processing on the target object data, so as to obtain a target object with a writing effect in the actual application program.
Specifically, the target object format supported by the target engine is acquired, and the object data is subjected to format matching processing according to the target object format, so that the target object data matched with the target object format is determined. The object data includes an object hair curve, an object mesh body and an object map, the object hair curve can be understood as a feather hair file, and can be specifically a file in an abc format, the object mesh body can be understood as a feather patch file, and can be specifically a file in an fbx format, and the object map can be understood as a feather map file, and can be specifically a file in a tag format.
For example, if a game engine of a game application program supports a file in an abc format, according to a target object format supported by the game engine, that is, the abc format, the object data is subjected to format matching processing, and a file in the abc format, that is, a feather hair file, is obtained by matching, and then the feather hair file in the abc format is used as target object data, and is subjected to rendering processing to obtain a target object with a realistic effect.
And similarly, if the game engine of the game application program supports the file in the tag format, performing format matching processing on the object data according to the target object format supported by the game engine, namely, the tag format, and matching to obtain the file in the tag format, namely, the mapping file of the feathers, so that the mapping file of the feathers in the tag format is determined to be the target object data. In the subsequent actual rendering process, the feather mapping file combined with the feather patch file needs to be rendered.
The feather patch file and the feather map file aim to reflect the feather effect in a light-weight mode, namely, the feather effect of a real feather hair curve (namely a groom file) can be restored by combining the feather patch file and the feather map file.
Step S208, a target engine is called, rendering processing is carried out on target object data, and a target object with a realistic effect is generated.
The target engine may be understood as a processing engine applied in the actual project development process, for example, a game engine in a certain game application program, and the target engine is called to perform rendering processing on target object data matched with a supported target object format of the target game engine, so as to generate a target object with a realistic effect.
For example, if the target object data is a hair curve of a feather or a mapping file of a feather, the target game engine is called to render the hair curve of the feather or the mapping file of the feather, so that the feather with the realistic effect can be generated.
Specifically, when the target object data is a target object hair curve, the target object with a realistic effect is generated by calling a target engine to render the target object hair curve.
Similarly, when the target object data is not the target object hair curve, the target object mesh body and the target object map are required to be combined to obtain target light-weighted object data, and further the target engine is called to perform rendering processing on the target light-weighted object data, so as to generate the target object with the realistic effect.
The purpose of the combination processing of the target object mesh body and the target object map is to embody the feather effect in a light-weight mode, that is, the feather effect of the real feather hair curve (namely, a groom file) can be restored and obtained by combining the feather patch file and the feather map file. Meanwhile, the subsequent rendering processing is specifically performed on the combined obtained target light-weight object data, so that a target object with a realistic effect is obtained.
In one embodiment, as shown in fig. 10, a schematic diagram of a feather with a realistic effect is provided, referring to fig. 10, where fig. 10 (a), (b), and (c) show that the feather with a realistic effect is provided for the same modeling object at different angles, such as the feather effect of the modeling object (such as hawk) in fig. 10 (a) when viewed from the front, and such as the feather effect of the modeling object (such as hawk) in fig. 10 (b) when viewed from the back, and such as the feather effect of the modeling object (such as hawk) in fig. 10 (c) when viewed from the bottom upwards.
It can be understood that, through actual requirements in different project development processes, the shape, size, width and the like of the object (such as feather) can be flexibly adjusted to obtain a target object (such as feather with a realistic effect) meeting the requirements, and meanwhile, a plurality of target objects can be further processed in a combined manner or angle adjustment manner to obtain a display object (such as a modeling object in a game application program) with realistic effects at different angles.
In the object processing method, the basic object framework is obtained, and each object attribute parameter value corresponding to the basic object framework is extracted, so that the object data corresponding to different object attribute parameter values are flexibly and rapidly generated based on the basic object framework and each object attribute parameter value. The object data comprises an object hair curve, an object grid body and an object map, and when the object data is subjected to format matching processing according to the target engine, the target object data matched with the target engine can be determined, so that the effect of better adapting to unused processing engines or application programs by generating various different forms of object data and avoiding the situations of mismatching of formats and incapability of applying processing is achieved. Further, the target engine is called to conduct rendering processing on the target object data so as to generate a target object with a realistic effect, so that the production efficiency of the target object with the realistic effect is improved, and the resource consumption in the production process of the target object is reduced.
In one embodiment, as shown in fig. 11, there is provided an object processing method, which specifically includes the following steps:
In step S1102, a basic object frame is acquired, and each object attribute parameter value corresponding to the basic object frame is extracted.
In the development process of the game application program, in order to obtain a better writing effect, a high-quality virtual representation is provided for a game user, virtual objects such as a plurality of game elements, a plurality of game roles and the like in different game scenes need to be constructed, for example, feathers with real effects in various forms need to be constructed aiming at some specific game scenes.
Specifically, taking the example of building the feather with the realistic effect, the basic object frame, such as the basic feather structure, is obtained, and the object attribute parameters corresponding to the basic feather structure are further obtained. The object attribute parameters specifically comprise a feather shaft length, a feather shaft width, a feather shaft shape, a feather color, a feather width at the left side and the right side of the feather shaft and a feather shape.
In step S1104, object data is generated based on the basic object framework and the object attribute parameter values.
In particular, the supported data formats or file formats are different due to the different applications or processing engines, whereas different object data comprising object hair curves, object mesh volumes and object maps need to be generated in order to adapt to the different processing engines or applications.
In this embodiment, the object hair curve and the object mesh body are subjected to synchronous deformation processing, that is, the deformed object hair curve and the deformed object mesh body are obtained. Likewise, the generated deformed object hair curve, and the object mesh body, may be adapted to support applications or processing engines of different formats, and the object mesh body is generally required to be used in combination with the object map file, so as to achieve an object effect that is represented in a lighter form and is close to that which can be represented by the real object hair curve.
The object hair curve can be understood as a feather hair file, and can be specifically an abc-format file, and the feather hair file can be obtained by combining object attribute parameters and parameter curve control through preset abstract modeling of the feather. The object grid body can be understood as a feather patch file, specifically, a file in the fbx format, and the feather patch file is obtained by obtaining the type of the proxy grid and the number of grid segments, and further outputting the feather patch file according to the shape, the size and the like of the feather, the type of the proxy grid and the number of grid segments.
Step S1106, the guide curve is invoked to perform deformation processing on the object data, so as to obtain deformed object data.
The guide curve is used for controlling the length and the growth direction of the object, namely, the object data can be subjected to deformation processing by calling the guide curve, so that deformed object data can be obtained. The deformation processing includes various processing means such as rotation processing, scaling processing, and orientation adjustment processing. For example, when the object is a feather, the input basic feather and the object mesh body corresponding to the feather are deformed to the corresponding positions and directions according to the guiding curves for controlling the directions and positions of the feather, that is, the object mesh body corresponding to the feather and the feather are deformed at the same time, so that the object mesh body can adapt to the new shape of the deformed feather.
Specifically, the deformed object data is obtained by determining a transformation moment associated with the guide curve and transforming the object data onto the guide curve according to the transformation matrix. The transformation matrix is usually a vector transformation matrix, specifically, a transformation vector obtained by performing vector calculation on a position vector of continuous data points randomly extracted from a guide curve, and performing further coordinate transformation processing.
In one embodiment, determining a transformation matrix associated with the steering curve includes: randomly extracting successive first and second data points from the guide curve; acquiring a first position vector of a first data point and a second position vector of a second data point; determining a transformation vector based on the first and second position vectors; and calling a preset objective function, and performing coordinate conversion processing on the transformation vector to obtain a transformation matrix associated with the guide curve.
Specifically, a plurality of data points are generally disposed on the guide curve, for example, a first data point and a second data point are selected randomly, and the position of the first data point on the guide curve is obtained and expressed as a first position vector, and the position of the second data point is obtained and expressed as a second position vector, and a new transformation vector can be obtained by subtracting the first position vector and the second position vector.
Further, the transformation matrix associated with the guide curve may be obtained by acquiring a preset objective function, such as a look at function (for converting world space coordinates of vertices into viewing space coordinates, which are actually coordinates obtained in a three-dimensional space redefined with the camera as an origin), and invoking the preset objective function to process the transformation vector, specifically, to perform coordinate conversion processing on the transformation vector. When a preset objective function (i.e., look at function) is called to process the transformation vector, a vector matrix (such as a vector matrix of 3X 3) of the first data point looking at the second data point can be obtained, and a transformation matrix associated with the guide curve is obtained.
In one embodiment, as shown in fig. 12, a schematic diagram of a basic object frame and a guiding curve of an object processing method is provided, and referring to fig. 12, it can be seen that fig. 12 (a) is a basic object frame, for example, 3 basic feathers with different shapes and sizes, and fig. 12 (b) is a guiding curve, for example, a guiding curve including a plurality of guiding curves with different shapes and trend trends, and is used for performing deformation processing on each basic feather to obtain deformed feathers.
In one embodiment, before determining the transformation matrix associated with the guide curve, further comprising:
determining a scaling ratio based on the size information of the object data and the size information of the guide curve; and performing scaling processing on the size information of the object data according to the scaling ratio to obtain the object data after the scaling processing.
Specifically, the size information of the object data and the size information of the guide curve are obtained, and a scaling ratio is determined according to the size information of the object data and the size information of the guide curve, wherein the scaling ratio is used for scaling the size information of the object data, so that the size of the object data can be matched with the size of the guide curve, and the scaled object data is obtained.
For example, taking the case that the feather needs to be deformed, the object data is a hair curve of the feather and a patch file of the feather (i.e., an object grid body), the hair curve of the feather may be obtained specifically, the basic structure of the feather may be determined, including information about the length, width, shape, symmetry of both sides of the feather shaft, etc., and the hair curve of the feather and the patch file of the feather may be synchronously deformed by using the guiding curve, so as to obtain deformed object data including the hair curve of the feather after deformation and the patch file of the feather after deformation.
Further, determining a scaling ratio based on the size information of the object data and the size information of the guide curve, scaling the size information of the object data according to the scaling ratio, and obtaining the scaled object data includes:
according to the scaling ratio, scaling the transformation matrix to obtain a scaled transformation matrix; and according to the transformation matrix after the scaling processing, performing rotation processing and offset processing on the object data after the scaling processing, and transforming the object data onto a guide curve to obtain deformed object data.
Specifically, the scaling is determined based on the size information of the object data and the size information of the guide curve, and then the scaling is performed on the transformation matrix according to the scaling according to the actual requirements in the project development process on the basis of determining the scaling, so as to obtain the transformation matrix after the scaling. If the scaling processing requirement is not detected, the scaling processing is not performed on the size information of the object data and the transformation matrix, and the intrinsic size of the object data (such as feathers) is used.
The transformation matrix after the scaling treatment can be obtained by scaling the transformation matrix, and then the object data after the scaling treatment is subjected to rotation treatment and offset treatment according to the transformation matrix after the scaling treatment, so that the object data is transformed onto the guide curve, and deformed object data is obtained. The scaling processing is performed on both the transformation matrix and the object data, and the scaling ratio may be added to the transformation matrix, so that the transformation matrix may embody parameters required for the entire transformation processing, including the scaling processing, rotation processing, offset processing, and the like.
In one embodiment, as shown in fig. 13, a processing flow of a feather generator and a feather deformer is provided, and referring to fig. 13, the generating and deforming processes of the feathers are respectively implemented by the feather generator and the feather deformer, wherein:
The sop_featuregenerator is a feather generator for generating feathers, and the Set Parameter is used for representing setting different object attribute Parameter values, wherein the feather generator specifically outputs Output Feather Card (the format of fbx) which is a feather patch file, output Texture (the format of tga) which is a map file of the feathers, and Output Groom File (the format of abc) which is a hair curve of the feathers.
Similarly, the sop_featherdeformer is used for deforming the feathers to obtain deformed feathers, wherein guide Curve represents a guide Curve for controlling the length and growth direction of the feathers, which are objects. The sop_ FeatherDeform (Sync Material) is used for synchronously deforming the patch file (namely the object grid body) of the feather and synchronizing the materials in the feather deformation process. Generate StaticMesh is used to represent the static grid of the feathers produced by the deformation treatment, and since it is generally necessary to deform a plurality of feathers, a set of static grids of the feathers after the deformation treatment can be obtained.
Furthermore, the feather generating tool is provided with a plurality of object attribute parameters, the appearance, the density, the length, the curling strength and the clustering size of the feathers can be controlled by modifying the specific value of the object parameters and the control curve corresponding to the value of the object parameters according to the actual requirements on the basis of the set object attribute parameters, the modeling gradual change and the color gradual change from the root to the tip of each hair are supported, and the processing process has good controllability and flexibility.
In order to facilitate flexible control and more accurate modeling, the left side and the right side of the feather are decoupled, different parameter settings can be respectively carried out on the two sides, and meanwhile, the function of mirroring the parameters from left to right is provided, namely, the parameter effect on the left side can be mirrored to the right side by one key, and the parameters can be respectively regulated again on the basis of mirrored parameters if the parameters are required to be modified, so that the parameter regulation efficiency is improved.
And when the feathers are generated, outputting corresponding feather patches, converting the object hair curve by using a rake function (namely a baking function) to obtain an object polygon corresponding to the object hair curve, and baking based on the object polygon and the object grid body to obtain the high-precision object map. The reduction effect after the feather patch and the mapping file are combined is very similar to the feather effect reflected by the true feather file.
Similarly, the feather deformation tool is used for deforming the input basic single feather and the object grid body corresponding to the feather to the corresponding position and direction according to the guiding curve for controlling the direction and the position of the feather. The guide curve and the feathers which need to be replaced can be determined according to the names specifically designated by the actual requirements so as to be replaced according to the specific requirements.
In particular, the feather deformation tool is used for realizing the deformation of the manufactured feathers to a specified direction and position. For example, by inputting one or more feathers and guiding curves, the feathers can be automatically deformed to the position of the guiding curve, and the deformed feathers can be obtained by scaling treatment and self-adapting to the length of the guiding curve. If the guiding curve provides normal attributes, the planes of the feathers can be automatically aligned with the planes of the feathers vertically, and when random replacement requirements exist, such as random replacement requirements exist on all the feathers needing deformation replacement, contracted attribute names can be added on the guiding curve and the feathers, different feather models can be automatically distributed randomly according to the attribute values of the provided attribute names, and the feather patches can be synchronously deformed to match the feather models while the feathers are deformed.
In step S1108, according to the target engine, format matching processing is performed on the deformed object data, and the target deformed object data matched with the target engine is determined.
The target engine may be understood as a processing engine applied in the actual project development process, for example, a game engine in a certain game application program, and file formats or data types supported by different game application programs are different, so that the target deformed object data needs to be subjected to format matching processing according to the target engine applied in the actual project development process, and the target deformed object data is determined, so that the target engine is called to perform rendering processing on the target deformed object data, and a target deformed object with a writing effect in the actual application program is obtained.
Specifically, the target object format supported by the target engine is acquired, and format matching processing is performed on the deformed object data according to the target object format, so as to determine target deformed object data matched with the target object format.
For example, if a game engine of a game application program supports a file in an abc format, according to a target object format supported by the game engine, that is, the abc format, format matching processing is performed on the deformed object data, so that a file in the abc format, that is, a feather hair file, is obtained by matching, and then the feather hair file in the abc format is used as target deformed object data, and is used for subsequent rendering processing, so as to obtain a target deformed object with a realistic effect.
In one embodiment, as shown in fig. 14, there is provided a schematic diagram of deformed object data and a deformed object mesh body, and referring to fig. 14, it is known that deformed object data (i.e., a deformation & variance file shown in fig. 14 (a)) shown in fig. 14 and deformed object mesh body (i.e., a card deformation file shown in fig. 14 (b)) shown in fig. 14 are obtained by deforming the object data by a guide curve while synchronously deforming the object mesh body.
Step S1110, a target engine is called, and rendering processing is carried out on target deformation data matched with the target engine, so that a target deformation object with a realistic effect is obtained.
The target engine may be understood as a processing engine applied in the actual project development process, for example, a game engine in a certain game application program, and the target deformation object with the realistic effect may be generated by calling the target game engine to perform rendering processing on the target deformation object data matched with the supported target deformation object format of the target game engine. For example, if the target deformation object data is a hair curve of a feather or a mapping file of a feather, the target game engine is called to perform rendering processing on the hair curve of the feather or the mapping file of the feather, so that the feather with a realistic effect can be generated.
Specifically, when the target deformation object data is the target deformation object hair curve, the target deformation object with the realistic effect is generated by calling the target engine to render the target deformation object hair curve.
Similarly, when the target deformed object data is not the target deformed object hair curve, the target deformed object mesh body and the target deformed object map need to be combined to obtain target lightweight deformed object data, and further the target engine is called to render the target lightweight deformed object data, so as to generate the target deformed object with the realistic effect. The purpose of the combination processing of the target deformation object mesh body and the target deformation object map is to embody the feather effect in a light-weight mode, that is, the feather effect of the real feather hair curve (namely, a groom file) can be restored and obtained by combining the feather patch file and the feather map file. Meanwhile, the subsequent rendering processing is carried out on the combined obtained target lightweight deformation object data, so that a target deformation object with a realistic effect is obtained.
In this embodiment, by acquiring the basic object frame and extracting each object attribute parameter value corresponding to the basic object frame, object data corresponding to different object attribute parameter values are flexibly and rapidly generated based on the basic object frame and each object attribute parameter value. The method has the advantages that the deformation object data meeting the actual requirements is obtained quickly by calling the guide curve to perform various deformation processes on the object data, the target deformation object data matched with the target engine can be determined by performing format matching process on the deformation object data, and the effects that the condition that the formats are not matched and the processing cannot be applied are avoided by generating various different forms of object data and performing deformation process on different object data to better adapt to unused processing engines or application programs. And further, the target engine is called to conduct rendering processing on target deformation data matched with the target engine, so that a target deformation object with a realistic effect is obtained, the production efficiency of the target deformation object with the realistic effect is improved, and the resource consumption in the production process of producing the target deformation object is reduced.
In one embodiment, as shown in fig. 15, the step of obtaining deformed object data, that is, performing rotation processing and offset processing on the scaled object data according to the scaled transformation matrix, and transforming the object data onto a guiding curve, to obtain deformed object data specifically includes:
In step S1502, the direction of the object data after the scaling processing is rotated, and the object data with the direction converted is obtained.
Specifically, the transformation processing mode of the image processing application program in the actual project development process is obtained, for example, the Houdini application program, namely the three-dimensional computer graphics software, the set calculation transformation mode matched with the image processing application program is obtained, and the rotation processing is carried out on the orientation of the object data after the scaling processing according to the calculation transformation mode.
In the application process of the Houdini application program, before the object data is subjected to transformation processing, the orientation of the object data such as feathers is uniformly set to be in the positive Z-axis direction according to a calculation transformation mode.
In step S1504, the object data of which orientation is changed are mapped to the preset range in order of the object data from bottom to top.
Specifically, the orientation-transformed object data, i.e., the orientation-transformed feathers, are mapped to the preset ranges sequentially in the order of the object data from bottom to top, e.g., in the order of the bottom to top of the feathers.
In this embodiment, the preset range may be the range of [0,1], that is, the range of [0,1] is mapped to the orientation of the transformed feathers in sequence from bottom to top of the feathers. The preset range can be adjusted and set according to actual requirements, such as a range of [0,10], a range of [0,100], and the like, and various value ranges are not limited to a single specific value range or specific value ranges as exemplified.
Step S1506, performing matching processing on the scaled transformation matrix according to the preset range, and obtaining a target position successfully matched from the transformation matrix.
Specifically, the scaling-processed transformation matrix is subjected to matching processing according to a preset range, such as a range of [0,1], and specifically, the transformation matrix corresponding to any point in the range of [0,1] on the corresponding guide curve can be matched according to the range of [0,1 ].
When the matching process is performed, for example, in the range of [0,1], if the object data, such as feather, is mapped to the position of 0.5 on the feather, the position corresponding to the value of 0.5 on the guiding curve is corresponding to the position corresponding to the value of 0.5 on the guiding curve, because the value of 0.5 is located at the middle position in the range of [0,1], the position of 0.5 on the object data and the position of 0.5 on the guiding curve are both corresponding to the respective middle positions, and thus a target position for successful matching can be obtained.
In step S1508, rotation processing is performed on the object data subjected to the orientation conversion based on the conversion matrix after the scaling processing, and the object data subjected to the orientation standard is obtained.
Specifically, the object data subjected to the orientation conversion is rotated based on the conversion matrix after the scaling processing, and specifically, the object data, that is, the feathers, are multiplied by the conversion matrix, so that the object data can be rotated to a correct orientation, thereby obtaining the object data subjected to the orientation standard.
In step S1510, the object data oriented to the standard is subjected to a positional shift process according to the target position, and the object data is deformed onto the guide curve, thereby obtaining deformed object data.
After the position of the object data, such as the feather, is multiplied by the transformation matrix, the feather is only rotated and scaled at the origin of the world, and no global position deviation is generated, the position deviation processing is required to be performed on the object data facing the standard according to the target position on the guide curve, so as to obtain the correct position deviation, thereby realizing the purpose of deforming the object data onto the guide curve and obtaining deformed object data.
In this embodiment, the object data of which the orientation is converted is obtained by performing a rotation process on the orientation of the object data after the scaling process, and the object data of which the orientation is converted is sequentially mapped to a preset range in the order of the object data from the bottom to the top. And then, according to a preset range, matching the scaled transformation matrix, and obtaining a successfully matched target position from the transformation matrix. The object data subjected to the scaling processing is rotated based on the transformation matrix, so that object data subjected to the scaling processing is obtained, and further, the object data subjected to the scaling processing is subjected to a positional shift processing according to the target position, so that the object data is deformed on the guide curve, and deformed object data is obtained. The method realizes the orientation transformation processing, the position offset processing and the like of the object data according to the transformation matrix, so that the object data can be deformed to the selected guide curve, the flexible control and the accurate adjustment of the position in the direction of the object data by using the guide curve are realized, the deformed object data is quickly obtained, the production efficiency of the target deformed object with the writing effect is further improved on the basis, and the resource consumption in the production process of the target deformed object is reduced.
In one embodiment, as shown in fig. 16, there is provided an object processing method, which specifically includes the following steps:
step S1602, obtaining a preset target attribute, and extracting a first attribute value and a second attribute value corresponding to the preset target attribute, where the first attribute value is associated with the object data, and the second attribute value is associated with the guide curve.
Specifically, by acquiring a preset target attribute, such as a variable_name attribute, and acquiring a first attribute value and a second attribute value corresponding to the preset target attribute, wherein the first attribute value is associated with object data, the second attribute value is associated with a guide curve, and the purpose of acquiring the first attribute value and the second attribute value is to compare the first attribute value and the second attribute value and judge whether the first attribute value and the second attribute value are consistent.
In step S1604, if the first attribute value corresponds to the second attribute value one by one, the object data is replaced on the guiding curve, and the replaced object data is obtained.
Specifically, when the object data is a feather, the variable_name attribute is used for determining which basic feather shape should be replaced on the current guiding curve, and by storing a variable_name attribute on each feather and the guiding curve, that is, if the value of the variable_name attribute on the current guiding curve, that is, the second attribute value corresponds to only one basic feather with the same value, that is, the first attribute value corresponds to the second attribute value one by one, the object data is replaced on the guiding curve, the replaced object data is obtained, that is, the feather is directly replaced on the guiding curve, and the replaced feather is obtained.
In step S1606, if the second attribute value corresponds to the plurality of first attribute values, or if there is no first attribute value matching the second attribute value, the data to be replaced is randomly determined based on each object data, and the data to be replaced is replaced on the guiding curve, so as to obtain the replaced object data.
Specifically, when the object data is a feather, if the value of the variable_name attribute on the current guiding curve corresponds to a plurality of feathers with the same value of the variable_name attribute, for example, the second attribute value of one guiding root curve corresponds to two or more first attribute values of the feathers, that is, the first attribute value and the second attribute value are not in a one-to-one correspondence relationship, the data to be replaced is randomly determined based on each object data, that is, the target feather is randomly determined in a plurality of feathers corresponding to the guiding curve, and the data to be replaced (that is, the target feather) is replaced on the guiding curve, so as to obtain the replaced object data (that is, the replaced feathers are obtained).
Similarly, if there is no first attribute value matching the second attribute value, i.e. if there is no feather matching the guiding curve, then randomly determining data to be replaced in the existing feather, and replacing the data to be replaced (i.e. the target feather) on the guiding curve to obtain the replaced object data (i.e. obtain the replaced feather).
In one embodiment, as shown in fig. 17, a schematic diagram of basic parameters of a basic object frame of an object processing method is provided, and referring to fig. 17, basic parameters of a basic object frame, that is, feathers, are provided, where the basic parameters include parameters such as types of feathers (including 3 feathers corresponding to the feathers 1, 2 and 3 feathers shown in fig. 17), colors of the feathers (including three different colors of red, green and blue shown in fig. 17), and other parameters such as shapes of the feathers, sizes of the feathers, number of the feathers, and a guiding curve (not shown in fig. 17) may be further included.
Further, as shown in fig. 18, a schematic diagram of the object data after replacement is provided, and as can be seen from fig. 18, the object data is obtained by performing matching processing and replacement processing on a plurality of inputted feathers. The replaced object data shown in fig. 18 is provided with various different bending degrees, feather shapes and the like, so that diversified feather effects are reflected, the method can be suitable for requirements of different practical application scenes in different project development processes, and requirements of feather effects with different load degrees are met.
In this embodiment, by acquiring a preset target attribute, and extracting a first attribute value and a second attribute value corresponding to the preset target attribute, the first attribute value is associated with object data, and the second attribute value is associated with a guide curve. And if the first attribute value corresponds to the second attribute value one by one, replacing the object data on the guide curve to obtain the replaced object data. Similarly, if the second attribute value corresponds to a plurality of first attribute values, or if there is no first attribute value matched with the second attribute value, randomly determining data to be replaced based on each object data, and replacing the data to be replaced on the guide curve to obtain replaced object data. The flexible matching and replacement processing of the object data according to the guide curve are realized, so that diversified object effects are obtained, the method can be suitable for the requirements of different actual application scenes in different project development processes, and the requirements of the object effects with different complexity are met.
In one embodiment, as shown in fig. 19, there is provided an object processing method, which specifically includes the following steps:
in step S1901, a basic object frame is acquired, and each object attribute parameter value corresponding to the basic object frame is extracted.
Step 1902, according to the attribute values of each object, modifying the original attribute information of the basic object framework to obtain the updated object of the attribute information.
Step S1903, the type of the proxy mesh and the number of mesh segments are acquired, and the object mesh body is generated based on the object updated by the attribute information, the type of the proxy mesh and the number of mesh segments.
In step S1904, an object hair curve matching the object of the attribute update is determined, the object hair curve including a plurality of curves for describing the styling of the object of the attribute update.
In step S1905, the object hair curve is converted to obtain an object polygon corresponding to the object hair curve, and the object map is obtained by baking based on the object polygon and the object mesh body.
In step S1906, the object data includes the object hair curve, the object mesh body and the object map, the target object format supported by the target engine is obtained, and the format matching process is performed on the object data according to the target object format, so as to determine the target object data matched with the target object format.
Step S1907, call the target engine, perform rendering processing on the target object hair curve to generate a target object with a realistic effect, or perform combination processing on the target object mesh body and the target object map to obtain target lightweight object data, and call the target engine, perform rendering processing on the target lightweight object data to generate a target object with a realistic effect.
Step S1908, randomly extracting consecutive first data points and second data points from the guide curve, and acquiring a first position vector of the first data points and a second position vector of the second data points.
Step S1909, determining a transformation vector based on the first position vector and the second position vector, and calling a preset objective function to perform coordinate transformation processing on the transformation vector, thereby obtaining a transformation matrix associated with the guide curve.
In step S1910, a scaling ratio is determined based on the size information of the object data and the size information of the guide curve, and scaling processing is performed on the size information of the object data according to the scaling ratio, so as to obtain scaled object data.
In step S1911, the scaling process is performed on the transformation matrix according to the scaling ratio, so as to obtain a transformation matrix after the scaling process.
In step S1912, the direction of the object data after the scaling processing is rotated, and the object data with the direction converted is obtained.
Step S1913, the object data which are oriented to the transformation are mapped to a preset range in sequence from bottom to top according to the object data, and the transformation matrix after the scaling processing is subjected to matching processing according to the preset range, so that a target position successfully matched is obtained from the transformation matrix.
In step S1914, rotation processing is performed on the object data subjected to the orientation conversion based on the conversion matrix after the scaling processing, and the object data subjected to the orientation standard is obtained.
In step S1915, the positional shift processing is performed on the object data oriented to the standard according to the target position, and the object data is deformed onto the guide curve, thereby obtaining deformed object data.
Step S1916, according to the target engine, performing format matching processing on the deformed object data, determining target deformed object data matched with the target engine, where the target engine is used to perform rendering processing on the target deformed data matched with the target engine, so as to obtain a target deformed object with a realistic effect.
In this embodiment, by acquiring the basic object frame and extracting each object attribute parameter value corresponding to the basic object frame, object data corresponding to different object attribute parameter values are flexibly and rapidly generated based on the basic object frame and each object attribute parameter value. The object data comprises an object hair curve, an object grid body and an object map, and when the object data is subjected to format matching processing according to the target engine, the target object data matched with the target engine can be determined, so that the effect of better adapting to unused processing engines or application programs by generating various different forms of object data and avoiding the situations of mismatching of formats and incapability of applying processing is achieved. The method comprises the steps of generating object data in different forms, carrying out deformation processing on different object data, and adapting to an unused processing engine or application program better. And further, the target engine is called to conduct rendering processing on target deformation data matched with the target engine, so that a target deformation object with a realistic effect is obtained, the production efficiency of the target deformation object with the realistic effect is improved, and the resource consumption in the production process of producing the target deformation object is reduced.
In one embodiment, there is provided an object processing method, specifically including the following processing parts:
p1, feather generation: in the development process of the game application program, in order to obtain a better writing effect, a high-quality virtual representation is provided for a game user, virtual objects such as various game elements, a plurality of game roles and the like in different game scenes need to be constructed, for example, feathers with real effects in various forms need to be constructed aiming at some specific game scenes.
For example, for the feather with the writing effect to be constructed, the basic object frame is the basic feather structure, for example, the positive feather structure is divided into a plurality of parts such as a feather shaft, a feather root, a lower navel, an upper navel and the like, the feather of different parts presents different models, the curve and the attribute parameters of each part are determined according to the different models of each part of the feather, and the curve and the attribute parameters of different parts of the feather can be adjusted and controlled according to the actual requirement so as to simulate the growth model of the real feather, thereby generating the feather with the writing effect.
Specifically, by acquiring a basic object framework, such as specifically a basic feather structure, and further acquiring object attribute parameters corresponding to the basic feather structure. The object attribute parameters specifically comprise a feather shaft length, a feather shaft width, a feather shaft shape, a feather color, a feather width at the left side and the right side of the feather shaft and a feather shape. If symmetrical feathers are needed, the width and shape of the feathers at two sides of the feather shaft can be set to be consistent, and a mirror image copying mode can be adopted specifically, specific attribute parameter values at one side are copied to the other side, for example, specific parameter values of the width and shape of the feathers at the left side are copied to the right side in a mirror image mode, so that complicated operation is reduced, and functions of one-key copying and rapid configuration at two sides are realized. Similarly, if a feather with a changeable shape and asymmetric left and right sides is required, the feather width and the feather shape of the feather on both sides of the feather shaft can be set respectively.
Further, since the supported data formats or file formats are different for different application programs or processing engines, and different object data needs to be generated to adapt to different processing engines or application programs, wherein the object data comprises an object hair curve, an object mesh body and an object map, the object hair curve can be understood as a hair file of the feather, and can be specifically a file in abc format, and the hair file of the feather can be output by combining an abstract modeling preset by the feather itself and object attribute parameters and parameter curve control. The object grid body can be understood as a feather patch file, specifically, a file in the fbx format, and the feather patch file is obtained by obtaining the type of the proxy grid and the number of grid segments, and further outputting the feather patch file according to the shape, the size and the like of the feather, the type of the proxy grid and the number of grid segments. The object mapping can be understood as a mapping file of the feather, specifically, a file in a tag format, a polygon corresponding to the object hair curve can be obtained by performing conversion processing on the object hair curve, and then the polygon and the object mesh body (i.e., the feather patch file) are baked (i.e., the rake processing) to obtain the feather mapping file.
In one embodiment, the target engine may be understood as a processing engine applied in the actual project development process, for example, a game engine in a certain game application program, where file formats or data types supported by different game application programs are different, and the target engine applied in the actual project development process needs to perform format matching processing on the object data to determine the target object data, so as to call the target engine to perform rendering processing on the target object data, so as to obtain a target object with a writing effect in the actual application program.
Specifically, the target object with the writing effect can be generated by acquiring the target object format supported by the target engine, performing format matching processing on the object data according to the target object format to determine the target object data matched with the target object format, further calling the target game engine, and performing rendering processing on the target object data matched with the supported target object format of the target game engine. For example, if the target object data is a hair curve of a feather or a mapping file of a feather, the target game engine is called to render the hair curve of the feather or the mapping file of the feather, so that the feather with the realistic effect can be generated.
P2, feather deformation: the method comprises the steps of obtaining a basic object frame, extracting each object attribute parameter value corresponding to the basic object frame, generating object data based on the basic object frame and each object attribute parameter value, further obtaining a guide curve for controlling the length and the growth direction of an object, and carrying out deformation processing on the object data by calling the guide curve to obtain deformed object data. The deformation processing includes various processing means such as rotation processing, scaling processing, and orientation adjustment processing.
For example, when the object is a feather, the input basic feather and the object mesh body corresponding to the feather are deformed to the corresponding positions and directions according to the guiding curves for controlling the directions and positions of the feather, that is, the object mesh body corresponding to the feather and the feather are deformed at the same time, so that the object mesh body can adapt to the new shape of the deformed feather.
Specifically, the deformed object data is obtained by determining a transformation moment associated with the guide curve and transforming the object data onto the guide curve according to the transformation matrix. The transformation matrix is usually a vector transformation matrix, specifically, a transformation vector obtained by performing vector calculation on a position vector of continuous data points randomly extracted from a guide curve, and performing further coordinate transformation processing. Wherein prior to determining the transformation matrix associated with the guide curve, further comprising: determining a scaling ratio based on the size information of the object data and the size information of the guide curve; and performing scaling processing on the size information of the object data according to the scaling ratio to obtain the object data after the scaling processing.
Similarly, scaling is determined based on size information of the object data and size information of the guide curve, scaling is performed on the size information of the object data according to the scaling to obtain scaled object data, scaling is performed on the transformation matrix according to the scaling to obtain scaled transformation matrix, rotation processing and offset processing are performed on the scaled object data according to the scaled transformation matrix, and the object data is transformed onto the guide curve to obtain deformed object data.
Specifically, the object data subjected to the scaling processing is rotated in the direction to obtain object data subjected to the direction conversion, and the object data subjected to the direction conversion is sequentially mapped to a predetermined range in the order of the object data from the bottom to the top. And carrying out matching processing on the scaled transformation matrix according to a preset range, and obtaining a target position successfully matched from the transformation matrix. And performing rotation processing on the object data subjected to the orientation transformation based on the transformation matrix subjected to the scaling processing to obtain object data subjected to the orientation standard, performing position offset processing on the object data subjected to the orientation standard according to the target position, and deforming the object data onto the guide curve to obtain deformed object data.
Further, after transforming the object data onto the guide curve according to the transformation matrix to obtain the deformed object data, obtaining a target object format supported by the target engine, performing format matching processing on the deformed object data according to the target object format to determine target deformed object data matched with the target object format, calling the target engine, and performing rendering processing on the target deformed data matched with the target engine to obtain the target deformed object with the writing effect.
P3, feather replacement: by acquiring a preset target attribute and extracting a first attribute value and a second attribute value corresponding to the preset target attribute, wherein the first attribute value is associated with object data, and the second attribute value is associated with a guide curve. Further, by judging whether the first attribute value and the second attribute value are in one-to-one correspondence, different replacing operations are performed. And if the first attribute value corresponds to the second attribute value one by one, replacing the object data on the guide curve to obtain the replaced object data. Conversely, if the second attribute value corresponds to the plurality of first attribute values, or if there is no first attribute value matching the second attribute value, the data to be replaced is randomly determined based on each object data, and the data to be replaced is replaced on the guide curve, so as to obtain the replaced object data.
Specifically, by storing a variable_name attribute on each feather and the guide curve, when the values of the two attributes are the same, the feather can be replaced on the corresponding guide curve, when the variable_name of one curve corresponds to the variable_names of two or more feathers, one feather is randomly replaced on the guide curve, and when no feather matches the guide curve, one feather is randomly replaced on the guide curve in the existing feathers. The variable_name attribute is used for determining which feather should be replaced on the current guiding curve, if the value of the variable_name on the current guiding curve corresponds to only one feather with the same value, the feather is directly replaced, and if the value corresponds to a plurality of feathers with the same value, one feather is randomly selected from the corresponding plurality of feathers to be replaced.
In this embodiment, by acquiring the basic object frame and extracting each object attribute parameter value corresponding to the basic object frame, object data corresponding to different object attribute parameter values are flexibly and rapidly generated based on the basic object frame and each object attribute parameter value. The object data comprises an object hair curve, an object grid body and an object map, and when the object data is subjected to format matching processing according to the target engine, the target object data matched with the target engine can be determined, so that the effect of better adapting to unused processing engines or application programs by generating various different forms of object data and avoiding the situations of mismatching of formats and incapability of applying processing is achieved. The method comprises the steps of generating object data in different forms, carrying out deformation processing on different object data, and adapting to an unused processing engine or application program better. And further, the target engine is called to conduct rendering processing on target deformation data matched with the target engine, so that a target deformation object with a realistic effect is obtained, the production efficiency of the target deformation object with the realistic effect is improved, and the resource consumption in the production process of producing the target deformation object is reduced.
It should be understood that, although the steps in the flowcharts related to the above embodiments are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an object processing device for realizing the above-mentioned object processing method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation of one or more embodiments of the object processing device provided below may refer to the limitation of the object processing method hereinabove, and will not be repeated herein.
In one embodiment, as shown in fig. 20, there is provided an object processing apparatus including: an object attribute parameter value extraction module 2002, an object data generation module 2004, a target object data determination module 2006, and a target object generation module 2008, wherein:
the object attribute parameter value extraction module 2002 is configured to obtain a basic object frame, and extract each object attribute parameter value corresponding to the basic object frame.
The object data generating module 2004 is configured to generate object data based on the basic object framework and the object attribute parameter values, where the object data includes an object hair curve, an object mesh body, and an object map.
The target object data determining module 2006 is configured to perform format matching processing on the object data according to the target engine, and determine target object data matched with the target engine.
The target object generating module 2008 is configured to invoke a target engine, perform rendering processing on target object data, and generate a target object with a realistic effect.
In the object processing device, the basic object framework is obtained, and each object attribute parameter value corresponding to the basic object framework is extracted, so that the object data corresponding to different object attribute parameter values are flexibly and rapidly generated based on the basic object framework and each object attribute parameter value. The object data comprises an object hair curve, an object grid body and an object map, and when the object data is subjected to format matching processing according to the target engine, the target object data matched with the target engine can be determined, so that the effect of better adapting to unused processing engines or application programs by generating various different forms of object data and avoiding the situations of mismatching of formats and incapability of applying processing is achieved. Further, the target engine is called to conduct rendering processing on the target object data so as to generate a target object with a realistic effect, so that the production efficiency of the target object with the realistic effect is improved, and the resource consumption in the production process of the target object is reduced.
In one embodiment, the object data generation module is further configured to: modifying original attribute information of the basic object framework according to the attribute values of each object to obtain an object with updated attribute information; acquiring the type of the proxy grid and the number of grid segments, and generating an object grid body based on the updated object of the attribute information, the type of the proxy grid and the number of grid segments; determining an object hair curve matching the object with the updated attribute; the object hair curve includes a plurality of curves for describing the shape of the object of the attribute update; performing conversion treatment on the object hair curve to obtain an object polygon corresponding to the object hair curve; and baking based on the object polygon and the object grid body to obtain the object map.
In one embodiment, the target object data includes a target object hair curve, a target object mesh volume, and a target object map; the target object generation module is further used for: invoking a target engine, and performing rendering treatment on a target object hair curve to generate a target object with a realistic effect; or combining the target object grid body and the target object mapping to obtain target light-weight object data; and calling a target engine to render the target light-weight object data, and generating a target object with a realistic effect.
In one embodiment, there is provided an object processing apparatus including: and the object attribute parameter value extraction module is used for acquiring the basic object framework and extracting each object attribute parameter value corresponding to the basic object framework. And the object data generation module is used for generating object data based on the basic object framework and the attribute parameter values of each object, wherein the object data comprises an object hair curve, an object grid body and an object map. And the deformation processing module is used for calling a guide curve to perform deformation processing on the object data to obtain deformed object data, and the guide curve is used for controlling the length and the growth direction of the object. The target deformation object generation module is used for carrying out format matching processing on deformation object data according to a target engine, determining target deformation object data matched with the target engine, and the target engine is used for carrying out rendering processing on the target deformation data matched with the target engine to obtain a target deformation object with a writing effect.
In one embodiment, the deformation processing module is further configured to: determining a transformation matrix associated with the guide curve; and according to the transformation matrix, transforming the object data onto the guide curve to obtain deformed object data.
In one embodiment, the deformation processing module is further configured to: randomly extracting successive first and second data points from the guide curve; acquiring a first position vector of a first data point and a second position vector of a second data point; determining a transformation vector based on the first and second position vectors; and calling a preset objective function, and performing coordinate conversion processing on the transformation vector to obtain a transformation matrix associated with the guide curve.
In one embodiment, there is provided an object processing apparatus, further including an object data scaling processing module configured to: determining a scaling ratio based on the size information of the object data and the size information of the guide curve; and performing scaling processing on the size information of the object data according to the scaling ratio to obtain the object data after the scaling processing.
In one embodiment, the deformation processing module is further configured to: according to the scaling ratio, scaling the transformation matrix to obtain a scaled transformation matrix; and according to the transformation matrix after the scaling processing, performing rotation processing and offset processing on the object data after the scaling processing, and transforming the object data onto a guide curve to obtain deformed object data.
In one embodiment, the deformation processing module is further configured to: performing rotation processing on the direction of the object data after the scaling processing to obtain object data with direction converted; sequentially mapping the object data with the direction of transformation to a preset range according to the sequence from the bottom to the top of the object data; according to a preset range, matching the scaled transformation matrix, and obtaining a target position successfully matched from the transformation matrix; performing rotation processing on the object data subjected to orientation transformation based on the transformation matrix subjected to the scaling processing to obtain object data subjected to orientation standard; and performing position offset processing on the object data facing the standard according to the target position, and deforming the object data onto the guide curve to obtain deformed object data.
In one embodiment, there is provided an object processing apparatus, further comprising: the attribute value extraction module is used for acquiring a preset target attribute and extracting a first attribute value and a second attribute value corresponding to the preset target attribute; the first attribute value is associated with the object data, and the second attribute value is associated with the guide curve; the first replacement processing module is used for replacing the object data onto the guide curve if the first attribute value corresponds to the second attribute value one by one to obtain replaced object data; and the second replacement processing module is used for randomly determining data to be replaced based on each object data if the second attribute value corresponds to a plurality of first attribute values or if the first attribute value matched with the second attribute value does not exist, and replacing the data to be replaced on the guide curve to obtain the replaced object data.
The respective modules in the above-described object processing apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 21. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing data such as a basic object framework, each object attribute parameter value corresponding to the basic object framework, an object hair curve, an object grid body, an object map, a target engine, target object data, a target object with a realistic effect and the like. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an object handling method.
It will be appreciated by persons skilled in the art that the architecture shown in fig. 21 is merely a block diagram of some of the architecture relevant to the present inventive arrangements and is not limiting as to the computer device to which the present inventive arrangements are applicable, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, carries out the steps of the method embodiments described above.
In an embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the embodiments provided herein may include at least one of a relational database and a non-relational database. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processor referred to in the embodiments provided in the present application may be a general-purpose processor, a central processing unit, a graphics processor, a digital signal processor, a programmable logic unit, a data processing logic unit based on quantum computing, or the like, but is not limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the application and are described in detail herein without thereby limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of the application should be assessed as that of the appended claims.

Claims (13)

1. An object processing method, the method comprising:
acquiring a basic object frame, and extracting each object attribute parameter value corresponding to the basic object frame;
generating object data based on the base object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
Performing format matching processing on the object data according to a target engine, and determining target object data matched with the target engine;
and calling the target engine to perform rendering processing on the target object data to generate a target object with a realistic effect.
2. The method according to claim 1, wherein the method further comprises:
invoking a guide curve to deform the object data to obtain deformed object data; the guide curve is used for controlling the length and the growth direction of the object;
performing format matching processing on the deformed object data according to the target engine, and determining target deformed object data matched with the target engine; the target engine is used for rendering target deformation data matched with the target engine to obtain a target deformation object with a realistic effect.
3. The method of claim 2, wherein the invoking the guide curve to deform the object data to obtain deformed object data comprises:
determining a transformation matrix associated with the guide curve;
and according to the transformation matrix, transforming the object data to the guide curve to obtain deformed object data.
4. A method according to claim 3, further comprising, prior to said determining a transformation matrix associated with said guide curve: determining a scaling ratio based on the size information of the object data and the size information of the guide curve; according to the scaling ratio, scaling the size information of the object data to obtain scaled object data;
the transforming the object data onto the guiding curve according to the transformation matrix to obtain deformed object data includes:
according to the scaling ratio, scaling the transformation matrix to obtain a scaled transformation matrix;
and according to the transformation matrix after the scaling processing, performing rotation processing and offset processing on the object data after the scaling processing, and transforming the object data onto the guide curve to obtain deformed object data.
5. The method according to claim 4, wherein the performing rotation processing and offset processing on the scaled object data according to the scaled transformation matrix, transforming the object data onto the guiding curve, and obtaining deformed object data includes:
Performing rotation processing on the direction of the object data after the scaling processing to obtain the object data with the direction converted;
sequentially mapping the object data with the direction converted to a preset range according to the sequence from the bottom to the top of the object data;
according to the preset range, matching the scaled transformation matrix, and obtaining a target position successfully matched from the transformation matrix;
performing rotation processing on the object data subjected to orientation transformation based on the transformation matrix subjected to scaling processing to obtain the object data subjected to orientation standard;
and performing position offset processing on the object data facing the standard according to the target position, and deforming the object data onto the guide curve to obtain deformed object data.
6. A method according to claim 3, wherein said determining a transformation matrix associated with said guide curve comprises:
randomly extracting successive first and second data points from the guide curve;
acquiring a first position vector of the first data point and a second position vector of the second data point;
determining a transformation vector based on the first and second position vectors;
And calling a preset objective function, and performing coordinate conversion processing on the transformation vector to obtain a transformation matrix associated with the guide curve.
7. The method of claim 1, wherein generating object data based on the base object framework and each of the object attribute parameter values comprises:
modifying the original attribute information of the basic object framework according to each object attribute value to obtain an object with updated attribute information;
acquiring the type and the number of grid segments of the proxy grid, and generating an object grid body based on the updated object of the attribute information, the type and the number of grid segments of the proxy grid;
determining an object hair curve matching the object of the attribute update; the object hair curve comprises a plurality of curves for describing the styling of the object of the attribute update;
performing conversion treatment on the object hair curve to obtain an object polygon corresponding to the object hair curve;
and baking based on the object polygon and the object grid body to obtain an object map.
8. The method according to claim 2, wherein the method further comprises:
Acquiring a preset target attribute, and extracting a first attribute value and a second attribute value corresponding to the preset target attribute; the first attribute value is associated with the object data, and the second attribute value is associated with the guide curve;
if the first attribute value corresponds to the second attribute value one by one, replacing the object data on the guide curve to obtain replaced object data;
and if the second attribute value corresponds to a plurality of first attribute values or if the first attribute value matched with the second attribute value does not exist, randomly determining data to be replaced based on each object data, and replacing the data to be replaced on the guide curve to obtain replaced object data.
9. The method according to any one of claims 1 to 8, wherein the target object data comprises a target object hair curve, a target object mesh body, and a target object map; the calling the target engine, performing rendering processing on the target object data, and generating a target object with a realistic effect, including:
invoking the target engine, and rendering the target object hair curve to generate a target object with a realistic effect;
Or (b)
Combining the target object grid body and the target object map to obtain target lightweight object data; and calling the target engine to render the target light-weight object data, and generating a target object with a realistic effect.
10. An object processing apparatus, the apparatus comprising:
the object attribute parameter value extraction module is used for obtaining a basic object frame and extracting each object attribute parameter value corresponding to the basic object frame;
an object data generating module, configured to generate object data based on the basic object framework and each of the object attribute parameter values; the object data comprises an object hair curve, an object grid body and an object map;
the target object data determining module is used for carrying out format matching processing on the object data according to a target engine and determining target object data matched with the target engine;
and the target object generation module is used for calling the target engine, rendering the target object data and generating a target object with a realistic effect.
11. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 9 when the computer program is executed.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 9.
13. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any one of claims 1 to 9.
CN202211287170.3A 2022-10-20 2022-10-20 Object processing method, apparatus, computer device, storage medium, and program product Pending CN117011426A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211287170.3A CN117011426A (en) 2022-10-20 2022-10-20 Object processing method, apparatus, computer device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211287170.3A CN117011426A (en) 2022-10-20 2022-10-20 Object processing method, apparatus, computer device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN117011426A true CN117011426A (en) 2023-11-07

Family

ID=88560671

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211287170.3A Pending CN117011426A (en) 2022-10-20 2022-10-20 Object processing method, apparatus, computer device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN117011426A (en)

Similar Documents

Publication Publication Date Title
US11587300B2 (en) Method and apparatus for generating three-dimensional virtual image, and storage medium
US10540817B2 (en) System and method for creating a full head 3D morphable model
WO2020207270A1 (en) Three-dimensional face reconstruction method, system and apparatus, and storage medium
EP3992919B1 (en) Three-dimensional facial model generation method and apparatus, device, and medium
US20220012930A1 (en) Artificial Intelligence-Based Animation Character Control and Drive Method and Apparatus
EP3855386B1 (en) Method, apparatus, device and storage medium for transforming hairstyle and computer program product
CN111507259B (en) Face feature extraction method and device and electronic equipment
CN113822965A (en) Image rendering processing method, device and equipment and computer storage medium
CN114202615A (en) Facial expression reconstruction method, device, equipment and storage medium
US20240193822A1 (en) Local attribute image editing using an image generation model and a feature image generation model
CN115330979A (en) Expression migration method and device, electronic equipment and storage medium
US20230342942A1 (en) Image data processing method, method and apparatus for constructing digital virtual human, device, storage medium, and computer program product
WO2023130819A1 (en) Image processing method and apparatus, and device, storage medium and computer program
CN117011426A (en) Object processing method, apparatus, computer device, storage medium, and program product
CN116310113A (en) Style digital person generation method, device, equipment and readable storage medium
CN115482481A (en) Single-view three-dimensional human skeleton key point detection method, device, equipment and medium
CN113762059A (en) Image processing method and device, electronic equipment and readable storage medium
CN115631516A (en) Face image processing method, device and equipment and computer readable storage medium
CN117557699B (en) Animation data generation method, device, computer equipment and storage medium
TWI723547B (en) Style transfer method and computer program product thereof
US20240013500A1 (en) Method and apparatus for generating expression model, device, and medium
CN117011430A (en) Game resource processing method, apparatus, device, storage medium and program product
Yu Research on the Application of Artificial Intelligence Painting in Digital Media Illustration Design
CN116977605A (en) Virtual character image model generation method, device and computer equipment
CN116977508A (en) Expression migration method and device, computer equipment, medium and product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination