CN111714883B - Mapping processing method and device and electronic equipment - Google Patents

Mapping processing method and device and electronic equipment Download PDF

Info

Publication number
CN111714883B
CN111714883B CN202010568630.4A CN202010568630A CN111714883B CN 111714883 B CN111714883 B CN 111714883B CN 202010568630 A CN202010568630 A CN 202010568630A CN 111714883 B CN111714883 B CN 111714883B
Authority
CN
China
Prior art keywords
mapping
target object
map
point
details
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010568630.4A
Other languages
Chinese (zh)
Other versions
CN111714883A (en
Inventor
蒋旭毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010568630.4A priority Critical patent/CN111714883B/en
Publication of CN111714883A publication Critical patent/CN111714883A/en
Application granted granted Critical
Publication of CN111714883B publication Critical patent/CN111714883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/92Dynamic range modification of images or parts thereof based on global image properties

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a method and a device for processing a map and electronic equipment, and relates to the technical field of map processing, wherein the method comprises the following steps: copying the seam repair mapping of the target object to obtain a first mapping and a second mapping; processing the first mapping to obtain a mapping containing high-point details of the target object; processing the second mapping to obtain a mapping containing low-point details of the target object; performing color restoration on the joint restoration mapping to obtain a color restoration mapping of the target object; and generating an optimized map of the target object according to the map containing the high-point details of the target object, the map containing the low-point details of the target object and the color restoration map. The method and the device can quickly find the details of the high and low points of the mapping, and improve the mapping precision.

Description

Mapping processing method and device and electronic equipment
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method and an apparatus for processing a map, and an electronic device.
Background
Compared with the traditional online game, the next generation online game can integrate the next generation game development technology into the online game, and the aim of improving the picture effect of the online game is achieved by increasing the data volume of the model and the map and using the next generation game engine.
In the existing next generation game model, the method for manufacturing the realistic skin map generally comprises the following steps: the method is characterized in that a multi-angle real photo is selected for projection manufacture, joints are arranged between the projected pictures, joint repair is needed to be carried out on a plurality of projected pictures, the joint repair can affect the picture precision, and due to the influence of natural shadow information, a plurality of strange colors or black shadows can appear in the pictures, under the condition, color processing is needed to be carried out on the pictures, dirty colors and shadows are removed, so that the picture texture details are lost in a large area, and the picture precision is low.
Disclosure of Invention
The application aims to provide a mapping processing method, a mapping processing device and electronic equipment, so as to at least partially solve the technical problems of texture detail loss and lower mapping precision in the existing mapping processing process.
In a first aspect, an embodiment of the present application provides a method for processing a map, where the method includes: copying the seam repair mapping of the target object to obtain a first mapping and a second mapping; processing the first mapping to obtain a mapping containing high-point details of the target object; processing the second mapping to obtain a mapping containing low-point details of the target object; performing color restoration on the joint restoration mapping to obtain a color restoration mapping of the target object; and generating an optimized map of the target object according to the map containing the high-point details of the target object, the map containing the low-point details of the target object and the color restoration map.
Further, the step of processing the first map to obtain a map including high-point details of the target object includes: carrying out de-coloring treatment on the first mapping; carrying out high contrast retention treatment on the first mapping after the de-coloring treatment; and performing color level parameter adjustment on the first mapping after the high contrast retention processing to obtain the mapping containing the high-point details of the target object.
Further, the step of processing the second map to obtain a map including low-point details of the target object includes: carrying out de-coloring treatment on the second mapping; performing reverse phase treatment on the second mapping after the de-coloring treatment; performing high contrast retention treatment on the second mapping after the inversion treatment; performing color level parameter adjustment on the second mapping after the high contrast retention treatment; and carrying out reverse phase processing on the second mapping after the color level parameter adjustment to obtain the mapping containing the low-point details of the target object.
Further, the step of generating the optimized map of the target object according to the map including the high-point details of the target object, the map including the low-point details of the target object, and the color restoration map includes: superposing the mapping containing the high-point details of the target object and the mapping containing the low-point details of the target object with the color restoration mapping; and taking the overlaid mapping as an optimized mapping of the target object.
Further, the step of superposing the map including the high-point details of the target object, the map including the low-point details of the target object, and the color restoration map includes: and calculating the gray value of the same pixel point in the mapping containing the high-point details of the target object, the mapping containing the low-point details of the target object and the color restoration mapping by a preset algorithm to obtain the gray value of each pixel point in the mapping after superposition processing.
Further, in the process of superposing the map including the high-point details of the target object, the map including the low-point details of the target object, and the color restoration map, the method further includes: and adjusting the transparency of the map containing the high-point details of the target object and the map containing the low-point details of the target object to respectively control the display intensity of the high-point details and the low-point details of the map obtained by the superposition processing.
Further, after the step of taking the superimposed map as the optimized map of the target object, the method further includes: and storing the optimization map of the target object.
Further, the method further comprises the following steps: acquiring a UV map of a target object; and performing seam repair on the UV map of the target object to obtain a seam repair map of the target object.
Further, the target object is a human skin.
In a second aspect, an embodiment of the present application further provides a mapping processing apparatus, where the apparatus includes: the mapping copying module is used for copying the seam repair mapping of the target object to obtain a first mapping and a second mapping; the first processing module is used for processing the first mapping to obtain a mapping containing high-point details of the target object; the second processing module is used for processing the second mapping to obtain a mapping containing low-point details of the target object; the color restoration module is used for performing color restoration on the joint restoration mapping to obtain a color restoration mapping of the target object; and the mapping generation module is used for generating an optimized mapping of the target object according to the mapping containing the high-point details of the target object, the mapping containing the low-point details of the target object and the color restoration mapping.
In a third aspect, an embodiment of the present application further provides an electronic device, including a processor and a memory, where the memory stores computer executable instructions executable by the processor, where the processor executes the computer executable instructions to implement the above method.
In a fourth aspect, embodiments of the present application also provide a computer-readable storage medium storing computer-executable instructions that, when invoked and executed by a processor, cause the processor to implement the above-described method.
In the method for processing the map provided by the embodiment of the application, two preset processing procedures are respectively carried out on the seam repair map of the target object to generate the map containing the high-point details of the target object and the map containing the low-point details of the target object, then the map containing the high-point details of the target object and the map containing the low-point details of the target object are overlapped with the color repair map which is obtained by carrying out color repair on the seam repair map in advance, so that the optimized map of the target object is obtained.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present application, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of a method for processing a map according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for processing a first map according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a high-point detail extraction plug-in provided by an embodiment of the present application;
FIG. 4 is a flowchart of a second mapping processing method according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a low-point detail extraction plug-in provided by an embodiment of the present application;
FIG. 6 is a flowchart of a method for generating a map according to an embodiment of the present application;
Fig. 7 is a schematic structural diagram of a mapping processing apparatus according to an embodiment of the present application;
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions of the present application will be clearly and completely described in connection with the embodiments, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
When developing games, modeling is needed to be carried out by using three-dimensional software, and the most important work after modeling is to make a mapping for the characters, wherein the accuracy of the mapping directly influences the display effect of the characters. In the existing mapping treatment mode, more skin mapping is made to be used for skin projection of a real photo, seam repair is carried out on a plurality of mapping after projection, the seam repair can cause that a part of texture details of the mapping are lost, precision loss is caused, in addition, under the projection condition, due to the influence of natural shadow information, a plurality of strange colors or black shadows can appear, further, manual treatment on dirty colors and shadows of the skin mapping is needed, and large-area loss of the skin texture details is possibly caused when the mapping is manually repaired, so that the mapping precision is lower.
Based on the above, the embodiment of the application provides a method and a device for processing a map, and an electronic device, which can quickly find high-point details and low-point details in the map, and improve the accuracy of the map.
Fig. 1 is a flowchart of a method for processing a map, where the method is executed on an electronic device and can automatically complete a processing procedure of the map, and the method for processing a map specifically includes the following steps:
step S102, performing copy operation on the seam repair map of the target object to obtain a first map and a second map.
The target object may be human skin, or may be a complex-textured object such as hair or clothing. In practical application, firstly, mapping projection is carried out on a model based on images of a plurality of angles of a target object to obtain a plurality of maps, then the maps are spliced, joints are formed after the maps are spliced, and then the joints are repaired, so that the joint repair map of the target object can be obtained. And copying the seam repair map of the target object to obtain two identical seam repair maps, namely a first map and a second map.
Step S104, the first map is processed to obtain a map containing high-point details of the target object.
Step S106, the second mapping is processed to obtain a mapping containing the low-point details of the target object.
The processing of the first mapping and the second mapping can be completed through plug-ins, and high-point details and low-point details of the target object are automatically generated to obtain the mapping containing the high-point details and the low-point details of the target object. The steps S104 and S106 may be performed simultaneously without any sequence.
Step S108, performing color restoration on the joint restoration map to obtain a color restoration map of the target object;
And (3) performing color restoration on the joint restoration mapping, so that dirty colors and shadows in the mapping can be removed, and the color restoration mapping of the target object can be obtained. There are various ways of color restoration, such as: whitening, skin grinding, adding filters, etc., and are not described in detail herein. This step may be performed simultaneously without any sequence with step S102.
Step S110, generating an optimized map of the target object according to the map containing the high-point details of the target object, the map containing the low-point details of the target object and the color restoration map.
After the high-point detail mapping, the low-point detail mapping and the color restoration mapping of the target object are obtained, the optimized mapping can be obtained by superposing the high-point detail mapping, the low-point detail mapping and the color restoration mapping of the target object, and the optimized mapping is relatively high in precision, and clear and vivid in texture.
In the method for processing the map provided by the embodiment of the application, two preset processing processes are respectively carried out on the seam repair map of the target object to generate the map containing the high-point details of the target object and the map containing the low-point details of the target object, then the map containing the high-point details of the target object and the map containing the low-point details of the target object are overlapped with the color repair map which is obtained by carrying out color repair on the seam repair map in advance, so that the optimized map of the target object is obtained.
In order to implement the fast and efficient high-point detail generation process, a map including high-point details is obtained, in the step S104: processing the first map to obtain a map containing high-point details of the target object may be implemented with reference to steps of a flowchart of a processing method of the first map shown in fig. 2:
in step S202, the first map is subjected to a color removal process.
The color removal process is to remove the color in the first map and display only black and white, that is, adjust the RGB value of each pixel in the first map, and assign equal red, green and blue values to each pixel, so that the image is displayed in gray scale. The first map after the color removal process is a gray scale image.
In step S204, the first map after the color removal process is subjected to a high contrast retaining process.
The high contrast preserving process mainly preserves the juncture of the color and the light and shade contrast in the image, the obvious lines can be preserved, and the lines are not obvious but are adjusted to gray, so that the high contrast part in the image is enhanced. More precisely, edge details are preserved at a specified radius where there is a strong color transition occurring, and the rest of the image is not displayed. For example, a 0.1 pixel radius only preserves edge pixels.
For example, a person's eyes, eyebrows, mouth, hair, etc. are present in a relatively high contrast portion of a face map. If the high contrast preservation process with preset radius is performed, the information with larger contrast is left, and the rest information such as large area of facial skin is adjusted to gray, so that eyes, eyebrows, mouth, hair and gray face are in sharp contrast, that is, the texture details in the face map are emphasized more.
In this step, the preset radius reserved by the high contrast can be adjusted according to the actual mapping effect (mainly checking whether the facial pores in the face mapping are obvious during adjustment) so as to determine the radius of the optimal effect.
Step S206, the first mapping after the high contrast preservation process is subjected to the tone scale parameter adjustment to obtain a mapping containing high point details of the target object.
The gradation is an index standard indicating the intensity of an image, and in digital image processing, refers to a gradation resolution (also referred to as a gradation resolution or an amplitude resolution). The color fullness and definition of an image is determined by the color level. The gradation is independent of the brightness and color, and is white when the gradation is brightest and black when the gradation is brightest.
In this embodiment, the tone scale parameter is adjusted to a fixed value 128, so that the high contrast portion, i.e. the texture detail, in the high contrast portion of the map after the high contrast preservation process is more prominent, and the high point detail in the map is more obvious.
As shown in FIG. 3, the process of generating the high-point detail-containing map can be completed by a plug-in, and the three operations of color removal, high contrast preservation and tone scale parameter adjustment are automatically performed, so that the process of mapping the first map is realized, and the high-point detail-containing map of the target object is obtained.
In order to implement the fast and efficient low-point detail generation process, a map including low-point details is obtained, in step S106: processing the second map to obtain a map containing low-point details of the target object may be implemented with reference to steps of a flowchart of a processing method of the second map shown in fig. 4:
In step S402, the second map is subjected to a color removal process.
The process of the color removal is the same as the step S202, and will not be described here again. The second map after the color removal process is a gray scale image.
Step S404, performing inverse processing on the second mapping after the de-coloring processing;
The inversion process is to invert the color of the image, for example, to change black to white, blue to yellow, red to green, etc., that is, to change the color in the image to its corresponding complementary color. For the second mapping after the de-coloring treatment, the black, white and black in the gray image are changed into black, and different gray is correspondingly inverted. For example, the nostrils of the second map after the de-coloring treatment are black, and then the nostrils after the inverting treatment are white. The inversion process in this step is to allow the low-point detail portion to be subjected to a high contrast preservation process.
Step S406, high contrast retaining processing is carried out on the second mapping after the inversion processing;
The high contrast preservation process is the same as that in step S204, and the high contrast preservation in step S204 is performed on the post-depigmented map to obtain clearer high-point details in the facial contours and the hair texture, and the details are black or dark gray. In the step, the operation is carried out on the mapping after the de-coloring treatment and the inversion treatment, and the final mapping is clear in the outline details of the five sense organs and the details of low points in the hair textures and is white or light gray.
In this step, the preset radius reserved by the high contrast can be adjusted according to the actual mapping effect (mainly checking whether the facial pores in the face mapping are obvious during adjustment) so as to determine the radius of the optimal effect.
Step S408, performing tone scale parameter adjustment on the second mapping after the high contrast preservation process;
The adjustment of the tone scale parameters is the same as that in step S206, so that the high contrast portion, i.e. the low point texture detail, in the high contrast retaining processed map is more prominent, and the low point detail in the map is more obvious, thus better low point effect can be shown when the maps are subsequently superimposed. In this embodiment, the tone scale parameter adjustment still adjusts the tone scale value to a fixed value 128, thereby obtaining a map containing low-point details. In the mapping after the color level parameter adjustment, the details of low points in the mapping, such as the outline of the five sense organs, the hair texture and the details of low points in the facial pores are clearer and are black or dark gray.
In step S410, the second map after the adjustment of the tone scale parameter is processed in an inverse manner to obtain a map including the low-point details of the target object.
After the low-point details of the target object are found through the first steps, the low-point details can be made to appear white or light gray through inverting processing again, so that the low-point details can be recessed after the subsequent overlay mapping, and a better low-point detail display effect is achieved.
As shown in FIG. 5, the process of generating the map including the low-point details can also be completed by a plug-in, and the five operations of the above-mentioned color removal, the inversion, the high contrast preservation, the tone scale parameter adjustment and the inversion are automatically performed, so as to implement the automatic processing process of the second map, and obtain the map including the low-point details of the target object.
In another embodiment, the seam repair map of the target object may be first subjected to a de-coloring process, then the de-coloring process is performed on the map to obtain two identical first and second de-colored maps, and then two different processing procedures are performed on the first and second de-colored maps, respectively, to obtain a map including high-point details and a map including low-point details.
In order to improve the accuracy of mapping, in the embodiment of the present application, the step of generating the optimized mapping of the target object according to the mapping including the high-point details of the target object, the mapping including the low-point details of the target object, and the color restoration mapping may be implemented with reference to the flowchart of the optimized mapping generation method shown in fig. 6:
in step S602, the overlay including the high-point details of the target object, the overlay including the low-point details of the target object, and the color restoration overlay are superimposed.
In specific implementation, the gray value of the same pixel point in the map including the high-point detail of the target object, the map including the low-point detail of the target object and the color restoration map can be calculated by a preset algorithm to obtain the gray value of each pixel point in the map after the superposition processing. The preset algorithm may be one of different algorithms corresponding to different superposition modes, which has an optimal effect on the detailed display of the embodiment.
In step S604, the superimposed map is used as an optimized map of the target object. The optimization map of the target object may further be saved.
In order to further improve the display effect of the high-point details and the low-point details in the optimized mapping, the embodiment of the application can also adjust the transparency of the mapping containing the high-point details of the target object and the mapping containing the low-point details of the target object so as to respectively control the display intensity of the high-point details and the low-point details of the mapping obtained by the superposition processing.
In another implementation manner, the embodiment of the present application may further obtain a seam repair map of the target object by:
(1) Acquiring a UV map of a target object; in practical application, firstly, projection mapping is performed based on real images of multiple angles of a target object to obtain multiple maps, and then the multiple maps are spliced to obtain the UV maps of the target object.
(2) And performing seam repair on the UV map of the target object to obtain a seam repair map of the target object.
The processing method of the map provided by the embodiment of the application can quickly find the details of the high points and the low points in the map based on the seam repair map of the target object, and superimpose the details of the high points and the low points in the color repair map of the target object to generate the optimized map with higher precision, thereby being convenient for the management of map manufacture of art suppliers and improving the working efficiency of art manufacture.
Based on the above method embodiment, the embodiment of the present application further provides a mapping processing apparatus, as shown in fig. 7, where the apparatus includes:
the map copying module 71 is configured to copy the seam repair map of the target object to obtain a first map and a second map; a first processing module 72, configured to process the first map to obtain a map containing high-point details of the target object; a second processing module 73, configured to process the second map to obtain a map containing low-point details of the target object; a color repair module 74 for performing color repair on the seam repair map to obtain a color repair map of the target object; the map generating module 75 is configured to generate an optimized map of the target object according to the map including the high-point details of the target object, the map including the low-point details of the target object, and the color restoration map.
In another possible embodiment, the first processing module 72 is further configured to: carrying out de-coloring treatment on the first mapping; carrying out high contrast retention treatment on the map after the color removal treatment; and (3) performing color level parameter adjustment on the high-contrast-reserved mapping to obtain the mapping containing the high-point details of the target object.
In another possible embodiment, the second processing module 73 is further configured to: carrying out de-coloring treatment on the second mapping; carrying out reverse phase treatment on the map after the de-coloring treatment; carrying out high contrast retention treatment on the map subjected to the inversion treatment; performing color level parameter adjustment on the high contrast retaining processed mapping; and carrying out reverse phase processing on the mapping after the tone scale parameter adjustment to obtain the mapping containing the low-point details of the target object.
In another possible implementation, the map generation module 75 is further configured to: superposing the mapping containing the high-point details of the target object and the mapping containing the low-point details of the target object with the color restoration mapping; and taking the overlaid mapping as an optimized mapping of the target object.
In another possible implementation, the map generation module 75 is further configured to: and calculating the gray value of the same pixel point in the mapping containing the high-point details of the target object, the mapping containing the low-point details of the target object and the color restoration mapping by a preset algorithm to obtain the gray value of each pixel point in the mapping after superposition processing.
In another possible implementation, the map generation module 75 is further configured to: and adjusting the transparency of the map containing the high-point details of the target object and the map containing the low-point details of the target object to respectively control the display intensity of the high-point details and the low-point details of the map obtained by the superposition processing.
In another possible implementation, the map generation module 75 is further configured to: and storing the optimization map of the target object.
In another possible implementation, the map replication module 71 is further configured to: acquiring a UV map of a target object; and performing seam repair on the UV map of the target object to obtain a seam repair map of the target object.
In another possible embodiment, the target object is human skin.
The implementation principle and the technical effects of the mapping processing device provided by the embodiment of the present application are the same as those of the foregoing mapping processing method embodiment, and for brevity, reference may be made to corresponding contents in the foregoing mapping processing method embodiment where the embodiment of the mapping processing device is not mentioned.
The embodiment of the application further provides an electronic device, as shown in fig. 8, which is a schematic structural diagram of the electronic device, wherein the electronic device includes a processor 81 and a memory 80, the memory 80 stores computer executable instructions that can be executed by the processor 81, and the processor 81 executes the computer executable instructions to implement the method.
In the embodiment shown in fig. 8, the electronic device further comprises a bus 82 and a communication interface 83, wherein the processor 81, the communication interface 83 and the memory 80 are connected by the bus 82.
The memory 80 may include a high-speed random access memory (RAM, random Access Memory), and may further include a non-volatile memory (non-volatile memory), such as at least one disk memory. The communication connection between the system network element and at least one other network element is implemented via at least one communication interface 83 (which may be wired or wireless), and may use the internet, a wide area network, a local network, a metropolitan area network, etc. Bus 82 may be an ISA (Industry Standard Architecture ) bus, a PCI (PERIPHERAL COMPONENT INTERCONNECT, peripheral component interconnect standard) bus, or EISA (Extended Industry Standard Architecture ) bus, among others. The bus 82 may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one bi-directional arrow is shown in FIG. 8, but not only one bus or type of bus.
The processor 81 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 81 or by instructions in the form of software. The processor 81 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), and the like; but may also be a digital signal Processor (DIGITAL SIGNAL Processor, DSP), application Specific Integrated Circuit (ASIC), field-Programmable gate array (FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor 81 reads the information in the memory, and in combination with its hardware, performs the steps of the method of the previous embodiment.
The embodiment of the application also provides a computer readable storage medium, which stores computer executable instructions that, when being called and executed by a processor, cause the processor to implement the above method, and the specific implementation can refer to the foregoing method embodiment and will not be described herein.
The method, the apparatus and the computer program product of the electronic device for processing the map provided in the embodiments of the present application include a computer readable storage medium storing program codes, where the instructions included in the program codes may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment and will not be repeated herein.
The relative steps, numerical expressions and numerical values of the components and steps set forth in these embodiments do not limit the scope of the present application unless it is specifically stated otherwise.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the description of the present application, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present application and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: the above examples are only specific embodiments of the present application, and are not intended to limit the scope of the present application, but it should be understood by those skilled in the art that the present application is not limited thereto, and that the present application is described in detail with reference to the foregoing examples: any person skilled in the art may modify or easily conceive of the technical solution described in the foregoing embodiments, or perform equivalent substitution of some of the technical features, while remaining within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (8)

1. A method of processing a map, the method comprising:
copying the seam repair mapping of the target object to obtain a first mapping and a second mapping;
Sequentially carrying out de-coloring treatment, high contrast retention treatment and color level parameter adjustment on the first mapping to obtain a mapping containing high-point details of the target object;
Sequentially carrying out a de-coloring process, an inversion process, a high contrast retaining process, a tone scale parameter adjustment and an inversion process on the second map to obtain a map containing low-point details of the target object;
Performing color restoration on the seam restoration mapping to obtain a color restoration mapping of the target object;
Calculating the gray value of the same pixel point in the mapping containing the high-point detail of the target object, the mapping containing the low-point detail of the target object and the color restoration mapping by a preset algorithm to obtain the gray value of each pixel point in the mapping after superposition processing; and taking the overlay-processed mapping as an optimized mapping of the target object.
2. The method of claim 1, wherein superimposing the map containing the high-point details of the target object, the map containing the low-point details of the target object, and the color repair map further comprises:
And adjusting transparency of the map containing the high-point details of the target object and the map containing the low-point details of the target object to respectively control display intensity of the high-point details and the low-point details of the map obtained by superposition processing.
3. The method of claim 1, further comprising, after the step of taking the overlaid map as the optimized map for the target object:
And storing the optimization map of the target object.
4. The method according to claim 1, wherein the method further comprises:
Acquiring a UV map of the target object;
and performing seam repair on the UV map of the target object to obtain a seam repair map of the target object.
5. The method of any one of claims 1 to 4, wherein the target subject is human skin.
6. A mapping processing apparatus, the apparatus comprising:
The mapping copying module is used for copying the seam repair mapping of the target object to obtain a first mapping and a second mapping;
The first processing module is used for sequentially carrying out the de-coloring treatment, the high contrast retaining treatment and the tone scale parameter adjustment on the first mapping to obtain a mapping containing high-point details of the target object;
the second processing module is used for sequentially carrying out the de-coloring treatment, the inversion treatment, the high contrast retention treatment, the tone scale parameter adjustment and the inversion treatment on the second mapping to obtain a mapping containing the low-point details of the target object;
the color restoration module is used for performing color restoration on the joint restoration mapping to obtain a color restoration mapping of the target object;
the mapping generation module is used for calculating the gray value of the same pixel point in the mapping containing the high-point detail of the target object, the mapping containing the low-point detail of the target object and the color restoration mapping by a preset algorithm to obtain the gray value of each pixel point in the mapping after superposition processing; and taking the overlay-processed mapping as an optimized mapping of the target object.
7. An electronic device comprising a processor and a memory, the memory storing computer-executable instructions executable by the processor, the processor executing the computer-executable instructions to implement the method of any one of claims 1 to 5.
8. A computer readable storage medium storing computer executable instructions which, when invoked and executed by a processor, cause the processor to implement the method of any one of claims 1 to 5.
CN202010568630.4A 2020-06-19 2020-06-19 Mapping processing method and device and electronic equipment Active CN111714883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010568630.4A CN111714883B (en) 2020-06-19 2020-06-19 Mapping processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010568630.4A CN111714883B (en) 2020-06-19 2020-06-19 Mapping processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN111714883A CN111714883A (en) 2020-09-29
CN111714883B true CN111714883B (en) 2024-06-04

Family

ID=72568530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010568630.4A Active CN111714883B (en) 2020-06-19 2020-06-19 Mapping processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN111714883B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114663296A (en) * 2022-02-17 2022-06-24 广东时谛智能科技有限公司 Interactive normal map concave-convex removing method, system, medium and equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338332A (en) * 2014-08-15 2016-02-17 联想(北京)有限公司 Information processing method and electronic equipment
CN105744159A (en) * 2016-02-15 2016-07-06 努比亚技术有限公司 Image synthesizing method and device
WO2017206400A1 (en) * 2016-05-30 2017-12-07 乐视控股(北京)有限公司 Image processing method, apparatus, and electronic device
WO2018039936A1 (en) * 2016-08-30 2018-03-08 Microsoft Technology Licensing, Llc. Fast uv atlas generation and texture mapping
CN107886552A (en) * 2016-09-29 2018-04-06 网易(杭州)网络有限公司 Stick picture disposing method and apparatus
CN110458932A (en) * 2018-05-07 2019-11-15 阿里巴巴集团控股有限公司 Image processing method, device, system, storage medium and image scanning apparatus
CN111031301A (en) * 2018-10-10 2020-04-17 珠海全志科技股份有限公司 Method for adjusting color gamut space, storage device and display terminal
CN111052176A (en) * 2017-08-11 2020-04-21 三星电子株式会社 Seamless image stitching
CN111292389A (en) * 2020-02-19 2020-06-16 网易(杭州)网络有限公司 Image processing method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6776004B2 (en) * 2016-05-26 2020-10-28 キヤノン株式会社 Image processing equipment, image processing methods and programs

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338332A (en) * 2014-08-15 2016-02-17 联想(北京)有限公司 Information processing method and electronic equipment
CN105744159A (en) * 2016-02-15 2016-07-06 努比亚技术有限公司 Image synthesizing method and device
WO2017140182A1 (en) * 2016-02-15 2017-08-24 努比亚技术有限公司 Image synthesis method and apparatus, and storage medium
WO2017206400A1 (en) * 2016-05-30 2017-12-07 乐视控股(北京)有限公司 Image processing method, apparatus, and electronic device
WO2018039936A1 (en) * 2016-08-30 2018-03-08 Microsoft Technology Licensing, Llc. Fast uv atlas generation and texture mapping
CN107886552A (en) * 2016-09-29 2018-04-06 网易(杭州)网络有限公司 Stick picture disposing method and apparatus
CN111052176A (en) * 2017-08-11 2020-04-21 三星电子株式会社 Seamless image stitching
CN110458932A (en) * 2018-05-07 2019-11-15 阿里巴巴集团控股有限公司 Image processing method, device, system, storage medium and image scanning apparatus
CN111031301A (en) * 2018-10-10 2020-04-17 珠海全志科技股份有限公司 Method for adjusting color gamut space, storage device and display terminal
CN111292389A (en) * 2020-02-19 2020-06-16 网易(杭州)网络有限公司 Image processing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
3ds Max中利用法线贴图在低模上实现高模贴图效果的方法;佘为;;电视字幕(特技与动画)(第04期);全文 *

Also Published As

Publication number Publication date
CN111714883A (en) 2020-09-29

Similar Documents

Publication Publication Date Title
US11257286B2 (en) Method for rendering of simulating illumination and terminal
US10403036B2 (en) Rendering glasses shadows
US8917317B1 (en) System and method for camera calibration
US20210291056A1 (en) Method and Apparatus for Generating Game Character Model, Processor, and Terminal
CN109360254B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112215934A (en) Rendering method and device of game model, storage medium and electronic device
CN107993216A (en) A kind of image interfusion method and its equipment, storage medium, terminal
CN107507217A (en) Preparation method, device and the storage medium of certificate photo
CN106447604B (en) Method and device for transforming face picture in video
CN110827371B (en) Certificate generation method and device, electronic equipment and storage medium
US10957092B2 (en) Method and apparatus for distinguishing between objects
CN111714883B (en) Mapping processing method and device and electronic equipment
JP4219521B2 (en) Matching method and apparatus, and recording medium
CN109447931B (en) Image processing method and device
CN111447428A (en) Method and device for converting plane image into three-dimensional image, computer readable storage medium and equipment
CN111383311B (en) Normal map generation method, device, equipment and storage medium
TWI462027B (en) Image processing device and image processing method thereof
CN107403448A (en) Cost function generation method and cost function generating means
CN116894911A (en) Three-dimensional reconstruction method, three-dimensional reconstruction device, electronic equipment and readable storage medium
CN116188720A (en) Digital person generation method, device, electronic equipment and storage medium
CN114155569B (en) Cosmetic progress detection method, device, equipment and storage medium
CN112053434B (en) Disparity map generation method, three-dimensional reconstruction method and related device
WO2021184303A1 (en) Video processing method and device
CN111563839A (en) Fundus image conversion method and device
CN110751078B (en) Method and equipment for determining non-skin color region of three-dimensional face

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant