CN112717391B - Method, device, equipment and medium for displaying character names of virtual characters - Google Patents

Method, device, equipment and medium for displaying character names of virtual characters Download PDF

Info

Publication number
CN112717391B
CN112717391B CN202110082965.XA CN202110082965A CN112717391B CN 112717391 B CN112717391 B CN 112717391B CN 202110082965 A CN202110082965 A CN 202110082965A CN 112717391 B CN112717391 B CN 112717391B
Authority
CN
China
Prior art keywords
character
virtual
dimensional
name
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110082965.XA
Other languages
Chinese (zh)
Other versions
CN112717391A (en
Inventor
金雨嫣
赵宇浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110082965.XA priority Critical patent/CN112717391B/en
Publication of CN112717391A publication Critical patent/CN112717391A/en
Application granted granted Critical
Publication of CN112717391B publication Critical patent/CN112717391B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/85Providing additional services to players
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses a character name display method, device, equipment and medium for virtual characters, and belongs to the field of man-machine interaction. The method comprises the following steps: displaying a character name creation interface of the virtual character, wherein the character name creation interface is displayed with a name input control; responding to the input operation on the name input control, and displaying the character name in the name input control; a character name is displayed on a virtual prop used by a virtual character located in the virtual environment. The interactive mode realizes the immersive display effect that the character names pass through the two-dimensional name creation interface to the virtual world.

Description

Method, device, equipment and medium for displaying character names of virtual characters
Technical Field
The present invention relates to the field of man-machine interaction, and in particular, to a method, an apparatus, a device, and a medium for displaying a character name of a virtual character.
Background
The user plays a game in the virtual environment using the virtual character. For example, the user controls the virtual character 1 to race in the three-dimensional racing world, or the user controls the virtual character 2 to seek a baby in the three-dimensional repairing world.
In the related art, when a user creates a virtual character used by the user, a game program displays a character name creation interface on which an input box and a confirm button are displayed. The user inputs a character name such as "invincibility warrior" in the input box. Then, the user clicks the confirm button, and the character name of the virtual character is successfully created.
However, the character names are displayed only on the game loading interface, the blood bar of the virtual character, the chat interface, the settlement interface, and other interfaces.
Disclosure of Invention
The application provides a character name display method, device, equipment and medium for a virtual character, which realize the immersive display effect that the character name passes through from a two-dimensional name creation interface to a virtual world. The technical scheme at least comprises the following steps:
according to one aspect of the present application, there is provided a character name display method of a virtual character, the method including:
displaying a character name creation interface of the virtual character, wherein the character name creation interface is displayed with a name input control;
responding to the input operation on the name input control, and displaying the character name in the name input control;
A character name is displayed on a virtual prop used by a virtual character located in the virtual environment.
According to another aspect of the present application, there is provided a character name display device of a virtual character, the device including:
the display module is used for displaying a character name creation interface of the virtual character, and the character name creation interface is displayed with a name input control;
the display module is also used for responding to the input operation on the name input control and displaying the character name in the name input control;
and the display module is also used for displaying the character name on the virtual prop used by the virtual character in the virtual environment.
According to one aspect of the present application, there is provided a computer device comprising: a processor and a memory storing a computer program loaded and executed by the processor to implement the character name display method of the virtual character as described above.
According to another aspect of the present application, there is provided a computer-readable storage medium storing a computer program loaded and executed by a processor to implement the character name display method of a virtual character as described above.
According to another aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions so that the computer device performs the character name display method of the virtual character described above.
The beneficial effects that technical scheme that this application embodiment provided include at least:
the name input control is displayed on the character name creation interface, then the user inputs and confirms the character name of the virtual character on the name input control, and finally the character name is displayed on the virtual prop used by the virtual character in the virtual environment. The interactive mode realizes the immersive display effect that the character names pass through the two-dimensional name creation interface to the virtual world.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of a computer system provided by an exemplary embodiment;
FIG. 2 is a flow chart of a method for displaying character names of virtual characters provided by an exemplary embodiment;
FIG. 3 is a schematic diagram of a role name creation interface of an exemplary embodiment;
FIG. 4 is a schematic diagram of a virtual prop displaying a character name in accordance with an exemplary embodiment;
FIG. 5 is a flow chart of a method for displaying character names of virtual characters provided by another exemplary embodiment;
FIG. 6 is a schematic diagram of a terminal acquiring a two-dimensional texture image in accordance with an exemplary embodiment;
FIG. 7 is a computer code schematic diagram of a two-dimensional texture image converted to a normal map in accordance with an exemplary embodiment;
FIG. 8 is a computer code schematic diagram of a two-dimensional texture image converted to a normal map in accordance with another exemplary embodiment;
FIG. 9 is a computer code schematic of a three-dimensional model of a normal map attached to a virtual prop of an exemplary embodiment;
FIG. 10 is a schematic diagram of a virtual prop displaying a character name in accordance with another exemplary embodiment;
fig. 11 is a flow chart of a method for displaying character names of virtual characters provided in another exemplary embodiment;
FIG. 12 is a pictorial diagram of an imprint animation of an exemplary embodiment;
FIG. 13 is a pictorial illustration of an exemplary embodiment of a virtual character's motion animation using a virtual prop;
fig. 14 is a block diagram showing the configuration of a character name display device of a virtual character provided in an exemplary embodiment;
fig. 15 is a schematic diagram of a computer device according to an exemplary embodiment.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, the terms involved in the embodiments of the present application will be briefly described:
horizontal edition game: the game is a game in which the moving route of a game character is controlled on a horizontal screen. In the horizontal game, the moving route of the game character is performed in the horizontal direction on all or most of the pictures. Dividing the horizontal game into games such as horizontal cross gate, horizontal adventure, horizontal competition, horizontal strategy and the like according to the content; according to technology, the flat games are classified into two-dimensional (2D) flat games and three-dimensional (3D) flat games.
Virtual environment: is a virtual environment that an application displays (or provides) while running on a terminal. The virtual environment may be a simulation environment for the real world, a semi-simulation and semi-imaginary environment, or a pure imaginary environment. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
Virtual prop: the virtual object can be used in a virtual environment, and comprises virtual weapons capable of changing attribute values of other virtual objects, replenishing props such as bullets, defensive props such as shields, armor, armored vehicles and the like, virtual props displayed by hands when virtual beams, virtual shock waves and the like are used for releasing skills of the virtual objects, and virtual props such as hands and legs of parts of bodies of the virtual objects and virtual props capable of changing attribute values of other virtual objects, including long-distance virtual props such as pistols, rifles, sniper guns and the like, short-distance virtual props such as daggers, knives, swords, ropes and the like, throwing type virtual props such as hatches, flyers, grenades, flash bullets, smoke bullets and the like.
In this application, a virtual prop refers to a prop that can display a character name, indicating the identity of the character. The virtual prop includes at least one of a virtual nameplate, a virtual bracelet, a virtual brooch, and a virtual belt, which is not limited in this application.
Virtual roles: refers to movable objects in a virtual environment. The movable object may be a virtual character, a virtual animal, a cartoon character, etc., such as: characters and animals displayed in the three-dimensional virtual environment. Optionally, the virtual character is a three-dimensional stereoscopic model created based on an animated skeleton technique. Each virtual character has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment.
Normal mapping: refers to a special texture that can be applied to the surface of a three-dimensional model. The normal map gives each pixel of the two-dimensional image a height value, which contains surface information of many details. The normal map is different from the prior texture, can only be used for two-dimensional surfaces, is used as an extension of concave-convex textures, and can create a plurality of special stereoscopic visual effects on a three-dimensional model.
FIG. 1 illustrates a block diagram of a computer system provided in an exemplary embodiment of the present application. The computer system 100 includes: a first terminal 120, a server 140, and a second terminal 160.
The first terminal 120 installs and runs an application supporting a virtual environment. The application may be any of a three-dimensional map program, a cross-plate shoot, a cross-plate adventure, a cross-plate pass, a cross-plate policy, a Virtual Reality (VR) application, an augmented Reality (Augmented Reality, AR) program. The first terminal 120 is a terminal used by a first user who uses the first terminal 120 to control a first avatar located in a virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, walking, running, jumping, riding, driving, aiming, picking up, using throwing props, attacking other virtual characters. Illustratively, the first avatar is a first avatar, such as an emulated persona object or a cartoon persona object. Illustratively, the first user controls the first avatar to engage in activity through a UI control on the virtual environment screen.
The first terminal 120 is connected to the server 140 through a wireless network or a wired network.
Server 140 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. The server 140 includes a processor 144 and a memory 142, where the memory 142 includes a receiving module 1421, a control module 1422, and a sending module 1423, and the receiving module 1421 is configured to receive a request sent by a client, such as a team request; the control module 1422 is used for controlling the rendering of the virtual environment picture; the sending module 1423 is configured to send a response to the client, for example, sending a prompt to the client that the team formation is successful. The server 140 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 140 takes on primary computing work, and the first terminal 120 and the second terminal 160 take on secondary computing work; alternatively, the server 140 performs a secondary computing job, and the first terminal 120 and the second terminal 160 perform a primary computing job; alternatively, the server 140, the first terminal 120 and the second terminal 160 perform cooperative computing by using a distributed computing architecture.
The second terminal 160 installs and runs an application supporting a virtual environment. The application program can be any one of a three-dimensional map program, a horizontal shooting, a horizontal adventure, a horizontal crossing, a horizontal strategy, a virtual reality application program and an augmented reality program. The second terminal 160 is a terminal used by a second user who uses the second terminal 160 to control a second avatar located in the virtual environment to perform activities including, but not limited to: adjusting at least one of body posture, walking, running, jumping, riding, driving, aiming, picking up, using throwing props, attacking other virtual characters. Illustratively, the second avatar is a second avatar, such as an emulated persona object or a cartoon persona object.
Optionally, the first virtual character object and the second virtual character object are in the same virtual environment. Alternatively, the first avatar object and the second avatar object may belong to the same team, the same organization, the same camp, have a friend relationship, or have temporary communication rights. Alternatively, the first avatar object and the second avatar object may belong to different camps, different teams, different organizations, or have hostile relationships.
Alternatively, the applications installed on the first terminal 120 and the second terminal 160 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 120 may refer broadly to one of a plurality of terminals, and the second terminal 160 may refer broadly to one of a plurality of terminals, the present embodiment being illustrated with only the first terminal 120 and the second terminal 160. The device types of the first terminal 120 and the second terminal 160 are the same or different, and include: at least one of a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer. The following embodiments are illustrated with the terminal comprising a smart phone.
Those skilled in the art will recognize that the number of terminals may be greater or lesser. Such as the above-mentioned terminals may be only one, or the above-mentioned terminals may be several tens or hundreds, or more. The number of terminals and the device type are not limited in the embodiment of the present application.
Fig. 2 is a flowchart of a character name display method of a virtual character according to an exemplary embodiment of the present application. For example, the method is applied to the terminal shown in fig. 1, and as shown in fig. 2, the method includes:
step 210: displaying a character name creation interface of the virtual character, wherein the character name creation interface is displayed with a name input control;
the character name creation interface is a user interface in which a user creates a name for a virtual character controlled by the user.
Part (a) of fig. 3 shows a character name creation interface. The character name creation interface 300 includes a background area 310 and a name input control 320. The background area 310 creates a background of the interface 300 for the character name, and the background area 310 includes at least one of a pattern, a text, and a motif. Illustratively, the background area 310 is displayed as the text "why we are fighting, as the deceased, as the survivor, as a sincere, as the peace future". More specifically, the characters may be artistic fonts including at least one of three-dimensional characters, projection characters, metal characters, wood grain characters, crystal characters, flame characters, background image relief characters, streamer characters and mouse drawings.
Name input control 320 is a control by which a user creates a name for a virtual character. Illustratively, the name input control 320 displays "name:please input the name”。
In one embodiment, in response to operation of the user touch name input control 320, the terminal receives a character name entered by the user using an input tool. Illustratively, in response to operation of the user touch name input control 320, the terminal displays a virtual keyboard on the character name creation interface 300, the virtual keyboard being used for a user to input a name, the terminal receiving the name input by the user using the virtual keyboard; illustratively, in response to operation of the user touch name input control 320, the terminal receives a name entered by the user using an input device including at least one of a keyboard and a microphone.
In one embodiment, the character name creation interface 300 also displays a random name input control for the terminal to randomly select the character name for the user. In response to the user touching the second name input control, the terminal receives an instruction for creating a character name from the user, and displays a random character name on the character name creation interface 300.
Optionally, the touch operation is at least one operation of a click operation, a double click operation, a pressure touch operation, a hover touch operation, and a slide operation performed on the name input control 320.
Step 220: responding to the input operation on the name input control, and displaying the character name in the name input control;
the character names in the name input control are obtained by the input operation of the user on the name input control. Part (b) of fig. 3 shows a character name creation interface. The character name creation interface 300 shown in part (b) of fig. 3 includes a background area 310, a name input control 320, and a character name 330.
The character name 330 displays characteristics called a presentation characteristic including at least one of a character type, a character style, a size, a color, and a thickness. In one embodiment, character name 330 is characterized by text, regular script, number four, black, bolded; in one embodiment, character names 330 are characterized by artistic fonts, number four, red, bolded; in one embodiment, character name 330 is characterized by a character string that includes text, symbols, and numbers, where the text is an artistic font, no. four, red, bolded.
Illustratively, the artistic fonts include at least one of three-dimensional words, projection words, metal words, wood grain words, crystal words, flame words, background image relief words, and streamer. The symbol comprises at least one of exclamation mark, question mark, period, comma and pause mark.
In some embodiments, when a character name created by a user for a virtual character exceeds a character length preset by the system, the character name cannot be created. In some embodiments, a role name created by a user for a virtual role cannot be created when that role name has been used by other users. In some embodiments, a character name created for a virtual character by a user cannot be created when the character name uses a character other than the character set preset by the system.
Step 230: a character name is displayed on a virtual prop used by a virtual character located in the virtual environment.
And in response to the user inputting the character name, the terminal displays the character name on the virtual prop used by the virtual character in the virtual environment. Fig. 4 illustrates an exemplary virtual prop display role name. Virtual prop display role name interface 400 includes virtual role 410, virtual prop 420, role name 430, and hint area 440.
The virtual character 410 is an active object in the virtual environment, and in this application, the virtual character 410 is a character controlled by a user in the virtual environment. By controlling a series of actions of running, jumping, attacking, withdrawing, etc. of the virtual character, the user can interact with other users in the virtual environment. Optionally, the avatar 410 is a three-dimensional avatar.
Virtual prop 420 refers to a prop in a virtual environment. In one embodiment, virtual prop 420 is a prop in a three-dimensional virtual environment. In this application, virtual prop 420 refers to a prop that can display the name of virtual character 410, indicating the identity of virtual character 410. In some embodiments, the virtual prop comprises at least one of a virtual nameplate, a virtual bracelet, a virtual brooch, a virtual harness, which is not limited in this application. Optionally, the virtual prop is a virtual alloy army plate. Optionally, the virtual prop is a three-dimensional virtual prop.
The character name 430 refers to a character name created by the user for the virtual character. The characteristic features of the character name 430 are described in detail in the above step 220, and will not be described here again.
In one embodiment, character name 430 displays a concave-convex effect when the virtual environment is a three-dimensional virtual environment, the virtual character is a three-dimensional virtual character, and the virtual prop is a three-dimensional virtual prop. That is, the character name having the concave-convex effect is displayed on the virtual prop used by the virtual character located in the three-dimensional virtual environment. The concave-convex effect is a stereoscopic effect exhibited by character name 430 as compared to virtual prop 420.
The presentation area 440 refers to a character name in which the system prompts the user to confirm the virtual character. In one embodiment, hint area 440 is displayed as "hello, tamma, soldier, your code is 00972".
In summary, in the method provided in this embodiment, the name input control is displayed on the name creation interface, then the user inputs and confirms the name of the virtual character on the name input control, and finally the name of the character is displayed on the virtual prop used by the virtual character in the virtual environment. The embodiment provides an interactive mode for mapping and displaying the character names on virtual props in a virtual environment after the character names are input on the character name creation interface, and the interactive mode achieves the immersive display effect that the character names pass through the two-dimensional name creation interface to the virtual world.
Fig. 5 is a flowchart of a method for displaying a character name of a virtual character according to an exemplary embodiment of the present application. The virtual environment is a three-dimensional virtual environment, the virtual character is a three-dimensional virtual character, and the virtual prop is a three-dimensional virtual prop. For example, the method is applied to the terminal shown in fig. 1, and as shown in fig. 5, the method includes:
Step 510: displaying a role name creation interface of the virtual role;
the character name creation interface of the virtual character is in a three-dimensional virtual environment and in one embodiment, the character name creation interface 300 displays animated special effects, i.e., a dynamic background area and dynamic name input controls are presented on the character name creation interface.
Step 520: responding to the input operation on the name input control, and displaying the character name in the name input control;
the name input control is a three-dimensional text input box located in the virtual environment.
The three-dimensional text input box refers to a text input box in a three-dimensional virtual environment, and in one embodiment, the three-dimensional text input box exhibits a dynamic stereoscopic effect.
In one embodiment, the character name creation interface displays a character name in response to an input operation on the name input control, while simultaneously displaying the confirm name control. And responding to the operation of the user touch control confirmation name control, receiving the indication of confirming the character name by the user by the terminal, and entering the next step of creating the character name.
Step 530: acquiring a two-dimensional texture image of a role name;
based on the indication of the user confirmation character name, the terminal acquires a two-dimensional texture image of the character name of the virtual character. The two-dimensional texture image is a two-dimensional image including a character name, and contains character name information, and displays the appearance characteristics of the character name. The two-dimensional texture image is a two-dimensional image composed of pixel points, each having a pixel value.
In one embodiment, a camera model located in the virtual environment is used for photographing the character name in the three-dimensional character input box, so that a two-dimensional texture image of the character name is obtained. Wherein the two-dimensional texture image is an image having a single color channel or an image having a plurality of color channels. Fig. 6 shows a terminal acquiring a two-dimensional texture image. Included in fig. 6 are a three-dimensional text input box 610, a camera model 620, and a two-dimensional texture image 630.
The terminal creates a camera model 620 at a designated location in the three-dimensional virtual environment, the camera model 620 being dedicated to capturing character name images of the three-dimensional text input box 610. The camera model 620 captures and intercepts the character name image of the three-dimensional text input box 610, resulting in a two-dimensional texture image. Wherein the two-dimensional texture image is an image having a single color channel or an image having a plurality of color channels. Illustratively, the two-dimensional texture image 630 is a two-dimensional texture image in the R8 format. Where the R8 format refers to an image with only R (red) channel pixel values.
Step 540: converting the two-dimensional texture image into a normal map based on pixel values of pixel points in the two-dimensional texture image;
Normal mapping refers to a special texture that can be applied to a three-dimensional model surface. The normal map comprises the orientations of the pixels in the two-dimensional texture image, i.e. the normal map comprises the normals of all the pixels of the two-dimensional texture image. The normals of all pixels form a normal map. The normal map contains surface information for many details. The normal map is different from the prior texture, can only be used for two-dimensional surfaces, is used as an extension of concave-convex textures, and can create a plurality of special stereoscopic visual effects on a three-dimensional model.
And the terminal photographs the character names in the three-dimensional character input box through a camera model in the virtual environment, and a two-dimensional texture image of the character names is obtained.
In one embodiment, a normal map is obtained by converting pixel values of pixel points in a two-dimensional texture image into normals of the pixel points through a height-to-normal line formula.
The height-to-normal line formula is used to convert the height map of the two-dimensional texture image into a normal map. The height map is an image made based on differences in pixel values between different color channels. The normal map is an image made based on the normal of the pixel point.
FIG. 7 is a computer code schematic diagram of two-dimensional texture image conversion to normal map in accordance with an exemplary embodiment of the present application. FIG. 8 is a computer code schematic diagram of converting a two-dimensional texture image into a normal map in accordance with an exemplary embodiment of the present application.
Fig. 7 is a c# key code, and a corresponding explanation of the related code is shown in the figure, and will be described below.
The map corresponds to the two-dimensional texture image 630 shown as "tamer" in fig. 6. Rendering object RenderTarget refers to a three-dimensional model of a virtual prop. NormalMap refers to line mapping, bumtmap refers to height map, normalmaplexire refers to line mapping texture, bumpaltexture refers to height map texture, globalpexture refers to original mapping texture, blittmateriall refers to normal mapping material, and Blit is an additive material function of normal mapping. The change of the height map into the normal map requires setting the material of the normal map to perform the map conversion, and finally the normal map is obtained.
Fig. 8 is a code associated with height map to normal map. A corresponding explanation of the relevant code is given in fig. 8, and is explained below.
l, r, u and d refer to the offset positions of the four pixels of the up, down, left and right of the current pixel, respectively. h_l, h_r, h_u and h_d refer to the pixel values of the four pixels above, below, left and right of the current pixel, respectively. dh_dx refers to the height difference of the pixel values of the left and right pixels of the current pixel, and dh_dy refers to the height difference of the pixel values of the upper and lower pixels of the current pixel. normal refers to normal.
In one embodiment, for a pixel in a two-dimensional texture image, an upper pixel, a lower pixel, a left pixel, and a right pixel are determined relative to the pixel; calculating the difference between the lower pixel point and the upper pixel point to obtain a first height difference; calculating the difference between the right pixel point and the left pixel point to obtain a second height difference; and converting the first height difference and the second height difference into normals of pixel points to obtain a normals map. And combining the first height difference and the second height difference with a fixed constant of 0.1 to obtain a normal, wherein the normal comprises a normal direction and a normal size.
By way of example only, and not by way of limitation,
dh dx =r-l,dh dy =-u;
vector of pixel point A in left-right direction
Figure GDA0004012629520000111
The vector in the up-down direction is +.>
Figure GDA0004012629520000112
Figure GDA0004012629520000113
Normal of pixel A +.>
Figure GDA0004012629520000114
Then get->
Figure GDA0004012629520000115
Figure GDA0004012629520000116
dx and dy are constants, and if the constant is 0.1, the +.>
Figure GDA0004012629520000117
Because of->
Figure GDA0004012629520000118
Middle z directionThe positive and negative values of the quantity affect the front and back sides of the normal map, so that the z-direction component is multiplied by-1 independently to obtain
Figure GDA0004012629520000119
The normal map is composed of normals of all pixel points of the height map. For a pair of
Figure GDA00040126295200001110
Normalization processing and format processing for eliminating formal problems between normals of different pixel points. />
The normal map includes the orientations of the pixels, and the orientation of a certain pixel is calculated based on the height values of surrounding pixels (up, down, left, and right). The height value comprises a pixel value under an R channel, a pixel value under a G channel and a pixel value under a B channel.
Step 550: in a three-dimensional virtual environment, a normal map is attached to a three-dimensional model of a virtual prop used by a virtual character, and a character name having a concave-convex effect is displayed.
The three-dimensional model of the virtual prop is a representation of the virtual prop at the terminal.
FIG. 9 is a computer code schematic diagram of a normal map applied to a three-dimensional model of a virtual prop according to an exemplary embodiment of the present application.
Fig. 9 is key code for a normal map attached to a three-dimensional model of a virtual prop. A corresponding explanation of the relevant code is already presented in fig. 9, which is explained below.
ThisRenderer, getPropertyBlock, material property, settexture, texture, and roughness of a three-dimensional model of a virtual prop.
The terminal firstly acquires the material property of the three-dimensional model of the virtual prop, then sets the normal map texture into the material property of the three-dimensional model of the virtual prop, and then sets the concave-convex strength into the material property of the three-dimensional model of the virtual prop, and the material property setting is completed based on the operation.
In response to the operation of confirming the character name by the user, in the three-dimensional virtual environment, the terminal pastes the normal map to the three-dimensional model of the virtual prop used by the virtual character, and then the terminal sets the light effect for the virtual prop, and the virtual prop displays the character name with the concave-convex effect.
The terminal sets a light effect for the virtual prop, namely, the system sets a light vector, and the light vector and the normal map on the three-dimensional model of the virtual prop are subjected to vector operation, so that the pixel value of the pixel point on the three-dimensional model of the virtual prop is obtained.
Exemplary as shown in fig. 10, fig. 10 shows that the virtual prop displays a character name. Wherein the character name "tamma" shows a concavo-convex effect.
In summary, in the method provided in this embodiment, firstly, a name input control is displayed on a character name creation interface, then, a user inputs and confirms a character name of a virtual character on the name input control, then, a terminal obtains a two-dimensional texture image of the character name, converts the two-dimensional texture image into a normal map based on pixel values of pixel points in the two-dimensional texture image, and finally, in a three-dimensional virtual environment, the terminal pastes the normal map to a three-dimensional model of a virtual prop used by the virtual character, so as to display the character name with a concave-convex effect. The embodiment provides an interactive mode for mapping and displaying the character names on virtual props in a virtual environment after the character names are input on the character name creation interface, and the interactive mode achieves the immersive display effect that the character names pass through the two-dimensional name creation interface to the virtual world.
According to the method provided by the embodiment, a two-dimensional picture is converted into a three-dimensional concave-convex effect on the virtual prop by means of a height map to normal map, and a lightweight three-dimensional effect generation mode is provided.
Fig. 11 is a flowchart of a character name display method of a virtual character according to an exemplary embodiment of the present application. In the present exemplary embodiment, the virtual environment includes a three-dimensional virtual environment, and the virtual prop includes a three-dimensional virtual prop. The method is applied to the terminal shown in fig. 1 for illustration, and the method comprises the following steps:
step 1101: displaying a character name creation interface of the virtual character, wherein the character name creation interface is displayed with a name input control;
the character name creation interface is an interface where a user creates a name for a character. In this embodiment, the character name creation interface of the virtual character is in a three-dimensional virtual environment, and in one embodiment, the character name creation interface displays an animated special effect, that is, a dynamic background area and a dynamic name input control are displayed on the character name creation interface.
Step 1102: responding to the input operation on the name input control, and displaying the character name in the name input control;
The character names in the name input control are obtained by the input operation of the user on the name input control. In an exemplary embodiment of the present application, the name input control is a three-dimensional text input box located in the virtual environment.
In one embodiment, the character name creation interface displays a character name in response to an input operation on the name input control, while simultaneously displaying the confirm name control. And responding to the operation of the user touch control confirmation name control, receiving the indication of confirming the character name by the user by the terminal, and entering the next step of creating the character name.
Step 1103: acquiring a two-dimensional texture image of a role name;
based on the indication of the user confirmation character name, the terminal acquires a two-dimensional texture image of the character name of the virtual character. The two-dimensional texture image is a two-dimensional image including a character name, and contains character name information, and displays the appearance characteristics of the character name. The two-dimensional texture image is a two-dimensional image composed of pixel points, each having a pixel value. The name input control is a three-dimensional text input box located in the virtual environment.
Shooting the character names in the three-dimensional character input box through a camera model in the virtual environment to obtain two-dimensional texture images of the character names; wherein the two-dimensional texture image is an image having a single color channel or an image having a plurality of color channels.
Step 1104: converting the two-dimensional texture image into a normal map based on pixel values of pixel points in the two-dimensional texture image;
normal mapping refers to a special texture that can be applied to a three-dimensional model surface. In one embodiment, a normal map is obtained by converting pixel values of pixel points in a two-dimensional texture image into normal directions of the pixel points through a height-to-normal line formula.
The height-to-normal line formula is used to convert the height map of the two-dimensional texture image into a normal map. The height map is an image made based on differences in pixel values between different color channels. The normal map is an image made based on the normal of the pixel point.
Step 1105: in a three-dimensional virtual environment, attaching a normal map to a three-dimensional model of a virtual prop used by a virtual character;
in response to a user operation of confirming the character name, the terminal pastes a normal map to a three-dimensional model of a virtual prop used by the virtual character in the three-dimensional virtual environment.
Step 1106: displaying an imprinting animation of gradually imprinting the character name on the virtual prop;
in response to the user's operation of inputting the character name and confirming the character name, the terminal controls the display screen to display the imprint animation. The imprint animation displays the dynamic process of imprinting the character name on the virtual prop.
In one embodiment, the imprint animation includes three pictures.
Part (a) of fig. 12 is a first pictorial diagram of an imprint animation according to an exemplary embodiment of the present application. Based on the terminal sending out an instruction to display the imprint animation on the display screen, the character name and the background area are displayed on the display screen. The character name displays character name information of the virtual character input by the user. In one embodiment, the character name is the same as the character name representation feature of the virtual character entered by the user, i.e., the character type, character style, size, color, thickness are the same. In one embodiment, the character name is different from the character name representation characteristics of the virtual character input by the user, i.e., at least one of character type, character style, size, color, thickness. Illustratively, the character names are displayed as oblique artistic fonts, and the character names of the virtual characters input by the user are displayed as vertical regular script.
The background area comprises at least one of patterns, characters and figures. Illustratively, the background area is displayed as the text "why we fight, as the dead, as the living, as a life-saving, as a peace future".
Part (b) of fig. 12 is a second pictorial diagram of an imprint animation of an exemplary embodiment of the present application. The second picture schematic includes a character name and a virtual prop. Based on the instruction triggered by the user to display the imprint animation on the display screen, the virtual prop is displayed on the display screen.
The virtual prop refers to a prop capable of displaying a character name and indicating the identity of the character. The virtual prop includes at least one of a virtual nameplate, a virtual bracelet, a virtual brooch, and a virtual belt, which is not limited in this application.
The terminal controls the display screen to display the virtual prop. Optionally, the terminal controls the background area to gradually darken until the background area reaches a set brightness threshold, and then the terminal controls the virtual prop to gradually lighten until the virtual prop reaches the set brightness threshold.
The terminal controls the display screen to remove the background area and display the virtual prop at the same time. Optionally, the background area is gradually darkened until the set brightness threshold is reached, and at the same time, the virtual prop is gradually lightened until the set brightness threshold is reached.
In one embodiment, the terminal adjusts the character name size on the display screen to match the virtual prop size. In one embodiment, the terminal adjusts the size of the virtual prop on the display screen until it matches the character name size. In one embodiment, the terminal adjusts the character name and the size of the virtual prop on the display screen simultaneously until they match.
Part (c) of fig. 12 is a third pictorial diagram of an imprint animation according to an exemplary embodiment of the present application. The character name in fig. 12 (c) has been matched with the virtual prop.
Step 1107: the three-dimensional model of the virtual prop displays the role name with concave-convex effect;
the three-dimensional model of the virtual prop is a representation of the virtual prop in a three-dimensional virtual environment.
The terminal has functions of receiving data, processing and transmitting data.
In response to a user operation of confirming a character name, in the three-dimensional virtual environment, the terminal pastes a normal map to a three-dimensional model of a virtual prop used by the virtual character, and the virtual prop displays the character name with a concave-convex effect. Fig. 10 shows virtual prop displaying character names. Wherein the character name "tamma" shows a concavo-convex effect.
Step 1108: and displaying the action animation of the virtual character using the virtual prop.
By way of example, FIG. 13 shows a screen of a virtual character touching a virtual prop. In various embodiments, the manner in which virtual characters use virtual props includes, but is not limited to: wearing a virtual prop, driving the virtual prop, attacking by using the virtual prop, and making appointed actions by using the virtual prop.
In summary, in the method provided in this embodiment, a name input control is displayed on a name creation interface of a character, then a user inputs and confirms a character name of a virtual character on the name input control, then a terminal obtains a two-dimensional texture image of the character name, converts the two-dimensional texture image into a normal map based on pixel values of pixels in the two-dimensional texture image, then an imprinting animation for gradually imprinting the character name on the virtual prop is displayed on a display screen, then the normal map is pasted on a three-dimensional model of the virtual prop used by the virtual character by the terminal in a three-dimensional virtual environment, the virtual prop displays the character name with concave-convex effect, and finally the terminal controls the display screen to display an action animation of the virtual prop used by the virtual character. The embodiment provides an interactive mode for mapping and displaying the character names on virtual props in a virtual environment after the character names are input on the character name creation interface, and the interactive mode achieves the immersive display effect that the character names pass through the two-dimensional name creation interface to the virtual world.
The method provided by the embodiment also provides a novel interaction mode by displaying the imprinting animation for gradually imprinting the character names on the virtual prop and using the imprinting animation to transfer the visual process of traversing the character names from the two-dimensional name creation interface to the three-dimensional virtual world to the user.
Fig. 14 is a block diagram showing the configuration of a character name display device for a virtual character according to an exemplary embodiment of the present application, and as shown in fig. 14, the device includes:
the display module 1420 is configured to display a role name creation interface of the virtual role, where the role name creation interface displays a name input control;
an interaction module 1440, configured to respond to an input operation on the name input control, and display the role name in the name input control;
the display module 1420 is further configured to display the character name on a virtual prop used by the virtual character located in the virtual environment.
In an alternative embodiment, the display module 1420 is further configured to display the character name having a concave-convex effect on the virtual prop used by the virtual character located in the three-dimensional virtual environment.
In an alternative embodiment, the display module 1420 includes an acquisition submodule 1421, a conversion submodule 1422, and a display submodule 1423:
the acquiring submodule 1421 is configured to acquire a two-dimensional texture image of the role name;
the conversion submodule 1422 is configured to convert the two-dimensional texture image into a normal map based on pixel values of pixel points in the two-dimensional texture image;
The display submodule 1423 is further configured to paste the normal map to a three-dimensional model of the virtual prop used by the virtual character in the three-dimensional virtual environment, and display the character name with the concave-convex effect.
In an optional embodiment, the obtaining submodule 1421 is further configured to take a photograph of the character name in the three-dimensional text input box through a camera model located in the virtual environment, to obtain a two-dimensional texture image of the character name;
wherein the two-dimensional texture image is an image having a single color channel or an image having a plurality of color channels.
In an alternative embodiment, the conversion submodule 1422 is further configured to convert, by using a height-to-normal line formula, a pixel value of a pixel point in the two-dimensional texture image into a normal direction of the pixel point, so as to obtain the normal map.
In an alternative embodiment, the conversion submodule 1422 is further configured to determine, for a pixel in the two-dimensional texture image, an upper pixel, a lower pixel, a left pixel, and a right pixel with respect to the pixel.
In an alternative embodiment, the conversion submodule 1422 is further configured to calculate a difference between the lower pixel point and the upper pixel point to obtain a first height difference; and calculating the difference between the right pixel point and the left pixel point to obtain a second height difference.
In an alternative embodiment, the conversion submodule 1422 is further configured to convert the first level difference and the second level difference into normals of the pixel points, and obtain the normals map.
In an alternative embodiment, the display submodule 1423 is further configured to display an imprint animation that gradually imprints the character name on the virtual prop, so that the character name on the virtual prop has a concave-convex effect.
In an alternative embodiment, the display submodule 1423 is further configured to display an action animation of the virtual character using the virtual prop.
It should be noted that: the role name display device for virtual roles provided in the above embodiment is only exemplified by the division of the above functional modules, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the device for displaying the name of the virtual character provided in the above embodiment belongs to the same concept as the embodiment of the method for displaying the name of the virtual character, and the detailed implementation process of the device is shown in the method embodiment, which is not repeated here.
Fig. 15 shows a block diagram of an electronic device 1500 provided in an exemplary embodiment of the present application. The electronic device 1500 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Electronic device 1500 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
Generally, the electronic device 1500 includes: a processor 1501 and a memory 1502.
The processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1501 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content to be displayed by the display screen. In some embodiments, the processor 1501 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1502 may include one or more computer-readable storage media, which may be non-transitory. Memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is configured to store at least one instruction for execution by processor 1501 to implement the acceleration method of off-domain network resources provided by the method embodiments in the present application.
In some embodiments, the electronic device 1500 may further optionally include: a peripheral interface 1503 and at least one peripheral device. The processor 1501, memory 1502 and peripheral interface 1503 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1503 via a bus, signal lines, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, a display 1505, a camera assembly 1506, audio circuitry 1507, and a power supply 1509.
A peripheral interface 1503 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1501 and the memory 1502. In some embodiments, processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1504 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication, short range wireless communication) related circuits, which are not limited in this application.
Display 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When display screen 1505 is a touch display screen, display screen 1505 also has the ability to collect touch signals at or above the surface of display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. At this point, display 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, disposed on the front panel of electronic device 1500; in other embodiments, the display 1505 may be at least two, respectively disposed on different surfaces of the electronic device 1500 or in a folded design; in other embodiments, display 1505 may be a flexible display screen that is disposed on a curved surface or a folded surface of electronic device 1500. Even more, the display 1505 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display screen 1505 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1507 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 1501 for processing, or inputting the electric signals to the radio frequency circuit 1504 for voice communication. For purposes of stereo acquisition or noise reduction, the microphone may be multiple and separately disposed at different locations of the electronic device 1500. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1507 may also include a headphone jack.
The power supply 1509 is used to power the various components in the electronic device 1500. The power supply 1509 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the electronic device 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the electronic apparatus 1500. For example, the acceleration sensor 1511 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal acquired by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 1512 may detect a body direction and a rotation angle of the electronic apparatus 1500, and the gyro sensor 1512 may collect 3D actions of the user on the electronic apparatus 1500 in cooperation with the acceleration sensor 1511. The processor 1501, based on the data collected by the gyro sensor 1512, may implement the following functions: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1513 may be disposed on a side frame of the electronic device 1500 and/or under the display 1505. When the pressure sensor 1513 is disposed on the side frame of the electronic apparatus 1500, a grip signal of the user on the electronic apparatus 1500 may be detected, and the processor 1501 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at the lower layer of the display screen 1505, the processor 1501 realizes control of the operability control on the UI interface according to the pressure operation of the user on the display screen 1505. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1515 is used to collect the ambient light intensity. In one embodiment, processor 1501 may control the display brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is turned up; when the ambient light intensity is low, the display luminance of the display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also referred to as a distance sensor, is typically provided on the front panel of the electronic device 1500. The proximity sensor 1516 is used to collect the distance between the user and the front of the electronic device 1500. In one embodiment, when the proximity sensor 1516 detects a gradual decrease in the distance between the user and the front of the electronic device 1500, the processor 1501 controls the display 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1516 detects that the distance between the user and the front of the electronic apparatus 1500 gradually increases, the processor 1501 controls the display screen 1505 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 15 is not limiting of the electronic device 1500 and may include more or fewer components than shown, or may combine certain components, or may employ a different arrangement of components.
The application also provides a computer readable storage medium, in which at least one instruction, at least one section of program, a code set or an instruction set is stored, where the at least one instruction, the at least one section of program, the code set or the instruction set is loaded and executed by a processor to implement a role name display method of a virtual role provided by the above method embodiment.
The present application provides a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the role name display method of the virtual role provided in the above method embodiment.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments is merely exemplary in nature and is in no way intended to limit the invention, since it is intended that all modifications, equivalents, improvements, etc. that fall within the spirit and scope of the invention.

Claims (10)

1. A character name display method of a virtual character, the method comprising:
displaying a character name creation interface of the virtual character, wherein the character name creation interface is displayed with a three-dimensional character input box positioned in a three-dimensional virtual environment;
responding to the input operation on the three-dimensional character input box, and displaying a character name in the three-dimensional character input box;
shooting and intercepting the character names in the three-dimensional character input box through a camera model in the three-dimensional virtual environment to obtain two-dimensional texture images of the character names;
for pixel points in the two-dimensional texture image, determining an upper pixel point, a lower pixel point, a left pixel point and a right pixel point relative to the pixel points;
calculating the difference between the lower pixel point and the upper pixel point to obtain a first height difference;
calculating the difference between the right pixel point and the left pixel point to obtain a second height difference;
combining the first height difference and the second height difference with a fixed constant to obtain a normal line of the pixel point, and carrying out normalization processing and format processing on the normal line to obtain a normal line map;
In the three-dimensional virtual environment, the material property of the three-dimensional model of the virtual prop used by the virtual character is obtained, the concave-convex strength and the texture of the normal line mapping are set into the material property of the three-dimensional model of the virtual prop, so that the normal line mapping is attached to the three-dimensional model of the virtual prop, and the virtual prop displays the character name with the concave-convex effect.
2. The method of claim 1, wherein the two-dimensional texture image is an image having a single color channel or an image having multiple color channels.
3. The method of claim 1, wherein in the three-dimensional virtual environment, before obtaining the material properties of the three-dimensional model of the virtual prop used by the virtual character and setting the intensity of the roughness and the texture of the normal map into the material properties of the three-dimensional model of the virtual prop, further comprises:
and displaying the imprinting animation of gradually imprinting the character names on the virtual prop, so that the character names with concave-convex effects on the virtual prop are displayed.
4. A method according to any one of claims 1 to 3, wherein the method further comprises:
And displaying the action animation of the virtual character using the virtual prop.
5. A character name display device of a virtual character, the device comprising:
the display module is used for displaying a character name creation interface of the virtual character, wherein the character name creation interface is displayed with a three-dimensional character input box positioned in a three-dimensional virtual environment, and the display module comprises an acquisition sub-module, a conversion sub-module and a display sub-module;
the interaction module is used for responding to the input operation on the three-dimensional character input box and displaying the character name in the three-dimensional character input box;
the acquisition submodule is used for photographing and intercepting the character names in the three-dimensional character input frame through a camera model in the three-dimensional virtual environment to obtain two-dimensional texture images of the character names;
the conversion sub-module is used for determining an upper pixel point, a lower pixel point, a left pixel point and a right pixel point relative to the pixel points in the two-dimensional texture image; calculating the difference between the lower pixel point and the upper pixel point to obtain a first height difference; calculating the difference between the right pixel point and the left pixel point to obtain a second height difference; combining the first height difference and the second height difference with a fixed constant to obtain a normal line of the pixel point, and carrying out normalization processing and format processing on the normal line to obtain a normal line map;
The display sub-module is used for acquiring the material property of the three-dimensional model of the virtual prop used by the virtual character in the three-dimensional virtual environment, setting the concave-convex strength and the texture of the normal map into the material property of the three-dimensional model of the virtual prop, enabling the normal map to be attached to the three-dimensional model of the virtual prop, and displaying the character name with the concave-convex effect by the virtual prop.
6. The apparatus of claim 5, wherein the two-dimensional texture image is an image having a single color channel or an image having multiple color channels.
7. The apparatus of claim 5, wherein the display sub-module is further configured to display an imprint animation that gradually imprints the character name on the virtual prop such that the character name has a concave-convex effect on the virtual prop.
8. The apparatus of any of claims 5-7, wherein the display sub-module is further configured to display an animation of the action of the virtual character using the virtual prop.
9. A computer device, the computer device comprising: a processor and a memory storing a computer program loaded and executed by the processor to implement the character name display method of a virtual character according to any one of claims 1 to 4.
10. A computer-readable storage medium storing a computer program loaded and executed by a processor to implement the character name display method of the virtual character according to any one of claims 1 to 4.
CN202110082965.XA 2021-01-21 2021-01-21 Method, device, equipment and medium for displaying character names of virtual characters Active CN112717391B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110082965.XA CN112717391B (en) 2021-01-21 2021-01-21 Method, device, equipment and medium for displaying character names of virtual characters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110082965.XA CN112717391B (en) 2021-01-21 2021-01-21 Method, device, equipment and medium for displaying character names of virtual characters

Publications (2)

Publication Number Publication Date
CN112717391A CN112717391A (en) 2021-04-30
CN112717391B true CN112717391B (en) 2023-04-25

Family

ID=75594745

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110082965.XA Active CN112717391B (en) 2021-01-21 2021-01-21 Method, device, equipment and medium for displaying character names of virtual characters

Country Status (1)

Country Link
CN (1) CN112717391B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114359269A (en) * 2022-03-09 2022-04-15 广东工业大学 Virtual food box defect generation method and system based on neural network

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6243172B2 (en) * 2013-09-11 2017-12-06 株式会社カプコン GAME PROGRAM AND GAME DEVICE
CN106730846A (en) * 2016-11-10 2017-05-31 北京像素软件科技股份有限公司 The data processing method and device of one attribute stage property
CN108434736B (en) * 2018-03-23 2020-07-07 腾讯科技(深圳)有限公司 Equipment display method, device, equipment and storage medium in virtual environment battle
CN110339570A (en) * 2019-07-17 2019-10-18 网易(杭州)网络有限公司 Exchange method, device, storage medium and the electronic device of information

Also Published As

Publication number Publication date
CN112717391A (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN111589142B (en) Virtual object control method, device, equipment and medium
US11980814B2 (en) Method and apparatus for controlling virtual object to mark virtual item and medium
CN111589128B (en) Operation control display method and device based on virtual scene
CN111035918B (en) Reconnaissance interface display method and device based on virtual environment and readable storage medium
CN111420402B (en) Virtual environment picture display method, device, terminal and storage medium
CN108786110B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN111462307A (en) Virtual image display method, device, equipment and storage medium of virtual object
CN112156464B (en) Two-dimensional image display method, device and equipment of virtual object and storage medium
CN111760278B (en) Skill control display method, device, equipment and medium
CN111325822B (en) Method, device and equipment for displaying hot spot diagram and readable storage medium
CN111589141B (en) Virtual environment picture display method, device, equipment and medium
CN111672106B (en) Virtual scene display method and device, computer equipment and storage medium
CN108744511B (en) Method, device and storage medium for displaying sighting telescope in virtual environment
CN112169330B (en) Method, device, equipment and medium for displaying picture of virtual environment
CN111013137B (en) Movement control method, device, equipment and storage medium in virtual scene
CN110833695B (en) Service processing method, device, equipment and storage medium based on virtual scene
CN113134232B (en) Virtual object control method, device, equipment and computer readable storage medium
CN112755517B (en) Virtual object control method, device, terminal and storage medium
CN113289336A (en) Method, apparatus, device and medium for tagging items in a virtual environment
CN112717391B (en) Method, device, equipment and medium for displaying character names of virtual characters
CN111672115A (en) Virtual object control method and device, computer equipment and storage medium
CN113680058B (en) Use method, device, equipment and storage medium for restoring life value prop
CN112604274B (en) Virtual object display method, device, terminal and storage medium
CN111921191B (en) State icon display method and device, terminal and storage medium
CN112156463B (en) Role display method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40042557

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant