CN111228816A - Scene layout method and device in game - Google Patents

Scene layout method and device in game Download PDF

Info

Publication number
CN111228816A
CN111228816A CN202010084994.5A CN202010084994A CN111228816A CN 111228816 A CN111228816 A CN 111228816A CN 202010084994 A CN202010084994 A CN 202010084994A CN 111228816 A CN111228816 A CN 111228816A
Authority
CN
China
Prior art keywords
entity
target
target entity
game
attribute
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010084994.5A
Other languages
Chinese (zh)
Inventor
张富存
李涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou Apas Digital Cloud Information Technology Co Ltd
Original Assignee
Zhengzhou Apas Digital Cloud Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou Apas Digital Cloud Information Technology Co Ltd filed Critical Zhengzhou Apas Digital Cloud Information Technology Co Ltd
Priority to CN202010084994.5A priority Critical patent/CN111228816A/en
Publication of CN111228816A publication Critical patent/CN111228816A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/53Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
    • A63F2300/538Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for performing operations on behalf of the game client, e.g. rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a scene layout method and device in a game. The method comprises the following steps: extracting map data, and calling a corresponding topographic picture in a set interface to render to obtain the scene map; calling a target entity from a given entity library, and associating the target entity to the scene map; calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; and executing action behaviors of the game characters in the scene map respectively corresponding to the game characters. In the role configuration process of the game scene, the map and the related roles are constructed by adopting the unified control interface to form the game scene, and the time-sharing coordinates corresponding to the game roles, the event marks in the event library and the attribute parameters are respectively associated in the game scene, so that the unified motion control is accurately performed on each target role, and the configuration efficiency of the game roles is improved.

Description

Scene layout method and device in game
Technical Field
The present application relates to the field of game control, and in particular, to a method and an apparatus for scene layout in a game.
Background
With the rapid development of computer technology, computers and the internet provide convenience for people's life, work and entertainment. Various games at a mobile terminal enrich daily life of people, people pay attention to game experience when playing games, and in the existing Application (App), scene layout of games is usually designed according to experience of planning personnel.
In a game scene, a planner often needs to perform configuration processes such as positioning and route design of game characters in the scene according to experience. The general method is that a planner needs to record the positions required to pass by each character and the trigger logic corresponding to each position, then configure the positions in a set map one by one according to experience, and repeatedly adjust the positions according to the presented effect to determine the corresponding time-sharing coordinates and directions when the game character continuously moves so as to ensure that the game character moves in the map according to a set route. In this way, the arrangement efficiency of each game character is low, and arrangement errors are likely to occur, so that the game characters are difficult to locate in a scene, and the development efficiency is reduced.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for scene layout in a game, wherein control interfaces corresponding to target roles in the game are uniformly arranged in a set control interface; the area and the shape size of the corresponding range of the game scene are read in the control interface, the target area is correspondingly set, and the time-sharing coordinate corresponding to the game role in the scene, the event mark in the event library and the attribute parameter are respectively associated, so that the uniform motion control is accurately performed on each target role in the game scene, and the configuration efficiency of the game role is improved.
In order to solve the above technical problem, the embodiment of the present application is implemented as follows:
according to a first aspect of embodiments of the present application, there is provided a method for scene layout in a game, the method including:
extracting map data, and calling a corresponding topographic picture in a set interface to render to obtain the scene map;
calling a target entity from a given entity library, and associating the target entity to the scene map;
calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map.
In an embodiment of the present application, when the corresponding terrain image is called for rendering,
rasterizing the map data to obtain raster data;
mapping each cell of the raster data with a pixel point in the terrain picture to obtain a pixel range corresponding to each cell;
rendering and coloring the terrain picture according to the pixel range corresponding to each cell to obtain the scene map.
In an embodiment of the present application, when the target entity is associated into the scene map,
acquiring an entity type corresponding to the target entity;
determining the central coordinate of the target entity according to the entity type;
and with the central coordinate as a center, arranging the target entity in a grid data corresponding to the scene map in an associated manner, and acquiring target coordinates of key points corresponding to the target entity in the grid data respectively.
In an embodiment of the present application, when the target entity association is set in the raster data corresponding to the scene map,
extracting the mapping relation of the central coordinate and the key point;
respectively determining the associated coordinates corresponding to the key points in real time according to the mapping relation;
and respectively fitting the associated coordinates to nodes of the raster data, and taking the nodes as target coordinates.
In an embodiment of the present application, when the interface for extracting the target entity loads the event attribute respectively,
calling a list of corresponding event attributes according to the entity type corresponding to the target entity;
respectively extracting interfaces corresponding to the target entities according to the list, and respectively adapting the parameter values corresponding to the event attributes to the interfaces corresponding to the target entities;
rendering the target entity according to the adapted parameter values, and coloring and adapting the target entity to obtain the game role corresponding to the target entity.
In one embodiment of the present application, the event attributes are respectively stored in a structured manner with target entities in the entity library,
storing each target entity in the entity library in a structured form, wherein each target entity is respectively associated with at least one event attribute;
the event attribute is adapted to the target entity according to a corresponding interface, and the event attribute is used as the entity attribute of the target entity;
and rendering the target entity according to the attribute value corresponding to the entity attribute, and adjusting the style characteristic corresponding to the target entity to form the game role corresponding to the target entity.
In an embodiment of the present application, the method further includes:
the established interface is an integrated interface, and the target entity and the calling interface of the topographic picture are uniformly arranged on the integrated interface;
and calling the event attribute through the integrated interface to display the game role.
According to a second aspect of embodiments of the present application, there is provided an apparatus for scene layout in a game, the apparatus comprising:
the rendering module is used for extracting map data and calling a corresponding topographic picture in a set interface to render to obtain the scene map;
the association module is used for calling a target entity from a given entity library and associating the target entity to the scene map;
the loading module is used for calling the event attribute corresponding to the target entity, extracting the interface of the target entity and respectively loading the event attribute to obtain the game role corresponding to the target entity; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
and the output module is used for respectively calling corresponding target events according to user trigger and executing respective corresponding action behaviors of the game role in the scene map.
In one embodiment of the present application, the rendering module includes,
the grid unit is used for rasterizing the map data to obtain grid data;
the mapping unit is used for mapping each cell of the raster data with a pixel point in the terrain picture to obtain a pixel range corresponding to each cell;
and the dividing unit is used for dividing the topographic picture according to the pixel range corresponding to each cell, rendering according to the divided pixels and then coloring to obtain the scene map.
In an embodiment of the present application, the association module includes,
a type unit, configured to obtain an entity type corresponding to the target entity;
the center determining unit is used for determining the center coordinates of the target entity according to the entity type;
and the setting unit is used for setting the target entity in a grid data corresponding to the scene map in an associated manner by taking the central coordinate as a center, and acquiring target coordinates of key points corresponding to the target entity in the grid data respectively.
In an embodiment of the present application, in the setting unit,
the extraction subunit is used for extracting the mapping relation between the central coordinate and the corresponding key point;
the coordinate subunit is used for respectively determining the associated coordinates corresponding to the key points in real time according to the mapping relation;
and the fitting subunit is used for respectively fitting the associated coordinates into the nodes of the raster data, and taking the nodes as target coordinates.
In an embodiment of the present application, the loading module includes,
the list unit is used for calling a list corresponding to the event attribute according to the entity type corresponding to the target entity;
the adaptation unit is used for respectively extracting the interfaces corresponding to the target entities according to the list and respectively adapting the parameter values corresponding to the event attributes with the interfaces corresponding to the target entities;
and the role unit is used for rendering the target entity according to the adapted parameter values, coloring and adapting the target entity to obtain the game role corresponding to the target entity.
In an embodiment of the present application, the list unit includes,
the storage subunit is configured to store each target entity in the entity library in a structured form, where each target entity is associated with at least one event attribute;
the interface adapter unit is used for adapting the event attribute to the target entity according to a corresponding interface, and taking the event attribute as the entity attribute of the target entity;
and the adjusting subunit is used for rendering the target entity according to the attribute value corresponding to the entity attribute, adjusting the style characteristic corresponding to the target entity, and forming the game role corresponding to the target entity.
In an embodiment of the present application, the apparatus further includes:
the integrated module is used for uniformly setting the target entity and the calling interface of the topographic picture on the integrated interface;
and the display module is used for calling the event attribute through the integrated interface and displaying the game role.
According to the technical scheme provided by the embodiment of the application, the embodiment of the application extracts map data and calls the corresponding topographic picture to render in the established interface to obtain the scene map; calling a target entity from a given entity library, and associating the target entity to the scene map; calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map. In the role configuration process of the game scene, the map and the related roles are constructed by adopting the unified control interface to form the game scene, and the time-sharing coordinates corresponding to the game roles, the event marks in the event library and the attribute parameters are respectively associated in the game scene, so that the unified motion control is accurately performed on each target role, and the configuration efficiency of the game roles is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present specification, and for those skilled in the art, other drawings can be obtained according to the drawings without any creative effort.
FIG. 1 is a flow diagram of a scene layout method in a game according to an embodiment of the present application;
FIGS. 2a and 2b are schematic diagrams illustrating examples of a scene layout method in a game according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device corresponding to a scene layout device in a game according to yet another embodiment of the present application;
fig. 4 is a schematic structural diagram of a scene layout device in a game according to an embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present specification without any creative effort shall fall within the protection scope of the present specification.
The embodiment of the application provides a scene layout method and device in a game.
First, a scene layout method in a game provided by the embodiment of the present application is described below.
In this embodiment, a scene layout method in a game is described by different examples. When a game player designs a scene, game characters are generally positioned into the game scene one by one according to experience of a game planner, and time-sharing coordinates, motion tracks, activity conditions and the like of the game characters in the scene are controlled one by one, so that the game characters continuously move in the game scene. Due to the lack of a uniform control entry, the scene environment and the motion tracks corresponding to the game characters cannot be quantized, so that various attribute parameters need to be continuously adjusted and correspondingly configured, the game characters are ensured to move in the game scene according to the will of planning personnel, and the development efficiency is greatly reduced. In the embodiment of the application, in the role configuration process of the game scene, maps are respectively constructed by adopting a unified control interface, and related game roles are uniformly constructed in the maps to form the game scene, further, unified parameterization configuration is respectively carried out on the game roles in the interfaces in the game scene, and time-sharing coordinates, event marks in an event library and attribute parameters which correspond to the game roles are allocated and configured, so that unified motion control is accurately carried out on the target roles, and the configuration efficiency of the game roles is improved.
Fig. 1 is a flowchart of a scene layout method in a game according to an embodiment of the present application, and as shown in fig. 1, the method may include the following steps:
in step 101, map data are extracted, and corresponding topographic pictures are called in a set interface for rendering, so that the scene map is obtained;
in this embodiment, parameter interfaces corresponding to target entities in each game scene are integrated into one control interface for input, after the control interface acquires target parameters of a game player input in the interface, the target parameters are respectively input into corresponding map data and called target entities according to control logic, a background rendering engine performs rendering to form a scene map and corresponding game roles, and each target entity is displayed in the control interface, so that the game planner can visually check the action behaviors of the target entities in the game scene.
Calling a terrain picture in a given interface to construct a scene map, acquiring a map area input by research personnel from the interface, and then rendering according to the map area and the terrain picture, wherein the method specifically comprises the following steps:
A. rasterizing the map data to obtain raster data;
when map data are preprocessed, various types of map data are rasterized in a coordinate mode to form corresponding coordinate data.
B. Mapping each cell of the raster data with a pixel point in the terrain picture to obtain a pixel range corresponding to each cell;
reading a given terrain picture, respectively associating and mapping each pixel in the terrain picture with the coordinate in the raster data, and enabling each pixel point to respectively correspond to the coordinate in the raster data, so that each element in the terrain picture can be marked by the coordinate point in the raster data, and each game role in a game scene can conveniently move in the terrain picture.
C. Rendering and coloring the terrain picture according to the pixel range corresponding to each cell to obtain the scene map.
And extracting a pixel range corresponding to each cell, rendering the terrain picture according to the pixel range, and attaching each pixel in the terrain picture to the corresponding coordinate point, so that a complete scene map is formed, and the scene map is displayed. And after receiving an instruction of the target entity added by the game player subsequently, displaying the corresponding position of the target entity in the scene map according to the corresponding time-sharing coordinate according to the time-sharing coordinate corresponding to the target entity.
Step 102: calling a target entity from a given entity library, and associating the target entity to the scene map;
after the scene map is constructed, a target entity to be added needs to be further added into the scene map, so that the target entity can perform relevant behaviors such as movement in a game scene.
1) Acquiring an entity type corresponding to the target entity;
when the target entity is re-extracted from the entity library, the entity type corresponding to the target entity is obtained, the entity library is used for carrying out structured storage on the target entities of various types, and the entity attribute corresponding to each type of target entity is subjected to associated storage in the entity library after being developed and completed by a third-party platform and being packaged in a plug-in mode.
The entity attributes of each type of target entity comprise general attributes and characteristic attributes, the general attributes comprise conventional characteristics such as length, width, height, type name, coordinates, transparency, zoom factor and the like, and the characteristic attributes comprise: head length, body length, clothing, hair accessories, weapons, vital signs, force, etc. corresponding to the entity type. After a target entity is obtained, the target entity is displayed in the interface shown in the step 101 according to the initial value of the entity attribute corresponding to the target entity, meanwhile, the entity attribute corresponding to the entity type is read, the attribute value modification interface corresponding to the entity attribute is extracted and loaded in the interface, and when a game player modifies the attributes, the target entity is rendered and displayed on the target interface after receiving the corresponding attribute value of the modified entity attribute.
2) Determining the central coordinate of the target entity according to the entity type;
and acquiring the length and width values of the entity attribute corresponding to each entity type, determining the central position of the target entity according to the length and width values and the current coordinate, and taking the central position of the target entity as the central coordinate of the target entity. And taking the central coordinates as mark points of the target entities, controlling the corresponding target entities in a scene map by taking the central coordinates of the target entities as marks, controlling the motion conditions of the target entities in the game scene, and enabling game personnel to lay out in the scene map of a set interface to construct the game scene consisting of different game roles.
3) And with the central coordinate as a center, arranging the target entity in a grid data corresponding to the scene map in an associated manner, and acquiring target coordinates of key points corresponding to the target entity in the grid data respectively.
The central coordinates corresponding to the target entities are respectively associated with the raster data of the terrain pictures and are associated to each cell of the raster data, so that the corresponding time-sharing coordinates are associated with each cell at any moment when the target entities move in the scene map, and the target entities are accurately positioned in the game scene.
In a game scene, in order to further control a target entity, it is necessary to determine the position of a key point corresponding to the target entity, such as the head edge of the target entity and the coordinates of the end portions of four limbs, so as to perform collision detection with other entities in the game scene, thereby achieving various effects such as attack, collision and the like corresponding to the target entity in the game scene.
When key points corresponding to the target entity are acquired and respectively correspond to target coordinates in the raster data:
3.1 extracting the mapping relation of the central coordinate and the key point;
and extracting entity attributes corresponding to the target entity, and acquiring mapping relations between the central coordinates and the key points according to the length, the width and the central coordinates of the entity attributes.
3.2 respectively determining the associated coordinates corresponding to the key points in real time according to the mapping relation;
and respectively calculating the associated coordinates of the key points corresponding to the target entities in the game scene according to the mapping relation, thereby facilitating the collision detection of the target entities in the game scene.
3.3 fitting the associated coordinates to the nodes of the raster data respectively, and taking the nodes as target coordinates.
And after receiving a corresponding instruction of a target entity added by a game player to respond to each event, the corresponding positions of the key points can be displayed in the scene map according to the corresponding time-sharing coordinates according to the time-sharing coordinates corresponding to the key points, so that the action behaviors of the game role formed by the target entity in the game scene are accurately rendered.
Step 103: calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
when a target entity is called from an entity library, event attributes corresponding to the target entity are simultaneously extracted and displayed in a set interface, wherein the event attributes are a target event corresponding to the entity type where the target entity is located and entity attributes, and respectively represent action behaviors and attribute characteristics which can be generated in a scene corresponding to the target entity. And once the target event occurs, the target entity responds according to the trigger corresponding to the target time, and calls the corresponding entity attribute according to the set rule and function transformation for adjustment, so that the dynamic effect of the target entity in the game scene is formed.
When the interface of the target entity is extracted to load the event attribute respectively, the method comprises the following steps:
a) calling a list of corresponding event attributes according to the entity type corresponding to the target entity;
and extracting the event attributes from the list of the event attributes corresponding to the target entity to obtain various event attributes supported by the target entity for the game personnel to call so that the target entity executes corresponding control logic in the game scene.
b) Respectively extracting interfaces corresponding to the target entities according to the list, and respectively adapting the parameter values corresponding to the event attributes to the interfaces corresponding to the target entities;
the interface corresponding to each event attribute in the list is extracted and displayed in a set interface, after a target entity is selected by a game player, the interface list of each event attribute corresponding to the target entity is displayed in the set interface, after the game player designates a corresponding target event, the interface corresponding to the target event is extracted from the list, and a corresponding parameter value is added to be adapted to the interface, so that the target entity moves in a game scene according to the corresponding parameter value under the target event.
c) Rendering the target entity according to the adapted parameter values, and coloring and adapting the target entity to obtain the game role corresponding to the target entity.
In this embodiment, the event attributes are respectively structurally stored with the target entities in the entity library:
storing each target entity in the entity library in a structured form, wherein each target entity is respectively associated with at least one event attribute; the event attributes are: and sliding, clicking, turning, running and the like are defined according to the entity type corresponding to the target entity in the game scene.
And adapting the event attribute to the target entity according to a corresponding interface, and taking the event attribute as the entity attribute of the target entity. And rendering the target entity according to the attribute value corresponding to the entity attribute, and adjusting the style characteristic corresponding to the target entity to form the game role corresponding to the target entity.
And according to the triggering of game personnel, rendering and compiling the target entity in the third-party engine according to the adapted parameter value to form a function logic capable of executing calling, and triggering the target entity according to the set event attribute to enable the target entity to move according to the approved action in the game scene to form a game role for the debugging and testing of the game personnel.
Step 104: and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map.
After receiving the corresponding trigger of the game role on the established interface by the game personnel, triggering the target event corresponding to the target entity where the game role is located, controlling the game role to execute the corresponding function logic according to the target event, adjusting the corresponding attribute parameters according to the set function logic by the game role, and displaying the attribute parameters on the established interface. If the time-sharing coordinates corresponding to the game characters are changed, the attribute parameters of the game characters are adjusted, the four limbs corresponding to the game characters are synchronously controlled to change and are displayed on the preset interface, the running effect of the game characters in the scene map can be achieved, the game characters can be checked by game personnel, if the change range of the four limbs of the game characters is too large, the attribute parameters corresponding to the game characters under the target event can be directly adjusted in the preset interface, namely the time-sharing coordinates corresponding to the four limbs of the game characters are adjusted, and the parameter configuration efficiency is improved.
The method comprises the steps of extracting map data, calling a corresponding topographic picture in a set interface to render, and obtaining a scene map; calling a target entity from a given entity library, and associating the target entity to the scene map; calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map. In the role configuration process of the game scene, the map and the related roles are constructed by adopting the unified control interface to form the game scene, and the time-sharing coordinates corresponding to the game roles, the event marks in the event library and the attribute parameters are respectively associated in the game scene, so that the unified motion control is accurately performed on each target role, and the configuration efficiency of the game roles is improved.
A scene layout method in a game according to another embodiment of the present application may include the steps of:
step 201: extracting map data, and calling a corresponding topographic picture in a set interface to render to obtain the scene map;
after a set interface is set, a picture resource folder texPath for a map is created in the Unity project and is used for storing all terrain pictures drawn by game personnel. Using a Unity engine tool, calling a create method, creating a window in a set interface, using an add method to add a setting tool for the window, wherein the tool specifically works as follows: firstly, a map picture tex1 is loaded from the texPath, the size of the picture and all pixel points are extracted by using a pixel reading method, the size and all the pixel points are stored in a two-dimensional array tps, and then the color of all the pixel points in the tps is read out and copied into an editor window.
When rasterizing the terrain data, as shown in fig. 2a, the picture is completely restored to the editor window, and then the terrain picture in the window is rendered to be raster data with size N × N by rendering one region horizontally and vertically with each 5 pixel points. After the region is divided, counting is started from the position of the upper left corner (0,0), the region is divided horizontally by 1 unit length (namely 5 pixels), and the corresponding position of the horizontal range is represented by 1 and 2. I.e. 2 denotes the area from the origin 2. And by analogy, the vertical direction is divided by the same length. The (1,1) is an area representing a distance of 1 from the origin in the lateral and longitudinal directions in the scene map. In this embodiment, if the size of the topographic image is 1000 × 1000, 5 pixels are used as one region, and finally rendered into a scene map with a size of 200 × 200 units, where 1000 is divided by 5 and is equal to 200. Each unit in the scene map is assigned with a unique identification code as an identifier of each area, and the shaded area shown in fig. 2a is assigned with a unique identification code of key (1, 1).
Step 202: calling a target entity from a given entity library, and associating the target entity to the scene map;
in this embodiment, the game player constructs a map containing image resources tex1 (size 1000 × 1000) of mountain, road and lake in the scene map. And (3) importing the tex1 picture elements into a set interface, dividing the picture elements into 200 by 200 area modules, and distributing unique identification codes to the area modules to form a scene map. In this embodiment, each cell is taken as a target entity, and has the following two entity attributes: the physical strength values corresponding to other game roles are deducted after the game role is not passed; and respectively storing the corresponding attribute parameters of each cell in a one-dimensional static array pArrary.
Step 203: calling a target entity from a given entity library, and associating the target entity to the scene map;
as shown in fig. 2b, when the area (2,2) in the editor is clicked, a click instruction is received, a click event is processed, an attribute configuration window is created by using a create method, the window reads the attribute array pArray corresponding to the cell, and all the trigger interfaces of the entity attributes are found and displayed in a set interface by using a loop traversal method.
In this embodiment, the target entity for setting the cell (1,1) in the marked area is a peak, the cell (2,2) is a lake, and the cell (3,3) is a road, and at this time, the cell (1,1) area should be set to have an impassable attribute. And when the game personnel triggers and triggers the interface corresponding to the entity attribute a, the entity attribute corresponding to the cell receives the setting step of the entity attribute in the established interface. The unit cell is marked by adopting { } as a target entity corresponding to the unit cell, key is used as an identification code corresponding to the unit cell area, value is used as various entity attributes corresponding to the unit cell area, and "is used as a separator between the identification code and an attribute value corresponding to the entity attribute, and finally the corresponding entity attributes are integrated into { key ═ 1, value ═ a }. The entity attributes corresponding to the two cells are combined as { key ═ (1,1), | { key ═ (2,2), } value ═ b }.
And setting various types of target entities in the whole scene map, and performing attribute configuration which meets the expectation of developers at one time. In the scene map of this embodiment, the peak is set as a value that other target entities cannot pass through, the target entity lake is set as a value that the corresponding physical strength of the target entity lake is subtracted after the other target entities pass through, and the other cells do not have any attribute. After the game personnel respectively call the corresponding target time to configure the entity parameters, the game personnel click a storage button, a set interface receives the corresponding parameters through a preset interface, stores a modified instruction, integrally encapsulates the assembled Data of all the areas, and finally stores the assembled Data as a field Data ═ { key ═ 1, (a) } | { key ═ 2, and (b) } | … … …. And then, carrying out output operation, storing the configuration file under a local project Resource folder, and taking the configuration file as data.
Step 204: and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map.
When a game is started in a given interface, the Unity engine firstly loads a picture Resource tex1 of the level 1 from a Resource library corresponding to a map picture, then loads configuration information data of a map of the level from a Resource catalog Resource by using I/O operation, and then analyzes the text content of the data.
Firstly, the configuration file is divided by using "|", the configuration information corresponding to each cell is obtained, then the configuration information is used, the configuration fragment is divided again, the attribute values of the mark code and the entity attribute corresponding to each cell are obtained, 200 x 200 keys and corresponding value values are obtained through analysis finally, the value values respectively represent the attribute information corresponding to the cell area, the cell (1,1) represents a "mountain peak", the cell is an unavailable area, the cell (2,2) represents a "lake", and when other target entities pass through, the physical strength value corresponding to the target entity is reduced.
And after loading of the scene map and the related target entities in all game scenes is finished, loading a game Role corresponding to one target entity from the upper left corner (0,0) of the scene map, wherein the corresponding entity type is a puppy (the Role Resource is stored below a Resource/Role directory in the project). And the rendered target entity is added to the scene map as a game role, a set interface of a target event 'move' corresponding to the target entity in the Unity engine is loaded in the set interface, and entity attributes such as direction, speed and the like are loaded through the set interface, so that the running dynamic effect of the game role is realized and displayed in the interface.
In this embodiment, in the predetermined interface, the puppy moves to the left when receiving the left direction key of the keyboard, and moves to the right when receiving the right direction key of the keyboard. The Move mode uses the Move method in Api of Unity engine. When the game character puppy moves to the cell (1,1), the cell (1,1) monitors that the time-sharing coordinate of the game character is located at the position, an impassable message is sent to the game character by using a message sending mechanism, and the puppy cannot move to the area continuously unless the game character puppy moves reversely to escape from the corresponding area of the cell. When the puppy moves to the cell area corresponding to the lake, the lake area also sends a message to the game role, and the 'physical strength value' in the entity attribute corresponding to the puppy is continuously reduced along with the time-sharing coordinate of the physical strength value. When the physical strength value corresponding to the puppy is reduced to 0, the puppy of the game role dies, and the Unity engine recovers the target entity; and if the time-sharing coordinate corresponding to the puppy is always within the range of the lake area, the physical strength value in the entity attribute can be continuously reduced. Therefore, through the control and the attribute calling performed by the established interface, the efficient work from the editing and configuration of the target entity to the realization of the established effect of the game role in the game scene is realized, the situation that the game role is difficult to position and disorder in the scene due to configuration errors is avoided, and the development efficiency is greatly improved.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application. On the hardware level, the electronic device comprises a processor and optionally an internal bus, a network interface and a memory. The Memory may include a Memory, such as a Random-Access Memory (RAM), and may further include a non-volatile Memory, such as at least 1 disk Memory. Of course, the electronic device may also include hardware required for other services.
The processor, the network interface, and the memory may be connected to each other via an internal bus, which may be an ISA (Industry Standard Architecture) bus, a PCI (peripheral component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 2, but this does not indicate only one bus or one type of bus.
And the memory is used for storing programs. In particular, the program may include program code comprising computer operating instructions. The memory may include both memory and non-volatile storage and provides instructions and data to the processor.
The processor reads the corresponding computer program from the nonvolatile memory into the memory and then runs the computer program to form the device of the scene layout in the game on the logic level. The processor is used for executing the program stored in the memory and is specifically used for executing the following operations:
extracting map data, and calling a corresponding topographic picture in a set interface to render to obtain the scene map;
calling a target entity from a given entity library, and associating the target entity to the scene map;
calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map.
The scene layout method in the game disclosed in the embodiment of fig. 3 of the present application can be applied to or implemented by a processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete gates or transistor logic devices, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in a memory, and a processor reads information in the memory and completes the steps of the method in combination with hardware of the processor.
The electronic device may also execute the method shown in fig. 1, and implement the functions of the scene layout apparatus in the game in the embodiment shown in fig. 1, which are not described herein again in this application embodiment.
Of course, besides the software implementation, the electronic device in this specification does not exclude other implementations, such as logic devices or a combination of software and hardware, and the like, that is, the execution subject of the following processing flow is not limited to each logic unit, and may also be hardware or logic devices.
Fig. 4 is a schematic structural diagram of a scene layout device in a game according to an embodiment of the present application. Referring to fig. 4, in a software implementation, an apparatus 400 for arranging scenes in a game in pictures may include: a rendering module 401, an association module 402, a loading module 403, and an output module 404, wherein,
the rendering module 401 is configured to extract map data, and call a corresponding topographic image in a predetermined interface to perform rendering, so as to obtain the scene map;
an association module 402, configured to invoke a target entity from a predetermined entity library, and associate the target entity with the scene map;
a loading module 403, configured to invoke an event attribute corresponding to the target entity, extract an interface of the target entity, and load the event attribute respectively to obtain a game role corresponding to the target entity; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
and the output module 404 is configured to respectively invoke corresponding target events according to user triggers, and execute respective corresponding action behaviors of the game role in the scene map.
The rendering module 401, in particular comprising,
the grid unit is used for rasterizing the map data to obtain grid data;
the mapping unit is used for mapping each cell of the raster data with a pixel point in the terrain picture to obtain a pixel range corresponding to each cell;
and the dividing unit is used for dividing the topographic picture according to the pixel range corresponding to each cell, rendering according to the divided pixels and then coloring to obtain the scene map.
The association module 402 may include, in particular,
a type unit, configured to obtain an entity type corresponding to the target entity;
the center determining unit is used for determining the center coordinates of the target entity according to the entity type;
and the setting unit is used for setting the target entity in a grid data corresponding to the scene map in an associated manner by taking the central coordinate as a center, and acquiring target coordinates of key points corresponding to the target entity in the grid data respectively.
In the setting unit, specifically include:
the extraction subunit is used for extracting the mapping relation between the central coordinate and the corresponding key point;
the coordinate subunit is used for respectively determining the associated coordinates corresponding to the key points in real time according to the mapping relation;
and the fitting subunit is used for respectively fitting the associated coordinates into the nodes of the raster data, and taking the nodes as target coordinates.
The loading module 403 may specifically include, for example,
the list unit is used for calling a list corresponding to the event attribute according to the entity type corresponding to the target entity;
the adaptation unit is used for respectively extracting the interfaces corresponding to the target entities according to the list and respectively adapting the parameter values corresponding to the event attributes with the interfaces corresponding to the target entities;
and the role unit is used for rendering the target entity according to the adapted parameter values, coloring and adapting the target entity to obtain the game role corresponding to the target entity.
The list unit, specifically comprising,
the storage subunit is configured to store each target entity in the entity library in a structured form, where each target entity is associated with at least one event attribute;
the interface adapter unit is used for adapting the event attribute to the target entity according to a corresponding interface, and taking the event attribute as the entity attribute of the target entity;
and the adjusting subunit is used for rendering the target entity according to the attribute value corresponding to the entity attribute, adjusting the style characteristic corresponding to the target entity, and forming the game role corresponding to the target entity.
The apparatus 400 specifically further includes:
the integrated module is used for uniformly setting the target entity and the calling interface of the topographic picture on the integrated interface;
and the display module is used for calling the event attribute through the integrated interface and displaying the game role.
In the role configuration process of the game scene, the map and the related roles are constructed by adopting the unified control interface to form the game scene, and the time-sharing coordinates corresponding to the game roles, the event marks in the event library and the attribute parameters are respectively associated in the game scene, so that the unified motion control is accurately performed on each target role, and the configuration efficiency of the game roles is improved.
In short, the above description is only a preferred embodiment of the present disclosure, and is not intended to limit the scope of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present specification shall be included in the protection scope of the present specification.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.

Claims (14)

1. A method of scene layout in a game, the method comprising:
extracting map data, and calling a corresponding topographic picture in a set interface to render to obtain the scene map;
calling a target entity from a given entity library, and associating the target entity to the scene map;
calling event attributes corresponding to the target entities, extracting interfaces of the target entities and respectively loading the event attributes to obtain game roles corresponding to the target entities; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
and respectively calling corresponding target events according to user triggering, and executing respective corresponding action behaviors of the game roles in the scene map.
2. The method of claim 1, wherein when the corresponding terrain image is called for rendering,
rasterizing the map data to obtain raster data;
mapping each cell of the raster data with a pixel point in the terrain picture to obtain a pixel range corresponding to each cell;
and rendering the terrain picture according to the pixel range corresponding to each cell to obtain the scene map.
3. The method of claim 1, wherein when associating the target entity into the scene map,
acquiring an entity type corresponding to the target entity;
determining the central coordinate of the target entity according to the entity type;
and with the central coordinate as a center, arranging the target entity in a grid data corresponding to the scene map in an associated manner, and acquiring target coordinates of key points corresponding to the target entity in the grid data respectively.
4. The method of claim 3, wherein the associating the target entity with the grid data corresponding to the scene map,
extracting the mapping relation of the central coordinate and the key point;
respectively determining the associated coordinates corresponding to the key points in real time according to the mapping relation;
and respectively fitting the associated coordinates to nodes of the raster data, and taking the nodes as target coordinates.
5. The method of claim 1, wherein the interface for extracting the target entity loads the event attribute respectively,
calling a list of corresponding event attributes according to the entity type corresponding to the target entity;
respectively extracting interfaces corresponding to the target entities according to the list, and respectively adapting the parameter values corresponding to the event attributes to the interfaces corresponding to the target entities;
rendering the target entity according to the adapted parameter values, and coloring and adapting the target entity to obtain the game role corresponding to the target entity.
6. The method of claim 5, wherein the event attributes are respectively stored in a structured manner with the target entities in the entity library, and the method comprises:
storing each target entity in the entity library in a structured form, wherein each target entity is respectively associated with at least one event attribute;
the event attribute is adapted to the target entity according to a corresponding interface, and the event attribute is used as the entity attribute of the target entity;
and rendering the target entity according to the attribute value corresponding to the entity attribute, and adjusting the style characteristic corresponding to the target entity to form the game role corresponding to the target entity.
7. The method of claim 1, further comprising:
the established interface is an integrated interface, and the target entity and the calling interface of the topographic picture are uniformly arranged on the integrated interface;
and calling the event attribute through the integrated interface to display the game role.
8. An apparatus for arranging scenes in a game, the apparatus comprising:
the rendering module is used for extracting map data and calling a corresponding topographic picture in a set interface to render to obtain the scene map;
the association module is used for calling a target entity from a given entity library and associating the target entity to the scene map;
the loading module is used for calling the event attribute corresponding to the target entity, extracting the interface of the target entity and respectively loading the event attribute to obtain the game role corresponding to the target entity; the event attribute is a target event and an entity attribute corresponding to the entity type of the target entity;
and the output module is used for respectively calling corresponding target events according to user trigger and executing respective corresponding action behaviors of the game role in the scene map.
9. The apparatus of claim 8, wherein the rendering module comprises,
the grid unit is used for rasterizing the map data to obtain grid data;
the mapping unit is used for mapping each cell of the raster data with a pixel point in the terrain picture to obtain a pixel range corresponding to each cell;
and the dividing unit is used for dividing the topographic picture according to the pixel range corresponding to each cell, rendering according to the divided pixels and then coloring to obtain the scene map.
10. The apparatus of claim 8, wherein the association module comprises,
a type unit, configured to obtain an entity type corresponding to the target entity;
the center determining unit is used for determining the center coordinates of the target entity according to the entity type;
and the setting unit is used for setting the target entity in a grid data corresponding to the scene map in an associated manner by taking the central coordinate as a center, and acquiring target coordinates of key points corresponding to the target entity in the grid data respectively.
11. The device according to claim 10, wherein the setting unit specifically includes:
the extraction subunit is used for extracting the mapping relation between the central coordinate and the corresponding key point;
the coordinate subunit is used for respectively determining the associated coordinates corresponding to the key points in real time according to the mapping relation;
and the fitting subunit is used for respectively fitting the associated coordinates into the nodes of the raster data, and taking the nodes as target coordinates.
12. The method of claim 8, wherein the loading module comprises,
the list unit is used for calling a list corresponding to the event attribute according to the entity type corresponding to the target entity;
the adaptation unit is used for respectively extracting the interfaces corresponding to the target entities according to the list and respectively adapting the parameter values corresponding to the event attributes with the interfaces corresponding to the target entities;
and the role unit is used for rendering the target entity according to the adapted parameter values, coloring and adapting the target entity to obtain the game role corresponding to the target entity.
13. The method according to claim 12, wherein the list element, in particular comprising,
the storage subunit is configured to store each target entity in the entity library in a structured form, where each target entity is associated with at least one event attribute;
the interface adapter unit is used for adapting the event attribute to the target entity according to a corresponding interface, and taking the event attribute as the entity attribute of the target entity;
and the adjusting subunit is used for rendering the target entity according to the attribute value corresponding to the entity attribute, adjusting the style characteristic corresponding to the target entity, and forming the game role corresponding to the target entity.
14. The method of claim 8, wherein the apparatus further comprises:
the integrated module is used for uniformly setting the target entity and the calling interface of the topographic picture on the integrated interface;
and the display module is used for calling the event attribute through the integrated interface and displaying the game role.
CN202010084994.5A 2020-02-10 2020-02-10 Scene layout method and device in game Withdrawn CN111228816A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010084994.5A CN111228816A (en) 2020-02-10 2020-02-10 Scene layout method and device in game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010084994.5A CN111228816A (en) 2020-02-10 2020-02-10 Scene layout method and device in game

Publications (1)

Publication Number Publication Date
CN111228816A true CN111228816A (en) 2020-06-05

Family

ID=70861867

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010084994.5A Withdrawn CN111228816A (en) 2020-02-10 2020-02-10 Scene layout method and device in game

Country Status (1)

Country Link
CN (1) CN111228816A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112642148A (en) * 2020-12-30 2021-04-13 北京像素软件科技股份有限公司 Game scene generation method and device and computer equipment
CN113457129A (en) * 2021-06-23 2021-10-01 深圳市瑞立视多媒体科技有限公司 Game level selection and role configuration method, system and computer equipment
CN113476848A (en) * 2021-07-08 2021-10-08 网易(杭州)网络有限公司 Method and device for generating tree chain map, storage medium and electronic equipment
CN113856202A (en) * 2021-10-11 2021-12-31 北京字跳网络技术有限公司 Game data editing method, device, editor, readable medium and equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013038979A1 (en) * 2011-09-14 2013-03-21 株式会社セガ Game program, game device, and recording medium having game program recorded therein
CN108986194A (en) * 2018-07-24 2018-12-11 合肥爱玩动漫有限公司 A kind of scene of game rendering method
CN109675309A (en) * 2019-02-01 2019-04-26 网易(杭州)网络有限公司 A kind of building method and device of scene of game
CN109771943A (en) * 2019-01-04 2019-05-21 网易(杭州)网络有限公司 A kind of building method and device of scene of game
CN109939440A (en) * 2019-04-17 2019-06-28 网易(杭州)网络有限公司 Generation method, device, processor and the terminal of 3d gaming map

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013038979A1 (en) * 2011-09-14 2013-03-21 株式会社セガ Game program, game device, and recording medium having game program recorded therein
CN108986194A (en) * 2018-07-24 2018-12-11 合肥爱玩动漫有限公司 A kind of scene of game rendering method
CN109771943A (en) * 2019-01-04 2019-05-21 网易(杭州)网络有限公司 A kind of building method and device of scene of game
CN109675309A (en) * 2019-02-01 2019-04-26 网易(杭州)网络有限公司 A kind of building method and device of scene of game
CN109939440A (en) * 2019-04-17 2019-06-28 网易(杭州)网络有限公司 Generation method, device, processor and the terminal of 3d gaming map

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112642148A (en) * 2020-12-30 2021-04-13 北京像素软件科技股份有限公司 Game scene generation method and device and computer equipment
CN113457129A (en) * 2021-06-23 2021-10-01 深圳市瑞立视多媒体科技有限公司 Game level selection and role configuration method, system and computer equipment
CN113476848A (en) * 2021-07-08 2021-10-08 网易(杭州)网络有限公司 Method and device for generating tree chain map, storage medium and electronic equipment
CN113476848B (en) * 2021-07-08 2023-11-17 网易(杭州)网络有限公司 Tree chain map generation method and device, storage medium and electronic equipment
CN113856202A (en) * 2021-10-11 2021-12-31 北京字跳网络技术有限公司 Game data editing method, device, editor, readable medium and equipment

Similar Documents

Publication Publication Date Title
CN111228816A (en) Scene layout method and device in game
CN108010112B (en) Animation processing method, device and storage medium
US10922152B2 (en) Event handler nodes for visual scripting
KR102170620B1 (en) Method and system for generating training data to train classifiers with localizable features
CN109144649A (en) Display methods, device, terminal and the storage medium of icon
CN111930442B (en) Page view loading method and device, storage medium and electronic equipment
CN112691381B (en) Rendering method, device and equipment of virtual scene and computer readable storage medium
CN111324381B (en) Development system, development method, development apparatus, computer device, and storage medium
TW202004674A (en) Method, device and equipment for showing rich text on 3D model
CN113018870B (en) Data processing method, device and computer readable storage medium
WO2023197762A1 (en) Image rendering method and apparatus, electronic device, computer-readable storage medium, and computer program product
CN110750664A (en) Picture display method and device
CN110865863B (en) Interface display method and device for fast application and storage medium
CN108845733B (en) Screen capture method, device, terminal and storage medium
WO2024131652A1 (en) Special effect processing method and apparatus, and electronic device and storage medium
CN106293658B (en) Interface component generation method and equipment
CN111526290B (en) Image processing method, device, terminal and storage medium
KR20210040305A (en) Method and apparatus for generating images
CN110865864B (en) Interface display method, device and equipment for quick application and storage medium
US10909754B1 (en) Visual scripting for multi-dimensional elements
CN110968513B (en) Recording method and device of test script
CN115018975A (en) Data set generation method and device, electronic equipment and storage medium
CN115222835A (en) Drawing suggestion generation method, device and equipment
US20240033625A1 (en) Rendering method and apparatus for virtual scene, electronic device, computer-readable storage medium, and computer program product
CN115174993B (en) Method, apparatus, device and storage medium for video production

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200605