CN111467803B - Display control method and device in game, storage medium and electronic equipment - Google Patents

Display control method and device in game, storage medium and electronic equipment Download PDF

Info

Publication number
CN111467803B
CN111467803B CN202010255567.9A CN202010255567A CN111467803B CN 111467803 B CN111467803 B CN 111467803B CN 202010255567 A CN202010255567 A CN 202010255567A CN 111467803 B CN111467803 B CN 111467803B
Authority
CN
China
Prior art keywords
scene
game
display
game scene
auxiliary game
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010255567.9A
Other languages
Chinese (zh)
Other versions
CN111467803A (en
Inventor
钱静
沈杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010255567.9A priority Critical patent/CN111467803B/en
Publication of CN111467803A publication Critical patent/CN111467803A/en
Application granted granted Critical
Publication of CN111467803B publication Critical patent/CN111467803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5255Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Abstract

The present disclosure relates to the technical field of games, and provides a display control method in a game, a display control device in a game, a computer storage medium, and an electronic device, where the display control method in a game includes: responding to a display triggering operation acting on the graphical user interface, and acquiring a pre-created auxiliary game scene; generating a display interface according to the auxiliary game scene and the virtual characters in the game; and responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface. The display control method in the game can solve the problems that two scenes cannot be opened simultaneously in the prior art, the interface in the battle can be switched rapidly, the background rotation and the virtual character rotation separation in the prior art can be solved, and the synchronous transformation of the game scenes and the virtual characters is realized.

Description

Display control method and device in game, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of game technologies, and in particular, to a display control method in a game, a display control device in a game, a computer storage medium, and an electronic apparatus.
Background
With the development of computer technology, related game fields are rapidly developed, and the requirements of people on substitution sense and display effect of games are gradually improved, so that the fusion of detail description and environment of virtual characters in the games becomes a focus of attention of related developers.
Currently, a 3D (three-dimensional) character is projected onto a UI (User Interface, UI) Interface, a shadow portion of the character is generated by a program, a 2D (two-dimensional) picture is still used for a background, and the 2D background is not associated with the 3D model, so that edges are hard and not unified with the environment, and rotation of the character and rotation of the background are separated.
In view of the foregoing, there is a need in the art to develop a new method and apparatus for controlling display in a game.
It should be noted that the information disclosed in the foregoing background section is only for enhancing understanding of the background of the present disclosure.
Disclosure of Invention
The present disclosure aims to provide a display control method in a game, a display control device in a game, a computer storage medium and an electronic device, so as to at least avoid the defect that in the prior art, a scene and a virtual character cannot be synchronously changed to some extent.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of the present disclosure, there is provided a display control method in a game, providing a graphical user interface through a terminal device, the method comprising: responding to a display triggering operation acting on the graphical user interface, and acquiring a pre-created auxiliary game scene; generating a display interface according to the auxiliary game scene and the virtual characters in the game; and responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
In an exemplary embodiment of the present disclosure, the graphical user interface includes at least one function control, and the responding to the display triggering operation applied to the graphical user interface obtains a pre-created auxiliary game scene, including: and responding to a first touch operation acted on the functional control, and acquiring the pre-created auxiliary game scene.
In an exemplary embodiment of the present disclosure, the generating a presentation interface according to the auxiliary game scene and the virtual character in the game includes: and generating the display interface corresponding to the function control according to the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the present disclosure, a virtual camera is disposed in the auxiliary game scene, and the generating a presentation interface according to the auxiliary game scene and a virtual character in the game includes: acquiring scene images in the auxiliary game scene through the virtual camera; and generating the display interface according to the scene image and the virtual character in the game.
In an exemplary embodiment of the present disclosure, the auxiliary game scenario includes at least one of: the virtual character is located in a game scene area in a main game scene of the game; a specific game scene area in the main game scene of the game; a background scene associated with the primary game scene in the game.
In an exemplary embodiment of the present disclosure, the viewing angle adjustment operation includes a second touch operation acting on the presentation interface; the responding to the visual angle adjusting operation acted on the display interface adjusts the display visual angles of the auxiliary game scene and the virtual character in the display interface, and the responding comprises the following steps: acquiring operation parameters corresponding to the second touch operation acting on the display interface; determining viewing angle adjustment parameters according to the operation parameters; and adjusting the display view angles of the auxiliary game scene and the virtual roles in the display interface according to the view angle adjusting parameters.
In an exemplary embodiment of the present disclosure, the determining a viewing angle adjustment parameter according to the operation parameter includes: determining a rotation parameter of a virtual camera arranged in the auxiliary game scene according to the operation parameter; and determining the rotation parameter as the visual angle adjustment parameter.
In an exemplary embodiment of the present disclosure, the second touch operation is a sliding operation, and the operation parameters include a touch point movement distance and a sliding direction of the sliding operation; determining viewing angle adjustment parameters from the operating parameters, comprising: determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the touch point moving distance; and determining the rotation direction of the virtual camera according to the sliding direction.
In an exemplary embodiment of the disclosure, the second touch operation is a click operation or a re-press operation, and the operation parameters include a relative distance and a relative positional relationship between a touch point position of the second touch operation and a preset reference position; determining viewing angle adjustment parameters from the operating parameters, comprising: determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the relative distance; and determining the rotation direction of the virtual camera according to the relative position relation.
In an exemplary embodiment of the present disclosure, the touch point movement distance includes a touch point movement distance in a horizontal direction and a touch point movement distance in a vertical direction; the determining the rotation angle of the virtual camera arranged in the auxiliary game scene according to the touch point moving distance comprises the following steps: determining a first rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera; determining a second rotation angle corresponding to the virtual camera according to the touch point moving distance in the horizontal direction, the touch point moving distance in the vertical direction and the rotation radius of the virtual camera; and determining the first rotation angle and the second rotation angle as rotation angles corresponding to virtual cameras arranged in the auxiliary game scene.
In an exemplary embodiment of the present disclosure, the generating the presentation interface according to the scene image and the virtual character in the game includes: blurring the scene image; adjusting the display parameters of the scene image after the blurring process to target display parameters; and rendering the scene image and the virtual roles in the game after being regulated to the target display parameters to the graphical user interface to obtain the display interface.
In an exemplary embodiment of the present disclosure, the method further comprises: and scaling and adjusting the image size of the scene image after being adjusted to the target display parameter according to the screen size of the terminal equipment.
According to a second aspect of the present disclosure, there is provided an in-game display control apparatus for providing a graphical user interface through a terminal device, the apparatus comprising: the acquisition module is used for responding to the display triggering operation acted on the graphical user interface and acquiring a pre-created auxiliary game scene; the interface generation module is used for generating a display interface according to the auxiliary game scene and the virtual characters in the game; and the visual angle adjusting module is used for responding to the visual angle adjusting operation acted on the display interface and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
According to a third aspect of the present disclosure, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements the display control method in a game described in the first aspect described above.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to execute the display control method in the game described in the first aspect described above via execution of the executable instructions.
As can be seen from the above technical solutions, the method for controlling in-game display, the device for controlling in-game display, the computer storage medium, and the electronic device according to the exemplary embodiments of the present disclosure have at least the following advantages and positive effects:
in the technical schemes provided by some embodiments of the present disclosure, on one hand, a pre-created auxiliary game scene is obtained in response to a display trigger operation acting on a graphical user interface, and a display interface is generated according to the auxiliary game scene and a virtual character in a game, so that the technical problem in the prior art that only a single scene can be opened and two scenes cannot be simultaneously opened can be solved, and rapid switching of an internal interface of a fight is realized. The visual angle adjusting operation acted on the display interface is responded, the display visual angles of the auxiliary game scene and the virtual characters in the display interface are adjusted, the technical problems of background rotation and rotation separation of the virtual characters in the prior art can be solved, synchronous rotation of the game scene and the virtual characters is realized, and the display quality of the game and the substituting sense of the game are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 illustrates a flow diagram of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of a primary game scenario in an exemplary embodiment of the present disclosure;
FIG. 3 illustrates a schematic diagram of an auxiliary game scenario in an exemplary embodiment of the present disclosure;
FIG. 4 illustrates a sub-flowchart diagram of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 5 illustrates a schematic diagram of a graphical user interface in an exemplary embodiment of the present disclosure;
FIG. 6 illustrates a sub-flowchart diagram of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 7 illustrates a schematic diagram of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 8 illustrates a sub-flowchart diagram of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 9 illustrates a sub-flowchart diagram of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 10 illustrates a sub-flowchart of a display control method in a game in an exemplary embodiment of the present disclosure;
FIG. 11A is a schematic diagram showing a display effect in an exemplary embodiment of the present disclosure;
fig. 11B is a diagram showing a display effect after display view angle adjustment in an exemplary embodiment of the present disclosure;
fig. 12 is a schematic diagram showing the structure of a display control device in a game in an exemplary embodiment of the present disclosure;
FIG. 13 illustrates a schematic diagram of a computer storage medium in an exemplary embodiment of the present disclosure;
fig. 14 shows a schematic structural diagram of an electronic device in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
At present, a 3D character is generally projected on a UI interface, a shadow part of the character is generated by a program, a background still uses a 2D picture, the 2D background is not related to a 3D model, the edge is hard and is not uniform with the environment, and the rotation of the character and the rotation of the background are separated. Or, the 3D scene and the person are directly used, so that only a single scene can be opened, two scenes cannot be opened at the same time, and rapid switching of an internal interface of a battle cannot be performed. Or, 2D character animation is produced, so that stereoscopic impression, light and shadow, depth of field and animation deformation are limited, and user experience is poor.
In the embodiment of the present disclosure, a display control method in a game is provided first, which overcomes the defect that a scene and a virtual character in the prior art cannot rotate synchronously at least to some extent.
Fig. 1 illustrates a flowchart of a display control method in a game, which may be a server performing display control of the game, according to an exemplary embodiment of the present disclosure.
Referring to fig. 1, a display control method in a game according to an embodiment of the present disclosure includes the steps of:
step S110, responding to a display triggering operation acted on a graphical user interface, and acquiring a pre-created auxiliary game scene;
step S120, generating a display interface according to the auxiliary game scene and the virtual roles in the game;
step S130, in response to the visual angle adjustment operation on the display interface, the display visual angles of the auxiliary game scene and the virtual character in the display interface are adjusted.
In the technical scheme provided by the embodiment shown in fig. 1, on one hand, a pre-created auxiliary game scene is obtained in response to a display trigger operation acting on a graphical user interface, and a display interface is generated according to the auxiliary game scene and virtual characters in a game, so that the technical problem that only a single scene can be opened and two scenes cannot be opened simultaneously in the prior art can be solved, and the rapid switching of the interface in a battle can be realized. The visual angle adjusting operation acted on the display interface is responded, the display visual angles of the auxiliary game scene and the virtual characters in the display interface are adjusted, the technical problems of background rotation and rotation separation of the virtual characters in the prior art can be solved, synchronous rotation of the game scene and the virtual characters is realized, and the display quality of the game and the substituting sense of the game are improved.
The specific implementation of each step in fig. 1 is described in detail below:
in an exemplary embodiment of the present disclosure, a terminal device, i.e., a computer display terminal, is a device that inputs a program and data to a computer or receives a result of processing output from the computer via a communication facility, and is an input and output device of a computer system. The terminal device in the present disclosure may be a mobile phone, a computer, an iPad, a tablet computer, a vehicle-mounted computer, or the like.
A graphical user interface (Graphical User Interface, abbreviated as GUI, also called graphical user interface) may be provided through the above terminal device, and refers to a computer-operated user interface that is displayed in a graphical manner, allowing a user to use an input device such as a mouse to manipulate icons or menu options on a screen, to select commands, call files, start programs, or perform some other daily task. Graphical user interfaces offer a number of advantages over character interfaces that accomplish routine tasks by entering text or character commands through a keyboard. The graphical user interface is composed of windows, drop-down menus, dialog boxes and their corresponding control mechanisms, standardized in various new applications, i.e. the same operations are always accomplished in the same way, and in the graphical user interface, the user sees and manipulates graphical objects, and computer graphics technology is applied.
A game scene is an environment, building, machinery, prop, etc. in a game. A game scenario can be generally understood as recovering available elements (including weapon props and NPCs (Non-Player Character, abbreviated as NPCs, being a type of Character in a game, meaning Non-Player characters, meaning game characters in a game that are not manipulated by players) in a building, trees, sky, roads, etc. in a game according to requirements of a plan, etc.
In an exemplary embodiment of the present disclosure, at least a portion of the primary game scene and at least one functionality control may be displayed in the graphical user interface described above. The main game scene can be larger than the display screen of the graphical user interface, a player can control game characters to play games such as monster, upgrade, task receiving, running and the like in the main game scene, and in the moving process of the game characters, the content of the main game scene displayed by the graphical user interface can be updated accordingly; the functional control can be one or more of a virtual prop control, a skill control, a role attribute control and a weapon type control, and the functional control can be used for opening a functional interface. For example, referring to fig. 2, fig. 2 illustrates an interface schematic diagram of a main game scene in an exemplary embodiment of the present disclosure, 201 illustrates a part of the main game scene, 202 illustrates a virtual character in the main game scene, and 203 illustrates the above functional control, such as a virtual prop control, for opening the virtual prop interface.
For example, an auxiliary game scene may be created in advance, and a virtual camera may be provided in the auxiliary game scene to capture a scene image of the auxiliary game scene in real time. The auxiliary game scene may be a game scene area where the virtual character in the game main game scene is located, for example: the same scene as the display screen of the main game scene.
The auxiliary game scene may also be a specific game scene area in the main game scene (i.e. a part of the main game scene), such as: when the virtual character runs in the main game scene, the auxiliary game scene can adopt a training field scene; when the virtual character runs in the main game scene, the auxiliary game scene can adopt a weapon paving scene; when the virtual character runs in the main game scene, the auxiliary game scene can adopt the hot spring scene, so that the auxiliary game scene can be updated according to the real-time position of the virtual character, and the picture sense and the interestingness of the game are improved.
The auxiliary game scene may also be a reconstructed background scene associated with the main game scene in the game, such as: a background scene that is different from the display screen of the main game scene but is consistent with the display attribute (such as style, type) of the main game scene, for example: and a wind game scene, a Chinese wind game scene, a real wind game scene, a last wind game scene and the like. For example, when the auxiliary game scene 301 is a game scene area where the virtual character 302 in the game main game scene is located, reference may be made to fig. 3, where fig. 3 illustrates a scene diagram of the auxiliary game scene in an exemplary embodiment of the present disclosure, where the auxiliary game scene 301 is illustrated, and the virtual character 302 is illustrated in the auxiliary game scene (same as the virtual character in the main game scene).
With continued reference to fig. 1, in step S110, a pre-created auxiliary game scene is acquired in response to a presentation trigger operation acting on a graphical user interface.
In an exemplary embodiment of the present disclosure, referring to the explanation related to the above steps, taking a function control as a virtual prop control as an example, by way of illustration, and continuing to refer to fig. 2 and 203, which illustrate the function control (e.g., a knapsack control BAG in the virtual prop control), when a first touch operation (e.g., clicking, long pressing, dragging, and sliding operation) of the user on the function control 203 is received, a pre-created auxiliary game scenario may be acquired, for example: a knapsack scene corresponding to the knapsack control BAG described above may be, for example, a background when a knapsack of a virtual character is opened in a game and an item in the knapsack is displayed.
In step S120, a presentation interface is generated from the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the present disclosure, a presentation interface corresponding to a functionality control may be generated from an auxiliary game scenario and a virtual character in a game.
Specifically, the scene image in the auxiliary game scene can be obtained through the virtual camera, and then a display interface is generated according to the scene image and the virtual character in the game. The content displayed on the display interface may further include virtual articles (for example, a knapsack interface corresponding to a knapsack control, virtual medicines, virtual equipment, virtual ores, virtual foods, etc.), a display position relationship between the virtual articles and virtual characters, etc. (for example, the virtual articles may be displayed at the bottom of the display interface, the virtual characters may be displayed in the middle of the display interface, etc., which may be set by themselves according to actual situations, and thus, the display interface belongs to the protection scope of the disclosure).
After the scene image is acquired, referring to fig. 4, fig. 4 is a schematic flow chart illustrating a display control method in a game according to an exemplary embodiment of the present disclosure, specifically illustrating a schematic flow chart illustrating generation of a presentation interface according to an auxiliary game scene and a virtual character in the game, including steps S401 to S403, and step S120 is explained below in connection with fig. 4.
In step S401, blurring processing is performed on a scene image.
In an exemplary embodiment of the present disclosure, after a scene image of an auxiliary game scene is acquired, the scene image may be subjected to blurring, and specifically, a depth of field (DOF) of the scene image may be subjected to blurring, which refers to a range of distances between front and rear of a subject measured in a camera lens or other imager that can acquire imaging of a clear image. Therefore, the scene image has the depth effect, and the light sense, the color and the immersion sense of the game picture are improved, so that the phenomenon of dislocation of the 3D character and the 2D background can be effectively avoided.
In step S402, the display parameters of the scene image after the blurring process are adjusted to target display parameters.
In an exemplary embodiment of the present disclosure, the display parameters of the scene image after the blurring process may be adjusted to target display parameters, wherein the display parameters may be color, brightness, color temperature, and the like. For example, when the display parameter is color, the display parameter may be adjusted according to the display requirement of the user, so as to adjust the display parameter to the target display parameter. For example, when the game type is a combat type, the color may be adjusted to a dark color system to promote the game atmosphere and the game substitution sense; when the game type is an educational game, the color can be adjusted to be a vivid color so as to increase the interest of the game.
In an exemplary embodiment of the present disclosure, after adjusting the display parameters to the target display parameters, if a full-screen image is to be displayed, the image size of the scene image after being adjusted to the target display parameters may be scaled according to different types of terminal devices to ensure that the scene image may be simultaneously matched with display screens of a plurality of terminal devices, and exemplary, the Render Texture of the scene image may be set to a standard handset size of 1920×1080, or the Render Texture of the scene image may be set to 2340×1440 to ensure that the Texture of the scene image is simultaneously matched with the handset and the Ipad screen.
In an exemplary embodiment of the present disclosure, an anchor point (which is one of hyperlinks in web page production, also called a naming anchor) of a scene image may also be set to be centered, so as to avoid edge leak, and optimize the visual experience of a user.
In step S403, the scene image and the virtual character in the game after being adjusted to the target display parameters are rendered onto the graphical user interface, and the presentation interface is obtained.
In exemplary embodiments of the present disclosure, after adjusting the display parameters of the scene image to the target display parameters, the scene image and the virtual character in the game may be rendered onto the graphical user interface, resulting in a presentation interface.
After the presentation interface is obtained, referring to fig. 5, fig. 5 may be schematically shown in an exemplary embodiment of the disclosure, specifically, a schematic diagram of displaying a main game scene and the presentation interface on the graphical user interface, and as can be seen from fig. 5, the disclosure can simultaneously open a plurality of game interfaces, so as to implement rapid switching of a combat interface in a game.
In step S130, in response to the viewing angle adjustment operation applied to the display interface, the display viewing angles of the auxiliary game scene and the virtual character in the display interface are adjusted.
In exemplary embodiments of the present disclosure, in response to a viewing angle adjustment operation acting on a presentation interface, a display viewing angle of a secondary game scene and a virtual character in the presentation interface may be adjusted.
Specifically, the above-mentioned view angle adjustment operation may be a second touch operation applied to the display interface, for example, refer to fig. 6, where fig. 6 is a schematic flow chart illustrating a display control method in a game according to an exemplary embodiment of the present disclosure, and specifically illustrates a flow chart illustrating adjustment of the display view angles of the auxiliary game scene and the virtual character in the display interface in response to the view angle adjustment operation applied to the display interface, including steps S601-S603, and the following explanation of step S130 is made in connection with fig. 6.
In step S601, operation parameters corresponding to the second touch operation on the display interface are obtained.
In an exemplary embodiment of the present disclosure, an operation parameter corresponding to a second touch operation acting on a display interface may be obtained, and exemplary reference may be made to fig. 7, where fig. 7 illustrates a schematic diagram of a display control method in a game in an exemplary embodiment of the present disclosure, specifically illustrates a schematic diagram of an operation parameter corresponding to a second touch operation, reference is made to fig. 7, 701 illustrates the display interface, point O is a world center point, XOYZ is a space coordinate system, and OA is a rotation radius r of a virtual camera.
When the second touch operation is a sliding operation, the operation parameters may include a touch point moving distance (including a touch point moving distance x in a horizontal direction and a touch point moving distance y in a vertical direction) and a sliding direction of the sliding operation, and, by way of example, when the second touch operation is a sliding operation from a point a to a point B, the touch point moving distance AB and the sliding direction from the point a to the point B are the operation parameters described above, and further, the touch point moving distance AB may be decomposed to obtain the touch point moving distance x in the horizontal direction and the touch point moving distance y in the vertical direction.
When the second touch operation is a clicking operation or a re-pressing operation, the operation parameters include a relative distance (including a relative distance x in a horizontal direction and a relative distance y in a vertical direction) between a touch point position of the second touch operation and a preset reference position, where the preset reference position may be a position where a virtual character in the game is located, and when the preset reference position is a position where the virtual character in the game is located (for example, a point a in fig. 7), and when the touch point position of the player is a point B, the relative distance between the touch point position and the preset reference position is AB, and further, the relative distance AB may be decomposed to obtain the relative distance x in the horizontal direction and the relative distance y in the vertical direction. The relative position relation between the touch point position and the preset reference position is as follows: the touch point is positioned on the right of a preset reference position.
In step S602, a viewing angle adjustment parameter is determined according to the operation parameter.
In exemplary embodiments of the present disclosure, after the operation parameters are obtained, the viewing angle adjustment parameters may be determined according to the operation parameters. Specifically, the rotation parameter of the virtual camera set in the auxiliary game scene may be determined according to the operation parameter, and further, the rotation parameter of the virtual camera is determined as the viewing angle adjustment parameter.
For example, referring to the explanation related to the above step S601, when the second touch operation is a sliding operation, the operation parameters may include a touch point moving distance (including a touch point moving distance x in a horizontal direction and a touch point moving distance y in a vertical direction) and a sliding direction of the sliding operation, and further, referring to fig. 8, fig. 8 may be a schematic view illustrating a sub-flow of a display control method in a game in an exemplary embodiment of the present disclosure, and particularly illustrates a schematic view illustrating a sub-flow of determining a viewing angle adjustment parameter according to the operation parameters (touch point moving distance and sliding direction), including steps S801 to S802, and a specific implementation will be explained below with reference to fig. 8.
In step S801, a rotation angle of a virtual camera provided in the auxiliary game scene is determined according to the touch point movement distance.
For example, reference may be made to fig. 9, in which fig. 9 illustrates a schematic sub-flowchart of a display control method in a game according to an exemplary embodiment of the present disclosure, specifically illustrates a schematic sub-flowchart of determining a rotation angle of a virtual camera provided in an auxiliary game scene according to a touch point movement distance, including steps S901 to S903, and step S801 is explained below in connection with fig. 9.
In step S901, a first rotation angle corresponding to the virtual camera is determined according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera.
In an exemplary embodiment of the present disclosure, the first rotation angle α corresponding to the virtual camera (i.e., the rotation angle of the virtual camera in the horizontal direction) may be determined according to the movement distance x of the touch point of the second touch operation in the horizontal direction and the rotation radius r of the virtual camera. Specifically, the following formula 1 may be used to determine the first rotation angle α corresponding to the virtual camera:
Figure BDA0002437183050000121
in step S902, a second rotation angle corresponding to the virtual camera is determined according to the movement distance of the touch point in the horizontal direction of the second touch operation, the movement distance of the touch point in the vertical direction of the second touch operation, and the rotation radius of the virtual camera.
In an exemplary embodiment of the disclosure, the second rotation angle β corresponding to the virtual camera (i.e., the rotation angle of the virtual camera in the horizontal direction) may be determined according to the above-mentioned movement distance x of the touch point in the horizontal direction of the second touch operation, the movement distance y of the touch point in the vertical direction of the second touch operation, and the rotation radius r of the virtual camera, and specifically, the following formula 2 may be used to determine the second rotation angle β corresponding to the virtual camera.
Figure BDA0002437183050000122
In step S903, the first rotation angle and the second rotation angle are determined as rotation angles corresponding to virtual cameras provided in the auxiliary game scene.
In an exemplary embodiment of the present disclosure, the above-described first rotation angle α and second rotation angle β may be determined as rotation angles corresponding to virtual cameras in the auxiliary game scene.
With continued reference to fig. 8, in step S802, the rotational direction of the virtual camera is determined according to the sliding direction.
For example, when the sliding direction of the sliding operation is left-sliding, the rotation direction of the virtual camera may be determined to be clockwise-rotating, and when the sliding direction of the sliding operation is right-sliding, the rotation direction of the virtual camera may be determined to be counterclockwise-rotating. For example, when the sliding direction is a right sliding operation from the point a to the point B, it may be determined that the corresponding rotation angle of the virtual camera is counterclockwise rotation. It should be noted that, the corresponding relationship between the sliding direction and the rotating direction can be set according to the actual situation, which belongs to the protection scope of the present disclosure.
For example, referring to the explanation related to the above step S601, when the second touch operation is a click operation or a press operation, the operation parameters include the relative distance (including the relative distance x in the horizontal direction and the relative distance y in the vertical direction) and the relative positional relationship between the touch point position of the second touch operation and the preset reference position, and further, referring to fig. 10, fig. 10 may be a schematic sub-flow diagram illustrating the display control method in the game in an exemplary embodiment of the present disclosure, and particularly illustrates a schematic sub-flow diagram for determining the viewing angle adjustment parameter according to the operation parameters (the touch point moving distance and the sliding direction), including steps S1001-S1002, and a specific implementation will be explained below in conjunction with fig. 10.
In step S1001, the rotation angle of the virtual camera provided in the auxiliary game scene is determined according to the relative distance.
In an exemplary embodiment of the present disclosure, after obtaining the relative distance x in the horizontal direction and the relative distance y in the vertical direction, the first rotation angle α corresponding to the virtual camera may be determined according to the relative distance x in the horizontal direction and the rotation radius r of the virtual camera with reference to the above-described related explanations of steps S901 to S903; and determining a second rotation angle beta corresponding to the virtual camera according to the relative distance x in the horizontal direction, the relative distance y in the vertical direction and the rotation radius r of the virtual camera, and determining the first rotation angle alpha and the second rotation angle beta as rotation angles corresponding to the virtual camera arranged in the auxiliary game scene.
In step S1002, the rotational direction of the virtual camera is determined from the relative positional relationship.
In an exemplary embodiment of the present disclosure, when the relative positional relationship between the touch point position and the preset reference position is: when the touch point is positioned on the left side of the preset reference position, the rotation direction of the virtual camera can be determined to be clockwise rotation; when the touch point is located on the right of the preset reference position, the rotation direction of the virtual camera can be determined to be anticlockwise rotation. With reference to the explanation of the above steps, when the touch point is located at the right side of the virtual character, it is further determined that the rotation direction of the virtual camera is counterclockwise.
In an exemplary embodiment of the present disclosure, after the rotation parameters of the virtual camera are obtained, the rotation parameters (including the rotation angle and the rotation direction) of the virtual camera may be determined as the angle of view adjustment parameters.
With continued reference to fig. 6, in step S603, the display viewing angles of the auxiliary game scene and the virtual character in the display interface are adjusted according to the viewing angle adjustment parameters.
In an exemplary embodiment of the present disclosure, after the above-mentioned viewing angle adjustment parameter is obtained, the display viewing angles of the auxiliary game scene and the virtual character in the display interface may be adjusted according to the viewing angle adjustment parameter, and by way of example, the virtual camera may be controlled to rotate counterclockwise by an angle α in the horizontal direction and rotate counterclockwise by an angle β in the vertical direction, so as to achieve adjustment of the display viewing angles of the auxiliary game scene and the virtual character.
In an exemplary embodiment of the present disclosure, fig. 11A and 11B are exemplary diagrams showing a display effect of the in-game display control method of the present disclosure in a Messiah engine, 1101 is an auxiliary game scene, 1102 is a virtual character in the auxiliary game scene, specifically, fig. 11A is a schematic diagram showing a display effect of a presentation interface, and fig. 11B is a schematic diagram showing a display effect after the display view angle adjustment of the auxiliary game scene and the virtual character in the presentation interface in response to a view angle adjustment operation acting on the presentation interface. As can be seen from fig. 11A and 11B, when the display view angle of the virtual character in the game changes, the display view angle of the auxiliary game scene also changes synchronously, that is, the present disclosure can realize synchronous rotation of the auxiliary game scene and the virtual character in the game, improve the substitution sense of the game, and solve the technical problem of rotation separation of the scene and the character in the prior art.
Based on the technical scheme, on the one hand, the three-dimensional image sensor can increase the sense of spatial immersion, adjust the parameters of the depth of field blur, color correction and other machines, and therefore can effectively avoid the phenomenon of dislocation of 3D characters and 2D backgrounds. Furthermore, the virtual character is not limited to a single light source, so that the richness and fineness of the light and shadow can be ensured to the greatest extent. On the other hand, the game scene can use rich 3D special effects and dynamic materials to realize synchronous transformation of the virtual characters and the game scene, and the beautiful environment scenery can be appreciated while the details of the virtual characters are appreciated, so that the display quality of the game is greatly improved.
The present disclosure also provides a display control apparatus in a game, and fig. 12 is a schematic diagram showing the structure of the display control apparatus in a game in an exemplary embodiment of the present disclosure; as shown in fig. 12, the in-game display control apparatus 1200 may include an acquisition module 1201, an interface generation module 1202, and a viewing angle adjustment module 1203. Wherein:
an acquisition module 1201 is configured to acquire a pre-created auxiliary game scene in response to a presentation trigger operation acting on the graphical user interface.
In an exemplary embodiment of the present disclosure, the graphical user interface includes at least one functionality control, and the acquisition module is configured to acquire the pre-created auxiliary game scene in response to a first touch operation acting on the functionality control in response to a presentation triggering operation acting on the graphical user interface.
In an exemplary embodiment of the present disclosure, a virtual camera is provided in an auxiliary game scene, and a presentation interface is generated according to the auxiliary game scene and a virtual character in a game, including: acquiring scene images in the auxiliary game scene through a virtual camera; and generating a display interface according to the scene image and the virtual roles in the game.
In an exemplary embodiment of the present disclosure, the auxiliary game scenario includes at least one of: the virtual role is located in the game scene area of the main game scene of the game; a specific game scene area in a main game scene of the game; background scene associated with in-game main game scene.
An interface generation module 1202 is configured to generate a presentation interface according to the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the present disclosure, the interface generation module is configured to generate a presentation interface corresponding to the functionality control according to the auxiliary game scene and the virtual character in the game.
In an exemplary embodiment of the present disclosure, the interface generation module is configured to blur a scene image; adjusting the display parameters of the scene image after the blurring process to target display parameters; and rendering the scene image and the virtual roles in the game after being adjusted to the target display parameters to a graphical user interface to obtain a display interface.
In an exemplary embodiment of the present disclosure, the interface generating module is configured to scale and adjust an image size of the scene image after being adjusted to the target display parameter according to a screen size of the terminal device.
The view angle adjusting module 1203 is configured to adjust a display view angle of the auxiliary game scene and the virtual character in the display interface in response to a view angle adjusting operation applied to the display interface.
In an exemplary embodiment of the present disclosure, the viewing angle adjustment operation includes a second touch operation acting on the presentation interface; the visual angle adjusting module is used for acquiring operation parameters corresponding to a second touch operation acting on the display interface; determining viewing angle adjustment parameters according to the operation parameters; and adjusting the display view angles of the auxiliary game scene and the virtual roles in the display interface according to the view angle adjustment parameters.
In an exemplary embodiment of the present disclosure, the view angle adjustment module is configured to determine a rotation parameter of a virtual camera provided in the auxiliary game scene according to the operation parameter; and determining the rotation parameter as the visual angle adjustment parameter.
In an exemplary embodiment of the present disclosure, the second touch operation is a sliding operation, and the operation parameters include a touch point movement distance and a sliding direction of the sliding operation; the visual angle adjusting module is used for determining the rotation angle of the virtual camera arranged in the auxiliary game scene according to the movement distance of the touch point; the rotation direction of the virtual camera is determined according to the sliding direction.
In an exemplary embodiment of the present disclosure, the second touch operation is a click operation or a re-press operation, and the operation parameters include a relative distance and a relative positional relationship between a touch point position of the second touch operation and a preset reference position; the visual angle adjusting module is used for determining the rotation angle of the virtual camera arranged in the auxiliary game scene according to the relative distance; and determining the rotation direction of the virtual camera according to the relative position relation.
In an exemplary embodiment of the present disclosure, the touch point movement distance includes a touch point movement distance in a horizontal direction and a touch point movement distance in a vertical direction; the visual angle adjusting module is used for determining a first rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera; determining a second rotation angle corresponding to the virtual camera according to the touch point moving distance in the horizontal direction, the touch point moving distance in the vertical direction and the rotation radius of the virtual camera; the first rotation angle and the second rotation angle are determined as rotation angles corresponding to virtual cameras provided in the auxiliary game scene.
The specific details of each module in the above-mentioned display control device in the game have been described in detail in the corresponding display control method in the game, so that they will not be described here again.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
Furthermore, although the steps of the methods in the present disclosure are depicted in a particular order in the drawings, this does not require or imply that the steps must be performed in that particular order or that all illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a mobile terminal, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer storage medium capable of implementing the above method is also provided. On which a program product is stored which enables the implementation of the method described above in the present specification. In some possible embodiments, the various aspects of the present disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device.
Referring to fig. 13, a program product 1300 for implementing the above-described method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
An electronic device 1400 according to such an embodiment of the present disclosure is described below with reference to fig. 14. The electronic device 1400 shown in fig. 14 is merely an example and should not be construed as limiting the functionality and scope of use of the disclosed embodiments.
As shown in fig. 14, the electronic device 1400 is embodied in the form of a general purpose computing device. Components of electronic device 1400 may include, but are not limited to: the at least one processing unit 1410, the at least one memory unit 1420, a bus 1430 connecting the different system components (including the memory unit 1420 and the processing unit 1410), and a display unit 1440.
Wherein the storage unit stores program code that is executable by the processing unit 1410 such that the processing unit 1410 performs steps according to various exemplary embodiments of the present disclosure described in the above section of the present description of exemplary methods. For example, the processing unit 1410 may perform the steps as shown in fig. 1: step S110, responding to the display triggering operation acted on the graphical user interface, and acquiring a pre-created auxiliary game scene; step S120, generating a display interface according to the auxiliary game scene and the virtual characters in the game; step S130, in response to an angle adjustment operation applied to the display interface, adjusting the display angles of the auxiliary game scene and the virtual character in the display interface.
The memory unit 1420 may include readable media in the form of volatile memory units, such as Random Access Memory (RAM) 14201 and/or cache memory 14202, and may further include Read Only Memory (ROM) 14203.
The memory unit 1420 may also include a program/utility 14204 having a set (at least one) of program modules 14205, such program modules 14205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 1430 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 1400 may also communicate with one or more external devices 1500 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 1400, and/or any device (e.g., router, modem, etc.) that enables the electronic device 1400 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 1450. Also, electronic device 1400 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 1460. As shown, the network adapter 1460 communicates with other modules of the electronic device 1400 via the bus 1430. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 1400, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, including several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (14)

1. A display control method in a game, characterized in that a graphical user interface is provided by a terminal device, the graphical user interface comprising at least one functionality control, in which graphical user interface at least part of a main game scene is displayed, the method comprising:
responding to a first touch operation of the functional control on the graphical user interface, and acquiring a pre-created auxiliary game scene; the auxiliary game scene includes a background scene associated with the primary game scene in the game;
generating a display interface according to the auxiliary game scene and the virtual characters in the game;
and responding to the visual angle adjusting operation acted on the display interface, and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
2. The method of claim 1, wherein the generating a presentation interface from the auxiliary game scene and the virtual character in the game comprises:
and generating the display interface corresponding to the function control according to the auxiliary game scene and the virtual character in the game.
3. The method of claim 1, wherein a virtual camera is provided in the auxiliary game scene, the generating a presentation interface from the auxiliary game scene and a virtual character in the game, comprising:
Acquiring scene images in the auxiliary game scene through the virtual camera;
and generating the display interface according to the scene image and the virtual character in the game.
4. The method of claim 1, wherein the auxiliary game scene comprises at least one of:
the virtual character is located in a game scene area in a main game scene of the game;
a specific game scene area in the main game scene of the game.
5. The method of claim 1, wherein the viewing angle adjustment operation comprises a second touch operation acting on the presentation interface;
the responding to the visual angle adjusting operation acted on the display interface adjusts the display visual angles of the auxiliary game scene and the virtual character in the display interface, and the responding comprises the following steps:
acquiring operation parameters corresponding to the second touch operation acting on the display interface;
determining viewing angle adjustment parameters according to the operation parameters;
and adjusting the display view angles of the auxiliary game scene and the virtual roles in the display interface according to the view angle adjusting parameters.
6. The method of claim 5, wherein said determining viewing angle adjustment parameters from said operating parameters comprises:
Determining a rotation parameter of a virtual camera arranged in the auxiliary game scene according to the operation parameter;
and determining the rotation parameter as the visual angle adjustment parameter.
7. The method of claim 6, wherein the second touch operation is a sliding operation, and the operation parameters include a touch point movement distance and a sliding direction of the sliding operation;
determining viewing angle adjustment parameters from the operating parameters, comprising:
determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the touch point moving distance;
and determining the rotation direction of the virtual camera according to the sliding direction.
8. The method according to claim 6, wherein the second touch operation is a click operation or a re-press operation, and the operation parameters include a relative distance and a relative positional relationship between a touch point position of the second touch operation and a preset reference position;
determining viewing angle adjustment parameters from the operating parameters, comprising:
determining a rotation angle of a virtual camera arranged in the auxiliary game scene according to the relative distance;
and determining the rotation direction of the virtual camera according to the relative position relation.
9. The method of claim 7, wherein the touch point movement distance comprises a touch point movement distance in a horizontal direction and a touch point movement distance in a vertical direction;
the determining the rotation angle of the virtual camera arranged in the auxiliary game scene according to the touch point moving distance comprises the following steps:
determining a first rotation angle corresponding to the virtual camera according to the movement distance of the touch point in the horizontal direction and the rotation radius of the virtual camera;
determining a second rotation angle corresponding to the virtual camera according to the touch point moving distance in the horizontal direction, the touch point moving distance in the vertical direction and the rotation radius of the virtual camera;
and determining the first rotation angle and the second rotation angle as rotation angles corresponding to virtual cameras arranged in the auxiliary game scene.
10. A method according to claim 3, wherein said generating said presentation interface from said scene image and a virtual character in said game comprises:
blurring the scene image;
adjusting the display parameters of the scene image after the blurring process to target display parameters;
And rendering the scene image and the virtual roles in the game after being regulated to the target display parameters to the graphical user interface to obtain the display interface.
11. The method according to claim 10, wherein the method further comprises:
and scaling and adjusting the image size of the scene image after being adjusted to the target display parameter according to the screen size of the terminal equipment.
12. A display control apparatus in a game, characterized in that a graphical user interface is provided by a terminal device, said graphical user interface comprising at least one functionality control, in which graphical user interface at least part of a main game scene is displayed, said apparatus comprising:
the acquisition module is used for responding to a first touch operation of the functional control acting on the graphical user interface and acquiring a pre-created auxiliary game scene; the auxiliary game scene includes a background scene associated with the primary game scene in the game;
the interface generation module is used for generating a display interface according to the auxiliary game scene and the virtual characters in the game;
and the visual angle adjusting module is used for responding to the visual angle adjusting operation acted on the display interface and adjusting the display visual angles of the auxiliary game scene and the virtual character in the display interface.
13. A computer storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the in-game display control method according to any one of claims 1 to 11.
14. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to execute the in-game display control method of any one of claims 1 to 11 via execution of the executable instructions.
CN202010255567.9A 2020-04-02 2020-04-02 Display control method and device in game, storage medium and electronic equipment Active CN111467803B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010255567.9A CN111467803B (en) 2020-04-02 2020-04-02 Display control method and device in game, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010255567.9A CN111467803B (en) 2020-04-02 2020-04-02 Display control method and device in game, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111467803A CN111467803A (en) 2020-07-31
CN111467803B true CN111467803B (en) 2023-07-14

Family

ID=71749636

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010255567.9A Active CN111467803B (en) 2020-04-02 2020-04-02 Display control method and device in game, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111467803B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113304471B (en) * 2020-08-26 2023-01-10 北京完美赤金科技有限公司 Virtual object display method, device and equipment
CN112044064A (en) * 2020-09-02 2020-12-08 完美世界(北京)软件科技发展有限公司 Game skill display method, device, equipment and storage medium
CN113750529A (en) * 2021-09-13 2021-12-07 网易(杭州)网络有限公司 Direction indicating method and device in game, electronic equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105068706A (en) * 2015-07-31 2015-11-18 张维谦 Slide steering method and device of shooting game
CN107977141A (en) * 2017-11-24 2018-05-01 网易(杭州)网络有限公司 Interaction control method, device, electronic equipment and storage medium
CN108854068A (en) * 2018-06-27 2018-11-23 网易(杭州)网络有限公司 Display control method and device, storage medium and terminal in game
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105068706A (en) * 2015-07-31 2015-11-18 张维谦 Slide steering method and device of shooting game
CN107977141A (en) * 2017-11-24 2018-05-01 网易(杭州)网络有限公司 Interaction control method, device, electronic equipment and storage medium
CN108854068A (en) * 2018-06-27 2018-11-23 网易(杭州)网络有限公司 Display control method and device, storage medium and terminal in game
CN110180168A (en) * 2019-05-31 2019-08-30 网易(杭州)网络有限公司 A kind of display methods and device, storage medium and processor of game picture

Also Published As

Publication number Publication date
CN111467803A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
US10863168B2 (en) 3D user interface—360-degree visualization of 2D webpage content
WO2018188499A1 (en) Image processing method and device, video processing method and device, virtual reality device and storage medium
CN111467803B (en) Display control method and device in game, storage medium and electronic equipment
US9886102B2 (en) Three dimensional display system and use
WO2021258994A1 (en) Method and apparatus for displaying virtual scene, and device and storage medium
US11003305B2 (en) 3D user interface
CN107977141B (en) Interaction control method and device, electronic equipment and storage medium
US20090251460A1 (en) Systems and methods for incorporating reflection of a user and surrounding environment into a graphical user interface
CN104915979A (en) System capable of realizing immersive virtual reality across mobile platforms
JP2024505995A (en) Special effects exhibition methods, devices, equipment and media
US9530243B1 (en) Generating virtual shadows for displayable elements
CN114327700A (en) Virtual reality equipment and screenshot picture playing method
US10623713B2 (en) 3D user interface—non-native stereoscopic image conversion
CN113066189A (en) Augmented reality equipment and virtual and real object shielding display method
CN112565883A (en) Video rendering processing system and computer equipment for virtual reality scene
CN108510433B (en) Space display method and device and terminal
KR20200103278A (en) System and method for providing virtual reality contents indicated view direction
CN112799507B (en) Human body virtual model display method and device, electronic equipment and storage medium
WO2020248682A1 (en) Display device and virtual scene generation method
CN114286077B (en) Virtual reality device and VR scene image display method
WO2023169089A1 (en) Video playing method and apparatus, electronic device, medium, and program product
CN116450017B (en) Display method and device for display object, electronic equipment and medium
Xu et al. Design and implementation of interactive game based on augmented reality
CN114286077A (en) Virtual reality equipment and VR scene image display method
CN114283055A (en) Virtual reality equipment and picture display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant