CN112245914A - Visual angle adjusting method and device, storage medium and computer equipment - Google Patents

Visual angle adjusting method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN112245914A
CN112245914A CN202011252862.5A CN202011252862A CN112245914A CN 112245914 A CN112245914 A CN 112245914A CN 202011252862 A CN202011252862 A CN 202011252862A CN 112245914 A CN112245914 A CN 112245914A
Authority
CN
China
Prior art keywords
visual angle
adjusting
angle
viewing angle
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011252862.5A
Other languages
Chinese (zh)
Other versions
CN112245914B (en
Inventor
姚舟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011252862.5A priority Critical patent/CN112245914B/en
Publication of CN112245914A publication Critical patent/CN112245914A/en
Application granted granted Critical
Publication of CN112245914B publication Critical patent/CN112245914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method and a device for adjusting a visual angle, a storage medium and computer equipment, wherein the method comprises the following steps: the graphical user interface provides a first visual angle control area and a second visual angle control area, and the second visual angle control area comprises a visual angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control; and responding to a second operation which starts from the visual angle height control and acts on a specified area, and adjusting the orientation of the visual angle. Therefore, different operations are performed on the visual angle height control, multiple different visual angle adjusting functions are achieved, complexity of visual angle adjusting operations is reduced, and visual angle adjusting efficiency is improved.

Description

Visual angle adjusting method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of computers, and in particular, to a method and an apparatus for adjusting a viewing angle, a computer-readable storage medium, and a computer device.
Background
In recent years, along with the development and popularization of computer device technology, more and more applications with three-dimensional virtual environments are emerging, such as: virtual reality applications, three-dimensional map programs, military simulation programs, First person shooter Games (FPS), Multiplayer Online Battle Arena Games (MOBA), and the like.
In the prior art, a terminal displays an environment picture in a three-dimensional virtual environment by adopting an observer viewing angle (OB), for example, an FPS game on a mobile terminal, a user controls a first viewing angle control area by a left hand to control movement of a viewing angle in a horizontal direction, clicks a second viewing angle control area by a right hand to adjust a viewing angle height, and the viewing angle is rotated by sliding a screen of the mobile terminal.
In the research and practice process to prior art, the inventor of this application discovers that visual angle adjustment among the prior art is controlled comparatively loaded down with trivial details, links up inadequately, and then leads to the efficiency of visual angle adjustment lower.
Disclosure of Invention
The embodiment of the application provides a method and a device for adjusting a visual angle, which can improve the efficiency of visual angle adjustment.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
a method for adjusting a viewing angle, which provides a graphical user interface through a display component of a terminal, wherein the content displayed by the graphical user interface at least comprises part of a game scene, and the method comprises the following steps:
the graphical user interface provides a first visual angle control area and a second visual angle control area, and the second visual angle control area comprises a visual angle height control;
responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane;
adjusting the height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control;
and responding to a second operation which starts from the visual angle height control and acts on a specified area, and adjusting the orientation of the visual angle.
A viewing angle adjusting apparatus for providing a graphical user interface through a display component of a terminal, the content displayed by the graphical user interface including at least a part of a game scene, the apparatus comprising:
the display module is used for providing a first visual angle control area and a second visual angle control area for the graphical user interface, and the second visual angle control area comprises a visual angle height control;
the first adjusting module is used for responding to touch operation acting on the first visual angle control area and adjusting the projection position of the visual angle on a specified plane;
the second adjusting module is used for responding to the first operation acted on the visual angle height control and adjusting the height of the visual angle vertical to the specified plane;
and the third adjusting module is used for responding to a second operation which is started from the visual angle height control and acts on the specified area, and adjusting the orientation of the visual angle.
In some embodiments, the second operation is a slide operation from the height control to the designated area;
the second adjustment module includes:
the first obtaining submodule is used for obtaining the touch position of a second operation in the second response area and the position of a reference point in the graphical user interface;
the first determining submodule is used for determining the current sliding direction of a second operation based on the touch position and the reference point position;
and the first adjusting submodule is used for determining the target orientation of the visual angle according to the current sliding direction and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the first determining sub-module includes:
the determining unit is used for determining the current sliding direction and the current sliding distance of the second operation based on the touch position and the position of the reference point;
the first adjusting submodule includes:
and the adjusting unit is used for determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the adjusting unit is configured to:
determining the current sliding direction as a visual angle rotating direction;
acquiring a maximum angle of the visual angle in the visual angle rotating direction and a maximum distance of a second operation in the current sliding direction;
determining a view angle rotation angle based on the current sliding distance, the maximum distance and the maximum angle;
and determining the target orientation of the visual angle according to the visual angle rotating direction and the visual angle rotating angle.
In some embodiments, the adjusting unit is further configured to:
if the sliding operation of moving from the designated area to the visual angle height control is detected, the stay time of a second operation in the visual angle height control is obtained;
comparing the stay time with a preset time to obtain a comparison result;
if the comparison result shows that the stay time is longer than the preset time, adjusting the height of the visual angle perpendicular to the specified plane;
if the comparison result is that the stay time is shorter than the preset time, judging whether the second operation is detected;
and if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In some embodiments, the adjusting unit is further configured to:
and if the second operation cannot be detected after the sliding operation is moved from the designated area to the visual angle height control, hiding the designated area.
In some embodiments, the content displayed on the user graphical interface is an environment picture when the target object observes the virtual environment in a preset view direction;
a first adjustment module comprising:
the first adjusting submodule is used for responding to touch operation acting on the first visual angle control area and adjusting the target object to move on the designated plane in the horizontal direction;
a second adjustment module comprising:
and the second adjusting sub-module is used for responding to the first operation acted on the second visual angle control area and adjusting the height of the target object vertical to the designated plane.
A computer-readable storage medium, wherein the computer-readable storage medium stores a plurality of instructions, and the instructions are suitable for being loaded by a processor to execute the steps of the viewing angle adjusting method.
A computer device, characterized in that the computer device comprises a memory in which a computer program is stored and a processor, and the processor executes the steps in the perspective adjustment method as described above by calling the computer program stored in the memory.
The method comprises the steps that a first visual angle control area and a second visual angle control area are provided through a graphical user interface, and the second visual angle control area comprises a visual angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control; and responding to a second operation which starts from the visual angle height control and acts on a specified area, and adjusting the orientation of the visual angle. Therefore, different operations are performed on the visual angle height control, multiple different visual angle adjusting functions are achieved, complexity of visual angle adjusting operations is reduced, and visual angle adjusting efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1a is a system schematic diagram of a method for adjusting a viewing angle according to an embodiment of the present disclosure.
Fig. 1b is a schematic flow chart of a method for adjusting a viewing angle according to an embodiment of the present disclosure.
Fig. 1c is a schematic view of a first application scenario of a method for adjusting a viewing angle according to an embodiment of the present application.
Fig. 1d is a schematic diagram of a world coordinate system in a virtual environment according to an embodiment of the present disclosure.
Fig. 1e is a schematic view of a second application scenario of the method for adjusting a viewing angle according to the embodiment of the present application.
Fig. 1f is a schematic rotation diagram of a camera model rotation angle provided in this embodiment of the present application.
Fig. 1g is a schematic view of a third application scenario of the method for adjusting a viewing angle according to the embodiment of the present application.
Fig. 1h is a schematic view of a fourth application scenario of the method for adjusting a viewing angle according to the embodiment of the present application.
Fig. 2 is another schematic flow chart of a method for adjusting a viewing angle according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of a viewing angle adjusting apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a method and a device for adjusting a visual angle, a storage medium and computer equipment. Specifically, the method for adjusting a viewing angle according to the embodiment of the present application may be executed by a computer device, where the computer device may be a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal device may further include a client, where the client may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the method for adjusting the viewing angle is run on a terminal, the terminal device stores a game application program and presents a part of a game scene in a game through a display component (e.g., a touch display screen). The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the perspective adjustment method is executed in a server, a cloud game may be used. Cloud gaming refers to a gaming regime based on cloud computing. In the cloud game running mode, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the visual angle adjusting method are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1a, fig. 1a is a system schematic view of a viewing angle adjusting device according to an embodiment of the present disclosure. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The embodiment of the application provides a visual angle adjusting method, which can be executed by a terminal or a server. The embodiment of the present application is described by taking an example in which the viewing angle adjustment method is executed by a terminal. The terminal comprises a display component and a processor, wherein the display component is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When the user operates the graphical user interface through the display component, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role playing game, a strategy game, a sports game, a game for developing intelligence, a First Person Shooter (FPS) game, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more objects to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
It should be noted that the system schematic diagram of the viewing angle adjusting system shown in fig. 1a is merely an example, and the viewing angle adjusting system and the scenario described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application, and as can be known by those skilled in the art, along with the evolution of the viewing angle adjusting system and the appearance of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In the present embodiment, the perspective of the viewing angle adjusting apparatus will be described, and the viewing angle adjusting apparatus may be specifically integrated into a computer device having a storage unit and a microprocessor installed therein and having an arithmetic capability.
Referring to fig. 1b, fig. 1b is a schematic flow chart illustrating a method for adjusting a viewing angle according to an embodiment of the present disclosure. The visual angle adjusting method comprises the following steps:
in step 101, the graphical user interface provides a first view angle control area and a second view angle control area, and the second view angle control area includes a view angle height control.
The graphical user interface is displayed as a virtual environment in a game scene, and the virtual environment is provided when an application program runs on a terminal, and can be a simulation environment of a real world, a semi-simulation semi-fictional environment or a pure fictional environment. The target object is a camera model, for example, in the case of an FPS game, the camera model is located on the head or neck of the virtual character when viewed from a first person, and the camera model is located behind the virtual character when viewed from a third person. Part of the game scene is a picture generated by observing the virtual environment by the camera model in a certain view angle direction.
The graphical user interface further includes a first viewing angle control area 11 and a second viewing angle control area 12, wherein the first viewing angle control area 11 and the second viewing angle control area 12 can implement different viewing angle adjustment functions according to a user's non-touch operation.
Specifically, the second viewing angle control area 12 includes a viewing angle height control 121, and the viewing angle height control 121 is used for adjusting the height of the viewing angle. Referring to fig. 1c and fig. 1d, fig. 1c is a first application scenario diagram of a perspective adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user playing a game in an observer view (OB), the virtual environment has a world coordinate system constructed by an X-axis, a Y-axis, and a Z-axis, and therefore the camera model located in the virtual environment also has its corresponding coordinates (X)1,Y1,Z1)。
In step 102, in response to the touch operation applied to the first view angle control area, a projection position of the view angle on a designated plane is adjusted.
The projection position of the viewing angle on the designated plane, that is, the projection position of the camera model on the designated plane, is here the plane 20 constructed by the X axis and the Y axis in the virtual environment. That is, the first viewing angle control area 11 is used to control the front-back and left-right movement of the camera model in the virtual scene.
Referring to fig. 1c and fig. 1d, fig. 1c is a first application scenario diagram of a perspective adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user playing a game in an observer view (OB), the virtual environment has a world coordinate system constructed by an X-axis, a Y-axis, and a Z-axis, and therefore the camera model located in the virtual environment also has its corresponding coordinates (X)1,Y1,Z1). The designated plane here is a plane constructed by the X axis and the Y axis.
Specifically, a wheel control 111 is arranged in the first visual angle control area 11, and a user controls the rocker in the wheel control 111 to move forward, backward, leftward and rightward so as to control the visual angle to move forward, backward, leftward and rightward.
In some embodiments, the content displayed on the user graphical interface is an environment picture when the target object observes the virtual environment in a preset view direction;
the step of adjusting the projection position of the viewing angle on the designated plane in response to the touch operation applied to the first viewing angle control area includes:
and adjusting the target object to move on the designated plane in the horizontal direction in response to the touch operation acting on the first visual angle control area.
The target object is a camera model, and the projection position of the control view angle on the designated plane, that is, the target object is controlled to move in the horizontal direction (front, back, left and right) on the designated plane.
In step 103, in response to a first operation on the viewing angle height control, the height of the viewing angle perpendicular to the designated plane is adjusted.
The first operation is a pressing operation on the viewing angle height control 121, the viewing angle height control 121 adjusts the position of the camera model on the Z axis according to the pressing duration of the pressing operation, and since the Y axis is divided into a positive half axis and a negative half axis, the corresponding height control 121 may further include a first sub-viewing angle height control 1211 adjusted to the positive half axis of the Z axis and a second sub-viewing angle height control 1212 adjusted to the negative half axis of the Z axis. The height perpendicular to the designated plane is the distance of the camera model from the plane 20 constructed by the X-axis and Y-axis.
In some embodiments, the step of adjusting the height of the perspective perpendicular to the specified plane in response to the first operation on the perspective height control comprises:
and adjusting the height of the target object perpendicular to the specified plane in response to a first operation acting on the visual angle height control.
And adjusting the height of the visual angle perpendicular to the specified plane, namely controlling the target object to move along the positive semi-axis direction or the negative semi-axis direction of the Z axis on the specified plane.
And 104, responding to a second operation which starts from the visual angle height control and acts on the specified area, and adjusting the orientation of the visual angle.
Referring to fig. 1e, fig. 1e is a schematic view of a second application scenario of the method for adjusting a viewing angle according to the embodiment of the present application, where a designated area 13 is used to receive a second operation of a user and adjust a viewing angle orientation according to the second operation. Taking the first operation as a pressing operation as an example, when the user presses the viewing angle height control 121, the function of adjusting the height of the viewing angle height control 121 is implemented, the designated area 13 is displayed on the graphical user interface, and in order to facilitate the user to operate the viewing angle height control 121 and the designated area 13 by using a single finger, the designated area 13 may be set to be the outer side of the viewing angle height control 121, that is, the inner periphery of the designated area 13 is adjacent to the outer periphery of the second viewing angle control area 121. The second response area 12 may be other areas of the graphical user interface except for the first viewing angle control area 11 and the second viewing angle control area 12, and the outer periphery of a specific designated area may be a circle or a polygon, which is not limited herein.
In some embodiments, the graphical user interface provides a first perspective control area and a second perspective control area, the second perspective control area including a perspective height control, comprising:
in some embodiments, please refer to fig. 1f, where fig. 1f is a schematic rotation diagram of a rotation angle of a camera model according to an embodiment of the present disclosure. The view angle direction of the lens 30 in the camera model can be adjusted by rotating the U-axis and the R-axis. Based on second operations of different operation modes, the second operations can be divided into two types, the first type is that a user slides a finger pressing the visual angle height control element 121 to the designated area 13, namely the second operation is continuous with the first operation, so that the transition from the adjustment of the visual angle height to the adjustment of the visual angle rotation is smoother, and the operations are more coherent; the second is that the user lifts the finger that pressed the viewing angle height control 121 and presses the designated area 13 for a designated period of time. Therefore, the way of rotating the view angle of the camera model is different for different operation modes, and the following description is divided into two modes.
Referring to fig. 1g, regarding that the second operation is a sliding operation of sliding from the height control to the designated area, fig. 1g is a third application scenario diagram of the viewing angle adjusting method according to the embodiment of the present application. In some embodiments, the step of adjusting the orientation of the viewing angle in response to the second operation that starts at the viewing angle height control and acts on the designated area comprises:
(1) acquiring a touch position of a second operation in the designated area and a reference point position in the graphical user interface;
(2) determining a current sliding direction of a second operation based on the touch position and the reference point position;
(3) and determining the target orientation of the visual angle according to the current sliding direction, and adjusting the orientation of the visual angle according to the target orientation.
The rotation mode of the camera model can be divided into rotation around the U axis only, and rotation around the U axis and the R axis, based on the working mode of rotation around the U axis only, the touch position of the sliding operation corresponding to the terminal device can be obtained in real time, and the sliding direction (namely the current sliding direction) of the sliding operation of the user is determined according to the touch position and the reference point position on the graphical user interface.
For example: the touch coordinates are (X2, Y2), the reference point is the coordinates (X3, Y3) of the center point of the viewing angle height control 121, and then the sine value of the angle can be determined according to | (Y2-Y3) |/| (X2-X3) |, the angle is determined according to the sine value, and then the current sliding direction is determined. And determining the orientation corresponding to the current sliding direction as the target orientation of the visual angle, and controlling the camera module to rotate around the U axis until the lens orientation of the camera module is overlapped with the target orientation.
In some embodiments, a mapping relationship between the current sliding direction and the viewing angle orientation may be set, so that after the current sliding direction is determined, the corresponding viewing angle orientation is determined as the target orientation according to the mapping relationship.
For example, the touch coordinates are (X2, Y2), the coordinates of the center point of the viewing angle height control 121 are (X3, Y3), a sine value of an angle can be determined according to | (Y2-Y3) |/| (X2-X3) |, the angle is determined according to the sine value, the current sliding direction is determined, a mapping relation between the viewing angle direction and each sliding direction can be established in advance, and the target direction of the viewing angle is determined according to the mapping relation after the current sliding direction is determined.
In some embodiments, as shown in fig. 1h, fig. 1h is a schematic view of a fourth application scenario of the method for adjusting a viewing angle provided in this embodiment of the present application. When the sliding operation from the viewing angle height control 121 to the designated area 13 is detected, the display position of the viewing angle height control 121 on the graphical user interface is adjusted according to the touch position of the second operation.
The position of the center point of the controllable viewing angle height control element 121 moves along with the second operation, so that the viewing angle height control element 121 moves along with the second operation, and a user can visually feel the sliding operation.
In some embodiments, the step of determining the current sliding direction of the second operation based on the touch position and the position of the reference point includes:
(1) determining a current sliding direction and a current sliding distance of a second operation based on the touch position and the position of the reference point;
(2) the step of determining the target orientation of the view angle according to the current sliding direction and adjusting the orientation of the view angle according to the target orientation comprises:
(3) and determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance, and adjusting the orientation of the visual angle according to the target orientation.
The camera module which can rotate around the U axis and the R axis in the working mode can acquire the touch position of the sliding operation on the terminal equipment in real time, determine the sliding direction (current sliding direction) and the sliding distance (current sliding distance) of the sliding operation of the user according to the touch position and the reference point position on the graphical user interface, and adjust the visual angle direction of the camera model in real time based on the current sliding direction and the current sliding distance. Specifically, the touch position is a touch coordinate of the user touching the screen, and the reference point position may be a center point coordinate of the viewing angle height control 121.
Specifically, in order to enable the user to achieve an accurate adjustment effect according to the viewing requirement, the distance between the touch coordinate and the central point coordinate can be calculated according to and according to the pythagorean theorem after the current sliding direction is determined, namely the current sliding distance. And determining the visual angle rotation angle of the visual angle in the visual angle direction corresponding to the current sliding direction according to the current sliding distance. And determines the orientation of the angle of view rotation in the direction of view rotation as the target orientation.
In some embodiments, the step of determining the target orientation of the view according to the current sliding direction and the current sliding distance includes:
(1.1) determining the current sliding direction as a viewing angle rotating direction;
(1.2) acquiring a maximum angle by which the viewing angle is rotatable in the viewing angle rotating direction, and a maximum distance by which the second operation is slidable in the current sliding direction;
(1.3) determining a view angle rotation angle based on the current sliding distance, the maximum distance, and the maximum angle;
and (1.4) determining the target orientation of the visual angle according to the visual angle rotating direction and the visual angle rotating angle.
After the current sliding direction is determined, the lens of the camera model can be controlled to rotate in the current sliding direction, and the specific rotating mode can rotate around the U shaft, so that the lens of the camera model can rotate.
The angle of the lens of the camera model raised in the view angle rotation direction can be determined by acquiring the maximum angle rotatable in the view angle rotation direction, for example, 90 °, and the maximum distance slidable in the current sliding direction, for example, 6cm, of the second operation, and determining the target orientation based on the sliding distance, the maximum distance, and the maximum angle.
When the visual angle rotating angle is determined, the lens of the camera model can be controlled to rotate according to the visual angle rotating angle, and the specific rotating mode can rotate around the R axis so that the lens of the camera model can be lifted.
In some embodiments, the step of determining the target orientation of the viewing angle according to the viewing angle rotation direction and the viewing angle rotation angle comprises:
(1.1) determining a distance ratio of the current sliding distance to the maximum distance;
and (1.2) determining a view angle rotation angle according to the distance ratio and the maximum angle, and determining the orientation of the target according to the current sliding direction and the view angle rotation angle.
Wherein if the maximum angle rotatable in the viewing angle rotation direction is 90 °, and the maximum distance by which the second operation can slide in the current sliding direction is 6cm, and the current sliding distance is 2cm, the distance ratio of the current sliding distance to the maximum distance is 1/3, and the maximum angle is 90 °, the viewing angle rotation angle is 90 ° × 1/3 ═ 30 °. Therefore, the target orientation is the orientation with the angle as the viewing angle rotation angle in the viewing angle rotation direction corresponding to the current sliding direction.
In some embodiments, the method further comprises:
(1.1) if the sliding operation of moving from the specified area to the visual angle height control is detected, acquiring the stay time of the sliding operation in the visual angle height control;
(1.2) comparing the stay time with a preset time to obtain a comparison result;
(1.3) if the comparison result shows that the staying time is longer than the preset time, adjusting the height of the visual angle vertical to the specified plane;
(1.4) if the comparison result shows that the stay time is less than the preset time, judging whether the second operation is detected;
(1.5) if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In order to avoid that the lens lifting angle is adjusted to be too high, so that the user passes through the first control when performing the sliding operation in the opposite direction, and the lens lifting angle is not adjusted but the camera model is moved, therefore, a preset time duration may be set, and how to adjust the viewing angle of the target object is further determined by comparing the size relationship between the stay time duration of the second operation in the viewing angle height control 121 and the preset time duration.
Specifically, if the dwell time is longer than the preset time, it may be determined that the user currently wants to adjust the vertical height of the camera model on the designated plane, and if the dwell time is shorter than the preset time, it may not be determined which adjustment mode the user currently wants to perform, so as to determine whether to detect the second operation of moving from the viewing angle height control 121 to the designated area 13; if the second operation is detected, returning to the step of acquiring the touch position of the sliding operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In some embodiments, the method further comprises:
if the second operation cannot be detected after the sliding operation is moved from the designated area 13 to the viewing angle height control 121, the designated area 13 is not displayed.
If the second operation cannot be detected, that is, it is determined that the user has lifted the finger, and the screen of the terminal device is no longer touched, the state of each control in fig. 1c is restored, that is, the designated area 13 is not displayed.
In some embodiments, in response to a second operation on the response area, the step of adjusting the viewing direction of the target object according to the second operation includes:
(1.1) acquiring the pressing position, the pressing duration and the reference point position in the graphical user interface of the pressing operation in the designated area;
(1.2) determining a viewing angle rotation direction based on the pressing position and the reference point position;
(1.3) determining a visual angle rotation angle corresponding to the pressing duration according to a preset mapping relation;
and (1.4) determining the target orientation of the visual angle according to the visual angle rotating direction and the visual angle rotating angle.
The difference between the pressing operation and the sliding operation for determining the rotation direction of the target view angle is as follows: the visual angle rotation angle is determined by the pressing time length in the pressing operation. By determining the mapping relation between the pressing duration and the visual angle rotation angle in advance, the corresponding visual angle rotation angle can be determined when the pressing duration of the user is obtained. For example: the pressing time period is 1s, the viewing angle is rotated by 30 °, the pressing time period is 2s, and the viewing angle is rotated by 60 °.
As can be seen from the above, in the embodiment of the present application, a first viewing angle control area and a second viewing angle control area are provided through a graphical user interface, where the second viewing angle control area includes a viewing angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the visual angle perpendicular to the designated plane in response to a first operation on the visual angle height control; and responding to a second operation which starts from the visual angle height control and acts on the designated area, and adjusting the orientation of the visual angle. Therefore, different operations are performed on the visual angle height control piece, multiple different visual angle adjusting functions (adjusting the visual angle height and the visual angle orientation) are achieved, complexity of visual angle adjusting operations is reduced, and visual angle adjusting efficiency is improved.
The method described in connection with the above embodiments will be described in further detail below by way of example.
Referring to fig. 2, fig. 2 is another schematic flow chart of a method for adjusting a viewing angle according to an embodiment of the present application, in which the second operation is an example of a sliding operation, and a specific flow of the method may be as follows:
in step 201, the graphical user interface provides a first view angle control area and a second view angle control area, and the second view angle control area includes a view angle height control.
The graphical user interface is displayed as a virtual environment in a game scene, and the virtual environment is provided when an application program runs on a terminal, and can be a simulation environment of a real world, a semi-simulation semi-fictional environment or a pure fictional environment. The target object is a camera model, for example, in the case of an FPS game, the camera model is located on the head or neck of the virtual character when viewed from a first person, and the camera model is located behind the virtual character when viewed from a third person. Part of the game scene is a picture generated by observing the virtual environment by the camera model in a certain view angle direction.
Specifically, the second viewing angle control area 12 includes a viewing angle height control 121, and the viewing angle height control 121 is used for adjusting the height of the viewing angle. Referring to fig. 1c and fig. 1d, fig. 1c is a first application scenario diagram of a perspective adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user playing a game in the observer view (OB), the virtual environment has a world coordinate system constructed by an X-axis, a Y-axis and a Z-axis,the camera model located in the virtual environment therefore also has its corresponding coordinates (X)1,Y1,Z1)。
In step 202, in response to the touch operation applied to the first view angle control area, the adjustment target object is moved in the horizontal direction on the designated plane.
The projection position of the viewing angle on the designated plane, that is, the projection position of the camera model on the designated plane, is here the plane 20 constructed by the X axis and the Y axis in the virtual environment. That is, the first viewing angle control area 11 is used to control the front-back and left-right movement of the camera model in the virtual scene.
Referring to fig. 1c and fig. 1d, fig. 1c is a first application scenario diagram of a perspective adjustment method provided in the embodiment of the present application, and fig. 1d is a schematic diagram of a world coordinate system in a virtual environment provided in the embodiment of the present application. Taking the example of a user playing a game in an observer view (OB), the virtual environment has a world coordinate system constructed by an X-axis, a Y-axis, and a Z-axis, and therefore the camera model located in the virtual environment also has its corresponding coordinates (X)1,Y1,Z1). The designated plane here is a plane constructed by the X axis and the Y axis.
Specifically, a wheel control 111 is arranged in the first visual angle control area 11, and a user controls the rocker in the wheel control 111 to move forward, backward, leftward and rightward so as to control the visual angle to move forward, backward, leftward and rightward. The target object is a camera model, and the projection position of the control view angle on the designated plane, namely the control target object, moves in the horizontal direction (front, back, left and right) on the designated plane.
In step 203, the height of the target object perpendicular to the specified plane is adjusted in response to the first operation acting on the viewing angle height control.
The first operation is a pressing operation on the viewing angle height control 121, the viewing angle height control 121 adjusts the position of the camera model on the Z axis according to the pressing duration of the pressing operation, and since the Y axis is divided into a positive half axis and a negative half axis, the corresponding height control 121 may further include a first sub-viewing angle height control 1211 adjusted to the positive half axis of the Z axis and a second sub-viewing angle height control 1212 adjusted to the negative half axis of the Z axis. The height perpendicular to the designated plane is the distance of the camera model from the plane 20 constructed by the X-axis and Y-axis.
Specifically, the target object is a camera model, and the height of the view angle perpendicular to the designated plane is adjusted, that is, the target object is controlled to move on the designated plane in the positive semi-axis direction or the negative semi-axis direction of the Z-axis.
In step 204, the touch position of the second operation in the designated area and the reference point position in the graphical user interface are obtained.
For example, the touch coordinates are (X2, Y2), the coordinates of the center point of the viewing angle height control 121 are (X3, Y3), a sine value of an angle can be determined according to | (Y2-Y3) |/| (X2-X3) |, the angle is determined according to the sine value, the current sliding direction is determined, a mapping relation between the viewing angle direction and each sliding direction can be established in advance, and the target direction of the viewing angle is determined according to the mapping relation after the current sliding direction is determined.
The position of the center point of the controllable viewing angle height control element 121 moves along with the second operation, so that the viewing angle height control element 121 moves along with the second operation, and a user can visually feel the sliding operation.
In step 205, a current sliding direction and a current sliding distance of the second operation are determined based on the touch position and the position of the reference point.
In order to enable a user to achieve an accurate adjustment effect according to a watching requirement, after the current sliding direction is determined, the distance between the touch coordinate and the coordinate of the central point is calculated according to the Pythagorean theorem, namely the current sliding distance. And determining the visual angle rotation angle of the visual angle in the visual angle direction corresponding to the current sliding direction according to the current sliding distance. And determines the orientation of the angle of view rotation in the direction of view rotation as the target orientation.
In step 206, the current sliding direction is determined as the viewing angle rotating direction.
After the current sliding direction is determined, the lens of the camera model can be controlled to rotate in the current sliding direction, and the specific rotating mode can rotate around the U shaft, so that the lens of the camera model can rotate.
The angle of the lens of the camera model raised in the view angle rotation direction can be determined by acquiring the maximum angle rotatable in the view angle rotation direction, for example, 90 °, and the maximum distance slidable in the current sliding direction, for example, 6cm, of the second operation, and determining the target orientation based on the sliding distance, the maximum distance, and the maximum angle. When the visual angle rotating angle is determined, the lens of the camera model can be controlled to rotate according to the visual angle rotating angle, and the specific rotating mode can rotate around the R axis so that the lens of the camera model can be lifted.
In step 207, the maximum angle by which the viewing angle is rotatable in the viewing angle rotating direction and the maximum distance by which the second operation is slidable in the current sliding direction are acquired.
For example, if the maximum angle rotatable in the viewing angle rotation direction is 90 °, and the maximum distance by which the sliding operation can slide in the current sliding direction is 6cm, and the current sliding distance is 2cm, the distance ratio of the current sliding distance to the maximum distance is 1/3.
In step 208, a viewing angle rotation angle is determined based on the current sliding distance, the maximum distance, and the maximum angle.
For example, if the maximum angle is 90 ° and the distance ratio is 1/3, the viewing angle rotation angle is 30 ° as 90 ° × 1/3.
In step 209, the target orientation of the viewing angle is determined according to the viewing angle rotation direction and the viewing angle rotation angle.
Wherein, after determining visual angle turned angle, the accessible control camera model's camera lens rotates according to visual angle turned angle, and specific rotation mode accessible is rotatory around the R axle to make camera model's camera lens lift.
In step 210, if a sliding operation moving from the designated area to the viewing angle height control is detected, a staying time of the sliding operation in the viewing angle height control is obtained.
In order to avoid that the lens lifting angle is adjusted to be too high, so that the user passes through the first control when performing the sliding operation in the opposite direction, and the lens lifting angle is not adjusted but the camera model is moved, therefore, a preset time duration may be set, and how to adjust the viewing angle of the target object is further determined by comparing the size relationship between the stay time duration of the second operation in the viewing angle height control 121 and the preset time duration. Therefore, in this step, the stay time period of the sliding operation within the first response region is acquired, for example, 4 s.
In step 211, the staying time is compared with a preset time to obtain a comparison result.
For example, if the preset time is 5s and the stay time is 4s, the comparison result indicates that the stay time is less than the preset time; if the staying time is 6s, the comparison result shows that the staying time is longer than the preset time.
In step 212, if the comparison result indicates that the staying time is longer than the predetermined time, the height of the viewing angle perpendicular to the designated plane is adjusted.
If the staying time length is longer than the preset time length, the vertical height of the camera model on the specified plane, which is currently to be adjusted by the user, can be judged.
In step 213, if the comparison result indicates that the staying time is less than the predetermined time, it is determined whether the second operation is detected.
If the staying time length is less than the preset time length, the current adjustment mode to be performed by the user cannot be judged, and therefore whether the sliding operation of moving from the first response area to the second response area is detected or not is judged.
In step 214, if the second operation is detected, the step of obtaining the touch position of the second operation in the designated area and the reference point position in the graphical user interface is returned to until the second operation cannot be detected. .
If the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
As can be seen from the above, in the embodiment of the present application, a first viewing angle control area and a second viewing angle control area are provided through a graphical user interface, where the second viewing angle control area includes a viewing angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the visual angle perpendicular to the designated plane in response to a first operation on the visual angle height control; and responding to a second operation which starts from the visual angle height control and acts on the designated area, and adjusting the orientation of the visual angle. Therefore, different operations are performed on the visual angle height control piece, multiple different visual angle adjusting functions (adjusting the visual angle height and the visual angle orientation) are achieved, complexity of visual angle adjusting operations is reduced, and visual angle adjusting efficiency is improved.
In order to better implement the method for adjusting a viewing angle provided by the embodiments of the present application, the embodiments of the present application further provide an apparatus based on the method for adjusting a viewing angle. The terms are the same as those in the above-described viewing angle adjusting method, and details of implementation may refer to the description in the method embodiment.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a viewing angle adjusting apparatus according to an embodiment of the present disclosure, where the viewing angle adjusting apparatus may include a display module 301, a control module 302, an adjusting module 303, and the like.
The display module 301 is configured to provide a first viewing angle control area and a second viewing angle control area on the gui, where the second viewing angle control area includes a viewing angle height control.
The first adjusting module 302 is configured to adjust a projection position of the viewing angle on a designated plane in response to a touch operation applied to the first viewing angle control area.
A second adjusting module 303, configured to adjust the height of the viewing angle perpendicular to the designated plane in response to the first operation on the viewing angle height control.
The third adjusting module 304 is configured to adjust the orientation of the viewing angle in response to a second operation starting from the viewing angle height control and acting on the designated area.
In some embodiments, the second operation is a slide operation from the height control to the designated area;
the second adjusting module 303 includes:
the first obtaining submodule is used for obtaining the touch position of the sliding operation in the second response area and the position of a reference point in the graphical user interface;
the first determining submodule is used for determining the current sliding direction of the second operation based on the touch position and the reference point position;
and the first adjusting submodule is used for determining the target orientation of the visual angle according to the current sliding direction and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the first determining sub-module includes:
the determining unit is used for determining the current sliding direction and the current sliding distance of the second operation based on the touch position and the position of the reference point;
the first adjusting submodule includes:
and the adjusting unit is used for determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance and adjusting the orientation of the visual angle according to the target orientation.
In some embodiments, the adjusting unit is configured to:
determining the current sliding direction as a viewing angle rotating direction;
acquiring the maximum angle of the visual angle in the visual angle rotating direction and the maximum slidable distance of the second operation in the current sliding direction;
determining a view angle rotation angle based on the current sliding distance, the maximum distance and the maximum angle;
and determining the target orientation of the visual angle according to the visual angle rotating direction and the visual angle rotating angle.
In some embodiments, the adjusting unit is further configured to:
if the sliding operation from the designated area to the visual angle height control is detected, the stay time of the sliding operation in the visual angle height control is obtained;
comparing the stay time with a preset time to obtain a comparison result;
if the comparison result shows that the staying time is longer than the preset time, the height of the visual angle perpendicular to the specified plane is adjusted;
if the comparison result is that the stay time is shorter than the preset time, judging whether the second operation is detected;
if the second operation is detected, returning to the step of acquiring the touch position of the sliding operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
In some embodiments, the adjusting unit is further configured to:
and if the second operation cannot be detected after the sliding operation is moved from the designated area to the visual angle height control, hiding the designated area.
In some embodiments, the content displayed on the gui is an environment picture when the target object observes the virtual environment in a preset viewing direction;
a first adjustment module 302, comprising:
the first adjusting submodule is used for responding to touch operation acted on the first visual angle control area and adjusting the target object to move on the designated plane in the horizontal direction;
a second adjustment module comprising:
and the second adjusting sub-module is used for responding to the first operation acted on the second visual angle control area and adjusting the height of the target object vertical to the designated plane.
As can be seen from the above, in the embodiment of the present application, the display module 301 provides a first viewing angle control area and a second viewing angle control area by using the graphical user interface, and the second viewing angle control area includes the viewing angle height control. The first adjusting module 302 adjusts a projection position of the viewing angle on the designated plane in response to a touch operation applied to the first viewing angle control area. The second adjustment module 303 adjusts the height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control. The third adjustment module 304 adjusts the orientation of the perspective in response to a second operation initiated at the perspective height control and acting on the designated area. Therefore, different operations are performed on the visual angle height control piece, multiple different visual angle adjusting functions (adjusting the visual angle height and the visual angle orientation) are achieved, complexity of visual angle adjusting operations is reduced, and visual angle adjusting efficiency is improved.
Correspondingly, the embodiment of the present application further provides a Computer device, where the Computer device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 4, fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the computer device configurations illustrated in the figures are not meant to be limiting of computer devices and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The processor 401 is a control center of the computer device 400, connects the respective parts of the entire computer device 400 using various interfaces and lines, performs various functions of the computer device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device 400 as a whole.
In the embodiment of the present application, the processor 401 in the computer device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions:
the graphical user interface provides a first visual angle control area and a second visual angle control area, and the second visual angle control area comprises a visual angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the visual angle perpendicular to the designated plane in response to a first operation on the visual angle height control; and responding to a second operation which starts from the visual angle height control and acts on the designated area, and adjusting the orientation of the visual angle.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 4, the computer device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 4 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the computer device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In this embodiment of the application, a game application is executed by the processor 401 to generate a graphical user interface on the touch display screen 403, where a virtual scene on the graphical user interface includes at least one function control or a wheel control. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other computer device via wireless communication, and for transceiving signals with the network device or other computer device.
The audio circuit 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401, and then sent to, for example, another computer device via the radio frequency circuit 404, or output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 4, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
As can be seen from the above, in the computer device provided in this embodiment, the graphical user interface provides a first viewing angle control area and a second viewing angle control area, where the second viewing angle control area includes a viewing angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the visual angle perpendicular to the designated plane in response to a first operation on the visual angle height control; and responding to a second operation which starts from the visual angle height control and acts on the designated area, and adjusting the orientation of the visual angle. Therefore, different operations are performed on the visual angle height control piece, multiple different visual angle adjusting functions (adjusting the visual angle height and the visual angle orientation) are achieved, complexity of visual angle adjusting operations is reduced, and visual angle adjusting efficiency is improved.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, where the computer programs can be loaded by a processor to execute the steps in any one of the viewing angle adjusting methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
the graphical user interface provides a first visual angle control area and a second visual angle control area, and the second visual angle control area comprises a visual angle height control; responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane; adjusting the height of the visual angle perpendicular to the designated plane in response to a first operation on the visual angle height control; and responding to a second operation which starts from the visual angle height control and acts on the designated area, and adjusting the orientation of the visual angle.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in any of the perspective adjustment methods provided in the embodiments of the present application, beneficial effects that can be achieved by any of the perspective adjustment methods provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The foregoing detailed description is directed to a method, an apparatus, a storage medium, and a computer device for adjusting a viewing angle provided in an embodiment of the present application, and a specific example is applied in the detailed description to explain the principles and implementations of the present application, and the description of the foregoing embodiment is only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (13)

1. A method for adjusting a viewing angle, wherein a graphical user interface is provided through a display component of a terminal, and content displayed by the graphical user interface at least includes a part of a game scene, the method comprising:
the graphical user interface provides a first visual angle control area and a second visual angle control area, and the second visual angle control area comprises a visual angle height control;
responding to touch operation acting on the first visual angle control area, and adjusting the projection position of the visual angle on a designated plane;
adjusting the height of the viewing angle perpendicular to the designated plane in response to a first operation on the viewing angle height control;
and responding to a second operation which starts from the visual angle height control and acts on a specified area, and adjusting the orientation of the visual angle.
2. The viewing angle adjustment method according to claim 1, wherein the second operation is a slide operation of sliding from the height control to the designated area;
the step of adjusting the orientation of the viewing angle in response to a second operation starting from the viewing angle height control and acting on a designated area comprises:
acquiring a touch position of a second operation in the designated area and a reference point position in the graphical user interface;
determining a current sliding direction of a second operation based on the touch position and the reference point position;
and determining the target orientation of the visual angle according to the current sliding direction, and adjusting the orientation of the visual angle according to the target orientation.
3. The method according to claim 2, wherein the step of determining a current sliding direction of the second operation based on the touch position and the position of the reference point comprises:
determining a current sliding direction and a current sliding distance of a second operation based on the touch position and the position of the reference point;
the step of determining the target orientation of the view angle according to the current sliding direction and adjusting the orientation of the view angle according to the target orientation comprises the following steps:
and determining the target orientation of the visual angle according to the current sliding direction and the current sliding distance, and adjusting the orientation of the visual angle according to the target orientation.
4. The method according to claim 3, wherein the step of determining the target orientation of the viewing angle according to the current sliding direction and the current sliding distance comprises:
determining the current sliding direction as a visual angle rotating direction;
acquiring a maximum angle of the visual angle in the visual angle rotating direction and a maximum distance of a second operation in the current sliding direction;
determining a view angle rotation angle based on the current sliding distance, the maximum distance and the maximum angle;
and determining the target orientation of the visual angle according to the visual angle rotating direction and the visual angle rotating angle.
5. The viewing angle adjustment method according to claim 4, characterized in that the method further comprises:
if the sliding operation of moving from the designated area to the visual angle height control is detected, the stay time of the sliding operation in the visual angle height control is obtained;
comparing the stay time with a preset time to obtain a comparison result;
if the comparison result shows that the stay time is longer than the preset time, adjusting the height of the visual angle perpendicular to the specified plane;
if the comparison result is that the stay time is shorter than the preset time, judging whether the second operation is detected;
and if the second operation is detected, returning to the step of acquiring the touch position of the second operation in the designated area and the reference point position in the graphical user interface until the second operation cannot be detected.
6. The viewing angle adjustment method according to claim 5, characterized in that the method further comprises:
and if the second operation cannot be detected after the sliding operation slides from the designated area to the visual angle height control, hiding the designated area.
7. The method according to claim 1, wherein an inner circumference of the designated area is adjacent to an outer circumference of the second viewing angle control area, and the outer circumference of the designated area is circular or polygonal.
8. The method according to claim 1, wherein the designated area is an area other than the first view control area and the second view control area within the graphical user interface.
9. The method according to claim 1, wherein the content displayed on the gui is an environment picture when the target object views the virtual environment in a predetermined viewing direction;
the step of adjusting the projection position of the view on the designated plane in response to the touch operation applied to the first view control area includes:
responding to touch operation acting on the first visual angle control area, and adjusting the target object to move on the designated plane in the horizontal direction;
the step of adjusting the height of the viewing angle perpendicular to the specified plane in response to a first operation on the viewing angle height control comprises:
adjusting the height of the target object perpendicular to the specified plane in response to a first operation on the perspective height control.
10. The method according to claim 1, wherein the first operation is continuous with the second operation.
11. A viewing angle adjusting apparatus for providing a graphical user interface through a display component of a terminal, the content displayed by the graphical user interface including at least a part of a game scene, the apparatus comprising:
the display module is used for providing a first visual angle control area and a second visual angle control area for the graphical user interface, and the second visual angle control area comprises a visual angle height control;
the first adjusting module is used for responding to touch operation acting on the first visual angle control area and adjusting the projection position of the visual angle on a specified plane;
the second adjusting module is used for responding to the first operation acted on the visual angle height control and adjusting the height of the visual angle vertical to the specified plane;
and the third adjusting module is used for responding to a second operation which is started from the visual angle height control and acts on the specified area, and adjusting the orientation of the visual angle.
12. A computer-readable storage medium storing instructions adapted to be loaded by a processor to perform the steps of the method of adjusting a viewing angle according to any one of claims 1 to 10.
13. A computer device characterized by comprising a memory in which a computer program is stored and a processor that executes the steps in the perspective adjustment method according to any one of claims 1 to 10 by calling the computer program stored in the memory.
CN202011252862.5A 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment Active CN112245914B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011252862.5A CN112245914B (en) 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011252862.5A CN112245914B (en) 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN112245914A true CN112245914A (en) 2021-01-22
CN112245914B CN112245914B (en) 2024-03-12

Family

ID=74265332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011252862.5A Active CN112245914B (en) 2020-11-11 2020-11-11 Viewing angle adjusting method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN112245914B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570693A (en) * 2021-07-26 2021-10-29 北京达佳互联信息技术有限公司 Method, device and equipment for changing visual angle of three-dimensional model and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060250398A1 (en) * 2003-01-07 2006-11-09 Konami Corporation Image display control program, image display control method, and video game device
JP2007044320A (en) * 2005-08-11 2007-02-22 Taito Corp Video game machine
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN108920084A (en) * 2018-06-29 2018-11-30 网易(杭州)网络有限公司 Visual field control method and device in a kind of game
CN110141855A (en) * 2019-05-24 2019-08-20 网易(杭州)网络有限公司 Method of controlling viewing angle, device, storage medium and electronic equipment
CN110665226A (en) * 2019-10-09 2020-01-10 网易(杭州)网络有限公司 Method, device and storage medium for controlling virtual object in game
CN111603758A (en) * 2020-05-28 2020-09-01 网易(杭州)网络有限公司 Visual angle adjusting method for game role, electronic equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060250398A1 (en) * 2003-01-07 2006-11-09 Konami Corporation Image display control program, image display control method, and video game device
JP2007044320A (en) * 2005-08-11 2007-02-22 Taito Corp Video game machine
CN106975219A (en) * 2017-03-27 2017-07-25 网易(杭州)网络有限公司 Display control method and device, storage medium, the electronic equipment of game picture
CN108920084A (en) * 2018-06-29 2018-11-30 网易(杭州)网络有限公司 Visual field control method and device in a kind of game
CN110141855A (en) * 2019-05-24 2019-08-20 网易(杭州)网络有限公司 Method of controlling viewing angle, device, storage medium and electronic equipment
CN110665226A (en) * 2019-10-09 2020-01-10 网易(杭州)网络有限公司 Method, device and storage medium for controlling virtual object in game
CN111603758A (en) * 2020-05-28 2020-09-01 网易(杭州)网络有限公司 Visual angle adjusting method for game role, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113570693A (en) * 2021-07-26 2021-10-29 北京达佳互联信息技术有限公司 Method, device and equipment for changing visual angle of three-dimensional model and storage medium

Also Published As

Publication number Publication date
CN112245914B (en) 2024-03-12

Similar Documents

Publication Publication Date Title
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113426124A (en) Display control method and device in game, storage medium and computer equipment
CN113398590A (en) Sound processing method, sound processing device, computer equipment and storage medium
CN113101650A (en) Game scene switching method and device, computer equipment and storage medium
CN112206517A (en) Rendering method, device, storage medium and computer equipment
CN113398566A (en) Game display control method and device, storage medium and computer equipment
CN112870718A (en) Prop using method and device, storage medium and computer equipment
CN113082707A (en) Virtual object prompting method and device, storage medium and computer equipment
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN115970284A (en) Attack method and device of virtual weapon, storage medium and computer equipment
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
WO2024045528A1 (en) Game control method and apparatus, and computer device and storage medium
CN113426129A (en) User-defined role appearance adjusting method, device, terminal and storage medium
WO2024051116A1 (en) Control method and apparatus for virtual character, and storage medium and terminal device
CN112245914B (en) Viewing angle adjusting method and device, storage medium and computer equipment
WO2024031942A1 (en) Game prop control method and apparatus, computer device and storage medium
CN115501581A (en) Game control method and device, computer equipment and storage medium
CN113413600B (en) Information processing method, information processing device, computer equipment and storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN113350801A (en) Model processing method and device, storage medium and computer equipment
CN113867873A (en) Page display method and device, computer equipment and storage medium
CN113426115A (en) Game role display method and device and terminal
WO2024152504A1 (en) Game interaction method and apparatus, and computer device and storage medium
CN113398564A (en) Virtual role control method, device, storage medium and computer equipment
CN115430150A (en) Game skill release method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant