CN110732135A - Virtual scene display method and device, electronic equipment and storage medium - Google Patents

Virtual scene display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110732135A
CN110732135A CN201910992462.9A CN201910992462A CN110732135A CN 110732135 A CN110732135 A CN 110732135A CN 201910992462 A CN201910992462 A CN 201910992462A CN 110732135 A CN110732135 A CN 110732135A
Authority
CN
China
Prior art keywords
visual angle
force
virtual object
target
adsorption
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910992462.9A
Other languages
Chinese (zh)
Other versions
CN110732135B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910992462.9A priority Critical patent/CN110732135B/en
Publication of CN110732135A publication Critical patent/CN110732135A/en
Application granted granted Critical
Publication of CN110732135B publication Critical patent/CN110732135B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application discloses virtual scene display methods, devices, electronic equipment and storage media, and belongs to the technical field of computers.

Description

Virtual scene display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to virtual scene display methods and apparatuses, an electronic device, and a storage medium.
Background
The shooting type game is more popular games, and the games usually display an aiming point at the center of a terminal screen, and a user can adjust the currently displayed virtual scene by adjusting the visual angle of the virtual scene to adjust the area aimed by the aiming point.
Currently, a virtual scene display method generally includes changing sensitivity of a viewing angle adjustment operation when a user performs the viewing angle adjustment operation when a target virtual object is detected and an aiming point is located in an adsorption area of the target virtual object, so as to assist the user in moving the aiming point to the adsorption area.
According to the method, only the user operation is adjusted, the assistance force is poor, and if the target virtual object moves, the user still has difficulty in moving the aiming point to the body of the target virtual object, so that accurate striking of the target virtual object cannot be completed, and therefore the virtual scene display does not meet the user expectation, the user requirements cannot be met, and the display effect is poor.
Disclosure of Invention
The embodiment of the application provides virtual scene display methods and devices, electronic equipment and storage media, and can solve the problems that the requirements of users cannot be met and the display effect is poor in the related technology.
, virtual scene display methods are provided, the method comprises:
when visual angle adjusting operation is detected and an aiming point is located in an adsorption area of a target virtual object, acquiring adsorption force borne by the visual angle and visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object;
acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and in the process of controlling the visual angle to rotate according to the target rotating speed, displaying the virtual scene which changes along with the rotation of the visual angle.
, there are provided kinds of virtual scene display devices, the devices including:
the acquisition module is used for acquiring the adsorption force borne by a visual angle and the visual angle adjustment force corresponding to the visual angle adjustment operation according to the motion state of a target virtual object when the visual angle adjustment operation is detected and an aiming point is positioned in the adsorption area of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object;
the acquisition module is further used for acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and the display module is used for displaying the virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed.
In possible implementations, the obtaining module is configured to:
when the visual angle adjusting operation is detected, acquiring an adsorption area of the target virtual object;
emitting rays from the position of the aiming point along the current visual angle;
and when the ray passes through the adsorption area, acquiring the adsorption force borne by the aiming point and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object.
In , the acquiring module is configured to acquire the adsorption force with the largest value of the th adsorption force and the second adsorption force as the adsorption force applied to the viewing angle.
In , the obtaining module is configured to obtain, when the target virtual object is stationary, a second absorption force applied to the viewing angle as an absorption force applied to the viewing angle, where the second absorption force is used to assist the aiming point to move toward the target virtual object.
In possible implementations, the obtaining module is to perform any of the following :
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
In possible implementations, the obtaining module is further configured to:
when the end of the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the target virtual object moves, the adsorption force borne by the visual angle is acquired;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force borne by the visual angle.
In possible implementation manners, the obtaining module is configured to, when detecting an angle-of-view adjustment operation and when the aiming point is located in an adsorption area of the target virtual object, execute the step of obtaining an adsorption force applied to the angle of view and an angle-of-view adjustment force corresponding to the angle-of-view adjustment operation according to a motion state, a health state, or a virtual scene in which the currently controlled virtual object is located, if the currently controlled virtual object is in the th state.
In possible implementations, the obtaining module is further configured to:
when the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the motion state and the health state of the currently controlled virtual object or the virtual scene in which the currently controlled virtual object is positioned are in a second state, the visual angle adjusting force corresponding to the visual angle adjusting operation is acquired;
and acquiring the target rotation speed of the visual angle of the virtual scene according to the visual angle adjusting force.
, electronic devices are provided that include or more processors and or more memories, the or more memories having stored therein at least pieces of program code that are loaded and executed by the or more processors to implement the operations performed by the virtual scene display method.
, computer-readable storage media having stored therein at least program codes, the program codes being loaded and executed by a processor to implement the operations performed by the virtual scene display method are provided.
In the embodiment of the application, the motion state of the target virtual object is considered, when the condition of providing the auxiliary aiming service is met, the force borne by the visual angle can be acquired according to the motion state, the force comprises the adsorption force and the visual angle adjusting force, so that the target rotating speed of the visual angle can be determined according to the force, how to display the virtual scene is determined, and good auxiliary aiming can be provided when the virtual object is in the moving state, so that the virtual scene display meets the expectation of a user, the requirement of the user is met, and the display effect is good.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of virtual scene display methods provided in an embodiment of the present application;
fig. 2 is a flowchart of virtual scene display methods provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of adsorption zones provided in an embodiment of the present application;
FIG. 4 is a schematic view of adsorption zones provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of radiation detection processes provided by an embodiment of the present application;
FIG. 6 is a schematic diagram of adsorption forces provided by the examples of the present application;
FIG. 7 is a schematic diagram of damping forces provided by an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating the relationship between damping forces and the viewing angle adjustment force provided by the embodiment of the present application;
FIG. 9 is a schematic view of magnetic adsorption processes provided in the embodiments of the present application;
FIG. 10 is a schematic view of magnetic adsorption processes provided by the embodiments of the present application;
FIG. 11 is a schematic view of rotation angles for the present application;
fig. 12 is a schematic structural diagram of kinds of virtual scene display devices provided in the embodiment of the present application;
fig. 13 is a schematic structural diagram of electronic devices provided in the embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further with reference to the accompanying drawings.
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all embodiments of .
The terms "", "second", and the like are used herein to distinguish between similar or identical items having substantially the same function or functionality, and it is to be understood that "", "second", and "nth" do not have a logical or chronological dependency or limitation on the number or order of execution.
The term "at least " means or more, "plurality" means two or more, for example, a plurality of positions means two or more positions.
Hereinafter, terms related to the present application are explained.
The virtual scene can be any of a two-dimensional virtual scene, a 2.5-dimensional virtual scene and a three-dimensional virtual scene, which is not limited in the present application.
The virtual objects may be virtual avatars in the virtual scene, the virtual objects may include a plurality of virtual objects each having its own shape and volume in the virtual scene occupying parts of the space in the virtual scene, the virtual characters may be characters controlled by operations on a client, may be Artificial Intelligence (AI) set in a virtual environment battle, may be Non-Player characters (Non-Player, NPC) set in a virtual environment battle, the virtual characters may be virtual characters for playing in a virtual environment battle, and the number of virtual characters in the virtual environment battle may be preset or determined according to the number of clients added to the virtual environment battle.
Taking a shooting game as an example, the user may control the virtual object to freely fall, glide or open a parachute to fall in the sky of the virtual scene, to run, jump, crawl, bow to move ahead on land, or to swim, float or dive in the sea, or the like, and of course, the user may also control the virtual object to move in the virtual scene by riding a virtual vehicle, which is only exemplified here, but the embodiment of the present invention is not limited to this. The user can also control the virtual object to fight with other virtual objects through the virtual prop, and the virtual prop can be used for simulating cold weapons and can also be used for simulating hot weapons, and the application is not specifically limited to this.
For example, the sliding direction of the sliding operation may correspond to the rotating direction of the viewing angle, the sliding distance of the sliding operation may be positively correlated with the rotating angle of the viewing angle, and of course, the sliding speed of the sliding operation may also be positively correlated with the rotating speed of the viewing angle.
In another possible implementation manners, the angle-of-view adjustment operation may also be a pressing operation, specifically, a control area may be preset on the terminal, and the user may perform the pressing operation in the control area, and when the terminal detects the pressing operation in the control area, the terminal may determine the rotation direction, the rotation speed, and the rotation angle of the angle-of-view corresponding to the pressing operation based on the specific position of the pressing operation relative to the control area, the pressing force of the pressing operation, and the pressing time.
In another possible implementation manners, the angle-of-view adjustment operation may also be a rotation operation on the terminal, and when an angular velocity sensor (e.g., a gyroscope) in the terminal detects the rotation operation, the rotation direction, the rotation angle, and the rotation speed of the angle of view may be determined according to the rotation direction, the rotation angle, and the rotation speed of the rotation operation.
Of course, when the user controls the virtual object, different control effects may also be achieved through a combination of the above several kinds of angle adjustment operations, for example, the angle adjustment operation of the user on the angle is a sliding operation, and when the sliding operation is performed, the terminal detects a pressing force degree of the operation during the sliding operation, and determines whether to perform shooting or not based on whether the pressing force degree is greater than a preset pressing force degree or not.
Hereinafter, a system architecture according to the present application will be described.
Referring to fig. 1, fig. 1 is a schematic diagram of an implementation environment of virtual scene display methods according to an embodiment of the present application, where the implementation environment includes a th terminal 120, a server 140, and a second terminal 160.
terminal 120 is installed and running with application programs that support virtual scenes, which may be virtual reality applications, three-dimensional map programs, military simulation programs, th person named shooting Games (FPS), Multiplayer Online tactical sports Games (MOBA), any of 0 of Multiplayer gunfight class Games, th terminal 120 is a terminal used by th user, th user uses th terminal 120 to operate th virtual object located in the virtual scene for activities including, but not limited to, adjusting at least of body pose, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, and attacking, th virtual object is a th virtual character, such as a character simulation character or cartoon character.
The th terminal 120 is connected to the server 140 through a wireless network or a wired network.
The server 140 comprises servers, a plurality of servers, at least cloud computing platforms and virtualization centers, the server 140 is used for providing background services for the application programs supporting the virtual scenes, optionally, the server 140 undertakes primary computing work, the th terminal 120 and the second terminal 160 undertake secondary computing work, or, the server 140 undertakes secondary computing work, the th terminal 120 and the second terminal 160 undertake primary computing work, or, the server 140, the th terminal 120 and the second terminal 160 adopt a distributed computing architecture to carry out collaborative computing.
The second terminal 160 is installed and operated with an application program supporting a virtual scene, which may be any of a virtual reality application program, a three-dimensional map program, a military simulation program, an FPS, an MOBA, a multi-player gun-battle type survival game the second terminal 160 is a terminal used by a second user, the second user using the second terminal 160 to operate a second virtual object located in the virtual scene for an activity including, but not limited to, adjusting at least of a body posture, crawling, walking, running, riding, jumping, driving, picking, shooting, attacking, throwing, illustratively, the second virtual object is a second virtual character, such as a simulated character or an animated character.
The second terminal 160 is connected to the server 140 through a wireless network or a wired network.
Alternatively, the th virtual object controlled by the terminal 120 and the second virtual object controlled by the second terminal 160 are in the same 0 virtual scene in embodiments 1, the 2 th virtual object and the second virtual object may be in a hostile relationship, for example, the th virtual object and the second virtual object may belong to different teams and organizations, and the th terminal 120 may control the th virtual object to attack the second virtual object in embodiments , the th virtual object and the second virtual object may be in a teammate relationship, for example, the th virtual character and the second virtual character may belong to the same teams, the same organizations, have a friend relationship, or have temporary communication rights.
Alternatively, the applications installed on the terminal 120 and the second terminal 160 may be the same, or the applications installed on both terminals may be the same -type applications of different operating system platforms the terminal 120 may refer broadly to of the plurality of terminals and the second terminal 160 may refer broadly to of the plurality of terminals, this embodiment being exemplified by only the terminal 120 and the second terminal 160. the th terminal 120 and the second terminal 160 may be of the same or different device types including a smart phone, a tablet, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, dynamic Picture Experts Group Audio Layer IV) player, an MP4(Moving Picture Experts Group Audio Layer 4) player, at least one of a laptop portable computer and a desktop computer . for example, the second terminal 120 and the second terminal 160 may be at least one of smart phones, or other portable gaming devices may be exemplified below.
For example, the number of the terminals may be only , or the number of the terminals may be dozens or hundreds, or more.
Fig. 2 is a flowchart of virtual scene display methods provided in an embodiment of the present application, and referring to fig. 2, this embodiment illustrates that the method is applied to a terminal, which may be the -th terminal 120 shown in fig. 1, where the method includes:
201. when detecting the operation of adjusting the view angle, the terminal detects whether a virtual scene includes a target virtual object, and when the virtual scene includes the target virtual object, the step 202 is executed; when the target virtual object is not included in the virtual scene, step 205 is performed.
The user can carry out visual angle adjustment operation on the terminal to adjust the visual angle of virtual scene, because the aiming point is usually located the screen center, adjust the visual angle and can adjust the position in the corresponding virtual scene of aiming point, can adjust the aiming position and the attack placement of the virtual object of current control through this visual angle adjustment operation when aiming the shooting to the virtual object in the virtual scene like this, thereby realize the accurate striking to the target.
In the embodiment of the application, when the user performs the above-mentioned visual angle adjustment operation, an auxiliary aiming service may be provided to assist the user to quickly move an aiming point to a virtual object to be aimed at, and if the virtual object is moving, corresponding auxiliary adjustment may be performed according to the moving state of the virtual object to reduce the operation difficulty of the user. Furthermore, when the view angle adjusting operation is detected, the terminal may detect whether the target virtual object is included in the virtual scene, so as to determine whether the auxiliary aiming service needs to be provided.
It is understood that if the target virtual object is not included in the virtual scene, that is, if no other virtual object exists in the field of view of the currently controlled virtual object, the virtual object does not have a target to aim or shoot, the angle-of-view adjustment operation may be only an operation of adjusting the angle of view by the user, and is not aiming, so that the auxiliary aiming service may not be provided, and the following steps 205 and 206 may be performed to perform the angle-of-view adjustment directly based on the angle-of-view adjustment operation.
If the virtual scene includes a target virtual object, that is, if there are other virtual objects in the field of view of the currently controlled virtual object, and the virtual object may want to aim at the other virtual object, the following determination may be made to determine how to provide the auxiliary aiming service, and specifically, the following step 202 may be executed to make the determination.
The virtual scene can display or more virtual objects, and the terminal can take the virtual object closest to the aiming point as the target virtual object.
In possible implementations, the currently controlled virtual object may be grouped with other virtual objects as virtual objects in the same teams, , and the currently controlled virtual object does not need to aim or shoot at a virtual object in the same teams, so the target virtual object may also be any virtual object different from the team to which the currently controlled virtual object belongs.
202. The terminal acquires the adsorption area of the target virtual object, and executes step 203 when the aiming point is located in the adsorption area of the target virtual object, and executes step 205 when the aiming point is located outside the adsorption area of the target virtual object.
After determining that a target virtual object is within the field of view of the currently controlled virtual object, the terminal may further determine whether the target virtual object is subject to supplemental targeting.
It can be understood that if the aiming point is far away from the target virtual object, auxiliary aiming is provided, and the aiming point is absorbed to the vicinity of the target virtual object, the fairness of the video game is lost, and the significance of the user operation is lost. Therefore, the target virtual object can be provided with the adsorption area, auxiliary aiming is provided when the aiming point is positioned in the adsorption area, the complexity of user operation is reduced, and meanwhile fairness of the electronic game is guaranteed.
The terminal may first perform step 202 to obtain an absorption area of the target virtual object and then determine whether the aiming point is located in the absorption area, if so, the following step 203 may be performed to provide auxiliary aiming, and if so, the following step 205 may be performed to directly perform viewing angle adjustment according to the viewing angle adjustment operation.
The target size may be preset by a person skilled in the art, for example, as shown in fig. 3 and 4, the adsorption area may be cylindrical areas or ellipsoidal areas around the target virtual object, and the shape of the adsorption area is not limited in the embodiments of the present application.
Accordingly, the step 202 can be that the terminal acquires the adsorption area of the target virtual object according to the distance between the currently controlled virtual object and the target virtual object, and the size of the adsorption area is positively related to the distance, the larger the distance is, the larger the size of the adsorption area is, the smaller the distance is, so that when the distance between the th virtual object and the second virtual object is far, the display size of the th virtual object is small, and the display size of the adsorption area of the th virtual object is not too small, so that a user can easily perform perspective adjustment through the perspective adjustment operation, thereby moving the position of the aiming point to the adsorption area of the target virtual object, and obtaining the auxiliary effect of auxiliary aiming.
The terminal may determine the relationship between the aiming point and the absorption area according to various ways, and in possible implementations, the terminal may employ a ray detection way, specifically, the terminal may emit a ray from the position of the aiming point along the current view angle, and when the ray passes through the absorption area, the terminal may determine that the aiming point is located within the absorption area, and then the terminal may perform step 203, which is described below, and according to the motion state of the target virtual object, obtain the absorption force applied to the aiming point and the view angle adjustment force corresponding to the view angle adjustment operation.
In another possible implementations, the terminal may determine whether there is an intersection between the two according to the position of the aiming point in the virtual scene and the position of the adsorption area, for example, the aiming point may correspond to straight lines in the virtual scene, and the adsorption area may correspond to ellipsoids in the virtual scene, and if there is an intersection between coordinate ranges of the aiming point and the virtual scene, it is determined that the aiming point is located within the adsorption area, and if there is no intersection, it is determined that the aiming point is located outside the adsorption area.
The foregoing only provides two determination manners, and the terminal may also determine the position relationship between the aiming point and the adsorption area by using other manners, which is not limited in this embodiment of the application.
203. And the terminal acquires the adsorption force borne by the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object.
In the embodiment of the application, when the motion states of the target virtual object are different, the terminal can provide different auxiliary aiming services. In step 203, the terminal may first obtain a motion state of the target virtual object, and then analyze a force applied to the viewing angle according to the motion state, so as to determine how to adjust the viewing angle.
The motion state of the target virtual object may include two types: moving and stationary. Specifically, the process of the terminal acquiring the adsorption force applied to the viewing angle in step 203 may include the following two cases:
in case , when the target virtual object is moving, a th absorption force and a second absorption force applied to the viewing angle are obtained, the th absorption force is used to control the aiming point to follow the target virtual object, the second absorption force is used to assist the aiming point to move towards the target virtual object, and the terminal can obtain the th absorption force or the second absorption force as the absorption force applied to the viewing angle.
In case , the aiming point is located in the suction area, the terminal may provide a second suction force to assist the user in aiming at the target virtual object, the target virtual object is moving, in order to assist the user in aiming at the moving target virtual object more quickly, the terminal may provide a suction force, if the two forces are superimposed, the auxiliary force is too large to easily allow the user to move the aiming point to the target virtual object, and the target virtual object is quickly moved so that it cannot be aimed accurately, and thus the terminal may take of the two as the suction force, for example, as shown in fig. 6, the suction force is from the aiming point to the location of the target virtual object.
In possible implementations, the terminal could use the larger value of the two forces to better assist the user in selecting to be the suction force, and in particular, the terminal could obtain the largest value of the th suction force and the second suction force as the suction force experienced at the viewing angle.
Case two: when the target virtual object is static, the terminal acquires a second adsorption force borne by the visual angle as an adsorption force borne by the visual angle, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object.
In the second case, since the target virtual object is stationary, the terminal may not need to provide the th suction force according to the target virtual object, and naturally, may not need to perform the screening between the two forces, and may directly use the second suction force for assisting the aiming as the suction force applied to the viewing angle.
The above steps 201 to 203 are processes of acquiring the absorption force applied to the viewing angle and the viewing angle adjustment force corresponding to the viewing angle adjustment operation according to the motion state of the target virtual object when the viewing angle adjustment operation is detected and the aiming point is located in the absorption area of the target virtual object, and the motion state of the target virtual object is considered, so as to provide a more appropriate auxiliary aiming service for the user according to the motion state.
In possible implementation manners, when the motion state, the health state or the virtual scene where the virtual object is currently controlled is different, even if the aiming point and the adsorption area conform to the position relationship, the auxiliary aiming service is not provided.
Specifically, the motion state, health state or virtual scene in which the virtual object is currently controlled can be divided into two states, th state and second state.
When the visual angle adjusting operation is detected, and the aiming point is located in the adsorption area of the target virtual object, if the motion state, the health state or the virtual scene where the current controlled virtual object is located is in the th state, the terminal can provide auxiliary aiming, and step 203 can be executed to obtain the adsorption force borne by the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object.
For example, when the current virtual object stands or lies on the ground or a building in the virtual scene, or the virtual object is in an environment capable of aiming at shooting, or the virtual object is not eliminated and still can compete with the virtual object in the virtual scene, the terminal can provide the auxiliary aiming service.
When the angle-of-view adjustment operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the motion state, health state or virtual scene of the currently controlled virtual object is in the second state, the terminal may not provide auxiliary aiming, the terminal may execute step 205, obtain an angle-of-view adjustment force corresponding to the angle-of-view adjustment operation, and obtain a target rotation speed of the angle of view of the virtual scene according to the angle-of-view adjustment force.
For example, when the current virtual object is in a falling state or a flying state in the virtual scene, or the virtual object is hit by a smoke bomb and is in smoke, or the virtual object is eliminated and cannot continue to compete with the virtual object in the virtual scene, the terminal may not provide the auxiliary aiming service.
204. And the terminal acquires the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force.
After the terminal acquires the adsorption force and the visual angle adjusting force, the target rotation speed of the visual angle can be determined according to the adsorption force and the visual angle adjusting force.
In another possible implementations, the terminal may also obtain corresponding rotational speeds according to the two forces, respectively, so as to obtain a resultant rotational speed of the two rotational speeds as a target rotational speed.
For example, when the angle-of-view adjustment operation is used to move an aiming point towards the target virtual object, the terminal may perform the above step 203 and step 204, and when the angle-of-view adjustment operation is used to move the aiming point away from the target virtual object, in addition to step 203, the terminal may obtain a force opposite to the direction of the angle-of-view adjustment operation as a damping force applied to the angle-of-view adjustment operation, and in step 204, the terminal may obtain a target rotation speed of the angle-of-view of the virtual scene according to the absorption force, the damping force applied to the angle-of-view adjustment operation, and the angle-of-view adjustment force.
The aiming point is closer to the target virtual object, and if the moving speed of the aiming point is too high, misoperation can easily occur to enable the aiming point to be away from the target virtual object quickly, damping effects can be provided, reaction force can be provided to reduce the rotating speed of the visual angle, and reaction force or reaction force can be provided in the mode, so that a user can easily master the operation skill, and the visual angle can be better adjusted to aim.
settings can be provided for the damping force, the adsorption force and the visual angle adjusting force, when the damping force borne by the visual angle is larger than the visual angle adjusting force, the terminal obtains zero as the target rotating speed of the visual angle of the virtual scene, so that when a user moves the aiming point to the direction far away from the target virtual object, the aiming point cannot move reversely due to the damping force, the user operation is better respected, and the user requirements are met.
For example, as shown in fig. 7, the direction of the damping force may be opposite to the direction of the viewing angle adjustment operation by the user. As shown in fig. 8, when the damping force (the generated reaction force) is larger than the viewing angle adjustment force (for example, the force generated by dragging the mouse), the aiming point cannot be removed from the target virtual object, and when the damping force is smaller than the viewing angle adjustment force, the aiming point is removed from the target virtual object.
In possible implementation manners, the setting may be further such that when a resultant force of the damping force applied to the viewing angle and the absorption force is greater than the viewing angle adjustment force, the terminal acquires zero as the target rotation speed of the viewing angle of the virtual scene.
In specific possible embodiments, if the user's angle-of-view adjustment operation is ended while the target virtual object is moving, the aiming point may further continue to move along with the target virtual object until the aiming point moves to the body of the target virtual object.
205. And the terminal acquires the visual angle adjusting force corresponding to the visual angle adjusting operation, and acquires the target rotating speed of the visual angle of the virtual scene according to the visual angle adjusting force.
Different from the step 203, the terminal may determine the target rotation speed of the viewing angle directly according to the viewing angle adjustment force corresponding to the viewing angle adjustment operation when the terminal does not need to provide the auxiliary aiming. That is, when the user performs the operation of adjusting the angle of view, the angle of view normally rotates according to the operation.
206. And the terminal displays a virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed.
Through the steps, after the terminal acquires the target rotating speed of the visual angle, the visual angle can be controlled to rotate according to the target rotating speed, the virtual scene observed through the visual angle changes when the visual angle rotates, and in the process, the terminal can display the changed virtual scene in the graphical user interface.
Two specific examples are provided below, and in specific examples, the above-mentioned adsorption force is described, as shown in fig. 9, the adsorption force may be referred to as magnetic adsorption, and the magnetic adsorption process may include the following steps:
at step , it is detected whether the player (user of the current terminal) has the operation of screen input, since the magnetic attraction will be generated under the operation of the player, the next step can be performed after the player touches the screen.
And step two, judging whether the current state of the player (the currently controlled virtual object) can generate adsorption, wherein the player can not generate magnetic adsorption when falling, dying or under the smoke bomb, and if the state is not the state, the adsorption can be generated.
And step three, when the current state is judged to generate magnetic force, judging whether the current aiming point position aims at the collision box of the target, emitting ray detection target collision boxes from the gun muzzle (aiming point) position by the player, and generating magnetic force adsorption when the collision box of a certain target is detected.
And fourthly, acquiring the current position of the player and the position of the muzzle, and calculating the angle required for the current direction to rotate to the target central axis. The target center axis is also the center position of the adsorption region, i.e., the target center position. The magnetic attraction is completed when the aiming point is attracted to the central axis of the target. Of course, if the target is moving, the view angle is rotated by the calculated angle, and the target center axis is not absorbed, but the view angle is rotated by the calculated angle when the target is moving, so that the aiming point is absorbed on the target center axis. The calculation is only an example, and of course, the rotation direction may be determined by the position without calculating the angle, which is not limited in the embodiment of the present application.
As shown in fig. 10, assuming that O is the muzzle position (aiming point position), OA is the direction of the gun pointing, that is, the viewing angle direction, and OB is the direction from the center axis of the target to the muzzle, when magnetic attraction is generated, if the target is not moved, the target is rotated from OA to OB by an angle C, and when the target is rotated to OB, the magnetic attraction is stopped.
And step five, calculating the current auxiliary force adsorption force when magnetic force adsorption is generated, judging the magnitude of the two forces, and taking the maximum forces.
As for the damping force, as shown in fig. 11, only the damping force and the view angle adjusting force are taken as an example for description, and as for the adsorption force, the calculation may be superimposed on the example shown in fig. 10, which is not described herein again. The damping force is referred to as damping adsorption, and the damping adsorption process can be as shown in fig. 11, specifically as follows:
and , detecting whether the player has screen input operation, because the damping adsorption is effective under the condition of player operation, and detecting the next step after the player touches the screen.
And step two, judging whether the current player state can generate adsorption, and when the player falls, dies or is under a smoke bomb, the player cannot generate damping adsorption. Damping adsorption may be generated when the player is not in these states.
And step three, when the player drags the mouse to the direction far away from the target, a counterforce is generated.
The mouse sliding deviation is visual angle adjusting force, the counterforce is damping force, the difference value of the two forces is calculated, if the acting force is larger than the mouse deviation, namely the damping force is larger than the visual angle adjusting force, the aiming point cannot be moved away from the target, and therefore dragging of the mouse is ineffective.
And fourthly, finishing damping adsorption when the player leaves the screen to click or the player cannot drag.
In the embodiment of the application, the motion state of the target virtual object is considered, when the condition of providing the auxiliary aiming service is met, the force borne by the visual angle can be acquired according to the motion state, the force comprises the adsorption force and the visual angle adjusting force, so that the target rotating speed of the visual angle can be determined according to the force, how to display the virtual scene is determined, and good auxiliary aiming can be provided when the virtual object is in the moving state, so that the virtual scene display meets the expectation of a user, the requirement of the user is met, and the display effect is good.
All the above-mentioned optional technical solutions can be adopted to form optional embodiments of the present application in any combination, and are not described in detail herein in .
Fig. 12 is a schematic structural diagram of kinds of virtual scene display devices provided in an embodiment of the present application, and referring to fig. 12, the device includes:
an obtaining module 1201, configured to, when a viewing angle adjustment operation is detected and an aiming point is located in an adsorption area of a target virtual object, obtain, according to a motion state of the target virtual object, an adsorption force borne by the viewing angle and a viewing angle adjustment force corresponding to the viewing angle adjustment operation, where the adsorption force is used to move the aiming point toward the target virtual object;
the obtaining module 1201 is further configured to obtain a target rotation speed of a visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and a display module 1202, configured to display a virtual scene that changes with the rotation of the angle of view in the process of controlling the angle of view to rotate according to the target rotation speed.
In possible implementations, the obtaining module 1201 is configured to:
when the visual angle adjusting operation is detected, acquiring an adsorption area of the target virtual object;
emitting rays from the position of the aiming point along the current visual angle;
when the ray passes through the adsorption area, the adsorption force borne by the aiming point and the visual angle adjusting force corresponding to the visual angle adjusting operation are obtained according to the motion state of the target virtual object.
In possible implementations, the obtaining module 1201 is configured to:
when the target virtual object is moving, th adsorption force and second adsorption force borne by the visual angle are obtained, the th adsorption force is used for controlling the aiming point to follow the target virtual object, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object;
the th suction force or the second suction force is obtained as the suction force received by the viewing angle.
In possible implementations, the acquiring module 1201 is configured to acquire the largest value of the th suction force and the second suction force as the suction force experienced at the viewing angle.
In , the obtaining module 1201 is configured to obtain a second absorption force applied to the viewing angle as the absorption force applied to the viewing angle when the target virtual object is stationary, where the second absorption force is used to assist the aiming point to move towards the target virtual object.
In possible implementations, the obtaining module 1201 is further configured to:
when the visual angle adjusting operation is used for keeping the aiming point away from the target virtual object, acquiring a force opposite to the visual angle adjusting operation direction as a damping force borne by the visual angle;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force.
In possible implementations, the obtaining module 1201 is configured to perform any of the following :
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
In possible implementations, the obtaining module 1201 is further configured to:
when the end of the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the target virtual object moves, the adsorption force borne by the visual angle is acquired;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force borne by the visual angle.
In possible implementation manners, the obtaining module 1201 is configured to, when detecting an angle-of-view adjustment operation and when the aiming point is located in the adsorption area of the target virtual object, if the motion state, the health state, or the virtual scene of the currently controlled virtual object is in the th state, execute the step of obtaining the adsorption force applied to the angle of view and the angle-of-view adjustment force corresponding to the angle-of-view adjustment operation according to the motion state of the target virtual object.
In possible implementations, the obtaining module 1201 is further configured to:
when the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the motion state and the health state of the currently controlled virtual object or the virtual scene in which the currently controlled virtual object is positioned are in a second state, the visual angle adjusting force corresponding to the visual angle adjusting operation is acquired;
and acquiring the target rotation speed of the visual angle of the virtual scene according to the visual angle adjusting force.
The device that this application embodiment provided has considered the motion state of target virtual object, when being accorded with the condition that provides supplementary service of aiming, can obtain the power that the visual angle received according to this motion state, power includes adsorption affinity, visual angle adjustment power to can confirm the target slew velocity at visual angle according to power, thereby confirm how to show virtual scene, also can provide fine supplementary aim when virtual object is in the mobile state, thereby virtual scene shows and accords with user's expectation, satisfy user's demand, the display effect is good.
It should be noted that, when the virtual scene display apparatus provided in the foregoing embodiment displays a virtual scene, only the division of the functional modules is illustrated, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the electronic device is divided into different functional modules to complete all or part of the functions described above.
Fig. 13 is a schematic structural diagram of electronic devices provided in this embodiment, where the electronic device 1300 may generate a relatively large difference due to different configurations or performances, and may include or multiple processors (CPUs) 1301 and or multiple memories 1302, where at least program codes are stored in the or multiple memories 1302, and the at least program codes are loaded and executed by the or multiple processors 1301 to implement the virtual scene display methods provided in the above respective method embodiments.
For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in computer readable storage mediums, which may be read only memory, magnetic or optical disks, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1, A virtual scene display method, characterized in that, the method includes:
when visual angle adjusting operation is detected and an aiming point is located in an adsorption area of a target virtual object, acquiring adsorption force borne by the visual angle and visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object;
acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and in the process of controlling the visual angle to rotate according to the target rotating speed, displaying the virtual scene which changes along with the rotation of the visual angle.
2. The method according to claim 1, wherein when the angle-of-view adjustment operation is detected and the aiming point is located in the absorption area of the target virtual object, acquiring the absorption force applied to the angle of view and the angle-of-view adjustment force corresponding to the angle-of-view adjustment operation according to the motion state of the target virtual object, comprises:
when the visual angle adjusting operation is detected, acquiring an adsorption area of the target virtual object;
emitting rays from the position of the aiming point along the current visual angle;
and when the ray passes through the adsorption area, acquiring the adsorption force borne by the aiming point and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object.
3. The method according to claim 1, wherein the obtaining the attraction force applied to the view according to the motion state of the target virtual object comprises:
when the target virtual object is moving, th adsorption force and second adsorption force borne by the visual angle are obtained, the th adsorption force is used for controlling the aiming point to follow the target virtual object, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object;
and acquiring the th adsorption force or the second adsorption force as the adsorption force received by the visual angle.
4. The method of claim 3, wherein said obtaining the th suction force or the second suction force is a suction force experienced at the viewing angle comprises:
and acquiring the adsorption force with the largest value in the th adsorption force and the second adsorption force as the adsorption force borne by the visual angle.
5. The method according to claim 1, wherein the obtaining the attraction force applied to the view according to the motion state of the target virtual object comprises:
and when the target virtual object is static, acquiring a second adsorption force borne by the visual angle as an adsorption force borne by the visual angle, wherein the second adsorption force is used for assisting the aiming point to move towards the target virtual object.
6. The method of any of , wherein the method further comprises:
when the visual angle adjusting operation is used for keeping the aiming point away from the target virtual object, acquiring a force opposite to the visual angle adjusting operation direction as a damping force borne by the visual angle;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force.
7. The method according to claim 6, wherein the obtaining of the target rotation speed of the view angle of the virtual scene according to the absorption force, the damping force and the view angle adjustment force applied to the view angle comprises any items of:
when the damping force borne by the visual angle is larger than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene;
and when the resultant force of the damping force borne by the visual angle and the adsorption force is greater than the visual angle adjusting force, acquiring zero as the target rotating speed of the visual angle of the virtual scene.
8. The method of claim 1, further comprising:
when the end of the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the target virtual object moves, the adsorption force borne by the visual angle is acquired;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force borne by the visual angle.
9. The method according to claim 1, wherein when the angle-of-view adjustment operation is detected and the aiming point is located in the absorption area of the target virtual object, acquiring the absorption force applied to the angle of view and the angle-of-view adjustment force corresponding to the angle-of-view adjustment operation according to the motion state of the target virtual object, comprises:
when the visual angle adjusting operation is detected and the aiming point is located in the adsorption area of the target virtual object, if the motion state, the health state or the virtual scene of the currently controlled virtual object is in the th state, the step of acquiring the adsorption force borne by the visual angle and the visual angle adjusting force corresponding to the visual angle adjusting operation according to the motion state of the target virtual object is executed.
10. The method of claim 1, further comprising:
when the visual angle adjusting operation is detected and the aiming point is positioned in the adsorption area of the target virtual object, if the motion state and the health state of the currently controlled virtual object or the virtual scene in which the currently controlled virtual object is positioned are in a second state, the visual angle adjusting force corresponding to the visual angle adjusting operation is acquired;
and acquiring the target rotation speed of the visual angle of the virtual scene according to the visual angle adjusting force.
11.. apparatus for displaying virtual scenes, comprising:
the acquisition module is used for acquiring the adsorption force borne by a visual angle and the visual angle adjustment force corresponding to the visual angle adjustment operation according to the motion state of a target virtual object when the visual angle adjustment operation is detected and an aiming point is positioned in the adsorption area of the target virtual object, wherein the adsorption force is used for moving the aiming point to the target virtual object;
the acquisition module is further used for acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the visual angle adjusting force;
and the display module is used for displaying the virtual scene which changes along with the rotation of the visual angle in the process of controlling the visual angle to rotate according to the target rotation speed.
12. The apparatus of claim 11, wherein the obtaining module is configured to:
when the target virtual object is moving, th adsorption force and second adsorption force borne by the visual angle are obtained, the th adsorption force is used for controlling the aiming point to follow the target virtual object, and the second adsorption force is used for assisting the aiming point to move towards the target virtual object;
and acquiring the th adsorption force or the second adsorption force as the adsorption force received by the visual angle.
13. The apparatus of claim 11 or 12, wherein the obtaining module is further configured to:
when the visual angle adjusting operation is used for keeping the aiming point away from the target virtual object, acquiring a force opposite to the visual angle adjusting operation direction as a damping force borne by the visual angle;
and acquiring the target rotating speed of the visual angle of the virtual scene according to the adsorption force and the damping force of the visual angle and the visual angle adjusting force.
14, electronic devices, characterized in that, the electronic devices includes or more processors and or more memories, the or more memories storing at least program codes, the program codes being loaded and executed by the or more processors to implement the operations performed by the virtual scene display method of any of claims 1 to 10.
15, computer-readable storage media having stored therein at least program code that is loaded and executed by a processor to perform the operations performed by the virtual scene display method of any of claims 1 to .
CN201910992462.9A 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium Active CN110732135B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910992462.9A CN110732135B (en) 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910992462.9A CN110732135B (en) 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110732135A true CN110732135A (en) 2020-01-31
CN110732135B CN110732135B (en) 2022-03-08

Family

ID=69269255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910992462.9A Active CN110732135B (en) 2019-10-18 2019-10-18 Virtual scene display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110732135B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111408132A (en) * 2020-02-17 2020-07-14 网易(杭州)网络有限公司 Game picture display method, device, equipment and storage medium
CN111589132A (en) * 2020-04-26 2020-08-28 腾讯科技(深圳)有限公司 Virtual item display method, computer equipment and storage medium
CN111784844A (en) * 2020-06-09 2020-10-16 当家移动绿色互联网技术集团有限公司 Method and device for observing virtual object, storage medium and electronic equipment
CN111888762A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Method for adjusting visual angle of lens in game and electronic equipment
CN113144593A (en) * 2021-03-19 2021-07-23 网易(杭州)网络有限公司 Target aiming method and device in game, electronic equipment and storage medium
CN113633976B (en) * 2021-08-16 2023-06-20 腾讯科技(深圳)有限公司 Operation control method, device, equipment and computer readable storage medium
CN117170504A (en) * 2023-11-01 2023-12-05 南京维赛客网络科技有限公司 Method, system and storage medium for viewing with person in virtual character interaction scene

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1086729A2 (en) * 1999-09-24 2001-03-28 Konami Corporation Shooting video game system and image displaying method in shooting video game
CN107913515A (en) * 2017-10-25 2018-04-17 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108415639A (en) * 2018-02-09 2018-08-17 腾讯科技(深圳)有限公司 Visual angle regulating method, device, electronic device and computer readable storage medium
CN109847336A (en) * 2019-02-26 2019-06-07 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic equipment and storage medium
CN110147159A (en) * 2017-09-21 2019-08-20 腾讯科技(深圳)有限公司 Object localization method, device and electronic equipment in virtual interacting scene

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1086729A2 (en) * 1999-09-24 2001-03-28 Konami Corporation Shooting video game system and image displaying method in shooting video game
CN110147159A (en) * 2017-09-21 2019-08-20 腾讯科技(深圳)有限公司 Object localization method, device and electronic equipment in virtual interacting scene
CN107913515A (en) * 2017-10-25 2018-04-17 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108415639A (en) * 2018-02-09 2018-08-17 腾讯科技(深圳)有限公司 Visual angle regulating method, device, electronic device and computer readable storage medium
CN109847336A (en) * 2019-02-26 2019-06-07 腾讯科技(深圳)有限公司 Virtual scene display method, apparatus, electronic equipment and storage medium

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111408132A (en) * 2020-02-17 2020-07-14 网易(杭州)网络有限公司 Game picture display method, device, equipment and storage medium
CN111408132B (en) * 2020-02-17 2023-04-07 网易(杭州)网络有限公司 Game picture display method, device, equipment and storage medium
CN111589132A (en) * 2020-04-26 2020-08-28 腾讯科技(深圳)有限公司 Virtual item display method, computer equipment and storage medium
CN111784844A (en) * 2020-06-09 2020-10-16 当家移动绿色互联网技术集团有限公司 Method and device for observing virtual object, storage medium and electronic equipment
CN111784844B (en) * 2020-06-09 2024-01-05 北京五一视界数字孪生科技股份有限公司 Method and device for observing virtual object, storage medium and electronic equipment
CN111888762A (en) * 2020-08-13 2020-11-06 网易(杭州)网络有限公司 Method for adjusting visual angle of lens in game and electronic equipment
CN113144593A (en) * 2021-03-19 2021-07-23 网易(杭州)网络有限公司 Target aiming method and device in game, electronic equipment and storage medium
CN113633976B (en) * 2021-08-16 2023-06-20 腾讯科技(深圳)有限公司 Operation control method, device, equipment and computer readable storage medium
CN117170504A (en) * 2023-11-01 2023-12-05 南京维赛客网络科技有限公司 Method, system and storage medium for viewing with person in virtual character interaction scene
CN117170504B (en) * 2023-11-01 2024-01-19 南京维赛客网络科技有限公司 Method, system and storage medium for viewing with person in virtual character interaction scene

Also Published As

Publication number Publication date
CN110732135B (en) 2022-03-08

Similar Documents

Publication Publication Date Title
CN110732135B (en) Virtual scene display method and device, electronic equipment and storage medium
KR102592632B1 (en) Methods and devices, electronic devices and storage media for generating mark information in a virtual environment
CN109847336B (en) Virtual scene display method and device, electronic equipment and storage medium
US20210387087A1 (en) Method for controlling virtual object and related apparatus
US9764226B2 (en) Providing enhanced game mechanics
JP2022533321A (en) VIRTUAL OBJECT CONTROL METHOD, APPARATUS, DEVICE AND COMPUTER PROGRAM
CN111399639B (en) Method, device and equipment for controlling motion state in virtual environment and readable medium
US9004997B1 (en) Providing enhanced game mechanics
CN110465087B (en) Virtual article control method, device, terminal and storage medium
JP2022539289A (en) VIRTUAL OBJECT AIMING METHOD, APPARATUS AND PROGRAM
KR102665665B1 (en) Method and apparatus for displaying virtual environment pictures, devices, and storage media
CN110585712A (en) Method, device, terminal and medium for throwing virtual explosives in virtual environment
US20230013014A1 (en) Method and apparatus for using virtual throwing prop, terminal, and storage medium
WO2022242400A1 (en) Method and apparatus for releasing skills of virtual object, device, medium, and program product
US20240070974A1 (en) Method and apparatus for displaying virtual environment picture, device, and storage medium
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
US20220161138A1 (en) Method and apparatus for using virtual prop, device, and storage medium
WO2022156491A1 (en) Virtual object control method and apparatus, and device, storage medium and program product
US11786817B2 (en) Method and apparatus for operating virtual prop in virtual environment, device and readable medium
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
WO2023130807A1 (en) Front sight control method and apparatus in virtual scene, electronic device, and storage medium
CN112316429A (en) Virtual object control method, device, terminal and storage medium
JP2022552752A (en) Screen display method and device for virtual environment, and computer device and program
WO2022007567A1 (en) Virtual resource display method and related device
WO2023071808A1 (en) Virtual scene-based graphic display method and apparatus, device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40021454

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant