CN114100128A - Prop special effect display method and device, computer equipment and storage medium - Google Patents

Prop special effect display method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114100128A
CN114100128A CN202111500232.XA CN202111500232A CN114100128A CN 114100128 A CN114100128 A CN 114100128A CN 202111500232 A CN202111500232 A CN 202111500232A CN 114100128 A CN114100128 A CN 114100128A
Authority
CN
China
Prior art keywords
virtual
prop
virtual object
special effect
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111500232.XA
Other languages
Chinese (zh)
Other versions
CN114100128B (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111500232.XA priority Critical patent/CN114100128B/en
Publication of CN114100128A publication Critical patent/CN114100128A/en
Application granted granted Critical
Publication of CN114100128B publication Critical patent/CN114100128B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a method and a device for displaying a special effect of a road tool, computer equipment and a storage medium, and belongs to the technical field of computers. The method comprises the following steps: displaying a visual field picture of a first virtual object controlled by the local terminal equipment; and under the condition that the triggered virtual prop is positioned in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance, displaying the prop special effect of the virtual prop, and under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop. According to the method provided by the embodiment of the application, the special effect of the prop of the virtual prop can be displayed only under the condition that the triggered virtual prop is in the visual field and the distance between the virtual prop and the virtual object is smaller than the target distance, so that resources occupied for displaying the special effect of the prop are saved.

Description

Prop special effect display method and device, computer equipment and storage medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for displaying a special effect of a road tool, computer equipment and a storage medium.
Background
With the development of computer technology, online games are more and more abundant and diversified. In the online game, when any virtual prop is triggered, the prop special effect of the virtual prop can be displayed so as to improve the display effect. However, when a plurality of virtual items are triggered, the special effects of the items of the plurality of virtual items need to be displayed, which requires more resources to be occupied.
Disclosure of Invention
The embodiment of the application provides a method and a device for displaying a special effect of a property, computer equipment and a storage medium, and can save resources occupied by displaying the special effect of the property. The technical scheme is as follows:
in one aspect, a method for displaying a special effect of a road furniture is provided, the method comprising:
displaying a visual field picture of a first virtual object controlled by the local terminal equipment;
displaying a prop special effect of the virtual prop under the condition that the triggered virtual prop is positioned in a visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than a target distance;
and under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop.
In another aspect, there is provided a display apparatus for a road furniture effect, the apparatus including:
the display module is used for displaying a view picture of a first virtual object controlled by the home terminal equipment;
the display module is further configured to display a prop special effect of the virtual prop when the triggered virtual prop is located in a field of view of the first virtual object and a distance between the virtual prop and the first virtual object is not greater than a target distance;
the display module is further configured to not display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view and a distance between the virtual prop and the first virtual object is greater than the target distance.
In another possible implementation manner, the display module is configured to display the item special effect of the virtual item when the virtual item is triggered, a second virtual object holding the virtual item is located in the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display the firing special effect of the virtual firearm if any virtual firearm is triggered, a second virtual object holding the virtual firearm is located in the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display a prop special effect of the virtual prop when the virtual prop moves in the field of view after being triggered and a distance between the virtual prop and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display the movement special effect of the virtual bullet if any virtual bullet moves within the field of view after being ejected and the distance between the virtual bullet and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display the prop special effect of the virtual prop at the contact position when the virtual prop is in contact with any virtual article after being triggered, the contact position is within the field of view, and a distance between the contact position and the position of the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module is configured to display the bullet hole special effect of the virtual bullet at the contact position if any virtual bullet is in contact with the virtual article after being ejected, the contact position is within the field of view, and the distance between the contact position and the position of the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module includes:
a determination unit, configured to determine a bullet hole special effect matched with the virtual article when the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the visual field, and a distance between the contact position and the position of the first virtual object is not greater than the target distance;
and the display unit is used for displaying the determined bullet hole special effect at the contact position.
In another possible implementation manner, the apparatus further includes:
the acquisition module is used for acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop under the condition that the virtual prop is triggered;
and the display module is used for displaying the prop special effect of the virtual prop under the condition that the position of the second virtual object is in the visual field and the distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance.
In another possible implementation manner, the apparatus further includes:
the acquisition module is used for acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop under the condition that the virtual prop is triggered;
a determining module, configured to determine a position of the virtual prop based on the position of the second virtual object, an orientation of the second virtual object, and a relative positional relationship between the second virtual object and the virtual prop;
and the display module is used for displaying the prop special effect of the virtual prop under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
In another possible implementation manner, the apparatus further includes:
the obtaining module is used for obtaining the position of the first virtual object and the position of a third virtual object for triggering the virtual prop under the condition that the virtual prop is triggered and starts to move;
a determining module, configured to determine a current position of the virtual prop based on a position of the third virtual object and a moving direction of the virtual prop;
and the display module is used for displaying the moving special effect of the virtual prop under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
In another aspect, a computer device is provided, and the computer device includes a processor and a memory, where at least one computer program is stored in the memory, and the at least one computer program is loaded and executed by the processor to implement the operations performed by the prop special effect display method according to the above aspect.
In another aspect, a computer-readable storage medium is provided, in which at least one computer program is stored, and the at least one computer program is loaded and executed by a processor to implement the operations performed by the prop special effect display method according to the above aspect.
In a further aspect, a computer program product is provided, which includes a computer program, and when the computer program is executed by a processor, the computer program implements the operations performed by the prop special effect display method according to the above aspect.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
the method, the device, the computer device and the storage medium provided by the embodiment of the application determine whether to display the special effect of the triggered virtual prop based on the visual field of the virtual object controlled by the local device and the target distance, only when the triggered virtual prop is in the visual field of the virtual object controlled by the local device and the distance between the virtual prop and the virtual object is less than the target distance, the special effect of the virtual prop is displayed, and the special effect of the virtual prop which is in the visual field and the distance between the virtual prop and the virtual object is not required to be displayed and is greater than the target distance is not required to be displayed, so that resources required to be occupied for displaying the special effect of the prop are saved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application;
fig. 2 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present disclosure;
fig. 3 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 5 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a virtual scene provided in an embodiment of the present application;
fig. 7 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present disclosure;
fig. 8 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present disclosure;
fig. 9 is a schematic diagram of a bullet hole special effect provided in an embodiment of the present application;
FIG. 10 is a schematic diagram of a bullet hole effect provided by an embodiment of the present application;
fig. 11 is a schematic diagram of a bullet hole effect provided in an embodiment of the present application;
fig. 12 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present disclosure;
FIG. 13 is a schematic diagram of a firing effect and a movement effect provided by an embodiment of the present application;
FIG. 14 is a schematic diagram of a firing effect and a movement effect provided by an embodiment of the present application;
fig. 15 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of a display device with special effects according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of a display device with special effects according to an embodiment of the present application;
fig. 18 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 19 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
The terms "first," "second," "third," and the like as used herein may be used herein to describe various concepts that are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. For example, a first virtual object may be referred to as a second virtual object, and similarly, a second virtual object may be referred to as a first virtual object, without departing from the scope of the present application.
As used herein, the terms "at least one," "a plurality," "each," and "any," at least one of which includes one, two, or more than two, and a plurality of which includes two or more than two, each of which refers to each of the corresponding plurality, and any of which refers to any of the plurality. For example, the plurality of virtual objects includes 3 virtual objects, each of which refers to each of the 3 virtual objects, and any one of the 3 virtual objects can be a first virtual object, a second virtual object, or a third virtual object.
For the convenience of understanding the embodiments of the present application, some terms referred to in the embodiments of the present application are explained as follows:
a mobile terminal: including a cell phone, tablet or other hand-held portable gaming device.
Shooting type games: the shooting games comprise first person shooting games, third person shooting games or other games which use hot weapon types to carry out remote attacks.
Virtual scene: is a virtual scene that is displayed (or provided) by an application program when the application program runs on a terminal. The virtual scene is a simulation environment of the real world, or a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene is any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, which is not limited in the present application. For example, the virtual scene includes sky, land, sea, and the like, the land includes environmental elements such as desert, city, and the like, and the user can control the virtual object to move in the virtual scene. Of course, the virtual scene also includes virtual objects, for example, a throwing object, a building, a vehicle, or a prop such as weapons required for arming itself or fighting with other virtual objects in the virtual scene, and the virtual scene can also be used for simulating real environments in different weathers, for example, weather such as sunny days, rainy days, foggy days, or dark nights. The variety of scene elements enhances the diversity and realism of the virtual scene.
Virtual object: refers to a virtual character that can move in a virtual scene, and the movable object is a virtual character, a virtual animal, an animation character, and the like. The virtual object is an avatar in the virtual scene that is virtual to represent the user. The virtual scene comprises a plurality of virtual objects, and each virtual object has a shape and a volume in the virtual scene and occupies a part of space in the virtual scene. Optionally, the virtual object is a Character controlled by operating on the client, or an Artificial Intelligence (AI) set in a virtual environment match-up through training, or a Non-Player Character (NPC) set in a virtual scene match-up. Optionally, the virtual object is a virtual character playing a game in a virtual scene. Optionally, the number of virtual objects in the virtual scene match is preset, or dynamically determined according to the number of clients participating in the match, which is not limited in the embodiment of the present application.
Alternatively, the user can control the virtual object to move in the virtual scene, for example, in a shooting game, the user controls the virtual object to freely fall, glide or open a parachute to fall in the sky of the virtual scene, to run, jump, crawl, bend over to go ahead on land, or the like, or controls the virtual object to swim, float, or dive in the sea, or controls the virtual object to move in the virtual scene in a vehicle. The user can also control the virtual object to enter and exit the building in the virtual scene, and discover and pick up the virtual property (e.g., property such as a throwing object, weapon, etc.) in the virtual scene, so as to fight with other virtual objects through the picked virtual property, for example, the virtual property is clothing, a helmet, a bullet-proof jacket, medical supplies, cold weapons, hot weapons, etc., or the virtual property left after the other virtual objects are eliminated. The above scenarios are merely illustrative, and the embodiments of the present application are not limited to this.
Virtual props: refers to a prop that can be used with a virtual object in a virtual scene. Taking shooting games as an example, shooting properties such as virtual bombs, virtual torpedoes and the like are arranged in the shooting games, shooting properties such as virtual firearms and virtual crossbows are also arranged in the shooting games, virtual bullets ejected by the virtual firearms and virtual arrows ejected by the virtual crossbows can also be called as the virtual properties, and the shooting properties can cause damage to attacked virtual objects. The virtual props can also assist the virtual objects in achieving a certain purpose, for example, a smoke cartridge can assist the virtual objects in obscuring the shape. It should be noted that the type of the virtual item is not limited in the embodiment of the present application.
The prop special effect method provided by the embodiment of the application is executed by computer equipment. Optionally, the computer device is a terminal or a server. Optionally, the server is an independent physical server, or a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a web service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like. Optionally, the terminal is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart voice interaction device, a smart home appliance, a vehicle-mounted terminal, and the like, but is not limited thereto.
In some embodiments, the computer program according to the embodiments of the present application may be deployed to be executed on one computer device or on multiple computer devices located at one site, or may be executed on multiple computer devices distributed at multiple sites and interconnected by a communication network, and the multiple computer devices distributed at the multiple sites and interconnected by the communication network can form a block chain system.
In some embodiments, the computer device is provided as a terminal. Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application. Referring to fig. 1, the implementation environment includes at least one terminal 101 (2 are taken as an example in fig. 1) and a server 102. The terminal 101 and the server 102 are connected via a wireless or wired network.
The terminal 101 is installed with a target application served by the server 102, and the target application supports virtual scene display, for example, the target application is any one of military simulation application, Role-Playing Game (RPG), Multiplayer Online Battle Game (MOBA), and Multiplayer gunfight live Game. The terminal 101 is a terminal used by any user who uses the terminal 101 to operate a virtual object located in a virtual scene for an activity, the activity including at least one of crawling, walking, running, jumping, driving, picking up, shooting, attacking, throwing.
In one possible implementation, different users use different terminals to control the virtual objects respectively, and the virtual objects controlled by different terminals are located in the same virtual scene, at this time, different virtual objects can be active. For example, a first terminal controls a first virtual object, the first terminal displays a view screen of the first virtual object, and in a virtual scene, the first virtual object or virtual objects controlled by other terminals can trigger a virtual prop. If any virtual prop is triggered, the virtual prop is in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance, the prop special effect of the virtual prop is displayed in the visual field, and if the virtual prop is triggered but the virtual prop is not in the visual field of the first virtual object, or the virtual prop is triggered and the virtual prop is in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is greater than the target distance, the prop special effect of the virtual prop is not displayed.
Fig. 2 is a flowchart of a method for displaying a specific track effect provided in an embodiment of the present application, and the method is executed by a terminal, as shown in fig. 2, and includes:
201. and the terminal displays a visual field picture of the first virtual object controlled by the local terminal equipment.
The first virtual object is a virtual object controlled by the terminal, for example, the virtual object is a virtual character, a virtual animal, or a virtual vehicle. The view field screen displays a virtual scene viewed from a first virtual object, for example, a view field screen corresponding to a view field displayed from a first person perspective of the first virtual object, or a view field screen corresponding to a view field displayed from a third person perspective corresponding to the first virtual object. For example, the first virtual object is a virtual character, the first person perspective of the first virtual object is used to view a virtual scene, and a view frame of the first virtual object is displayed to simulate a real person to view a view that can be observed by a real scene. For example, the view screen includes a virtual building, a vehicle, a virtual object, or the like.
202. And the terminal displays the prop special effect of the virtual prop under the condition that the triggered virtual prop is positioned in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance.
The virtual prop is any type of prop in a virtual scene, and the virtual prop can be triggered. For example, the virtual prop is a virtual firearm, a virtual torpedo, a virtual bullet, or the like. The target distance is an arbitrary distance, for example, the target distance is 100 meters or 200 meters. The property special effect is an effect presented when the virtual property is triggered, for example, the virtual property is a virtual firearm, and the property special effect of the virtual firearm is a firing special effect, so that an effect of firing the virtual property is presented. For another example, the virtual property is a virtual bullet, the property special effect of the virtual bullet is a moving special effect or a bullet hole special effect, the moving special effect can show the moving effect of the virtual bullet, and the bullet hole special effect can show a bullet hole formed by the virtual bullet.
In the embodiment of the present application, the field of view includes a part of the virtual scene observed at the angle of view of the first virtual object, the field of view of the first virtual object corresponds to a field of view screen of the first virtual object, and the virtual scene in the field of view of the first virtual object is displayed in the field of view screen. The triggered virtual item is within the field of view of the first virtual object, indicating that the triggered virtual item can be viewed. The target distance can represent the maximum distance which can present the prop special effect in the visual field, namely the distance between the triggered virtual prop and the first virtual object in the visual field is not more than the target distance, which represents that the virtual prop can be seen in the visual field, and the prop special effect of the virtual prop when the virtual prop is triggered can also be seen, so that the prop special effect of the virtual prop can be displayed in the visual field picture.
203. And under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is greater than the target distance, the terminal does not display the prop special effect of the virtual prop.
In this embodiment of the application, the distance between the triggered virtual item and the first virtual object in the view is greater than the target distance, which means that the virtual item can be seen in the view, but the item special effect when the virtual item is triggered cannot be seen, and even if the item special effect of the virtual item is displayed, the display effect of the item special effect is not obvious, so that the item special effect of the virtual item is not displayed in the view screen any more, so as to save resources occupied for displaying the item special effect.
The method provided by the embodiment of the application determines whether to display the prop special effect of the triggered virtual prop based on the visual field of the virtual object controlled by the local terminal equipment and the target distance, and only when the triggered virtual prop is in the visual field of the virtual object controlled by the local terminal equipment and the distance between the virtual prop and the virtual object is less than the target distance, the prop special effect of the virtual prop can be displayed without displaying the prop special effect of the virtual prop in the visual field but the distance between the virtual prop and the virtual object is greater than the target distance, so that resources occupied for displaying the prop special effect are saved.
On the basis of the embodiment shown in fig. 2, the virtual prop is a virtual prop that can be held by a virtual object, and when the virtual prop is triggered, it is determined whether a position of a second virtual object holding the virtual prop is in a visual field and whether a distance between the second virtual object and a first virtual object is greater than a target distance, so as to determine whether to display a prop special effect of the virtual prop.
Fig. 3 is a flowchart of a method for displaying a specific track effect provided in an embodiment of the present application, executed by a terminal, as shown in fig. 3, where the method includes:
301. and the terminal displays a visual field picture of the first virtual object controlled by the local terminal equipment.
In the embodiment of the present application, a virtual scene included in the visual field of the first virtual object is presented in the visual field screen displayed by the terminal.
In one possible implementation, the distance between any location within the field of view and the location of the first virtual object is no greater than the viewable distance.
The visible distance is the farthest distance that can be observed by the first virtual object, and the visible distance is an arbitrary distance, for example, the visible distance is 1000 meters. The visual field only comprises the positions which are not more than the visual distance from the position of the first virtual object and can be observed in the visual field of the first virtual object. For example, the distance between the first location and the location of the first virtual object is not greater than the viewable distance, but the first location is behind the first virtual object, and the first location is not viewed within the field of view of the first virtual object, and the first location is not within the field of view. Or, the distance between any first position and the position of the first virtual object is not greater than the visible distance, but an obstacle exists between the first position and the first virtual object, when the virtual scene is observed in the view field of the first virtual object, the first position is blocked by the obstacle, so that the first position cannot be observed, and the first position is not in the view field. As shown in fig. 4, an obstacle 403 exists between the virtual object 401 and the virtual object 402, and even if the distance between the virtual object 401 and the virtual object 402 is close, the virtual object 402 is not in the field of view of the virtual object 401.
Optionally, a ray is sent to determine whether there is an obstacle between two positions, that is: emitting a ray to a first position from the position of the first virtual object, and determining that no obstacle exists between the first position and the position of the first virtual object under the condition that the ray does not contact with a virtual object in the process of reaching the first position; and determining that an obstacle exists between the first position and the position of the first virtual object when the ray is in contact with the virtual object when the ray does not reach the first position. As shown in fig. 5, the first virtual object is shot based on the shooting prop, and a ray 501 is emitted from the position of the lens shown in fig. 5, where the lens refers to the muzzle of the virtual firearm, and when the ray 501 is broken by the collision box of the box beside the virtual object in fig. 5, the ray 501 does not hit the virtual object to be detected, but hits the box beside the virtual object.
In one possible implementation, the first virtual object corresponds to a virtual camera, and the step 301 includes: the terminal displays the virtual scene shot by the virtual camera.
The virtual scene shot by the virtual camera is the view picture of the first virtual object. Optionally, the view field of the first virtual object is displayed at a first person perspective or a third person perspective, the virtual scene captured by the first virtual camera is the view field displayed at the first person perspective, and the virtual scene captured by the second virtual camera is the view field displayed at the third person perspective.
Wherein the first virtual camera is located at a different position than the second virtual camera. For example, the position of the first virtual camera is the same as the position of the eyes of the first virtual object, the position of the second virtual camera is above and behind the head of the first virtual object, only the arm and the hand-held virtual item of the first virtual object are displayed on the view field screen displayed at the first personal perspective, and the back, the arm and the hand-held virtual item of the first virtual object can be displayed on the view field screen displayed at the third personal perspective.
302. And the terminal displays the prop special effect of the virtual prop under the conditions that the virtual prop is triggered, the second virtual object of the handheld virtual prop is positioned in the visual field of the first virtual object, and the distance between the second virtual object and the first virtual object is not greater than the target distance.
In the embodiment of the present application, the second virtual object is located in the same virtual scene as the first virtual object. Optionally, the second virtual object is a virtual object controlled by another terminal, or is a non-player character. For example, the virtual scene is a multi-player competition scene, each player controls one virtual object through a terminal, and a plurality of players can realize multi-player competition in the virtual scene through the controlled plurality of virtual objects. For another example, the virtual scene is a stand-alone scene, in the virtual scene, only the first virtual object is a virtual object controlled by the player through the terminal, the other virtual objects in the virtual scene are non-player characters, and the player performs a match with the non-player characters in the virtual scene through the controlled first virtual object.
The virtual prop is a virtual prop which can be held by a virtual object in a virtual scene, for example, the virtual prop is a shooting prop such as a virtual firearm and a virtual bow and crossbow, or the virtual prop is a throwing prop such as a virtual bomb and a virtual torpedo. The virtual property can be triggered when the virtual object holds the virtual property, for example, the virtual property is a virtual firearm, and the virtual object can shoot based on the held virtual firearm when holding the virtual firearm; or the virtual prop is a virtual grenade, and a lead of the virtual grenade can be unplugged when the virtual grenade is held by a virtual object; or, this virtual stage property is the burning bottle, can ignite handheld burning bottle when the handheld burning bottle of virtual object.
For different virtual props, the prop special effects are different when the virtual props are triggered. For example, the virtual prop is a virtual grenade, and when a virtual object pulls out a lead of the handheld virtual grenade, the virtual grenade will smoke, that is, the prop special effect when the virtual grenade is triggered is a smoke special effect; or, the virtual prop is a combustion bottle, and when the virtual object ignites the handheld combustion bottle, the combustion bottle can present a special combustion effect, namely, the prop special effect when the combustion bottle is triggered is the combustion special effect.
When the virtual prop is triggered, the virtual prop is held by a second virtual object, and the position of the second virtual object holding the virtual prop is close to the position of the virtual prop, so that the position of the second virtual object can be taken as the position of the virtual prop, whether the second virtual object is in the visual field of the first virtual object is determined, namely whether the virtual prop is in the visual field of the first virtual object is determined, whether the distance between the second virtual object and the first virtual object is greater than a target distance is determined, namely whether the distance between the second virtual object and the first virtual object is greater than the target distance is determined, and whether the prop special effect when the virtual prop is triggered is displayed is determined.
In one possible implementation, this step 302 includes: and the terminal loads the prop special effect of the virtual prop and displays the loaded prop special effect under the conditions that the virtual prop is triggered, a second virtual object of the handheld virtual prop is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not more than the target distance.
And when determining that the prop special effect of the virtual prop needs to be displayed, loading the prop special effect of the virtual prop, and displaying the loaded prop special effect to show the prop special effect of the triggered virtual prop.
In one possible implementation, the position within the field of view of the first virtual object, at which the distance to the position of the first virtual object is not greater than the target distance, constitutes the special effect visibility range. If any virtual item is triggered and the virtual item is in the special effect visual range, displaying the item special effect of the virtual item in the visual field picture; if any virtual item is triggered and the virtual item is not in the special effect visual range, the item special effect of the virtual item is not displayed in the visual field picture. Namely, the virtual prop is not in the special effect visible range, and even if the prop special effect of the virtual prop is displayed, the special effect display effect is not obvious, so that the prop special effect which is not in the special effect visible range is not displayed, the display effect is not reduced, and the resource occupied by the display special effect can be saved. In the embodiment of the application, only the prop special effect of the triggered virtual prop in the special effect visual range is displayed, and the prop special effect is not displayed when the virtual prop outside the special effect visual range and in the visual field is triggered, so that the prop special effect of the virtual prop does not need to be frequently loaded, and resources occupied for displaying the special effect are saved.
In one possible implementation, this step 302 includes: and displaying the firing effect of the virtual firearm under the condition that any virtual firearm is triggered, a second virtual object of the handheld virtual firearm is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not more than the target distance.
In this embodiment, the virtual item is a virtual gun, the second virtual object shoots based on the handheld virtual gun, and if the second virtual object is in the visual field and within the visual range of the special effect, the firing special effect of the virtual gun is displayed in the visual field picture, for example, a muzzle of the virtual gun emits fire light.
303. And the terminal does not display the prop special effect of the virtual prop under the condition that the virtual prop is triggered, a second virtual object of the handheld virtual prop is positioned in the visual field, and the distance between the second virtual object and the first virtual object is greater than the target distance.
In some embodiments, this step 303 comprises: and in the case that any virtual gun is triggered, a second virtual object of the handheld virtual gun is positioned in the visual field, and the distance between the second virtual object and the first virtual object is greater than the target distance, the firing special effect of the virtual gun is not displayed.
In a possible implementation manner, after step 301, a position of a second virtual object holding the virtual item is obtained, and then it is determined whether to display the item special effect of the virtual item according to the position of the second virtual object, where the process includes the following steps 304 and 306:
304. and the terminal acquires the position of the first virtual object and the position of the second virtual object holding the virtual item under the condition that any virtual item is triggered.
Wherein the position of the first virtual object represents the position of the first virtual object in the virtual scene and the position of the second virtual object represents the position of the second virtual object in the virtual scene. The position can be expressed in any form, for example, the position is expressed in the form of coordinates.
In one possible implementation, this step 304 includes: the terminal checks the position of a first virtual object controlled by the terminal and the position of a second virtual object holding the virtual item under the condition that any virtual item is triggered.
In the embodiment of the application, the terminal can check the position of any virtual object in the virtual scene. Under the condition that any virtual prop is triggered, the terminal detects the position of a controlled first virtual object and a second virtual object holding the virtual prop in a virtual scene.
Optionally, the process of the terminal acquiring the position of the virtual object includes: the terminal responds to a trigger instruction of any virtual item, and the position of a first virtual object controlled by the terminal and the position of a second virtual object holding the virtual item are checked.
Wherein, the triggering instruction indicates to trigger the virtual prop.
For example, in a stand-alone game scene, the second virtual object is a non-player character, the first virtual object is in match with the non-player character in the virtual scene, in the match process, the terminal generates a trigger instruction based on the running game logic of the stand-alone game, the trigger instruction indicates that one non-player character triggers the handheld virtual prop, and then the terminal acquires the position of the non-player character and the position of the first virtual object.
In a possible implementation manner, the second virtual object is a virtual object controlled by another terminal, and then the step 304 includes: the terminal receives a trigger instruction of the server for any virtual object and the position of a second virtual object holding the virtual prop synchronously, and obtains the position of the first virtual object.
Wherein the trigger instruction is sent to the server by the terminal controlling the second virtual object.
In the embodiment of the application, each terminal controls one virtual object, and the virtual objects controlled by a plurality of terminals are located in the same virtual scene. For any terminal, the terminal detects the position of a controlled virtual object in a virtual scene, synchronizes the detected position to the server, synchronizes a trigger instruction of the virtual item to the server when the terminal detects the trigger operation of the virtual item held by the controlled virtual object, receives the position of the virtual object synchronized by the terminal and the trigger instruction of the virtual item, and synchronizes the position of the virtual object and the trigger instruction of the virtual item to other terminals so as to enable the other terminals to subsequently determine whether to display the special effect of the triggered virtual item.
305. And the terminal displays the prop special effect of the virtual prop under the condition that the position of the second virtual object is in the visual field and the distance between the position of the second virtual object and the position of the first virtual object is not more than the target distance.
In this embodiment of the application, after the terminal acquires the position of the second virtual object, when the position of the second virtual object is within the field of view of the first virtual object and the distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance, it is determined that the position of the second virtual object is within the special effect visual range, that is, it is determined that the virtual article is within the special effect visual range, and therefore, the special effect of the virtual item can be displayed. The position of the virtual object of the handheld virtual prop is used as the position of the virtual prop, and whether the prop special effect of the virtual prop needs to be displayed or not is judged conveniently based on the position of the virtual object, so that the process of determining the position of the virtual prop is simplified, and the prop special effect of the virtual prop can be displayed timely.
In one possible implementation, the prop special effect of the virtual prop is a special effect with a display range.
The prop special effect of the virtual prop is a special effect with a display range, and even if the triggered virtual prop is not in the visual field of the first virtual object but the second virtual object holding the virtual prop is in the visual field, the prop special effect of the virtual prop can be displayed, so that the accuracy of displaying the prop special effect is ensured. As shown in fig. 6, although the virtual object 601 is in the field of view of the first virtual object and the virtual object 601 is in the special effect visible range, the virtual firearm held by the virtual object 601 is not in the field of view, and when the virtual object 601 shoots on the basis of the held virtual firearm, the firing special effect of the virtual firearm is loaded and displayed, and a partial firing special effect can be seen at the edge of the field of view.
In one possible implementation, the process of determining whether the position of the second virtual object is within the field of view includes: and performing human projection on a visual angle picture shot by a virtual camera corresponding to the first virtual object based on the position of the second virtual object, determining that the position of the second virtual object is in the visual field if the position of the second virtual object is projected on the visual angle picture, and determining that the position of the second virtual object is not in the visual field if the position of the second virtual object is not projected on the visual angle picture.
In the embodiment of the present application, a virtual scene included in a view angle picture captured by the virtual camera is a virtual scene in a field of view of the first virtual object. And projecting the visual angle picture shot by the virtual camera based on the position of the second virtual object to determine whether the position of the second virtual object is in the visual field of the first virtual object.
Optionally, the process of determining whether the position of the second virtual object is within the field of view of the first virtual object based on the position of the second virtual object and the view angle picture captured by the virtual camera includes: determining the position of the virtual camera in a virtual scene and the shooting direction of the virtual camera, mapping the position of a second virtual object to a plane where the virtual camera is located based on the mapping relation between a first coordinate system and a second coordinate system to obtain a mapping position corresponding to the position of the second virtual object, determining that the position of the second virtual object is in the visual field if the mapping position is in the visual angle range of the virtual camera, and determining that the position of the second virtual object is not in the visual field if the mapping position is not in the visual angle range of the virtual camera.
The first coordinate system is a coordinate system in the virtual scene, the first coordinate system takes any point in the virtual scene as an origin, the position of the second virtual object and the position of the virtual camera are both expressed by coordinates in the first coordinate system, the second coordinate system is the coordinate system of the virtual camera, the second coordinate system takes an intersection point of the shooting direction of the virtual camera and the virtual lens as the origin, the mapping position is expressed by coordinates in the second coordinate system, and the mapping relation is used for mapping the coordinates in the first coordinate system to the coordinates in the second coordinate system. The view angle range is a range captured by the virtual camera in the second coordinate system, for example, the view angle range is represented in a matrix form, and a position within the view angle range can be projected, that is, a position that can be captured by the virtual camera. Accordingly, it is determined whether the position of the second virtual object is within the field of view of the first virtual object based on whether the projection position is within the viewing angle range.
306. And under the condition that the position of the second virtual object is in the visual field and the distance between the position of the second virtual object and the position of the first virtual object is greater than the target distance, the terminal does not display the prop special effect of the virtual prop.
Step 306 is similar to step 305, and will not be described herein again.
In the embodiment of the present application, after the terminal acquires the position of the first virtual object and the position of the second virtual object, determining by the terminal whether the position of the second virtual object is within the field of view and whether the distance between the position of the second object and the position of the first virtual object is greater than the target distance, in yet another embodiment, determining, by the server, whether the position of the second virtual object is within the field of view and whether the distance between the position of the second object and the position of the first virtual object is greater than the target distance, in the event that the server determines that the position of the second virtual object is within the field of view of the first virtual object and that the distance between the position of the second object and the position of the first virtual object is greater than the target distance, and sending a display notice to the terminal, and displaying the prop special effect of the virtual prop in the visual field picture by the terminal based on the display notice.
In one possible implementation manner, the server interacts with the terminal, and the process of displaying the prop special effect of the virtual prop by the terminal includes: when other terminals detect the triggering operation of a virtual prop held by a second virtual object under control, the position of the second virtual object and the triggering instruction of the virtual prop are synchronized to the server, the server receives the position of the second virtual object synchronized by other terminals and the triggering instruction of the virtual prop, the visual field of the first virtual object is determined based on the position and the sight direction of the first virtual object, whether the position of the second virtual object is in the visual field of the first virtual object and whether the distance between the position of the second object and the position of the first virtual object is greater than a target distance is determined based on the position of the second virtual object, and a display notification is sent to the terminals when the server determines that the position of the second virtual object is in the visual field of the first virtual object and the distance between the position of the second object and the position of the first virtual object is greater than the target distance, and the terminal receives the display notice and displays the prop special effect of the virtual prop held by the second virtual object in the visual field picture based on the display notice.
Wherein the display notification indicates that the virtual item held by the second virtual object is triggered and within the special effect visual range of the first virtual object.
The method provided by the embodiment of the application determines whether to display the prop special effect of the triggered virtual prop based on the visual field of the virtual object controlled by the local terminal equipment and the target distance, and only when the triggered virtual prop is in the visual field of the virtual object controlled by the local terminal equipment and the distance between the virtual prop and the virtual object is less than the target distance, the prop special effect of the virtual prop can be displayed without displaying the prop special effect of the virtual prop in the visual field but the distance between the virtual prop and the virtual object is greater than the target distance, so that resources occupied for displaying the prop special effect are saved.
And the position of the virtual object holding the virtual prop is used as the position of the virtual prop, so that whether the prop special effect of the virtual prop needs to be displayed or not is judged conveniently based on the position of the virtual object, the process of determining the position of the virtual prop is simplified, and the prop special effect of the virtual prop can be displayed timely.
And the prop special effect of the virtual prop is a special effect with a display range, and even if the triggered virtual prop is not in the visual field of the first virtual object but the second virtual object holding the virtual prop is in the visual field, the prop special effect of the virtual prop can be displayed, so that the accuracy of displaying the prop special effect is ensured.
On the basis of the embodiment shown in fig. 2, the virtual prop can move after being triggered, and the prop special effect of the virtual prop is displayed only when the virtual prop moves in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is less than the target distance, which is described in detail in the following embodiment.
Fig. 7 is a flowchart of a method for displaying a specific track effect provided in an embodiment of the present application, and the method is executed by a terminal, as shown in fig. 7, and includes:
701. and the terminal displays a visual field picture of the first virtual object controlled by the local terminal equipment.
This step is the same as the above steps 201 and 301, and will not be described herein again.
702. And the terminal displays the prop special effect of the virtual prop under the condition that the virtual prop moves in the visual field of the first virtual object after being triggered and the distance between the virtual prop and the first virtual object is not greater than the target distance.
In the embodiment of the application, the virtual prop can be moved after being triggered, and if the virtual prop moves in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance in the moving process, the prop special effect of the virtual prop is displayed in the visual field picture. Optionally, in a case that the virtual prop moves, the displayed prop special effect is a movement special effect. For example, the virtual prop is a combustion bottle thrown by a virtual object, and displays a special effect of combustion of the combustion bottle in the moving process of the combustion bottle.
In one possible implementation, this step 702 includes: and displaying the moving special effect of the virtual bullet when any virtual bullet moves in the visual field after being shot and the distance between the virtual bullet and the first virtual object is not more than the target distance.
Wherein the virtual bullet is shot by any virtual object in the virtual scene based on the handheld virtual firearm, and the virtual object may or may not be in the visual field. The moving effect of the virtual bullet is any type of effect, for example, the moving effect of the virtual bullet is a flare. For example, if a virtual bullet moves within the field of view and the distance between the virtual bullet and the first virtual object is always not greater than the target distance within a movement distance, the special effect of movement of the virtual bullet is displayed, presenting the trajectory of the virtual bullet.
Optionally, the dummy cartridges are ejected and moved until they reach a maximum distance of contact or movement with any of the dummy objects. Optionally, the dummy cartridge is shot and then subject to a parabolic motion. The maximum distance represents the range of the virtual firearm that fired the virtual bullet.
703. And the terminal does not display the prop special effect of the virtual prop under the condition that the virtual prop moves in the visual field after being triggered and the distance between the virtual prop and the first virtual object is greater than the target distance.
In one possible implementation, this step 703 includes: when any virtual bullet moves in the visual field after being shot and the distance between the virtual bullet and the first virtual object is larger than the target distance, the moving special effect of the virtual bullet is not displayed in the visual field picture.
In a possible implementation manner, after step 301, a position of a second virtual object holding the virtual item is obtained, and then it is determined whether to display the item special effect of the virtual item according to the position of the second virtual object, where the process includes the following steps 704 and 707:
704. and the terminal acquires the position of the first virtual object and the position of a third virtual object triggering the virtual prop under the condition that the virtual prop is triggered and starts to move.
In this embodiment of the application, the third virtual object is any virtual object in the virtual scene, and the virtual prop is a movable virtual prop. For example, the virtual prop is a virtual bullet that starts to move after being ejected. For another example, the virtual item is a virtual grenade, and the virtual grenade starts to move after being triggered and when the virtual object throws the virtual grenade.
This step is similar to step 304, and will not be described herein again.
705. And the terminal determines the current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop.
The moving direction of the virtual prop can represent the direction to which the virtual prop moves. For example, the virtual prop is a virtual bullet, and the moving direction of the virtual prop is the shooting direction of the virtual firearm shooting the virtual bullet; or the virtual prop is a virtual grenade, and the moving direction of the virtual prop is the throwing direction of the virtual object throwing the virtual grenade. In this embodiment of the application, when the virtual item is triggered and starts to move, the position of the third virtual object is taken as a starting point, and the movement is performed along the movement direction of the virtual item, so that the position of the virtual item in the movement process can be determined based on the position of the third virtual object and the movement direction of the virtual item. Optionally, the terminal determines the position of the virtual item in real time based on the position of the third virtual object and the moving direction of the virtual item.
In one possible implementation, this step 705 includes: and determining the position of the virtual prop in real time based on the position of the third virtual object, the moving direction of the virtual prop and the moving speed of the virtual prop.
The moving speed of the virtual prop is a fixed value or a speed meeting the gravity condition. The virtual prop takes the position of the third virtual object as a starting point, moves along the moving direction at the moving speed, and can determine the position of the virtual prop in real time based on the position of the third virtual object, the moving direction of the virtual prop and the moving speed of the virtual prop in the moving process of the virtual prop.
706. And the terminal displays the moving special effect of the virtual prop under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
And under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not more than the target distance, the virtual prop is shown to be in the special effect visual range, and then the moving special effect of the virtual prop is displayed.
707. And under the condition that the virtual prop moves in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is greater than the target distance, the terminal does not display the moving special effect of the virtual prop.
In the moving process of the virtual prop, the position of the virtual prop is changed in real time, so that whether the virtual prop is in the special effect visual range or not is determined in real time based on the position of the virtual prop, and the moving special effect of the virtual prop is displayed only when the virtual prop is in the special effect visual range. For example, in the process of moving the virtual prop, the virtual prop is moved from outside the special effect visual range to inside the special effect visual range, and then the special effect visual range is moved, so that the prop special effect of the virtual prop is displayed only when the virtual prop moves in the special effect visual range.
The method provided by the embodiment of the application determines whether to display the prop special effect of the triggered virtual prop based on the visual field of the virtual object controlled by the local terminal equipment and the target distance, and only when the triggered virtual prop is in the visual field of the virtual object controlled by the local terminal equipment and the distance between the virtual prop and the virtual object is less than the target distance, the prop special effect of the virtual prop can be displayed without displaying the prop special effect of the virtual prop in the visual field but the distance between the virtual prop and the virtual object is greater than the target distance, so that resources occupied for displaying the prop special effect are saved.
And the position of the virtual prop in the moving process is determined in real time, whether the virtual prop is in the special effect visual range or not is determined according to the position of the virtual prop, and the moving special effect of the virtual prop can be displayed only under the condition that the virtual prop moves in the special effect visual range, so that the display pattern of the special effect of the prop is enriched, and the accuracy of special effect display is also ensured.
On the basis of the embodiment shown in fig. 2, in the process of moving the virtual item after being triggered, the virtual item may contact with the virtual article, and if the contact position is within the special effect visible range, the item special effect of the virtual item may be displayed at the contact position, and the specific process is described in the following embodiment.
Fig. 8 is a flowchart of a method for displaying a specific track effect provided in an embodiment of the present application, and the method is executed by a terminal, as shown in fig. 8, and includes:
801. and the terminal displays a visual field picture of the first virtual object controlled by the local terminal equipment.
This step is the same as the above steps 201 and 301, and will not be described herein again.
802. And the terminal displays the prop special effect of the virtual prop at the contact position under the conditions that the virtual prop is contacted with any virtual article after being triggered, the contact position is in the visual field of the first virtual object, and the distance between the contact position and the position of the first virtual object is not more than the target distance.
In the embodiment of the application, the virtual prop can contact any virtual article in the virtual scene after being triggered, a special effect of the prop can be generated when the virtual prop contacts the virtual article, and the special effect of the prop can show the effect that the virtual prop contacts the virtual article. For example, the virtual prop is a virtual torpedo, and when the virtual torpedo is in contact with the ground, an explosion special effect is displayed at the contact position.
The process of determining whether the contact position is in the field of view, the process of determining whether the position of the second virtual object is in the field of view in step 305, the process of determining whether the distance between the contact position and the position of the first virtual object is greater than the target distance, and the process of determining whether the distance between the position of the second virtual object and the position of the first virtual object is greater than the target distance in step 305 are the same, and are not repeated here.
In one possible implementation, this step 802 includes: and displaying the bullet hole special effect of the virtual bullet at the contact position when any virtual bullet is in contact with the virtual article after being shot, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is not more than the target distance.
In the embodiment of the application, the contact of the virtual bullet with the virtual article means that the virtual bullet hits the virtual article, and the virtual bullet forms a bullet hole on the virtual article, so that if the contact position is in the visual field and the distance between the contact position and the first virtual object is not greater than the target distance, the special effect of the bullet hole of the virtual bullet is displayed at the contact position in the visual field picture. As shown in fig. 9, the first virtual object is shot based on a handheld virtual firearm, the shot bullet contacts with the wall, the contact position is within the special effect visual range, and the bullet hole special effect 901 is displayed on the wall.
Optionally, in a case where the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the visual field, and the distance between the contact position and the position of the first virtual object is not greater than the target distance, a bullet hole special effect matching the virtual article is determined, and the determined bullet hole special effect is displayed at the contact position.
In the embodiment of the application, the bullet holes formed on different virtual articles are different, that is, the bullet hole special effects generated when the bullet holes are formed on different virtual articles are also different, so that when the virtual bullet is determined to be in contact with the virtual article, the bullet hole special effect matched with the virtual article is obtained, the determined bullet hole special effect is displayed at the contact position in the visual field picture, the display style of the bullet hole special effect is enriched, and the accuracy of the displayed bullet hole special effect is ensured.
Optionally, the process of obtaining the bullet hole special effect matched with the virtual article includes: and acquiring the material of the virtual article, and inquiring the bullet hole special effect matched with the material in the database.
The database comprises a corresponding relation between the material and the bullet hole special effect, after the dish of the virtual article is obtained, the corresponding relation in the database is inquired, and the bullet hole special effect matched with the material can be determined. And acquiring the bullet hole special effect matched with the material of the virtual article, and determining the bullet hole special effect matched with the material as the bullet hole special effect matched with the virtual article. In the embodiment of the application, the materials of different virtual articles may be the same or different, and if the materials of two virtual articles are the same, the bullet hole special effects displayed at the contact positions when the virtual bullets are respectively contacted with the two virtual articles will also be the same; if the two virtual articles are made of different materials, the bullet hole special effects displayed at the contact positions are different when the virtual bullets are respectively contacted with the two virtual articles.
As shown in fig. 10, the shot hole effect 1001 resulting from the virtual bullet shooting onto the virtual wall is different from the shot hole effect 1002 resulting from the virtual bullet shooting onto the grass.
Optionally, the virtual bullet is shot by a handheld virtual firearm based on a virtual object controlled by other terminals, and the process of acquiring the contact position by the terminal includes: the other terminals control the virtual object to shoot the virtual bullet based on the handheld virtual gun, then emit rays from the position of the virtual gun, detect a collision box of any virtual object in the rays, determine the contact position between the rays and the virtual object as the contact position between the virtual bullet and the virtual object, synchronize the contact position to the server, and synchronize the server to the terminal controlling the first virtual object, so that the terminal obtains the contact position.
Optionally, the virtual bullet is shot by the terminal to control the first virtual object to be shot based on the handheld virtual firearm, and the process of the terminal to acquire the contact position includes: the terminal controls the first virtual object to shoot the virtual bullet based on the handheld virtual gun, then to emit the ray from the position of the virtual gun, and the terminal detects the collision box of any virtual article in the ray, and determines the contact position between the ray and the virtual article as the contact position between the virtual bullet and the virtual article.
803. And under the conditions that the terminal is contacted with any virtual article after the virtual prop is triggered, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is greater than the target distance, the prop special effect of the virtual prop is not displayed at the contact position.
The virtual bullet is contacted with the virtual article, the contact position is in the visual field, but the distance between the contact position and the first virtual object is larger than the target distance, namely the contact position is not in the special effect visual range, even if the bullet hole special effect of the virtual bullet is displayed, the special effect display effect is not obvious, therefore, the bullet hole special effect which is not in the special effect visual range is not displayed, the display effect is not reduced, and meanwhile, the resource occupied by the display special effect can be saved. As shown in fig. 11, the distance between the contact position of the virtual bullet and the virtual article and the first virtual object is long, and even if the bullet hole special effect 1101 is displayed, the display effect of the bullet hole special effect 1101 is not obvious, and therefore, the bullet hole special effect 1101 is not displayed any more.
In one possible implementation manner, after step 801, a contact position of the virtual item and the virtual item is obtained, and then it is determined whether to display the item special effect of the virtual item according to the contact position, the process includes: under the condition that the virtual prop is in contact with any virtual article after being triggered, obtaining the contact position of the virtual prop and the virtual article; and displaying the prop special effect of the virtual prop at the contact position under the condition that the contact position is in a visual field and the distance between the contact position and the position of the first virtual object is not more than the target distance, and displaying the prop special effect of the virtual prop at the contact position under the condition that the contact position is in the visual field and the distance between the contact position and the position of the first virtual object is more than the target distance.
The process of obtaining the contact position is the same as the process of obtaining the contact position in step 802, and is not described herein again.
It should be noted that, in the embodiment of the present application, the virtual item is described as being in contact with a virtual object, but in another embodiment, the virtual item can also be in contact with a virtual object. For example, the terminal controls a first virtual object to shoot based on a handheld virtual gun, emits a ray from a muzzle of the virtual gun for collision detection, indicates that a bullet issued by the virtual gun hits another virtual object if the ray detects a collision box with the other virtual object, and indicates that the bullet issued by the virtual gun does not hit the other virtual object if the ray does not detect a collision box with the other virtual object. In addition, the position of hitting the other virtual object can be detected through rays, the damage to the other virtual object is determined based on the hit position, the terminal sends the damage to the other virtual object to the server, after the server verifies the damage, the life value of the other virtual object is reduced by the damage value corresponding to the damage, and the special effect that the virtual bullet hits the other virtual object, such as the bleeding special effect, is displayed at the position where the virtual bullet hits the other virtual object.
The method provided by the embodiment of the application determines whether to display the prop special effect of the triggered virtual prop based on the visual field of the virtual object controlled by the local terminal equipment and the target distance, and only when the triggered virtual prop is in the visual field of the virtual object controlled by the local terminal equipment and the distance between the virtual prop and the virtual object is less than the target distance, the prop special effect of the virtual prop can be displayed without displaying the prop special effect of the virtual prop in the visual field but the distance between the virtual prop and the virtual object is greater than the target distance, so that resources occupied for displaying the prop special effect are saved.
And under the condition that the virtual prop is in contact with the virtual article and the contact position is within the special effect visual range, the prop special effect of the virtual prop is displayed at the contact position so as to show that the virtual prop is in contact with the virtual article, and the display style of the prop special effect is enriched.
And under the condition that the virtual prop is in contact with the virtual article and the contact position is within the special effect visual range, the prop special effect matched with the virtual article is determined, and the determined bullet hole special effect is displayed at the contact position, so that the display style of the bullet hole special effect is enriched, and the accuracy of the displayed bullet hole special effect is ensured.
On the basis of the embodiment shown in fig. 2, each virtual object and the handheld virtual prop have a relative positional relationship, the position of the virtual prop is determined according to the position of the second virtual object, the orientation of the second virtual object, and the relative positional relationship, and after the virtual prop is triggered, whether the virtual prop is within the special effect visual range is determined according to the position of the virtual prop.
Fig. 12 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present application, executed by a terminal, as shown in fig. 12, where the method includes:
1201. and the terminal displays a visual field picture of the first virtual object controlled by the local terminal equipment.
This step is the same as the above steps 201 and 301, and will not be described herein again.
1202. And the terminal acquires the position of the first virtual object and the position of the second virtual object holding the virtual prop under the condition that the virtual prop is triggered.
This step is similar to step 304, and will not be described herein again.
1203. And the terminal determines the position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object and the relative position relationship between the second virtual object and the virtual prop.
The relative position relationship is used for representing the relationship between the position of the virtual object and the position of the virtual prop when the virtual object holds the virtual prop. Optionally, the relative positional relationship is represented by a positional offset vector. The orientation of the second virtual object is the direction faced by the virtual object, and because the virtual prop held by the virtual object is in front of the virtual object, the offset direction of the position of the virtual prop relative to the position of the second virtual object can be determined through the orientation of the virtual object, and the position of the virtual prop can be determined through the relative position relation and the offset direction.
1204. And the terminal displays the prop special effect of the virtual prop under the condition that the position of the virtual prop is in the visual field of the first virtual object and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
This step is similar to step 305 described above and will not be described herein again.
1205. And under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is greater than the target distance, the terminal does not display the prop special effect of the virtual prop.
The method provided by the embodiment of the application determines whether to display the prop special effect of the triggered virtual prop based on the visual field of the virtual object controlled by the local terminal equipment and the target distance, and only when the triggered virtual prop is in the visual field of the virtual object controlled by the local terminal equipment and the distance between the virtual prop and the virtual object is less than the target distance, the prop special effect of the virtual prop is displayed in a visual field picture, and the prop special effect of the virtual prop which is displayed in the visual field but is greater than the target distance from the virtual object is not required to be displayed, so that resources occupied for displaying the prop special effect are saved.
And the position of the virtual prop is determined according to the position of the second virtual object, the orientation of the second virtual object and the relative position relation, and whether the virtual prop is in the special effect visual range is determined based on the position of the virtual prop, so that the accuracy of the position of the virtual prop is ensured, and the accuracy of displaying the special effect is ensured.
The embodiments shown in fig. 2, 3, 7, 8, and 12 can be combined arbitrarily. As shown in fig. 13, when a virtual gun is held by a first virtual object, and the first virtual object is fired on the basis of the virtual gun, the virtual gun displays a firing effect 1301 of the virtual gun within a visible range of the effect, a virtual bullet 1302 fired from the virtual gun starts moving, a moving effect 1303 of the virtual bullet 1302 is displayed while the virtual bullet 1302 moves within the visible range of the effect, a virtual article is hit by the virtual bullet 1302, a contact position of the virtual bullet 1302 and the virtual article is within the visible range of the effect, and a bullet hole effect of the virtual bullet 1302 is displayed at the contact position.
As shown in fig. 14, a visual field screen of a first virtual object is displayed, a second virtual object 1401 is displayed in the visual field screen, when the second virtual object 1401 is shot by a handheld virtual gun 1402, the second virtual object 1401 is within a special effect visible range, a firing special effect 1403 of the virtual gun 1402 is displayed in the visual field screen, a virtual bullet 1404 shot by the virtual gun 1402 starts moving, a moving special effect 1405 of the virtual bullet 1404 is displayed in the visual field screen while the virtual bullet 1404 moves within the special effect visible range, a virtual article is hit by the virtual bullet 1404, a contact position of the virtual bullet 1404 and the virtual article is within the special effect visible range, and a bullet hole special effect of the virtual bullet 1404 is displayed at the contact position.
For another example, a visual field screen of a first virtual object is displayed, a second virtual object outside the visual field screen is shot based on a handheld virtual gun, the second virtual object is not within the special effect visual range, the firing special effect of the virtual gun is not displayed in the visual field screen, a virtual bullet fired by the virtual gun starts moving, the moving special effect of the virtual bullet is displayed in the visual field screen in the process that the virtual bullet enters the special effect visual range and moves within the special effect visual range, a virtual article is hit by the virtual bullet, the contact position between the virtual bullet and the virtual article is within the special effect visual range, and the bullet hole special effect of the virtual bullet is displayed at the contact position.
Fig. 15 is a flowchart of a method for displaying a specific track effect according to an embodiment of the present application. The execution subject of the embodiment of the present application is any terminal in the above-described implementation environment. Referring to fig. 15, the method includes the steps of:
1. when the game is started, the first terminal displays a view screen of the first virtual object controlled by the terminal.
2. And the second terminal controls the virtual object to be in a special effect visual range in the visual field of the first virtual object when the virtual object is fired based on the handheld virtual firearm, if not, the firing special effect of the virtual firearm is not realized, and if so, the firing special effect of the virtual firearm is displayed in the visual field picture.
3. The second terminal emits rays from the muzzle of the handheld virtual gun for ray detection, whether the rays collide with a collision box of the barrier or not is detected, ray detection is continued when the collision is not detected, a contact position is obtained when the collision is detected, and the contact position is synchronized to the first terminal through the server.
4. The first terminal determines whether the contact position is within a special effect visual range, if so, the bullet hole special effect is displayed at the contact position, and if not, the bullet hole special effect is not displayed at the contact position.
Fig. 16 is a schematic structural view of a display device for special effects of a road furniture, as shown in fig. 16, the device includes:
a display module 1601, configured to display a view screen of a first virtual object controlled by a home device;
the display module 1601 is further configured to display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view of the first virtual object and a distance between the virtual prop and the first virtual object is not greater than a target distance;
the display module 1601 is further configured to not display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view and a distance between the virtual prop and the first virtual object is greater than a target distance.
In one possible implementation manner, the display module 1601 is configured to display a prop special effect of the virtual prop when the virtual prop is triggered, the second virtual object holding the virtual prop is located in the field of view, and a distance between the second virtual object and the first virtual object is not greater than a target distance.
In another possible implementation, the display module 1601 is configured to display the firing effect of the virtual firearm if any of the virtual firearms is triggered, the second virtual object of the handheld virtual firearm is located within the field of view, and the distance between the second virtual object and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module 1601 is configured to display a special effect of the virtual prop when the virtual prop moves in the field of view after being triggered and a distance between the virtual prop and the first virtual object is not greater than a target distance.
In another possible implementation, the display module 1601 is configured to display a special movement effect of the virtual bullet if any virtual bullet moves in the field of view after being shot and the distance between the virtual bullet and the first virtual object is not greater than the target distance.
In another possible implementation manner, the display module 1601 is configured to display a prop special effect of the virtual prop at the contact position when the virtual prop is in contact with any virtual item after being triggered, the contact position is within the field of view, and a distance between the contact position and a position of the first virtual object is not greater than a target distance.
In another possible implementation manner, the display module 1601 is configured to display the bullet hole special effect of the virtual bullet at the contact position if any virtual bullet is in contact with the virtual article after being shot, the contact position is within the visual field, and the distance between the contact position and the position of the first virtual object is not greater than the target distance.
In another possible implementation, as shown in fig. 17, a display module 1601 includes:
a determination unit 1611 configured to determine a bullet hole special effect matching the virtual article if the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the visual field, and the distance between the contact position and the position of the first virtual object is not greater than the target distance;
and a display unit 1612 for displaying the determined bullet hole special effect at the contact position.
In another possible implementation, as shown in fig. 17, the apparatus further includes:
an obtaining module 1602, configured to obtain a position of the first virtual object and a position of a second virtual object holding the virtual item when the virtual item is triggered;
the display module 1601 is configured to display a prop special effect of the virtual prop when the position of the second virtual object is within the field of view and a distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance.
In another possible implementation, as shown in fig. 17, the apparatus further includes:
an obtaining module 1602, configured to obtain a position of the first virtual object and a position of a second virtual object holding the virtual item when the virtual item is triggered;
a determining module 1603, configured to determine a position of the virtual item based on the position of the second virtual object, the orientation of the second virtual object, and a relative position relationship between the second virtual object and the virtual item;
the display module 1601 is configured to display a prop special effect of the virtual prop when the position of the virtual prop is within the field of view and a distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
In another possible implementation, as shown in fig. 17, the apparatus further includes:
an obtaining module 1602, configured to obtain a position of the first virtual object and a position of a third virtual object that triggers the virtual item when the virtual item is triggered and starts to move;
a determining module 1603, configured to determine a current position of the virtual item based on the position of the third virtual object and the moving direction of the virtual item;
the display module 1601 is configured to display a moving special effect of the virtual prop when the virtual prop moves in the field of view and a distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
It should be noted that: the prop special effect display device provided by the above embodiment is exemplified by only the division of the above functional modules, and in practical application, the above functions can be allocated to different functional modules according to needs, that is, the internal structure of the computer device is divided into different functional modules to complete all or part of the above described functions. In addition, the prop special effect display device and the prop special effect display method provided by the embodiment belong to the same concept, and the specific implementation process is detailed in the method embodiment and is not described again.
The embodiment of the present application further provides a computer device, where the computer device includes a processor and a memory, where the memory stores at least one computer program, and the at least one computer program is loaded and executed by the processor to implement the operation performed by the prop special effect display method of the foregoing embodiment.
Optionally, the computer device is provided as a terminal. Fig. 18 shows a block diagram of a terminal 1800 according to an exemplary embodiment of the present application. The terminal 1800 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The terminal 1800 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
The terminal 1800 includes: a processor 1801 and a memory 1802.
The processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1801 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1801 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1801 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed by the display screen. In some embodiments, the processor 1801 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 1802 may include one or more computer-readable storage media, which may be non-transitory. Memory 1802 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1802 is used to store at least one computer program for execution by processor 1801 to implement the prop special effects display methods provided by the method embodiments herein.
In some embodiments, the terminal 1800 may further optionally include: a peripheral interface 1803 and at least one peripheral. The processor 1801, memory 1802, and peripheral interface 1803 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1803 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1804, display 1805, camera assembly 1806, audio circuitry 1807, positioning assembly 1808, and power supply 1809.
The peripheral interface 1803 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1801 and the memory 1802. In some embodiments, the processor 1801, memory 1802, and peripheral interface 1803 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1801, the memory 1802, and the peripheral device interface 1803 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1804 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1804 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 1804 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals. Optionally, the radio frequency circuitry 1804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1804 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1805 is a touch display screen, the display screen 1805 also has the ability to capture touch signals on or over the surface of the display screen 1805. The touch signal may be input to the processor 1801 as a control signal for processing. At this point, the display 1805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1805 may be one, disposed on a front panel of the terminal 1800; in other embodiments, the number of the display screens 1805 may be at least two, and each of the display screens is disposed on a different surface of the terminal 1800 or is in a foldable design; in other embodiments, the display 1805 may be a flexible display disposed on a curved surface or a folded surface of the terminal 1800. Even more, the display 1805 may be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display 1805 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1806 is used to capture images or video. Optionally, the camera assembly 1806 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1806 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1807 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1801 for processing or inputting the electric signals to the radio frequency circuit 1804 to achieve voice communication. The microphones may be provided in a plurality, respectively, at different positions of the terminal 1800 for the purpose of stereo sound collection or noise reduction. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1801 or the radio frequency circuitry 1804 to sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1807 may also include a headphone jack.
The positioning component 1808 is utilized to locate a current geographic position of the terminal 1800 for navigation or LBS (Location Based Service). The Positioning component 1808 may be a Positioning component based on a Global Positioning System (GPS) in the united states, a beidou System in china, or a galileo System in russia.
The power supply 1809 is used to power various components within the terminal 1800. The power supply 1809 may be ac, dc, disposable or rechargeable. When the power supply 1809 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1800 also includes one or more sensors 1810. The one or more sensors 1810 include, but are not limited to: acceleration sensor 1811, gyro sensor 1812, pressure sensor 1813, fingerprint sensor 1814, optical sensor 1815, and proximity sensor 1816.
The acceleration sensor 1811 may detect the magnitude of acceleration on three coordinate axes of a coordinate system established with the terminal 1800. For example, the acceleration sensor 1811 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1801 may control the display 1805 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1811. The acceleration sensor 1811 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1812 may detect a body direction and a rotation angle of the terminal 1800, and the gyro sensor 1812 may cooperate with the acceleration sensor 1811 to collect a 3D motion of the user on the terminal 1800. The processor 1801 may implement the following functions according to the data collected by the gyro sensor 1812: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensors 1813 may be disposed on the side bezel of the terminal 1800 and/or on the lower layer of the display 1805. When the pressure sensor 1813 is disposed on a side frame of the terminal 1800, a user's grip signal on the terminal 1800 can be detected, and the processor 1801 performs left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 1813. When the pressure sensor 1813 is disposed at the lower layer of the display screen 1805, the processor 1801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1805. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1814 is used to collect the fingerprint of the user, and the processor 1801 identifies the user according to the fingerprint collected by the fingerprint sensor 1814, or the fingerprint sensor 1814 identifies the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1801 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1814 may be disposed at the front, rear, or side of the terminal 1800. When a physical key or vendor Logo is provided on the terminal 1800, the fingerprint sensor 1814 may be integrated with the physical key or vendor Logo.
The optical sensor 1815 is used to collect the ambient light intensity. In one embodiment, the processor 1801 may control the display brightness of the display screen 1805 based on the ambient light intensity collected by the optical sensor 1815. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1805 is increased; when the ambient light intensity is low, the display brightness of the display 1805 is reduced. In another embodiment, the processor 1801 may also dynamically adjust the shooting parameters of the camera assembly 1806 according to the intensity of the ambient light collected by the optical sensor 1815.
A proximity sensor 1816, also called a distance sensor, is provided at the front panel of the terminal 1800. The proximity sensor 1816 is used to collect the distance between the user and the front surface of the terminal 1800. In one embodiment, when the proximity sensor 1816 detects that the distance between the user and the front surface of the terminal 1800 gradually decreases, the processor 1801 controls the display 1805 to switch from the bright screen state to the dark screen state; when the proximity sensor 1816 detects that the distance between the user and the front surface of the terminal 1800 is gradually increased, the processor 1801 controls the display 1805 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 18 is not intended to be limiting of terminal 1800 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Optionally, the computer device is provided as a server. Fig. 19 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1900 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1901 and one or more memories 1902, where the memory 1902 stores at least one computer program, and the at least one computer program is loaded and executed by the processors 1901 to implement the methods provided by the foregoing method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
The embodiment of the present application further provides a computer-readable storage medium, where at least one computer program is stored in the computer-readable storage medium, and the at least one computer program is loaded and executed by a processor to implement the operations executed by the prop special effect display method of the foregoing embodiment.
An embodiment of the present application further provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the computer program implements the operations performed by the prop special effect display method according to the foregoing aspect.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only an alternative embodiment of the present application and should not be construed as limiting the present application, and any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for displaying a specific effect of a road, the method comprising:
displaying a visual field picture of a first virtual object controlled by the local terminal equipment;
displaying a prop special effect of the virtual prop under the condition that the triggered virtual prop is positioned in a visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than a target distance;
and under the condition that the triggered virtual prop is positioned in the visual field and the distance between the virtual prop and the first virtual object is greater than the target distance, not displaying the prop special effect of the virtual prop.
2. The method of claim 1, wherein displaying the prop effect of the virtual prop if the triggered virtual prop is located within the field of view of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than a target distance comprises:
and displaying the prop special effect of the virtual prop under the conditions that the virtual prop is triggered, a second virtual object of the handheld virtual prop is positioned in the visual field, and the distance between the second virtual object and the first virtual object is not greater than the target distance.
3. The method of claim 2, wherein the displaying the prop effect of the virtual prop if the virtual prop is triggered, a second virtual object holding the virtual prop is located within the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance comprises:
displaying a firing special effect of any virtual firearm if the virtual firearm is triggered, a second virtual object holding the virtual firearm is located within the field of view, and a distance between the second virtual object and the first virtual object is not greater than the target distance.
4. The method of claim 1, wherein displaying the prop effect of the virtual prop if the triggered virtual prop is located within the field of view of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than a target distance comprises:
and displaying the item special effect of the virtual item under the condition that the virtual item moves in the visual field after being triggered and the distance between the virtual item and the first virtual object is not greater than the target distance.
5. The method of claim 4, wherein displaying the prop effect of the virtual prop if the virtual prop moves within the field of view after being triggered and the distance between the virtual prop and the first virtual object is not greater than the target distance comprises:
displaying a movement special effect of any virtual bullet if the virtual bullet moves within the field of view after being fired and a distance between the virtual bullet and the first virtual object is not greater than the target distance.
6. The method of claim 1, wherein displaying the prop effect of the virtual prop if the triggered virtual prop is located within the field of view of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than a target distance comprises:
and displaying the prop special effect of the virtual prop at the contact position under the conditions that the virtual prop is contacted with any virtual article after being triggered, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is not more than the target distance.
7. The method of claim 6, wherein the displaying the prop effect of the virtual prop at the contact location if the virtual prop is in contact with any virtual item after being triggered, a contact location is within the field of view, and a distance between the contact location and the location of the first virtual object is not greater than the target distance comprises:
and displaying the bullet hole special effect of the virtual bullet at the contact position when any virtual bullet is in contact with the virtual article after being shot, the contact position is in the visual field, and the distance between the contact position and the position of the first virtual object is not more than the target distance.
8. The method of claim 7, wherein the displaying the bullet hole special effect of the virtual bullet at the contact position in the case that any virtual bullet is in contact with the virtual article after being fired, the contact position is within the field of view, and the distance between the contact position and the position of the first virtual object is not greater than the target distance comprises:
determining a bullet hole special effect matched with the virtual article if the virtual bullet is in contact with the virtual article after being ejected, the contact position is within the field of view, and the distance between the contact position and the position of the first virtual object is not greater than the target distance;
and displaying the determined bullet hole special effect at the contact position.
9. The method of claim 1, further comprising:
under the condition that the virtual prop is triggered, acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop;
the displaying the prop special effect of the virtual prop under the condition that the triggered virtual prop is located in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance comprises:
displaying a prop special effect of the virtual prop if the position of the second virtual object is within the field of view and the distance between the position of the second virtual object and the position of the first virtual object is not greater than the target distance.
10. The method of claim 1, further comprising:
under the condition that the virtual prop is triggered, acquiring the position of the first virtual object and the position of a second virtual object holding the virtual prop;
determining the position of the virtual prop based on the position of the second virtual object, the orientation of the second virtual object and the relative position relationship between the second virtual object and the virtual prop;
the displaying the prop special effect of the virtual prop under the condition that the triggered virtual prop is located in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance comprises:
and displaying the prop special effect of the virtual prop under the condition that the position of the virtual prop is in the visual field and the distance between the position of the virtual prop and the position of the first virtual object is not greater than the target distance.
11. The method of claim 1, further comprising:
under the condition that the virtual prop is triggered and starts to move, acquiring the position of the first virtual object and the position of a third virtual object triggering the virtual prop;
determining the current position of the virtual prop based on the position of the third virtual object and the moving direction of the virtual prop;
the displaying the prop special effect of the virtual prop under the condition that the triggered virtual prop is located in the visual field of the first virtual object and the distance between the virtual prop and the first virtual object is not greater than the target distance comprises:
displaying a movement special effect of the virtual prop when the virtual prop moves within the field of view and a distance between a position of the virtual prop and a position of the first virtual object is not greater than the target distance.
12. A display device for displaying a specific effect on a road, the device comprising:
the display module is used for displaying a view picture of a first virtual object controlled by the home terminal equipment;
the display module is further configured to display a prop special effect of the virtual prop when the triggered virtual prop is located in a field of view of the first virtual object and a distance between the virtual prop and the first virtual object is not greater than a target distance;
the display module is further configured to not display a prop special effect of the virtual prop when the triggered virtual prop is located in the field of view and a distance between the virtual prop and the first virtual object is greater than the target distance.
13. A computer device, characterized in that the computer device comprises a processor and a memory, wherein at least one computer program is stored in the memory, and the at least one computer program is loaded by the processor and executed to implement the operations performed by the prop special effects display method according to any one of claims 1 to 11.
14. A computer-readable storage medium, wherein at least one computer program is stored in the computer-readable storage medium, and is loaded and executed by a processor to implement the operations performed by the method for displaying a prop special effect according to any one of claims 1 to 11.
15. A computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the operations performed by the prop special effects display method of any of claims 1 to 11.
CN202111500232.XA 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium Active CN114100128B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111500232.XA CN114100128B (en) 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111500232.XA CN114100128B (en) 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114100128A true CN114100128A (en) 2022-03-01
CN114100128B CN114100128B (en) 2023-07-21

Family

ID=80363823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111500232.XA Active CN114100128B (en) 2021-12-09 2021-12-09 Prop special effect display method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114100128B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001450A1 (en) * 2022-06-27 2024-01-04 腾讯科技(深圳)有限公司 Method and apparatus for displaying special effect of prop, and electronic device and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150073914A1 (en) * 2013-09-10 2015-03-12 Utechzone Co., Ltd. Playing method and electronic apparatus information
CN105808815A (en) * 2015-01-16 2016-07-27 纳宝株式会社 Apparatus and method for generating and displaying cartoon content
CN109126120A (en) * 2018-08-17 2019-01-04 Oppo广东移动通信有限公司 motor control method and related product
CN110478895A (en) * 2019-08-23 2019-11-22 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN112076465A (en) * 2020-08-06 2020-12-15 腾讯科技(深圳)有限公司 Virtual fort control method, device, terminal and storage medium
CN112090070A (en) * 2020-09-18 2020-12-18 腾讯科技(深圳)有限公司 Interaction method and device of virtual props and electronic equipment
CN112121434A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interaction method and device of special effect prop, electronic equipment and storage medium
CN112604279A (en) * 2020-12-29 2021-04-06 珠海金山网络游戏科技有限公司 Special effect display method and device
WO2021184806A1 (en) * 2020-03-17 2021-09-23 腾讯科技(深圳)有限公司 Interactive prop display method and apparatus, and terminal and storage medium
CN113599810A (en) * 2021-08-06 2021-11-05 腾讯科技(深圳)有限公司 Display control method, device, equipment and medium based on virtual object
CN113750531A (en) * 2021-09-18 2021-12-07 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150073914A1 (en) * 2013-09-10 2015-03-12 Utechzone Co., Ltd. Playing method and electronic apparatus information
CN105808815A (en) * 2015-01-16 2016-07-27 纳宝株式会社 Apparatus and method for generating and displaying cartoon content
CN109126120A (en) * 2018-08-17 2019-01-04 Oppo广东移动通信有限公司 motor control method and related product
CN110478895A (en) * 2019-08-23 2019-11-22 腾讯科技(深圳)有限公司 Control method, device, terminal and the storage medium of virtual objects
CN110585712A (en) * 2019-09-20 2019-12-20 腾讯科技(深圳)有限公司 Method, device, terminal and medium for throwing virtual explosives in virtual environment
WO2021184806A1 (en) * 2020-03-17 2021-09-23 腾讯科技(深圳)有限公司 Interactive prop display method and apparatus, and terminal and storage medium
CN112076465A (en) * 2020-08-06 2020-12-15 腾讯科技(深圳)有限公司 Virtual fort control method, device, terminal and storage medium
CN112090070A (en) * 2020-09-18 2020-12-18 腾讯科技(深圳)有限公司 Interaction method and device of virtual props and electronic equipment
CN112121434A (en) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 Interaction method and device of special effect prop, electronic equipment and storage medium
CN112604279A (en) * 2020-12-29 2021-04-06 珠海金山网络游戏科技有限公司 Special effect display method and device
CN113599810A (en) * 2021-08-06 2021-11-05 腾讯科技(深圳)有限公司 Display control method, device, equipment and medium based on virtual object
CN113750531A (en) * 2021-09-18 2021-12-07 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001450A1 (en) * 2022-06-27 2024-01-04 腾讯科技(深圳)有限公司 Method and apparatus for displaying special effect of prop, and electronic device and storage medium

Also Published As

Publication number Publication date
CN114100128B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
CN110448891B (en) Method, device and storage medium for controlling virtual object to operate remote virtual prop
CN110427111B (en) Operation method, device, equipment and storage medium of virtual prop in virtual environment
CN111282275B (en) Method, device, equipment and storage medium for displaying collision traces in virtual scene
KR102619439B1 (en) Methods and related devices for controlling virtual objects
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110721468B (en) Interactive property control method, device, terminal and storage medium
CN111589150B (en) Control method and device of virtual prop, electronic equipment and storage medium
US11944904B2 (en) Data synchronization method and apparatus, terminal, server, and storage medium
CN110538459A (en) Method, apparatus, device and medium for throwing virtual explosives in virtual environment
CN111389005B (en) Virtual object control method, device, equipment and storage medium
CN111228809A (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN113041622B (en) Method, terminal and storage medium for throwing virtual throwing object in virtual environment
WO2021147496A1 (en) Method and apparatus for using virtual prop, and device and storage meduim
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN112402964B (en) Using method, device, equipment and storage medium of virtual prop
CN111265857A (en) Trajectory control method, device, equipment and storage medium in virtual scene
CN111760284A (en) Virtual item control method, device, equipment and storage medium
CN112316421A (en) Equipment method, device, terminal and storage medium of virtual prop
CN112933601A (en) Virtual throwing object operation method, device, equipment and medium
CN112717410A (en) Virtual object control method and device, computer equipment and storage medium
CN112221142A (en) Control method and device of virtual prop, computer equipment and storage medium
CN112704875B (en) Virtual item control method, device, equipment and storage medium
CN112316430B (en) Prop using method, device, equipment and medium based on virtual environment
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN111589137B (en) Control method, device, equipment and medium of virtual role

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40069742

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant