CN113633969A - Data processing method, device, equipment and storage medium - Google Patents

Data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN113633969A
CN113633969A CN202110930954.2A CN202110930954A CN113633969A CN 113633969 A CN113633969 A CN 113633969A CN 202110930954 A CN202110930954 A CN 202110930954A CN 113633969 A CN113633969 A CN 113633969A
Authority
CN
China
Prior art keywords
game
information
scene
user interface
graphical user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110930954.2A
Other languages
Chinese (zh)
Inventor
侯文鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110930954.2A priority Critical patent/CN113633969A/en
Publication of CN113633969A publication Critical patent/CN113633969A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a data processing method, a data processing device, data processing equipment and a storage medium. According to the method, the second terminal device responds to the orientation marking operation of the second graphical user interface, orientation information is determined in the game scene, and the orientation information is transmitted to the first terminal device. The first terminal equipment receives the azimuth information determined by the second terminal equipment, and a trigger object corresponding to the azimuth information is displayed in the first graphical user interface; and responding to the trigger operation of the trigger object, and performing control processing on the first game role according to the azimuth information. When a first user triggers a trigger object through first terminal equipment, the first terminal equipment can automatically control and process a first game role on the first terminal equipment according to the azimuth information, so that the azimuth information of the first terminal equipment and the second terminal equipment is automatically synchronized, the operation steps are simplified, and the azimuth information synchronization efficiency is improved.

Description

Data processing method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for data processing.
Background
In a large-scale multi-player online shooting type network game, players need to synchronize the position or position of teammates in many scenes, and team cooperation is achieved.
In the current online game, when players synchronize the position or position of teammates, the following methods are mainly used: one way is to directly inform the teammates through voice conversation, and for players who do not start a voice game, the player cannot know the hit information of the teammates. The other mode is realized by a 'mark point' button function, a player needs to align the center with a field scene or a terrain within a certain distance to determine a mark point, and then clicks the mark button to report the mark point to a team member; the teammates need to rotate in directions for many times, mark points are searched on a scale or in a scene, and whether the players mark or the teammates find the mark points, the marking time is long, the efficiency is low, and the players in fierce war acts are often fatal.
Disclosure of Invention
The application provides a data processing method, a data processing device, data processing equipment and a storage medium.
A first aspect of the present application provides a data processing method, which is applied to a first terminal device, and provides a first graphical user interface through the first terminal device, where content displayed in the first graphical user interface at least partially includes a game scene of a game application, the game scene includes a first game character, and the method includes:
receiving orientation information determined in the game scene through a second terminal device;
displaying a trigger object corresponding to the orientation information in the first graphical user interface;
and responding to the trigger operation of the trigger object, and performing control processing on the first game role according to the azimuth information.
A second aspect of the present application is to provide a data processing method, applied to a second terminal device, for providing a second graphical user interface through the second terminal device, where content displayed in the second graphical user interface at least partially includes a game scene of a game application, including:
determining orientation information in the game scene in response to an orientation marking operation of the second graphical user interface;
and transmitting the azimuth information to a first terminal device so as to enable a trigger object corresponding to the azimuth information to be displayed in a first graphical user interface of the first terminal device, wherein the trigger operation of the trigger object is used for controlling and processing the first game role according to the azimuth information.
A third aspect of the present application is to provide an apparatus for data processing, which is applied to a first terminal device, and provides a first graphical user interface through the first terminal device, where content displayed in the first graphical user interface at least partially includes a game scene of a game application, where the game scene includes a first game character, and the apparatus includes:
the information receiving module is used for receiving the azimuth information determined in the game scene through the second terminal equipment;
the trigger object display module is used for displaying a trigger object corresponding to the azimuth information in the first graphical user interface;
and the control module is used for responding to the triggering operation of the triggering object and controlling and processing the first game role according to the azimuth information.
A fourth aspect of the present application is to provide an apparatus for data processing, which is applied to a second terminal device, and provides a second graphical user interface through the second terminal device, where content displayed in the second graphical user interface at least partially includes a game scene of a game application, the apparatus including:
the orientation information acquisition module is used for responding to the orientation marking operation of the second graphical user interface and determining orientation information in the game scene;
and the information transmission module is used for transmitting the azimuth information to first terminal equipment so as to display a trigger object corresponding to the azimuth information in a first graphical user interface of the first terminal equipment, wherein the trigger operation of the trigger object is used for controlling and processing the first game role according to the azimuth information.
A fifth aspect of the present application provides a terminal device, including:
a processor, a memory, and a computer program stored on the memory and executable on the processor; wherein the processor, when executing the computer program, implements the method of any of the above aspects.
A sixth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the above aspects.
According to the data processing method, the data processing device, the data processing equipment and the storage medium, the second terminal equipment responds to the orientation mark operation of the second graphical user interface, and orientation information is determined in the game scene; and transmitting the azimuth information to the first terminal equipment. The first terminal equipment receives the azimuth information determined in the game scene through the second terminal equipment; displaying a trigger object corresponding to the orientation information in the first graphical user interface; and when the first user triggers the trigger object through the first terminal equipment, the first terminal equipment can automatically control and process the first game role on the first terminal equipment according to the azimuth information determined by the second user through the second terminal equipment in a game scene, so that the synchronization of the azimuth information of the first terminal equipment and the second terminal equipment is automatically realized, the user operation steps are simplified, and the azimuth information synchronization efficiency is improved.
Drawings
FIG. 1 is an exemplary diagram of a system architecture of a network game provided in an embodiment of the present application;
fig. 2 is a flowchart of a data processing method according to an embodiment of the present application;
FIG. 3 is a flow chart of another method for data processing according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an interface for a second user to initiate position synchronization provided by an embodiment of the present application;
FIG. 5 is a schematic diagram of an interface for performing orientation synchronization by a first user according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another data processing apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terms "first", "second", etc. referred to in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicit to the number of technical features indicated. In the description of the following examples, "plurality" means two or more unless specifically limited otherwise.
The data processing method provided by the application is particularly applied to synchronizing the position, the position information and the like among teammates in a large-scale multi-player online shooting type online game. The system architecture of the network game is shown in fig. 1, and comprises a terminal device 10 of any user and a server 20 deployed with the network game, wherein data transmitted between any two users needs to be forwarded through server processing.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 2 is a flowchart of a data processing method according to an embodiment of the present application. The method in this embodiment is applied to a terminal device, which may be a terminal device equipped with a network game client, a mobile terminal such as a smart phone, a tablet, an AR device, or a fixed terminal device such as a Personal Computer (PC), and in other embodiments, the method may also be applied to other devices, and this embodiment takes the terminal device as an example to schematically illustrate.
As shown in fig. 2, the method comprises the following specific steps:
and step S101, responding to the orientation marking operation of the second graphical user interface, and determining orientation information in the game scene.
In this embodiment, the second terminal device is used to refer to a user terminal that initiates the azimuth synchronization, and the first terminal device is used to refer to a user terminal that accepts the azimuth synchronization. The first terminal device provides a first graphical user interface, content displayed in the first graphical user interface at least partially contains a game scene of the game application, and the game scene comprises a first game role. The second terminal device provides a second graphical user interface, the content displayed in the second graphical user interface at least partially containing a game scene of the game application, the game scene including a second game character. The first game character and the second game character may be different game characters in the same game scene, and the first game character and the second game character have a designated association relationship, such as a teammate relationship. In addition, the designated association relationship may be set and adjusted according to an actual application scenario, and is not specifically limited herein.
In addition, the same terminal device may be the initiator of the position synchronization at one time and the recipient of the position synchronization at another time. The second user is used to refer to the player using the second terminal device, and the first user is used to refer to the player using the first terminal device. The second user is a player initiating the direction synchronization, and the second user can perform the direction marking operation in a game scene displayed in a second graphical user interface of the second terminal device to determine direction information in the game scene.
The determined direction information is the direction information marked by the second user in the game scene, and may include direction information and/or position information.
For example, the second user may synchronize information on the current position, location, or second user-specified marker point of the second game character in the second graphical user interface with the first user.
Illustratively, when the current position or position of the second game character in the second graphical user interface is synchronized, a trigger control for position marking operation can be displayed in the current page, and the second user can synchronize the current position or the current position of the second game character to the first user having a specified association relationship with the second user by clicking or long-pressing the trigger control in the second graphical user interface, so that the user operation is simplified, and the efficiency of position synchronization is improved.
And S102, the second terminal equipment transmits the azimuth information to the first terminal equipment.
The second terminal device can transmit the information of the second user marking the determined position in the second graphical user interface to the first terminal device of the first user.
Illustratively, the second terminal device may transmit the orientation information to the server. The server determines a first user having a designated association with a second user and address information of a first terminal device of the first user, and then sends the direction information to the first terminal device.
For example, the second terminal device transmits an information synchronization request containing the orientation information of the second user mark to the server. And responding to the received information synchronization request, the server determines a first user having a designated association relationship with a second user and address information of first terminal equipment of the first user, and then sends the direction information marked by the second user to the first terminal equipment.
And step S103, the first terminal device receives the azimuth information determined in the game scene through the second terminal device.
Illustratively, the first terminal device can receive the direction information which is sent by the server and marked by the second user in the game scene.
And step S104, displaying the trigger object corresponding to the azimuth information in the first graphical user interface.
After receiving the orientation information marked by the second user in the game scene, the first terminal device may display a trigger object corresponding to the orientation information in the provided second graphical user interface, and when the trigger object is triggered, automatically adjust at least one of the perspective, the orientation, the moving direction, and the center of gravity of the first game character in the first graphical user interface, thereby automatically achieving the orientation synchronization of the first game character and the orientation information.
Step S105, responding to the trigger operation of the trigger object, and controlling and processing the first game role according to the azimuth information.
When a second terminal device sends azimuth information needing synchronization to a first terminal device, a corresponding trigger object is displayed on a first graphical user interface of the first terminal device. When a first user triggers a trigger object through first terminal equipment, the first terminal equipment automatically controls and processes a first game role in a first graphical user interface according to the azimuth information, and therefore synchronization of the first game role and the azimuth information is automatically achieved.
Illustratively, the first terminal device may display a trigger object corresponding to the orientation information in the first graphical user interface, which may be a control such as a button. When a second terminal device synchronizes azimuth information to a first terminal device, a corresponding control is displayed on a first graphical user interface of the first terminal device, and when a first user clicks or long-presses the displayed control, the first terminal device automatically controls and processes a first game role on the first graphical user interface, so that the first game role is automatically synchronized with the azimuth information.
In the embodiment of the application, when a second user needs to synchronize azimuth information to a first user, azimuth information is determined in a game scene of a second graphical user interface through second terminal equipment in response to azimuth marking operation of the second graphical user interface; transmitting azimuth information to the first terminal device; the method comprises the steps that a first terminal device receives azimuth information determined in a game scene through a second terminal device; displaying a trigger object corresponding to the orientation information in a first graphical user interface; responding to the trigger operation of the trigger object, and controlling and processing the first game role according to the azimuth information; when the first user triggers the trigger object, the first terminal device can automatically control and process the first game role according to the received direction information, so that the synchronization of the direction information marked by the first game role and the second user is automatically realized, the operation of direction synchronization of the user is greatly simplified, and the efficiency of direction information synchronization is improved.
Fig. 3 is a flowchart of another data processing method according to an embodiment of the present disclosure. On the basis of the embodiment of the method corresponding to fig. 2, in an optional implementation manner, if the trigger object is not triggered within the preset time duration, the trigger object in the first graphical user interface is removed, so as to avoid the trigger object from occupying the space of the graphical user interface, and facilitate the user operation.
As shown in fig. 3, the method comprises the following specific steps:
step S201, responding to the orientation mark operation of the second graphical user interface, and determining orientation information in the game scene.
In this embodiment, the second terminal device is used to refer to a user terminal that initiates the azimuth synchronization, and the first terminal device is used to refer to a user terminal that accepts the azimuth synchronization. The first terminal device provides a first graphical user interface, content displayed in the first graphical user interface at least partially contains a game scene of the game application, and the game scene comprises a first game role. The second terminal device provides a second graphical user interface, the content displayed in the second graphical user interface at least partially containing a game scene of the game application, the game scene including a second game character. The first game character and the second game character may be different game characters in the same game scene, and the first game character and the second game character have a designated association relationship, such as a teammate relationship. In addition, the designated association relationship may be set and adjusted according to an actual application scenario, and is not specifically limited herein.
In addition, the same terminal device may be the initiator of the position synchronization at one time and the recipient of the position synchronization at another time.
The second user is used to refer to the player using the second terminal device, and the first user is used to refer to the player using the first terminal device. The second user is a player initiating the direction synchronization, and the second user can perform the direction marking operation in a game scene displayed in a second graphical user interface of the second terminal device to determine direction information in the game scene.
The determined direction information is the direction information marked by the second user in the game scene, and may include direction information and/or position information.
For example, the second user may synchronize information on the current position, location, or second user-specified marker point of the second game character in the second graphical user interface with the first user.
Illustratively, when the current position or position of the second game character in the second graphical user interface is synchronized, a trigger control for position marking operation can be displayed in the current page, and the second user can synchronize the current position or the current position of the second game character to the first user having a specified association relationship with the second user by clicking or long-pressing the trigger control in the second graphical user interface, so that the user operation is simplified, and the efficiency of position synchronization is improved.
In an optional implementation, the direction information indicated by the position indicator is determined in response to a triggering operation of the position indicator of the second graphical user interface.
In a game scenario, the orientation indicator typically represents an absolute orientation in the world map directly, such as the north direction, i.e. directly above the map.
The triggering operation on the direction indicator may be a click operation or a long-press operation on the direction indicator, or may be another operation configured according to an actual game scene, and is not specifically limited herein.
For example, as shown in fig. 4, in response to the second user clicking or long-pressing the position indicator on the second graphical user interface, the direction information indicated by the position indicator is determined, and the direction information indicated by the position indicator is taken as the marked position information.
By adding the function of triggering azimuth synchronization on the azimuth indicator, the second user triggers the azimuth indicator displayed on the second graphical user interface through the second terminal device, so that the second terminal device can be triggered to automatically determine the direction information indicated by the azimuth indicator, and the direction information indicated by the azimuth indicator is used as the marked azimuth information to be sent to the first terminal device. The first game role can immediately carry out the azimuth synchronization after the first user triggers the trigger object, thereby simplifying the operation steps of both parties of the azimuth synchronization and improving the efficiency of 'reporting the azimuth' among players and searching the corresponding azimuth for teammates.
In another optional implementation manner, the second user may use any scene in the second graphical user interface as a target scene to be marked through the second terminal device, and perform operations such as clicking or long-pressing on the target scene to mark the azimuth information corresponding to the target scene.
And responding to the click or long-press operation of the target scene in the second graphical user interface, and the second terminal equipment determines the position information of the click or long-press operation.
For example, the target scene marked in the second graphical user interface by the second user through the second terminal device may be a static scene.
Responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, if the target scene is a static scene, the second terminal device determines preset position information corresponding to the target scene, and the preset position information corresponding to the static scene is used as the marked azimuth information.
The static scene may be a position point, an object, and the like fixedly set in the game scene, for example, a vehicle in the game scene, or a position point where a click or long-press operation is performed. The static scene is fixedly set in the game scene, the position is kept unchanged, and the preset position information of the static scene is in the map data of the game application.
For example, the target scene marked in the second graphical user interface by the second user through the second terminal device may be a dynamic scene.
Responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, if the target scene is a dynamic scene, the second terminal device determines the current position of the dynamic scene, and the current position of the dynamic scene is used as the marked azimuth information.
The dynamic scene may be a scene movable in a game scene, such as a game character, other movable objects, or the like.
Further, if the marked target scene is a dynamic scene, when the position of the dynamic scene changes, the second terminal device acquires the current position of the dynamic scene in real time, and uses the current position of the dynamic scene as the marked azimuth information to synchronize the azimuth information to the first terminal device in real time.
Alternatively, the dynamic scene may be a second game character in the game scene displayed by the second graphical user interface. The second user can mark the azimuth information of the second game role in the second graphical user interface through the second terminal equipment, and the second terminal equipment acquires the azimuth information of the second game role in real time and synchronizes the azimuth information of the second game role to the first terminal equipment in real time. The first terminal device displays the corresponding trigger object, and after the first user triggers the trigger object, the first terminal device can acquire the azimuth information of the second game role in real time and control the first game role in real time, so that the first game role and the second game role are synchronized in azimuth information.
Step S202, the second terminal device transmits the azimuth information to the first terminal device.
The second terminal device can transmit the information of the second user marking the determined position in the second graphical user interface to the first terminal device of the first user.
Illustratively, the second terminal device may transmit the orientation information to the server. The server determines a first user having a designated association with a second user and address information of a first terminal device of the first user, and then sends the direction information to the first terminal device.
For example, the second terminal device transmits an information synchronization request containing the orientation information of the second user mark to the server. And responding to the received information synchronization request, the server determines a first user having a designated association relationship with a second user and address information of first terminal equipment of the first user, and then sends the direction information marked by the second user to the first terminal equipment.
In this step, the position information marked by the second user can be transmitted to the first terminal equipment of each first user having the specified association relation with the second user, or the position information marked by the second user can be transmitted to the first terminal equipment of one first user specified by the second user.
Step S203, the first terminal device receives the azimuth information determined in the game scene through the second terminal device.
Illustratively, the first terminal device can receive the direction information which is sent by the server and marked by the second user in the game scene.
And step S204, displaying the trigger object corresponding to the azimuth information in the first graphical user interface.
After receiving the orientation information marked by the second user in the game scene, the first terminal device may display a trigger object corresponding to the orientation information in the provided second graphical user interface, and when the trigger object is triggered, automatically adjust at least one of the perspective, the orientation, the moving direction, and the center of gravity of the first game character in the first graphical user interface, thereby automatically achieving the orientation synchronization of the first game character and the orientation information.
Optionally, the trigger object corresponding to the orientation information may be a control displayed in the first graphical user interface.
For example, as shown in fig. 5, an orientation synchronization button may be displayed in the first graphical user interface as a trigger object of the orientation information. And when the first user clicks the direction synchronization button, the first game role is automatically controlled and processed.
Optionally, the processing to be performed when the trigger object is triggered may also be displayed on or near the trigger object.
For example, "turn 285" may be displayed on the orientation sync button as shown in fig. 5 to remind the first user, upon clicking the orientation sync button, to turn the first game character's perspective and orientation to the orientation of 285 (consistent with the orientation of the second game character in fig. 4).
Optionally, the position of the trigger object in the first graphical user interface may be set and adjusted according to an actual application scenario, which is not specifically limited herein.
And S205, responding to the trigger operation of the trigger object, and performing control processing on the first game role according to the azimuth information.
When a second terminal device sends azimuth information needing synchronization to a first terminal device, a corresponding trigger object is displayed on a first graphical user interface of the first terminal device. When a first user triggers a trigger object through first terminal equipment, the first terminal equipment automatically controls and processes a first game role in a first graphical user interface according to the direction information, and therefore synchronization of the first game role and the direction recording information is automatically achieved.
Illustratively, the first terminal device may display a trigger object corresponding to the orientation information in the first graphical user interface, which may be a control such as a button.
When a second terminal device synchronizes azimuth information to a first terminal device, a corresponding control is displayed on a first graphical user interface of the first terminal device, and when a first user clicks or long-presses the displayed control, the first terminal device automatically controls and processes a first game role on the first graphical user interface, so that the first game role is automatically synchronized with the azimuth information.
In this embodiment, the first terminal device may adjust at least one of the following information of the first game character according to the orientation information:
orientation, viewing angle, direction of movement, centration.
For example, if the direction information is the position information of the target scene in the game scene, the current direction and the moving direction of the first game character may be adjusted at the same time, so that the first game character moves toward the position of the target scene.
For example, if the orientation information is position information of a target scene in the game scene, the current view angle and the moving direction of the first game character may be adjusted at the same time, so that the current view angle of the first game character is adjusted to the observation view angle of the position of the target scene and moves to the position of the target scene.
For example, if the orientation information is position information of a target scene in the game scene, the center of gravity and the moving direction of the first game character may be adjusted at the same time, so that the first game character aims at the position of the target scene and moves to the position of the target scene.
For example, if the orientation information is the current position of the second game character, the current orientation and the moving direction of the first game character are adjusted, so that the first game character moves toward the position where the second game character is located.
In addition, one of the orientation, the view angle, the moving direction and the center of the first game character may be adjusted according to the orientation information, or multiple items of information therein may be adjusted at the same time, and the configuration is specifically performed according to the actual application scenario, which is not specifically limited herein.
Optionally, in response to the trigger operation on the trigger object, the first terminal device may also automatically send reply information corresponding to the position information to the second terminal device, and the user does not need to manually reply the message.
Step S206, if the trigger object is not triggered within the preset time length, removing the trigger object in the first graphical user interface.
Optionally, after the trigger object corresponding to the orientation information of the second user mark is displayed in the first graphical user interface, if there is no trigger operation on the trigger object within the set display time, the trigger object is no longer displayed, and the trigger object automatically disappears from the first graphical user interface.
The set display time corresponding to the trigger object may be set and adjusted according to the actual application scenario, which is not specifically limited herein. For example, the set display time may be 3 seconds.
In the embodiment of the application, at the starting end of the direction synchronization, a second user can mark the position information where the click or long-press operation is located as the direction information by clicking or long-press operation of a second terminal device on a target scene in a second graphical user interface, or can mark the direction information indicated by the direction indicator as the direction information by triggering operation of the second terminal device on a direction indicator of the second graphical user interface, and the second terminal device automatically sends the marked direction information to a related first terminal device, so that the operation of the direction synchronization initiated by the user is greatly simplified. At the receiving end of the direction synchronization, the first terminal device can automatically receive the direction information marked in the game scene through the second terminal device, and display the trigger object corresponding to the direction information in the first graphical user interface. The first user only needs to trigger the trigger object corresponding to the direction information, and the first terminal device can control and process the first game role according to the direction information, so that the direction synchronization is realized, the user operation of the direction synchronization receiving end is greatly simplified, the direction synchronization efficiency is improved, and the game experience of the user is improved.
Fig. 6 is a schematic structural diagram of a data processing apparatus according to an embodiment of the present application. The data processing device provided by the embodiment of the application can execute the processing flow provided by the method embodiment of the data processing. The data processing device is applied to a first terminal device, a first graphical user interface is provided through the first terminal device, content displayed in the first graphical user interface at least partially contains a game scene of a game application, and the game scene comprises a first game role.
As shown in fig. 6, the data processing apparatus 60 includes: an information receiving module 601, a trigger object display module 602 and a control module 603.
Specifically, the information receiving module 601 is configured to receive the orientation information determined in the game scene by the second terminal device.
A trigger object display module 602, configured to display a trigger object corresponding to the orientation information in the first graphical user interface.
And the control module 603 is configured to, in response to a trigger operation on the trigger object, perform control processing on the first game character according to the orientation information.
Optionally, the control module is further configured to:
according to the azimuth information, adjusting at least one item of the following information of the first game role:
orientation, viewing angle, direction of movement, centration.
Optionally, the control module is further configured to:
if the direction information is the position information of the target scene in the game scene, adjusting the current orientation and the moving direction of the first game role to enable the first game role to move towards the position of the target scene;
alternatively, the first and second electrodes may be,
if the orientation information is the position information of the target scene in the game scene, adjusting the current visual angle and the moving direction of the first game role, so that the current visual angle of the first game role is adjusted to be the observation visual angle of the position of the target scene and moves to the position of the target scene;
alternatively, the first and second electrodes may be,
if the azimuth information is the position information of the target scene in the game scene, the alignment and the moving direction of the first game role are adjusted, so that the first game role can aim at the position of the target scene and move to the position of the target scene.
Optionally, the control module is further configured to:
if the orientation information is the current position of the second game role, the current orientation and the moving direction of the first game role are adjusted, so that the first game role moves towards the position of the second game role.
Optionally, the trigger object display module is further configured to:
and displaying a control corresponding to the orientation information in the first graphical user interface.
Optionally, the trigger object display module is further configured to:
after the trigger object corresponding to the azimuth information is displayed in the first graphical user interface, if the trigger object is not triggered within the preset time length, the trigger object in the first graphical user interface is removed.
Optionally, the control module is further configured to:
and responding to the trigger operation of the trigger object, and sending reply information corresponding to the azimuth information.
The apparatus provided in this embodiment of the present application may be specifically configured to execute the method flow executed by the first terminal device in any of the method embodiments described above, and specific functions and achieved technical effects are not described herein again.
Fig. 7 is a schematic structural diagram of another data processing apparatus according to an embodiment of the present application. The data processing device provided by the embodiment of the application can execute the processing flow provided by the method embodiment of the data processing. And the second terminal device is used for providing a second graphical user interface, and the content displayed in the second graphical user interface at least partially contains the game scene of the game application.
As shown in fig. 7, the data processing apparatus 70 includes: a position information acquisition module 701 and an information transmission module 702.
Specifically, the orientation information obtaining module 701 is configured to determine orientation information in the game scene in response to an orientation marking operation of the second graphical user interface.
An information transmission module 702, configured to transmit the orientation information to the first terminal device, so that a trigger object corresponding to the orientation information is displayed in the first graphical user interface of the first terminal device, where a trigger operation of the trigger object is used to control the first game character according to the orientation information.
Optionally, the orientation information obtaining module is further configured to:
and responding to the clicking or long-pressing operation of the target scene in the second graphical user interface, and determining the position information of the clicking or long-pressing operation.
Optionally, the orientation information obtaining module is further configured to:
in response to a triggering operation of the position indicator of the second graphical user interface, direction information indicated by the position indicator is determined.
Optionally, the orientation information obtaining module is further configured to:
responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, and if the target scene is a static scene, determining preset position information corresponding to the target scene;
alternatively, the first and second electrodes may be,
and responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, and if the target scene is a dynamic scene, determining the current position of the dynamic scene.
Optionally, the orientation information obtaining module is further configured to:
and responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, if the target scene is a dynamic scene, determining the current position of the dynamic scene, and then acquiring the current position of the dynamic scene in real time when the position of the dynamic scene changes.
Optionally, the dynamic scene is a second game character in the game scene displayed by the second graphical user interface.
The apparatus provided in this embodiment of the present application may be specifically configured to execute the method flow executed by the second terminal device in any of the method embodiments described above, and specific functions and achieved technical effects are not described herein again.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 8, the terminal device 100 includes: a processor 1001, a memory 1002, and computer programs stored on the memory 1002 and executable on the processor 1001.
When running the computer program, the processor 1001 implements a processing procedure executed by the first terminal device or the second terminal device in any of the above method embodiments.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements a processing procedure executed by the first terminal device or the second terminal device in any of the method embodiments.
An embodiment of the present application further provides a computer program product, where the program product includes: a computer program, stored in a readable storage medium, from which at least one processor of the terminal device can read the computer program, and the at least one processor executes the computer program to make the terminal device execute the processing procedure executed by the first terminal device or the second terminal device in any of the above method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (17)

1. A method of data processing, applied to a first terminal device, by which a first graphical user interface is provided, content displayed in the first graphical user interface at least partially containing a game scene of a game application, the game scene including a first game character, the method comprising:
receiving orientation information determined in the game scene through a second terminal device;
displaying a trigger object corresponding to the orientation information in the first graphical user interface;
and responding to the trigger operation of the trigger object, and performing control processing on the first game role according to the azimuth information.
2. The method of claim 1, wherein the performing a control process on the first game character based on the orientation information comprises:
according to the azimuth information, adjusting at least one of the following information of the first game role:
orientation, viewing angle, direction of movement, centration.
3. The method of claim 2, wherein the performing a control process on the first game character based on the orientation information comprises:
if the orientation information is the position information of the target scene in the game scene, adjusting the current orientation and the moving direction of the first game role to enable the first game role to move towards the position of the target scene;
alternatively, the first and second electrodes may be,
if the orientation information is the position information of the target scene in the game scene, adjusting the current visual angle and the moving direction of the first game role to enable the current visual angle of the first game role to be adjusted to the observation visual angle of the position of the target scene and move to the position of the target scene;
alternatively, the first and second electrodes may be,
if the orientation information is the position information of the target scene in the game scene, the alignment and the moving direction of the first game role are adjusted, so that the first game role aims at the position of the target scene and moves to the position of the target scene.
4. The method of claim 2, wherein the game scene includes a second game character, and wherein performing the control process on the first game character according to the orientation information includes:
if the orientation information is the current position of the second game role, the current orientation and the moving direction of the first game role are adjusted, so that the first game role moves towards the position of the second game role.
5. The method of claim 1, wherein displaying the trigger object corresponding to the orientation information in the first graphical user interface comprises:
displaying a control corresponding to the orientation information in the first graphical user interface.
6. The method according to any one of claims 1-5, further comprising, after displaying the trigger object corresponding to the orientation information in the first graphical user interface:
and if the trigger object is not triggered within the preset time length, removing the trigger object in the first graphical user interface.
7. The method of claim 1, further comprising:
and responding to the trigger operation of the trigger object, and sending reply information corresponding to the azimuth information.
8. A method of data processing, applied to a second terminal device, by which a second graphical user interface is provided, content displayed in the second graphical user interface at least partially containing a game scene of a game application, the game scene including a second game character, the method comprising:
determining orientation information in the game scene in response to an orientation marking operation of the second graphical user interface;
and transmitting the azimuth information to a first terminal device so as to enable a trigger object corresponding to the azimuth information to be displayed in a first graphical user interface of the first terminal device, wherein the trigger operation of the trigger object is used for controlling and processing the first game role according to the azimuth information.
9. The method of claim 8, wherein determining orientation information in the game scene in response to the orientation marking operation of the second graphical user interface comprises:
and responding to the click or long-press operation of the target scene in the second graphical user interface, and determining the position information of the click or long-press operation.
10. The method of claim 8, wherein determining orientation information in the game scene in response to the orientation marking operation of the second graphical user interface comprises:
in response to a triggering operation of a position indicator of the second graphical user interface, determining direction information indicated by the position indicator.
11. The method of claim 9, wherein the determining the location information of the click or long press operation in response to the click or long press operation on the target scene in the second graphical user interface comprises:
responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, and if the target scene is a static scene, determining preset position information corresponding to the target scene;
alternatively, the first and second electrodes may be,
responding to clicking or long-time pressing operation of a target scene in the second graphical user interface, and if the target scene is a dynamic scene, determining the current position of the dynamic scene.
12. The method of claim 11, wherein in response to a click or a long press operation on a target scene in the second graphical user interface, if the target scene is a dynamic scene, after determining a current location of the dynamic scene, further comprising:
and when the position of the dynamic scene changes, acquiring the current position of the dynamic scene in real time.
13. The method of claim 11 or 12, wherein the dynamic scene is a second game character in the game scene displayed by the second graphical user interface.
14. An apparatus for data processing, applied to a first terminal device, through which a first graphical user interface is provided, content displayed in the first graphical user interface at least partially containing a game scene of a game application, the game scene including a first game character, the apparatus comprising:
the information receiving module is used for receiving the azimuth information determined in the game scene through the second terminal equipment;
the trigger object display module is used for displaying a trigger object corresponding to the azimuth information in the first graphical user interface;
and the control module is used for responding to the triggering operation of the triggering object and controlling and processing the first game role according to the azimuth information.
15. An apparatus for data processing, applied to a second terminal device through which a second graphical user interface is provided, content displayed in the second graphical user interface at least partially containing a game scene of a game application, the apparatus comprising:
the orientation information acquisition module is used for responding to the orientation marking operation of the second graphical user interface and determining orientation information in the game scene;
and the information transmission module is used for transmitting the azimuth information to first terminal equipment so as to display a trigger object corresponding to the azimuth information in a first graphical user interface of the first terminal equipment, wherein the trigger operation of the trigger object is used for controlling and processing the first game role according to the azimuth information.
16. A terminal device, comprising:
a processor, a memory, and a computer program stored on the memory and executable on the processor;
wherein the processor, when executing the computer program, implements the method of any of claims 1 to 13.
17. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 13.
CN202110930954.2A 2021-08-13 2021-08-13 Data processing method, device, equipment and storage medium Pending CN113633969A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110930954.2A CN113633969A (en) 2021-08-13 2021-08-13 Data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110930954.2A CN113633969A (en) 2021-08-13 2021-08-13 Data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113633969A true CN113633969A (en) 2021-11-12

Family

ID=78421545

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110930954.2A Pending CN113633969A (en) 2021-08-13 2021-08-13 Data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113633969A (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107694086A (en) * 2017-10-13 2018-02-16 网易(杭州)网络有限公司 Information processing method and device, storage medium, the electronic equipment of games system
CN107789837A (en) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107812384A (en) * 2017-09-12 2018-03-20 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108465238A (en) * 2018-02-12 2018-08-31 网易(杭州)网络有限公司 Information processing method, electronic equipment in game and storage medium
CN109908574A (en) * 2019-02-22 2019-06-21 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
CN111359208A (en) * 2020-02-24 2020-07-03 网易(杭州)网络有限公司 Method and device for generating marking signal in game, electronic equipment and storage medium
CN111569422A (en) * 2020-05-15 2020-08-25 网易(杭州)网络有限公司 Interaction method and device between game characters, electronic equipment and storage medium
US20200330870A1 (en) * 2018-06-01 2020-10-22 Tencent Technology (Shenzhen) Company Limited Information prompting method and apparatus, storage medium, and electronic device
CN112402976A (en) * 2020-11-24 2021-02-26 网易(杭州)网络有限公司 Game role control method, terminal, readable storage medium and electronic device
CN112619167A (en) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 Information processing method and device, computer equipment and medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107789837A (en) * 2017-09-12 2018-03-13 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN107812384A (en) * 2017-09-12 2018-03-20 网易(杭州)网络有限公司 Information processing method, device and computer-readable recording medium
CN109248439A (en) * 2017-09-12 2019-01-22 网易(杭州)网络有限公司 Information processing method, device and computer readable storage medium
CN107694086A (en) * 2017-10-13 2018-02-16 网易(杭州)网络有限公司 Information processing method and device, storage medium, the electronic equipment of games system
CN107899241A (en) * 2017-11-22 2018-04-13 网易(杭州)网络有限公司 Information processing method and device, storage medium, electronic equipment
CN108465238A (en) * 2018-02-12 2018-08-31 网易(杭州)网络有限公司 Information processing method, electronic equipment in game and storage medium
US20200330870A1 (en) * 2018-06-01 2020-10-22 Tencent Technology (Shenzhen) Company Limited Information prompting method and apparatus, storage medium, and electronic device
CN109908574A (en) * 2019-02-22 2019-06-21 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
CN111957032A (en) * 2019-02-22 2020-11-20 网易(杭州)网络有限公司 Game role control method, device, equipment and storage medium
CN111359208A (en) * 2020-02-24 2020-07-03 网易(杭州)网络有限公司 Method and device for generating marking signal in game, electronic equipment and storage medium
CN111569422A (en) * 2020-05-15 2020-08-25 网易(杭州)网络有限公司 Interaction method and device between game characters, electronic equipment and storage medium
CN112402976A (en) * 2020-11-24 2021-02-26 网易(杭州)网络有限公司 Game role control method, terminal, readable storage medium and electronic device
CN112619167A (en) * 2020-12-21 2021-04-09 网易(杭州)网络有限公司 Information processing method and device, computer equipment and medium

Similar Documents

Publication Publication Date Title
CN112206512B (en) Information processing method, device, electronic equipment and storage medium
CN108970116B (en) Virtual role control method and device
CN105450736B (en) Method and device for connecting with virtual reality
CN110496391B (en) Information synchronization method and device
US9433869B2 (en) Information processing device, server, and information processing system
CN110870975B (en) Method, device and equipment for processing live game and computer readable storage medium
EP3659683A1 (en) Object display method and device and storage medium
US20220379214A1 (en) Method and apparatus for a control interface in a virtual environment
CN112619140B (en) Method and device for determining position in game and method and device for adjusting path
JP2022000180A (en) Game processing program, game processing method, and game processing device
CN113633969A (en) Data processing method, device, equipment and storage medium
CN111249723A (en) Method and device for display control in game, electronic equipment and storage medium
KR20110057298A (en) Augmented reality messenger providing system and controlling method thereof
CN113886208B (en) Data processing method, device, equipment and storage medium
CN114470759A (en) Prompt message display method and device, storage medium and electronic equipment
CN110384933B (en) Deployment control method and device for virtual objects in game
CN110891200B (en) Bullet screen based interaction method, device, equipment and storage medium
CN111035926B (en) Virtual object control method, device and storage medium
CN115089959A (en) Direction prompting method and device in game and electronic terminal
CN113101646B (en) Video processing method, device and system
KR20140014903A (en) Shooting gun provided with mobile device
CN116943152A (en) Game display control method, display control device, equipment and medium
CN111437602B (en) Flight trajectory display method, device, equipment and storage medium in virtual environment
EP4154956A1 (en) Method and apparatus for controlling avatar, and device and computer-readable storage medium
CN113893523A (en) Mark display method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination