CN112516584B - Game role control method and device - Google Patents

Game role control method and device Download PDF

Info

Publication number
CN112516584B
CN112516584B CN202011521118.0A CN202011521118A CN112516584B CN 112516584 B CN112516584 B CN 112516584B CN 202011521118 A CN202011521118 A CN 202011521118A CN 112516584 B CN112516584 B CN 112516584B
Authority
CN
China
Prior art keywords
control
voice
client
game
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011521118.0A
Other languages
Chinese (zh)
Other versions
CN112516584A (en
Inventor
罗剑嵘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lianshang Network Technology Co Ltd
Original Assignee
Shanghai Lianshang Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lianshang Network Technology Co Ltd filed Critical Shanghai Lianshang Network Technology Co Ltd
Priority to CN202011521118.0A priority Critical patent/CN112516584B/en
Publication of CN112516584A publication Critical patent/CN112516584A/en
Application granted granted Critical
Publication of CN112516584B publication Critical patent/CN112516584B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1081Input via voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a control method and a control device for game roles, and relates to the technical field of games and intelligent interaction. The specific embodiment comprises the following steps: receiving user selection information of one of at least two control aspects of the game character from a client, determining that the client has control over the one control aspect of the game character; receiving voice control instructions sent to the target role by clients in different clients corresponding to the at least two control aspects under the condition that the clients participate and the game corresponding to the game role is started; and executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction. The user of the client corresponding to each of at least two control aspects of the present application can control the control aspect of the game character by speaking a voice control instruction. Thereby realizing the voice control of the game roles by the cooperation of multiple persons.

Description

Game role control method and device
Technical Field
The application relates to the technical field of computers, in particular to the technical field of games and intelligent interaction, and particularly relates to a method and a device for controlling game roles.
Background
With the development of the game industry, various game layers are endless. As the style of games increases, characters in games are also becoming increasingly abundant, and for example, characters may be characters, animals, cars, tanks, and the like.
In the related art, a user generally adopts a manual operation to control actions of characters. For example, manual control may be performed using a handle, keyboard, mouse, etc. With the development of somatosensory technology, some games also use somatosensory manipulation to control actions of characters. In addition, some games are shot by a camera to determine the user's actions, so that the user's actions are mapped to the character body of the game, for example, the technique can be used for dance games.
Disclosure of Invention
Provided are a game character control method, a game character control device, an electronic device and a storage medium.
According to a first aspect, there is provided a control method of a game character, for a server, the method including: receiving user selection information of one of at least two control aspects of the game character from a client, determining that the client has control over the one control aspect of the game character; receiving voice control instructions sent to the target role by clients in different clients corresponding to the at least two control aspects under the condition that the clients participate and the game corresponding to the game role is started; and executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
According to a second aspect, there is provided a control device for a game character for a server, the device comprising: a determining unit configured to receive user selection information of one of at least two control aspects of the game character from a client, determine that the client has control over the one control aspect of the game character; a receiving unit configured to receive a voice control instruction transmitted to the target character by a client of different clients corresponding to the at least two control aspects, in a case where the client participates and a game corresponding to the game character has been started; and the execution unit is configured to execute the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
According to a third aspect, there is provided a control method of a game character for a client, the method comprising: in response to detecting a user selection operation of one of at least two control aspects of a game character, generating and transmitting user selection information indicating the user selection operation to a server, so that the server determines that the client has control over the one control aspect of the game character; when the game corresponding to the game role is started and participated by the client, responding to the collected voice control instruction of the user, and sending the voice control instruction to the server so that the server receives the voice control instruction sent by the client to the target role in different clients corresponding to the at least two control aspects; and executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
According to a fourth aspect, there is provided a control apparatus of a game character for a client, the apparatus comprising: a generation unit configured to generate and transmit user selection information indicating a user selection operation to a server in response to detection of a user selection operation on one of at least two control aspects of a game character, so that the server determines that the client has control right on the one control aspect of the game character; a sending unit configured to send a voice control instruction to the server in response to collecting the voice control instruction of a user when the game corresponding to the game role in which the client participates has started, so that the server receives the voice control instruction sent by the client to the target role in different clients corresponding to the at least two control aspects; and executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
According to a fifth aspect, there is provided an electronic device comprising: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method such as any of the embodiments of the method of controlling a game character.
According to a sixth aspect, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor implements a method as any one of the embodiments of the method of controlling a game character.
According to a seventh aspect, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method according to any one of the embodiments of the control method of a game character.
According to the scheme of the application, the user of the client corresponding to each control aspect of at least two control aspects can control the control aspect of the game role by speaking a voice control instruction. Thereby realizing the voice control of the game roles by the cooperation of multiple persons.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which some embodiments of the present application may be applied;
FIG. 2 is a flow chart of one embodiment of a method of controlling a game character according to the present application;
fig. 3 is a schematic view of an application scenario of a control method of a game character according to the present application;
FIG. 4 is a flow chart of yet another embodiment of a method of controlling a game character according to the present application;
FIG. 5 is a flow chart of yet another embodiment of a method of controlling a game character according to the present application;
Fig. 6 is a block diagram of an electronic device for implementing a control method of a game character according to an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application will now be described with reference to the accompanying drawings, in which various details of the embodiments of the present application are included to facilitate understanding, and are to be considered merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 of an embodiment of a game character control method or a game character control apparatus to which the present application can be applied.
As shown in fig. 1, a system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as game applications, video-type applications, live applications, instant messaging tools, mailbox clients, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices with display screens, including but not limited to smartphones, tablets, electronic book readers, laptop and desktop computers, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., multiple software or software modules for providing distributed services) or as a single software or software module. The present invention is not particularly limited herein.
The server 105 may be a server providing various services, such as a background server providing support for the terminal devices 101, 102, 103. The background server may analyze data such as user selection information and voice control instructions, and may feed back a processing result (e.g., a game screen generated by executing the voice control instructions on the target character) to the terminal device.
It should be noted that, the method for controlling a game character provided in the embodiment of the present application may be executed by the server 105 or the terminal devices 101, 102, 103, and accordingly, the device for controlling a game character may be provided in the server 105 or the terminal devices 101, 102, 103.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method of controlling a game character according to the present application is shown. The control method of the game role can be used for a server, and the method can comprise the following steps:
Step 201, receiving user selection information of one control aspect of at least two control aspects of the game character from a client, and determining that the client has control over the one control aspect of the game character.
In this embodiment, an execution subject (e.g., a server shown in fig. 1) on which the control method of the game character is executed may receive user selection information of one of at least two control aspects of the game character from a client, thereby determining that the client has control rights to the one control aspect of the game character, that is, giving the user of the client the control rights. One control aspect herein may be any one of at least two control aspects.
Each game character may correspond to at least two control aspects. The control aspect of the game character means that the game character can be controlled by which aspect, for example, the game character can be controlled by attacking and moving two control aspects. In addition, roles may also be controlled through three control aspects of attack, movement, and direction. In particular, the attack herein may choose to use different weapons, when to attack, etc. Movement may include forward, reverse, left, right, and so forth. Directions may include what direction of attack is taken, towards which target. In the case of controlling a character by both the attack and the movement of the control aspect, the contents included in both the attack and the direction described above may be combined into the contents included in the attack.
Step 202, receiving a voice control instruction sent by a client side in different client sides corresponding to the at least two control aspects to the target character under the condition that the client side participates in and the game corresponding to the game character is started.
In this embodiment, the execution body may receive a voice control instruction transmitted from a client in different clients corresponding to the at least two control aspects when the game corresponding to the game character in which the client participates has already been started. The client corresponding to each of the at least two control aspects herein is different, i.e. the client corresponding to the different control aspects is different. The voice control instruction sent by the user of any client to the game role can be collected by any client and sent to the server, so that the server, i.e. the execution subject, can receive the voice control instruction. The execution body may start the game in a case where the corresponding client exists in at least two control aspects.
And 203, executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
In this embodiment, the execution body may execute the voice control instruction on the game character in terms of control corresponding to the voice control instruction. Specifically, a control aspect corresponding to a voice control instruction indicates that a user of a client has control rights of the control aspect, wherein the client sends the voice control instruction to a server.
In the method provided by the embodiment of the application, the user of the client corresponding to each of the at least two control aspects can control the control aspect of the game character by speaking the voice control instruction. Thereby realizing the voice control of the game roles by the cooperation of multiple persons.
In some optional implementations of this embodiment, the user of the client corresponding to any control aspect has preset authentication information, where the preset authentication information includes preset voiceprint information; the method further comprises the steps of: for clients in different clients, determining voiceprint information of the voice control instruction as current voiceprint information of the client, and obtaining current authentication information of the client, including the current voiceprint information; detecting whether the current authentication information is matched with the preset authentication information of the user of the client side or not, and generating a detection result; if the number of the continuous detection results which are not matched in the detection results detected by the client reaches a preset number threshold, a suspected cheating message is generated for the client and is output.
In these alternative implementations, the executing entity may determine, for each client (e.g., each client) of the different clients, voiceprint information of a voice control instruction sent by the client as current voiceprint information, and generate current authentication information including the current voiceprint information as current authentication information of the client. And the executing body can match the current authentication information of the client with the preset authentication information of the client, namely, detect whether the current authentication information and the preset authentication information are matched or not, and generate a detection result. The detection result may indicate a match, or a mismatch. Matching in the present application may mean that the same or similarity is greater than a preset threshold. In practice, the current authentication information may also include other information than voiceprint information (such as current voiceprint information), such as a face image.
If the number of the continuous detection results indicating the mismatch in the detection results detected by one client reaches a preset number threshold, the execution body may generate a suspected cheating message for the client and output the suspected cheating message. Specifically, the suspected cheating message may indicate that the user of the client is currently suspected of not being the user himself of the game account. The executing body may send the suspected cheating message to a preset terminal, such as a terminal of a supervisor, or may send the suspected cheating message to the client.
The implementation modes can be authenticated through voiceprint, so that other users are prevented from impersonating the game account of the user.
Optionally, the determining the voiceprint information of the voice control instruction as the current voiceprint information of the client may include: for clients in the different clients, determining voiceprint information of a voice control instruction finally received from the client in response to reaching a preset authentication period; and taking the voiceprint information as the current voiceprint information of the client.
The execution body may determine, for a client (for example, each client) of different clients, a voice control instruction received last from the client in response to reaching a preset authentication period, and take voiceprint information of the voice control instruction as current voiceprint information of the client. In practice, the preset authentication period here may be a set respective duration, such as one minute, half minute, or the like.
In the alternative implementation modes, the user of the client can be subjected to voice print authentication once in each preset authentication period, so that the problem of resource waste caused by overlarge detection workload due to too frequent real-time detection is avoided.
In some optional implementations of this embodiment, the method may further include: in response to receiving a request for exchange by a first client for a second control aspect, sending an exchange request message to a second client having control over the second control aspect, wherein the first client has control over a first control aspect of the at least two control aspects; in response to receiving the consent exchange information fed back by the second client, determining that the first client has control over the second control aspect, and determining that the second client has control over the first control aspect; and respectively sending the exchanged interface display data to the first client and the second client.
In these alternative implementations, the executing entity may send the exchange request message to the second client having control of the second control aspect after receiving the exchange request sent by the first client for the second control aspect. The second client can output the exchange request message to the user in a display or play mode and the like, so that the user can know the exchange willingness of the user of the first client to the control right in the control aspect. The first client has control over the first control aspect, and a user of the first client wants to exchange control over the control aspect with a user of the second client.
If the user of the second client agrees to exchange, a corresponding operation can be performed or an agreeing voice can be spoken, so that the second client can feed back agreeing exchange information indicating the agreement of the user to the execution subject. The above-described execution body may implement exchange of control rights in the control aspect between the first client and the second client, i.e., determine that the first client has control rights in the second control aspect and determine that the second client has control rights in the first control aspect. After the exchange, the execution body may send the exchanged interface display data indicating that the control right has been exchanged to the two clients, so that the two clients display the exchanged interface.
The implementation modes can enhance the controllability of the user to the game roles by exchanging the control rights in the aspect of control, and the setting in the game process is more flexible, so that the control rights of the game roles are more in line with the wish of the user.
In some optional implementations of this embodiment, the method may further include: for the clients in the different clients, receiving authentication voice from the client, determining voiceprint information of the authentication voice, and obtaining authentication information comprising the voiceprint information; detecting whether the authentication information is matched with the preset authentication information or not, and generating a detection result; and if the detection result indicates that the authentication information is matched with the preset authentication information, starting the game.
In these alternative implementations, the executing entity may, for each of the different clients (e.g., each client), receive the authentication voice from the client and determine voiceprint information for the authentication voice, thereby obtaining authentication information that includes the voiceprint information. Then, the executing body can detect whether the authentication information is matched with the preset authentication information or not, and a detection result is obtained. If the detection result indicates a match, the game may be started.
The implementation methods can authenticate whether the user is himself before the game starts, avoid the situation of taking the account of other people to play the game, and ensure the safety of the game account.
In some optional implementations of this embodiment, the method further includes: under the condition that the client participates in and the game corresponding to the game role is started, determining whether the real-time face image is matched with preset face information of a user of the client or not in response to receiving the real-time face image acquired by the client in the different clients; if the real-time face image is not matched with the preset face information, generating and outputting a suspected cheating message corresponding to the client.
In these alternative implementations, the executing entity may determine, in response to receiving the real-time face image collected by a client (such as each client) of the different clients, whether the real-time face image and the preset face information match when the game has started. If the client is not matched, a suspected cheating message can be generated for the client and output to the preset terminal.
The implementation modes can utilize real-time face detection to determine whether a user performing game operation is the user of the client or an imposter of the user account in real time, so that the safety of the game account can be ensured.
With continued reference to fig. 3, fig. 3 is a schematic view of an application scenario of the control method of the game character according to the present embodiment. In the application scenario of fig. 3, there are three control aspects in total, and the user may select the control right of two other control aspects except the control aspect of the user, for example, the user having the control right of "direction" may click "exchange movement" or "exchange attack", so that the client may issue an exchange request for "movement" or "attack". If the user clicks on "exchange attack," the interface will display "wait for exchange. In the process of waiting for exchange display, the original control right of each client on the control aspect is still maintained. If the exchange is complete, the user may have control over the "attack" and the interface may display "exchange movement" and "exchange direction".
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method of controlling a game character is shown. The process 400 includes the steps of:
Step 401 of receiving user selection information of one of at least two control aspects of the game character from a client, determining that the client has control over the one control aspect of the game character.
In this embodiment, an execution subject (e.g., a server shown in fig. 1) on which the control method of the game character is executed may receive user selection information of one of at least two control aspects of the game character from a client, thereby determining that the client has control rights to the one control aspect of the game character, that is, giving the user of the client the control rights. One control aspect herein may be any one of at least two control aspects.
Step 402, receiving a voice control instruction sent by a client in different clients corresponding to the at least two control aspects to the target character when the client participates in and a game corresponding to the game character has started.
In this embodiment, the execution body may receive a voice control instruction transmitted from a client in different clients corresponding to the at least two control aspects when the game corresponding to the game character in which the client participates has already been started. The client corresponding to each of the at least two control aspects herein is different, i.e. the client corresponding to the different control aspects is different.
Step 403, determining a voice command template matched with the voice control command, and executing a preset program corresponding to the voice command template on the game role in the control aspect corresponding to the voice control command.
In this embodiment, the execution body may determine a voice command template that matches the voice control command. The execution body may execute a predetermined program, that is, a predetermined code, in a control aspect corresponding to the voice control instruction. The preset program can realize the role control corresponding to the voice instruction template in the aspect of the control of the game role.
The voice command template is a standard voice set for a voice command uttered by a user, such as voice "forward", "backward", and the like. The user speaks "forward" then matches the voice command template and if the user speaks "forward" then does not match the voice command template.
The voice command templates may include voices for controlling game characters or words corresponding to the voices for controlling game characters. Accordingly, the executing body can directly match the voice control instruction, namely, the voice with each voice instruction template serving as the voice to find the voice instruction template matched with the voice control instruction. Or the executing body may also obtain the text corresponding to the voice control instruction, for example, perform voice recognition on the device to obtain the text, or send the voice control instruction to other electronic devices, and receive the text returned by the other electronic devices. Then, the executing body may match the obtained text with each voice command template serving as the text, so as to find a voice command template matched with the voice control command.
The embodiment can utilize the voice command template to normalize the voice spoken by the user, and avoid the problem of unrecognizable caused by non-uniform voice control commands spoken by the user.
In some optional implementations of this embodiment, the client having control rights in a third control aspect of the at least two control aspects is a third client, and the voice instruction templates for each control aspect include voice instruction templates for at least two actions; step 403 may include: determining a voice command template of an action matched with the voice control command; and in the control aspect corresponding to the voice control instruction, executing a preset program corresponding to the voice instruction template of the action on the game role.
In these alternative implementations, each voice control command may instruct the game character to perform an action, and the executing entity may determine an action that matches the voice control command. And executing the preset program corresponding to the voice instruction template of the action.
For example, the action performed may be "forward", "backward", "shoot" or "bomb attack", and the corresponding voice command template may be "forward", "backward", "BU", "bottom".
These implementations can specifically direct instructions to the actions of the character so that the character can be precisely controlled by voice.
In some optional implementations of this embodiment, the method may further include: receiving a change request of a voice command template of a first action in the voice command template of the third control aspect from the third client, wherein the change request is sent by the third client in response to acquiring a new voice command template, and the change request comprises the new voice command template; updating the voice command template of the first action in the voice command template of the third control aspect, wherein the updated voice command template is the new voice command template.
In these alternative implementations, the execution body may receive a request for modification of the voice instruction template sent by the client, where the request for modification indicates modification of the voice instruction template of the first action in the voice instruction template of the third control aspect. The third client is any client corresponding to the at least two aspects, for example, the third client may be the first client or the second client. Then, the execution body may update the voice command template of the first action in the voice command templates of the third control aspect to the new voice command template.
Alternatively, the user-altered new voice command templates may continue to be retained after control of the control aspect is exchanged between different clients.
In the implementation modes, the user can customize the voice instruction template, so that instructions for the game roles can be spoken according to own will, the controllability of the user on the game roles is improved, and the user experience is improved.
Optionally, updating the voice command template of the first action in the voice command template of the third control aspect may include: sending a recall notification for the new voice instruction template to the third client; receiving the re-speaking voice fed back by the third client, and comparing the re-speaking voice with the new instruction template; and updating the voice command template of the first action in the voice command template of the third control aspect in response to the comparison result being consistent.
Specifically, the executing body may send a re-speaking notification for the new voice command template to the third client, so that the user of the third client re-records the new voice command template once. The execution body may compare the re-recorded voice command template with the voice command template in the change request. If the comparison results are consistent, the voice command template of the first action can be updated to the new voice command template.
These implementations can avoid the problem that the user records the voice command template only once and may make mistakes, thereby ensuring the accuracy of the process of updating the voice command template.
As shown in fig. 5, the present application further provides a method for controlling a game character, for a client, where the method includes: step 501, in response to detecting a user selection operation on one of at least two control aspects of a game character, generating and transmitting user selection information indicating the user selection operation to a server side, so that the server side determines that the client side has control right on the one control aspect of the game character; step 502, when the game corresponding to the game role and participated by the client has been started, in response to collecting the voice control instruction of the user, sending the voice control instruction to the server, so that the server receives the voice control instruction sent by the client to the target role in different clients corresponding to the at least two control aspects, and executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
In some optional implementations of this embodiment, the method further includes: generating an exchange request for a second control aspect of the game character and sending the exchange request to the server in response to receiving an exchange identification of the second control aspect of the game character at an exchange interface, so that the server sends an exchange request message to a second client with control over the second control aspect, wherein the client has control over the first control aspect of the game character; and receiving the exchanged interface display data sent by the server.
In some alternative implementations of this embodiment, the voice command templates for each control aspect include voice command templates for at least two actions; the method further comprises the steps of: generating a change request comprising a new voice instruction template in response to collecting the new instruction template; and sending the change request to the server side so that the server side updates the voice command template of the first action in the voice command template of the third control aspect, wherein the updated voice command template is the new voice command template.
In some optional implementations of this embodiment, the sending the change request to the server, so that the server updates a voice command template of the first action in the voice command templates of the third control aspect, includes: receiving a re-speaking notification sent by the server for the new voice instruction template, and outputting the re-speaking notification to the user; responding to the collected re-speaking voice, feeding back the re-speaking voice to the server side so that the server side compares the re-speaking voice with the new instruction template, and updating the voice instruction template of the first action in the voice instruction template of the third control aspect if the comparison result is consistent
In some optional implementations of this embodiment, the method further includes: collecting authentication voice and sending the authentication voice to the server, wherein the server determines voiceprint information of the authentication voice to obtain authentication information comprising the voiceprint information, detects whether the authentication information is matched with the preset authentication information, generates a detection result, and starts the game if the detection result indicates that the authentication information is matched with the preset authentication information.
In some optional implementations of this embodiment, the method further includes: collecting real-time face images and sending the real-time face images to the server under the condition that the client participates in and the game corresponding to the game role is started, wherein the server determines whether the real-time face images are matched with preset face information of a user of the client or not in response to receiving the real-time face images collected by the clients in different clients; if the real-time face image is not matched with the preset face information, generating a suspected cheating message for the client and outputting the suspected cheating message.
According to embodiments of the present application, the present application also provides an electronic device, a readable storage medium and a computer program product.
As shown in fig. 6, there is a block diagram of an electronic device of a control method of a game character according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the applications described and/or claimed herein.
As shown in fig. 6, the electronic device includes: one or more processors 601, memory 602, and interfaces for connecting the components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 601 is illustrated in fig. 6.
The memory 602 is a non-transitory computer readable storage medium provided by the present application. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for controlling a game character provided by the present application. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to execute the control method of the game character provided by the present application.
The memory 602, which is a non-transitory computer-readable storage medium, may be used to store a non-transitory software program, a non-transitory computer-executable program, and modules, such as program instructions/modules (e.g., the determining unit 501, the receiving unit 502, and the executing unit 503) corresponding to the control method of the game character in the embodiment of the present application. The processor 601 executes various functional applications of the server and data processing, i.e., implements the control method of the game character in the above-described method embodiment, by running non-transitory software programs, instructions, and modules stored in the memory 602.
The memory 602 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created according to the use of the control electronics of the game character, etc. In addition, the memory 602 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 602 may optionally include memory remotely located relative to processor 601, which may be connected to the control electronics of the game character via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the control method of game characters may further include: an input device 603 and an output device 604. The processor 601, memory 602, input device 603 and output device 604 may be connected by a bus or otherwise, for example in fig. 6.
The input device 603 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the controlling electronics of the game character, such as a touch screen, keypad, mouse, trackpad, touchpad, pointer stick, one or more mouse buttons, trackball, joystick, and like input devices. The output means 604 may include a display device, auxiliary lighting means (e.g., LEDs), tactile feedback means (e.g., vibration motors), and the like. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service ("Virtual PRIVATE SERVER" or simply "VPS") are overcome. The server may also be a server of a distributed system or a server that incorporates a blockchain.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented in software or in hardware. The described units may also be provided in a processor, for example, described as: a processor includes a determination unit, a receiving unit, and an execution unit. The names of these units do not limit the unit itself in some cases, and for example, the execution unit may also be described as "a unit that executes the voice control instruction on the game character" in terms of control corresponding to the voice control instruction.
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: receiving user selection information of one of at least two control aspects of the game character from a client, determining that the client has control over the one control aspect of the game character; receiving voice control instructions sent to the target role by clients in different clients corresponding to the at least two control aspects under the condition that the clients participate and the game corresponding to the game role is started; and executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction.
The above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept described above. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.

Claims (16)

1. A control method of a game character for a server, the method comprising:
Receiving user selection information of one of at least two control aspects of the game character from a client, determining that the client has control over the one control aspect of the game character;
receiving voice control instructions sent by clients in different clients corresponding to the at least two control aspects to a target character under the condition that the clients participate and the game corresponding to the game character is started;
executing the voice control instruction on the game role in the control aspect corresponding to the voice control instruction;
Further comprises:
In response to receiving a request for exchange by a first client for a second control aspect, sending an exchange request message to a second client having control over the second control aspect, wherein the first client has control over a first control aspect of the at least two control aspects;
in response to receiving the consent exchange information fed back by the second client, determining that the first client has control over the second control aspect, and determining that the second client has control over the first control aspect;
and respectively sending the exchanged interface display data to the first client and the second client.
2. The method of claim 1, wherein a user of the client corresponding to any control aspect has preset authentication information, the preset authentication information including preset voiceprint information;
The method further comprises the steps of:
for the clients in the different clients, determining the voiceprint information of the voice control instruction as the current voiceprint information of the client, and obtaining the current authentication information of the client, including the current voiceprint information;
Detecting whether the current authentication information is matched with the preset authentication information of the user of the client side or not, and generating a detection result;
If the number of the continuous detection results which are not matched in the detection results detected by the client reaches a preset number threshold, a suspected cheating message is generated for the client and is output.
3. The method of claim 2, wherein the determining the voiceprint information of the voice control instruction as current voiceprint information of the client comprises:
For clients in the different clients, determining voiceprint information of a voice control instruction finally received from the client in response to reaching a preset authentication period;
and taking the voiceprint information as the current voiceprint information of the client.
4. A method according to any one of claims 1-3, wherein said executing said voice control command on said game character in a control aspect corresponding to said voice control command comprises:
and determining a voice command template matched with the voice control command, and executing a preset program corresponding to the voice command template on the game role in the control aspect corresponding to the voice control command.
5. The method of claim 4, wherein the client having control of a third control aspect of the at least two control aspects is a third client, the voice instruction templates of each control aspect comprising voice instruction templates for at least two actions;
the determining a voice command template matched with the voice control command, executing a preset program corresponding to the voice command template on the game role in the control aspect corresponding to the voice control command, including:
Determining a voice command template of an action matched with the voice control command;
And in the control aspect corresponding to the voice control instruction, executing a preset program corresponding to the voice instruction template of the action on the game role.
6. The method of claim 5, wherein the method further comprises:
Receiving a change request of a voice command template of a first action in the voice command template of the third control aspect from the third client, wherein the change request is sent by the third client in response to acquiring a new voice command template, and the change request comprises the new voice command template;
Updating the voice command template of the first action in the voice command template of the third control aspect, wherein the updated voice command template is the new voice command template.
7. The method of claim 6, wherein the updating the voice command template of the first action in the voice command template of the third control aspect comprises:
sending a recall notification for the new voice instruction template to the third client;
Receiving the re-speaking voice fed back by the third client, and comparing the re-speaking voice with the new instruction template;
and updating the voice command template of the first action in the voice command template of the third control aspect in response to the comparison result being consistent.
8. The method of claim 1, wherein the method further comprises:
For the clients in the different clients, receiving authentication voice from the client, determining voiceprint information of the authentication voice, and obtaining authentication information comprising the voiceprint information;
detecting whether the authentication information is matched with preset authentication information or not, and generating a detection result;
And if the detection result indicates that the authentication information is matched with the preset authentication information, starting the game.
9. The method of claim 1, wherein the method further comprises:
Under the condition that the client participates in and the game corresponding to the game role is started, determining whether the real-time face image is matched with preset face information of a user of the client or not in response to receiving the real-time face image acquired by the client in the different clients;
if the real-time face image is not matched with the preset face information, generating a suspected cheating message for the client and outputting the suspected cheating message.
10. A method of controlling a game character for a client, the method comprising:
In response to detecting a user selection operation of one of at least two control aspects of a game character, generating and transmitting user selection information indicating the user selection operation to a server, so that the server determines that the client has control over the one control aspect of the game character;
Under the condition that the game corresponding to the game role is started and participated by the client, responding to the collected voice control instruction of the user, sending the voice control instruction to the server, so that the server receives the voice control instruction sent by the client to the target role in different clients corresponding to the at least two control aspects, and executing the voice control instruction to the game role in the control aspect corresponding to the voice control instruction;
Further comprises:
Generating an exchange request for a second control aspect of the game character and sending the exchange request to the server in response to receiving an exchange identification of the second control aspect of the game character at an exchange interface, so that the server sends an exchange request message to a second client with control over the second control aspect, wherein the client has control over the first control aspect of the game character;
And receiving the exchanged interface display data sent by the server.
11. The method of claim 10, wherein the voice command templates for each control aspect include voice command templates for at least two actions;
The method further comprises the steps of:
generating a change request comprising a new voice instruction template in response to collecting the new instruction template;
And sending the change request to the server side so that the server side updates the voice instruction template of the first action in the voice instruction template of the third control aspect, wherein the updated voice instruction template is the new voice instruction template, and the client side of the control right of the third control aspect is a third client side.
12. The method of claim 11, wherein the sending the change request to the server to cause the server to update a voice command template of a first action in the voice command templates of the third control aspect comprises:
receiving a re-speaking notification sent by the server for the new voice instruction template, and outputting the re-speaking notification to the user;
And responding to the collected re-speaking voice, feeding back the re-speaking voice to the server side so that the server side compares the re-speaking voice with the new instruction template, and updating the voice instruction template of the first action in the voice instruction template of the third control aspect if the comparison result is consistent.
13. The method of claim 10, wherein the method further comprises:
Collecting authentication voice and sending the authentication voice to the server, wherein the server determines voiceprint information of the authentication voice to obtain authentication information comprising the voiceprint information, detects whether the authentication information is matched with preset authentication information, generates a detection result, and starts the game if the detection result indicates that the authentication information is matched with the preset authentication information.
14. The method of claim 10, wherein the method further comprises:
Collecting real-time face images and sending the real-time face images to the server under the condition that the client participates in and the game corresponding to the game role is started, wherein the server determines whether the real-time face images are matched with preset face information of a user of the client or not in response to receiving the real-time face images collected by the clients in different clients; if the real-time face image is not matched with the preset face information, generating a suspected cheating message for the client and outputting the suspected cheating message.
15. An electronic device, comprising:
One or more processors;
Storage means for storing one or more programs,
When executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-14.
16. A computer readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the method of any of claims 1-14.
CN202011521118.0A 2020-12-21 2020-12-21 Game role control method and device Active CN112516584B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011521118.0A CN112516584B (en) 2020-12-21 2020-12-21 Game role control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011521118.0A CN112516584B (en) 2020-12-21 2020-12-21 Game role control method and device

Publications (2)

Publication Number Publication Date
CN112516584A CN112516584A (en) 2021-03-19
CN112516584B true CN112516584B (en) 2024-06-04

Family

ID=75002247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011521118.0A Active CN112516584B (en) 2020-12-21 2020-12-21 Game role control method and device

Country Status (1)

Country Link
CN (1) CN112516584B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114931747B (en) * 2022-07-25 2022-10-14 深圳市景创科技电子股份有限公司 Game controller and intelligent voice control method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1773518A (en) * 2004-11-10 2006-05-17 世嘉股份有限公司 Network game system
CN105791931A (en) * 2016-02-26 2016-07-20 深圳Tcl数字技术有限公司 Smart television and voice control method of the smart television
CN108159702A (en) * 2017-12-06 2018-06-15 广东欧珀移动通信有限公司 Based on multi-person speech game processing method and device
KR20190027563A (en) * 2017-09-07 2019-03-15 주식회사 엔씨소프트 Apparatus and method for controlling an action of a character in online game based on vacal information
CN109992248A (en) * 2019-02-25 2019-07-09 百度在线网络技术(北京)有限公司 Implementation method, device, equipment and the computer readable storage medium of voice application
CN110548282A (en) * 2018-05-31 2019-12-10 索尼互动娱乐有限责任公司 Forking and passing control of shared control in video games
CN111530079A (en) * 2020-04-29 2020-08-14 网易(杭州)网络有限公司 Information processing method and device in game, electronic equipment and storage medium
CN112023396A (en) * 2020-09-17 2020-12-04 深圳市欢太科技有限公司 Cloud game data interaction method and device, computer readable medium and electronic equipment
CN112023393A (en) * 2020-08-27 2020-12-04 深圳创维-Rgb电子有限公司 Game control method, television device, server and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1773518A (en) * 2004-11-10 2006-05-17 世嘉股份有限公司 Network game system
CN105791931A (en) * 2016-02-26 2016-07-20 深圳Tcl数字技术有限公司 Smart television and voice control method of the smart television
KR20190027563A (en) * 2017-09-07 2019-03-15 주식회사 엔씨소프트 Apparatus and method for controlling an action of a character in online game based on vacal information
CN108159702A (en) * 2017-12-06 2018-06-15 广东欧珀移动通信有限公司 Based on multi-person speech game processing method and device
CN110548282A (en) * 2018-05-31 2019-12-10 索尼互动娱乐有限责任公司 Forking and passing control of shared control in video games
CN109992248A (en) * 2019-02-25 2019-07-09 百度在线网络技术(北京)有限公司 Implementation method, device, equipment and the computer readable storage medium of voice application
CN111530079A (en) * 2020-04-29 2020-08-14 网易(杭州)网络有限公司 Information processing method and device in game, electronic equipment and storage medium
CN112023393A (en) * 2020-08-27 2020-12-04 深圳创维-Rgb电子有限公司 Game control method, television device, server and storage medium
CN112023396A (en) * 2020-09-17 2020-12-04 深圳市欢太科技有限公司 Cloud game data interaction method and device, computer readable medium and electronic equipment

Also Published As

Publication number Publication date
CN112516584A (en) 2021-03-19

Similar Documents

Publication Publication Date Title
CN111354360A (en) Voice interaction processing method and device and electronic equipment
CN111666546B (en) Application login method and device
CN109428859B (en) Synchronous communication method, terminal and server
CN111968203B (en) Animation driving method, device, electronic equipment and storage medium
JP2021034003A (en) Human object recognition method, apparatus, electronic device, storage medium, and program
KR102642866B1 (en) Image recognition method and apparatus, electronic device, and medium
CN111709362B (en) Method, device, equipment and storage medium for determining important learning content
CN111443801B (en) Man-machine interaction method, device, equipment and storage medium
KR20230006009A (en) Facial image generation method, device, electronic device and readable storage medium
WO2022100075A1 (en) Method and apparatus for performance test, electronic device and computer-readable medium
CN112487973B (en) Updating method and device for user image recognition model
CN112235417A (en) Method and device for sending debugging instruction
CN112516584B (en) Game role control method and device
KR20210154774A (en) Image recognition method, device, electronic equipment and computer program
US20140067982A1 (en) Determining an estimation of message response time
CN112561059B (en) Method and apparatus for model distillation
CN112529181B (en) Method and apparatus for model distillation
CN113055593A (en) Image processing method and device
CN111615171A (en) Access method and device of wireless local area network
CN110968856A (en) Login method, login device, electronic equipment and storage medium
CN113743288B (en) Image recognition method, device and equipment of cloud mobile phone and storage medium
US11488384B2 (en) Method and device for recognizing product
CN113556649B (en) Broadcasting control method and device of intelligent sound box
CN112752323A (en) Method and device for changing hotspot access state
CN113419915A (en) Cloud terminal desktop stillness determination method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant