CN116999838A - Interaction control method and device of target object, storage medium and electronic equipment - Google Patents

Interaction control method and device of target object, storage medium and electronic equipment Download PDF

Info

Publication number
CN116999838A
CN116999838A CN202310936138.1A CN202310936138A CN116999838A CN 116999838 A CN116999838 A CN 116999838A CN 202310936138 A CN202310936138 A CN 202310936138A CN 116999838 A CN116999838 A CN 116999838A
Authority
CN
China
Prior art keywords
interaction
target object
gesture
interactive
progress
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310936138.1A
Other languages
Chinese (zh)
Inventor
王冠翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202310936138.1A priority Critical patent/CN116999838A/en
Publication of CN116999838A publication Critical patent/CN116999838A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure belongs to the technical field of virtual games, and relates to an interactive control method and device for target objects, a computer storage medium and electronic equipment. The method comprises the following steps: responding to an interaction instruction aiming at the target object, displaying the target object with a preset interaction gesture in a graphical user interface, and controlling the controlled virtual object to interact with the target object; the preset interaction gesture is one of a first interaction gesture and a second interaction gesture; in the interaction process of the controlled virtual object and the target object, responding to a gesture switching instruction aiming at the target object, controlling the target object to be switched from a preset interaction gesture to a target interaction gesture and interacting with the target object; the target interaction gesture is the other of the first interaction gesture and the second interaction gesture. In the present disclosure, there are two kinds of interaction gestures with different interaction operations difficulties, and thus, the interaction process with the target object can be changed by changing the interaction gesture, which increases the interest of the virtual game.

Description

Interaction control method and device of target object, storage medium and electronic equipment
Technical Field
The disclosure relates to the technical field of virtual games, and in particular relates to an interaction control method of a target object, an interaction control device of the target object, a computer readable storage medium and electronic equipment.
Background
With the development of internet technology, games are favored by more and more game players.
In a conventional game, only one form of interactive object is presented, and a game player can only complete a task in the game by interacting with this form of interactive object. The game player cannot control the interactive process at will, so that the game lacks interestingness, and the game experience of the game player is reduced.
In view of this, there is a need in the art to develop a new method and apparatus for interactive control of a target object.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
The present disclosure is directed to a method for controlling interaction of a target object, an apparatus for controlling interaction of a target object, a computer-readable storage medium, and an electronic device, and thus, at least to some extent, to overcome the problem of low game interest caused by the related art.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
According to a first aspect of an embodiment of the present invention, there is provided an interactive control method for a target object, providing a graphical user interface through a terminal device, where at least part of a game scene is displayed in the graphical user interface, where the game scene includes a controlled virtual object and the target object, the method including: responding to an interaction instruction aiming at the target object, displaying the target object with a preset interaction gesture in the graphical user interface, and controlling the controlled virtual object to interact with the target object; the preset interaction gesture is one of a first interaction gesture and a second interaction gesture, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty; in the process of interaction between the controlled virtual object and the target object, responding to a gesture switching instruction aiming at the target object, and controlling the target object to be switched from the preset interaction gesture to the target interaction gesture and to interact with the target object; wherein the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
According to a second aspect of an embodiment of the present invention, there is provided an interaction control apparatus for a target object, providing a graphical user interface through a terminal device, the graphical user interface having at least a part of a game scene displayed therein, the game scene including a controlled virtual object and the target object, the apparatus comprising: the first interaction module is configured to respond to an interaction instruction aiming at the target object, display the target object with a preset interaction gesture in the graphical user interface and control the controlled virtual object to interact with the target object; the preset interaction gesture is one of a first interaction gesture and a second interaction gesture, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty; the second interaction module is configured to respond to a gesture switching instruction aiming at the target object in the process of interaction between the controlled virtual object and the target object, and control the target object to be switched from the preset interaction gesture to a target interaction gesture and interact with the target object; wherein the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
According to a third aspect of an embodiment of the present invention, there is provided an electronic apparatus including: a processor and a memory; wherein the memory has stored thereon computer readable instructions which, when executed by the processor, implement the interactive control method of the target object of any of the above-described exemplary embodiments.
According to a fourth aspect of embodiments of the present invention, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the interactive control method of the target object in any of the above-described exemplary embodiments.
As can be seen from the above technical solutions, the method for controlling interaction of a target object, the device for controlling interaction of a target object, the computer storage medium, and the electronic device according to the exemplary embodiments of the present invention have at least the following advantages and positive effects:
in the method and the device provided by the exemplary embodiment of the disclosure, on one hand, in response to the gesture switching instruction for the target object, the target object is controlled to be switched from the preset interaction gesture to the target interaction gesture and to interact with the target object, and the interaction gesture of the target object can be switched between the first interaction gesture and the second interaction gesture, so that the situation that the target object can be displayed only in one interaction gesture is avoided, the display effect of the target object is improved, and different interaction requirements of game players are met; on the other hand, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty, and interaction with the target object under different interaction operation difficulties can be completed through switching the interaction gesture, so that the interaction progress with the target object can be flexibly controlled, the game interestingness is increased, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
FIG. 1 schematically illustrates a flow chart of a method for interactive control of a target object in an embodiment of the disclosure;
FIG. 2 schematically illustrates a schematic view of a wood object preset with an interaction gesture in an embodiment of the present disclosure;
FIG. 3 schematically illustrates a process of switching the interaction pose of a target object in an embodiment of the present disclosure;
FIG. 4 schematically illustrates a graphical user interface diagram showing current interaction progress information when the interaction gesture of the target object is the first interaction gesture in the present exemplary embodiment;
FIG. 5 schematically illustrates a user interface diagram when the interaction gesture of the target object is a second interaction gesture in the present exemplary embodiment;
FIG. 6 schematically illustrates a graphical user interface diagram showing display parameters in an embodiment of the present disclosure;
FIG. 7 schematically illustrates an apparatus for an interactive control method for a target object in an embodiment of the present disclosure;
FIG. 8 schematically illustrates an electronic device for an interactive control method for a target object in an embodiment of the disclosure;
fig. 9 schematically illustrates a computer-readable storage medium for an interactive control method for a target object in an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. One skilled in the relevant art will recognize, however, that the aspects of the disclosure may be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
The terms "a," "an," "the," and "said" are used in this specification to denote the presence of one or more elements/components/etc.; the terms "comprising" and "having" are intended to be inclusive and mean that there may be additional elements/components/etc. in addition to the listed elements/components/etc.; the terms "first" and "second" and the like are used merely as labels, and are not intended to limit the number of their objects.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities.
The interactive control method of the target object in one embodiment of the disclosure may be run on a local terminal device or a server. When the interaction control method of the target object runs on the server, the method can be realized and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an alternative embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud game. Taking cloud game as an example, cloud game refers to a game mode based on cloud computing. In the running mode of the cloud game, the running main body of the game program and the game picture presentation main body are separated, the storage and running of the interactive control method of the target object are completed on the cloud game server, and the function of the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When playing the game, the player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, codes and compresses data such as game pictures and the like, returns the data to the client device through a network, and finally decodes the data through the client device and outputs the game pictures.
In an alternative embodiment, taking a game as an example, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through the graphical user interface, namely, conventionally downloading and installing the game program through the electronic device and running. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal, or provided to the player by holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including game visuals, and a processor for running the game, generating the graphical user interface, and controlling the display of the graphical user interface on the display screen.
In a possible implementation manner, the embodiment of the present invention provides an interaction control method of a target object, and a graphical user interface is provided through a terminal device, where the terminal device may be the aforementioned local terminal device or the aforementioned client device in the cloud interaction system.
Aiming at the problems in the related art, the present disclosure proposes an interaction control method for a target object. Fig. 1 shows a flow chart of an interactive control method of a target object, wherein a graphical user interface of a game is displayed through a terminal, and a game virtual scene is displayed in the graphical user interface, and the interactive control method of the target object at least comprises the following steps of:
S110, responding to an interaction instruction aiming at a target object, displaying the target object with a preset interaction gesture in a graphical user interface, and controlling the controlled virtual object to interact with the target object; the method comprises the steps of presetting an interaction gesture as one of a first interaction gesture and a second interaction gesture, wherein the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty.
S120, responding to a gesture switching instruction aiming at a target object in the interaction process of the controlled virtual object and the target object, and controlling the target object to be switched from a preset interaction gesture to a target interaction gesture and to interact with the target object; the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
In the method and the device provided by the exemplary embodiment of the disclosure, on one hand, in response to the gesture switching instruction for the target object, the target object is controlled to be switched from the preset interaction gesture to the target interaction gesture and interact with the target object, and the interaction gesture of the target object can be switched between the first interaction gesture and the second interaction gesture, so that the situation that the target object can only be displayed in one interaction gesture is avoided, the display effect of the target object is improved, and different interaction requirements of game players are met; on the other hand, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty, and interaction with the target object under different interaction operation difficulties can be completed through switching the interaction gesture, so that the interaction progress with the target object can be flexibly controlled, the game interestingness is increased, and the user experience is improved.
The respective steps of the interactive control method of the target object are described in detail below.
In step S110, in response to the interaction instruction for the target object, displaying the target object with the preset interaction gesture in the graphical user interface, and controlling the controlled virtual object to interact with the target object; the method comprises the steps of presetting an interaction gesture as one of a first interaction gesture and a second interaction gesture, wherein the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty.
In the exemplary embodiment of the present disclosure, the terminal may be a computer terminal, a mobile phone terminal, a television terminal, a tablet terminal, or any terminal that may be used by a game player, which is not particularly limited in this exemplary embodiment.
The game player can display a graphical user interface in the terminal through a corresponding game application in the touch terminal, and a virtual scene of the game entered by the game player is displayed in the graphical user interface.
At this time, the game player may choose to perform a certain task in the game (the task is the target game task). A target object refers to an object that may be used to characterize a target game task. For example, the target game task is drilling wood, in which case the target object may be a block of wood.
In completing a target game task, interaction with a target object is required. An interaction instruction refers to an instruction for interacting with a target object. After receiving the interaction instruction, the target object is displayed in the graphical user interface, and specifically, the target object is displayed in the graphical user interface in a preset interaction gesture.
The preset interaction gesture refers to a display gesture of the target object, and specifically, the preset interaction gesture may be a first interaction gesture or a second interaction gesture. The first interaction gesture may be a stereoscopic display gesture of the target object, and the second interaction gesture may be a planar display gesture of the target object, which is not particularly limited in the present exemplary embodiment.
It should be noted that the two interaction gestures are provided with different interaction progress and interaction operation difficulty, and specifically, in the first interaction gesture, the first interaction progress may be presented through size information of the target object (for example, the first interaction progress is presented through the length of the wood object). In the second interaction gesture, a second interaction progress may be presented by a current progress percentage (e.g., 70%).
Under the first interaction attitude, no matter how the user changes the interaction position, only the instruction related to the movement in the horizontal direction is acquired, and the execution speed of the task is stably adjusted. In the second interaction attitude, the user can change the interaction position in the horizontal direction and also can change the interaction position in the vertical direction, and along with the movement of the interaction position in the vertical direction, the speed of task execution can be flexibly controlled, and therefore, the interaction operation difficulty configured by the first interaction attitude and the second interaction attitude is different.
Fig. 2 schematically illustrates a diagram of a wood object of a preset interaction gesture in an embodiment of the present disclosure.
In step S120, in the process of interaction between the controlled virtual object and the target object, in response to the gesture switching instruction for the target object, the target object is controlled to be switched from the preset interaction gesture to the target interaction gesture and to interact with the target object; the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
In an exemplary embodiment of the present disclosure, the gesture switching instruction is an instruction for switching the interaction gesture of the target object, and if the preset interaction gesture is the first interaction gesture, after receiving the gesture switching instruction, the interaction gesture of the target object is switched from the first interaction gesture to the second interaction gesture; and if the preset interaction gesture is the second interaction gesture, switching the interaction gesture of the target object from the second interaction gesture to the first interaction gesture after receiving the gesture switching instruction. Along with the switching of the interaction gesture, the corresponding interaction progress and interaction operation difficulty are also switched.
For example, fig. 3 schematically illustrates a process of switching the interaction gesture of the target object in the embodiment of the disclosure, and as shown in fig. 3, the interaction gesture of the wood block object is adjusted gradually in the order from left to right, so as to adjust the wood block model from the first interaction gesture to the second interaction gesture. The number above the wood block model represents the current progress percentage corresponding to the current interaction progress. The lines around the wood block model in the second interaction pose represent the task execution speed at the current interaction progress.
In an alternative embodiment, the method further comprises: determining current interaction progress information of a target object; displaying current interaction progress information at a target position of the graphical user interface; there is a positional mapping relationship between the target position and the display position of the target object in the graphical user interface.
The game player can acquire the current interaction progress of the target game task through the current interaction progress information. After determining the current interaction progress information, the current interaction progress information may be displayed at the target location. It should be noted that, there is a positional mapping relationship between the target position and the display position of the target object in the graphical user interface. For example, the target object is displayed at the position a of the graphical user interface, and then the position a is the display position. The target position may be an upper part of the display position, a lower part of the display position, an upper left corner of the display position, an upper right corner of the display position, or a position having a position mapping relation with any one of the graphical user interfaces and the display position, which is not particularly limited in this exemplary embodiment.
In the present exemplary embodiment, the current interaction progress information may be displayed at the target position having a position mapping relationship with the display position, so that the game player can more intuitively learn the current interaction progress of the target game task through the current interaction progress information.
In an alternative embodiment, the current interaction progress information includes a current progress percentage under the current interaction progress, and size information of the target object under the current interaction progress.
The current progress percentage refers to a value used for representing the current interaction progress, and based on the current progress percentage, a game player can acquire the current interaction progress of the target game task. The size information of the target object may be the height of the target object, the width of the target object, the size of the cross section of the target object, or any information that may describe the size of the target object, which is not particularly limited in the present exemplary embodiment.
For example, if the target game task is a drill wood block task, the target object may be as shown in fig. 2. The game player can drill the bottom surface of the wood block object shown in fig. 2 by manipulating the drill bit. Along with the change of the drilling progress, the height of the wood block object can be changed, and then the size information of the target object is the height of the wood block object.
It is worth to say that, the size information of the target object is always matched with the current interaction progress in the execution process of the target game task. For example, the current interaction progress is 85%, and the height of the wood block object is adjusted to 15% of the initial height.
It should be noted that, the size information of the target object is related to the current interactive progress of the target game task, whether it is the current progress percentage. For example, fig. 4 schematically illustrates a graphical user interface showing current interaction progress information when the interaction gesture of the target object is the first interaction gesture in the present exemplary embodiment, and as shown in fig. 4, a character 410 represents a controlled virtual object currently controlled by the game player to perform the target game task of drilling a wood block, and a virtual electric drill 420 is held in the hand of the controlled virtual object. The target object 430 is a wood block object with an interaction gesture of a first interaction gesture, the first interaction identifier 440 is a drill bit of the virtual electric drill, 15% of numbers above the target object is a current progress percentage under a current interaction progress, and the wood block height of the target object 430 at the moment is size information of the target object under the current interaction progress.
In the present exemplary embodiment, the current interaction progress information includes a current progress percentage at the current interaction progress, and size information of the target object at the current interaction progress. And the game player can obtain the current interaction progress of the target object through the current progress percentage and the size information of the target object, so that the display effect of the current interaction progress is improved.
In an alternative embodiment, the target object includes a corresponding interactive operation region, and the interactive operation region includes a valid interactive region and an invalid interactive region that are differentially displayed.
Wherein the interactive operation region refers to a region belonging to the target object. The game player can interact the first interaction identifier or the second interaction identifier on the interaction control area of the target object through controlling the controlled virtual object so as to realize interaction with the target object; the game player can also act the touch operation on the interactive control area of the target object through the touch terminal so as to realize the interaction with the target object. The present exemplary embodiment is not particularly limited thereto.
For example, the target object is a wood block object as shown in fig. 2, and the game player can control the controlled virtual object to drill on the target object by means of the electric drill (i.e. the first interaction identifier) which is a virtual prop, so as to realize interaction with the wood block object, thereby completing the target game task.
It is worth to say that the operation area comprises an effective interaction area and an ineffective interaction area, and the target game task can be continued only when the interaction position is in the effective interaction area; when the interaction location is in the invalid interaction region, the target game task will not be executed any more, and the target game task is in a failed state or a suspended state.
The manner in which the effective interaction area and the ineffective interaction area are divided is determined by the specific target game task. The effective interaction area and the ineffective interaction area can be two connected rectangular areas; the effective interaction area may be a circle, and the ineffective interaction area may be an area surrounding the circle, which is not particularly limited in the present exemplary embodiment.
In the present exemplary embodiment, the interactive manipulation area includes an effective interactive area and an ineffective interactive area. Through the division of the interactive control area, logic between continuous progress, failure or suspension of the target game task is perfected.
In an alternative embodiment, the target object includes a corresponding interactive operation region, and the interactive operation region includes an effective interactive region and an ineffective interactive region; the method further comprises the steps of: determining an interaction position acting on a target object in the process of interaction between the controlled virtual object and the target object; and in response to the interaction position being located in the effective interaction area or the ineffective interaction area, the effective interaction area and/or the ineffective interaction area are displayed in a distinguishing mode.
The interaction position refers to a position acted on the target object under the current interaction progress.
When the interaction location is located in the effective interaction area, the effective interaction area can be displayed differently. For example, the interactive operation region is black, and when the interactive position is located in the effective interactive region, the color of the effective interactive region is changed to red. When the interaction location is located in the invalid interaction zone, the invalid interaction zone may be differentially displayed. For example, the interactive operation region is black, and when the interactive position is located in the invalid interactive region, the color of the invalid interactive region is changed to red.
There is also a case where the effective interaction area and the ineffective interaction area are displayed differently regardless of whether the interaction position is located in the effective interaction area or the ineffective interaction area. For example, the active interaction area is displayed in a first color and the inactive interaction area is displayed in a second color, the first color and the second color being two different colors. The effective interaction area and the ineffective interaction area can be more obviously distinguished from the interaction operation area by the game player, so that the interaction efficiency with the target object is improved.
In an alternative embodiment, the first interaction gesture is a three-dimensional gesture and the second interaction gesture is a planar gesture; controlling the controlled virtual object to interact with the target object, comprising: when the target object is in a three-dimensional gesture, controlling the controlled virtual object to perform interaction in a first direction on the target object in the three-dimensional gesture through a first interaction identifier; when the target object is in the plane gesture, the controlled virtual object is controlled to interact in the first direction and the second direction on the target object in the plane gesture through the second interaction mark.
Wherein, the three-dimensional gesture refers to the gesture of the three-dimensional form of the target object, and the plane gesture refers to the gesture of the plane form of the target object. For example, as shown in fig. 3, the left side in fig. 3 represents the three-dimensional posture of the wood block object, and the right side in fig. 3 represents the planar posture of the wood block object.
The game player can control the controlled virtual object no matter what gesture the target object is in, and move the first interaction identifier or the second interaction identifier to interact with the target object.
However, in the three-dimensional interaction pose and the planar interaction pose, there is a difference in controlling the interaction of the controlled virtual object on the target object of the three-dimensional pose.
The controlled virtual object may be provided with a virtual prop, upon which the game player may manipulate the controlled virtual object to complete an interaction with the target object through the virtual prop. The first interaction identifier and the second interaction identifier correspond to a virtual prop. Assuming that the target game task is a drilling task aiming at a wood block object, when the target object is in a three-dimensional posture, the first interaction identifier can be a drill bit corresponding to a drilling tool; the second interactive identifier may be a drill point corresponding to the drill tool when the target object is in the planar pose.
It should be noted that, when the target object is in the three-dimensional posture, only the controlled virtual object is controlled to interact with the target object in the first direction through the first interaction identifier (i.e. even if the game player wants to interact with the target object in a direction other than the first direction, the terminal does not respond to the interaction); when the target object is in the plane gesture, the target object is displayed in the graphical user interface in a plane mode, so that interaction between the controlled virtual object and the target object in the plane gesture can be controlled in a first direction, and interaction between the controlled virtual object and the target object in the plane gesture can be controlled in a second direction.
The first direction may be specifically a horizontal direction or any direction, and this is not particularly limited in the present exemplary embodiment. The second direction is different from the first direction, and if the first direction is a horizontal direction, the second direction may be a vertical direction; if the first direction is a vertical direction, the second direction may be a horizontal direction; if the first direction is the upper left direction, the second direction may be the upper right direction, which is not particularly limited in the present exemplary embodiment.
For example, as shown in fig. 4, when the wood block object 430 is in a three-dimensional pose, the controlled virtual object may be controlled to perform interaction in a horizontal direction on the target object in the three-dimensional pose using the first identifier 440.
In an alternative embodiment, the method further comprises: when the target object is in the first interaction attitude, the interaction progress in unit time is a first numerical value; when the target object is in the second interaction attitude, the interaction progress in unit time is a second numerical value; the second value is greater than the first value.
The interaction progress can be changed in unit time, and the first numerical value and the second numerical value are used for representing the change speed of the interaction progress of the target object under different interaction postures.
When the target object is in the first interaction attitude, the interaction progress changes by a first numerical value in unit time. And when the target object is in the second interaction attitude, the interaction progress changes by a second numerical value in unit time.
It is worth noting that if the first value is greater than the second value, it indicates that the change speed of the interaction progress is higher when the target object is in the second interaction attitude than when the target object is in the first interaction attitude, so as to distinguish the interaction operation difficulty in the first interaction attitude from the interaction operation difficulty in the second interaction attitude, and bring different control experiences to the game player (namely, when the target object is in the second interaction attitude, the change speed of the interaction progress is faster, so that the target game task can be completed more quickly, and when the target object is in the first interaction attitude, the change speed of the interaction progress is slower, so that more time is required to complete the target game task).
Based on this, when the game player needs to complete the target game task more quickly, a gesture switching instruction may be issued to switch the target object from the first interaction gesture to the second interaction gesture, after which, when the game player wants to complete the target game task slowly, a gesture switching instruction may be issued to switch the target object from the second interaction gesture to the first interaction gesture.
In an alternative embodiment, the target object includes a corresponding interactive operation region, the interactive operation region includes a valid interactive region and an invalid interactive region, and the method further includes: determining the interaction position of the first interaction identifier or the second interaction identifier on the target object in the interaction process of the controlled virtual object and the target object; responding to the interaction position in the effective interaction area, and controlling and increasing the interaction progress of the controlled virtual object and the target object according to the first numerical value or the second numerical value; and controlling to reduce the interaction progress of the controlled virtual object and the target object in response to the interaction position being in the invalid interaction area.
Wherein the interaction location refers to the location where the first interaction identification or the second interaction identification acts on the target object. When the interaction position is in the effective interaction area, the interaction with the target object is required to be continued at the moment, so that the target game task is executed. Based on this, it is necessary to increase the progress of interaction between the controlled virtual object and the target object.
It is worth noting that the first interaction identifier corresponds to a first interaction gesture, and the second interaction identifier corresponds to a second interaction gesture. The interactive progress change value in the unit time corresponding to the first interactive gesture is a first numerical value, and the interactive progress change value in the unit time corresponding to the second interactive gesture is a second numerical value.
Based on the above, when the interaction identifier is a first interaction identifier and the interaction position is located in the effective interaction area, increasing the interaction progress between the controlled virtual object and the target object according to the first numerical value; and when the interaction identifier is a second interaction identifier, increasing the interaction progress between the controlled virtual object and the target object according to the second numerical value.
When the interaction identifier is the first interaction identifier and the interaction position is in the invalid interaction area, the controlled virtual object and the target object do not continue to interact, and the target game task is not executed. Based on this, when the interaction location is located in the ineffective interaction area, it is necessary to reduce the progress of interaction between the controlled virtual object and the target object.
For example, as shown in fig. 4, the contact position between the first interactive identifier 440 and the bottom surface of the target object 430 is the interaction position. Obviously, the interaction location is now located in the active interaction area, based on which the interaction progress needs to be increased in the course of the interaction location being moved to the position shown in fig. 4, and the interaction progress is increased by a first value; correspondingly, if the interaction location is moved to the boundary of the bottom surface of the wood block object 430, the interaction progress needs to be reduced according to the first value.
Conversely, when the interactive position is located in the invalid interactive region, the target game task will not be continuously executed at this time, and thus, the interactive progress should be reduced. The process of reducing the progress of the interaction may be a slow process or a process of instantaneously resetting the progress of the interaction to 0, which is not particularly limited in the present exemplary embodiment.
For example, in a wood block drilling task, when the interactive position of the drill bit is at the boundary of the cross section, the drilling speed under the interactive progress is reduced.
In an alternative embodiment, in response to the interaction location being located in the effective interaction area, increasing the interaction progress of the controlled virtual object with the target object according to the first numerical value or the second numerical value control; and in response to the interaction position being in the invalid interaction area, controlling and reducing the interaction progress of the controlled virtual object and the target object, so that when the target object is in different areas, the interaction progress is regulated in different ways, and the logic for regulating the interaction progress is perfected.
In an alternative embodiment, the method further comprises: and in response to the duration that the interaction position is continuously located in the invalid interaction area reaches the designated duration, controlling to reset the interaction progress of the controlled virtual object and the target object to 0.
When the interaction position is in the invalid interaction area, the controlled virtual object and the target object do not continue to interact. However, sometimes, due to the misoperation of the player, the interaction position is also located in the invalid interaction region, and further, in order to determine whether the interaction progress needs to be reset to 0 at this time, whether the duration that the interaction position is continuously located in the invalid interaction region reaches the specified duration needs to be determined. The designated duration refers to a preset threshold value for defining the duration that the interaction location is in the invalid interaction zone.
If the duration of the interaction position continuously located in the invalid interaction area reaches the designated duration, resetting the interaction progress to 0; and if the duration of the interaction position continuously located in the invalid interaction area does not reach the designated duration, temporarily reducing the interaction progress of the controlled virtual object and the target object.
For example, as shown in fig. 4, if the first interactive identifier 440 is at the boundary (i.e. the invalid interactive area) of the bottom surface of the target object 430 at this time, the current progress may be reset to 0, specifically, the current progress percentage may be set to 0, and the length of the wood block object is restored to the initial length.
In an alternative embodiment, after the step of increasing the progress of interaction of the controlled virtual object with the target object in response to the interaction location being located in the active interaction zone, according to the first numerical value or the second numerical value control, the method further comprises: and determining the interaction completion time of the controlled virtual object and the target object according to the first numerical value or the second numerical value, and prompting in a graphical user interface.
Where the interaction completion time refers to the time from the beginning to the end of this interaction process, i.e. the total duration that the target game task is completed. For example, if the target game task is a wood block drilling task, as shown in fig. 2, the interaction completion time is a time period used for drilling the wood block object in fig. 2, and may be 2 minutes, for example.
It should be noted that, due to the difference between the interaction instruction and the interaction gesture of the wood block object, the interaction completion time may be determined based on the first value or may be determined based on the second value, which is not limited in this exemplary embodiment.
After determining the interaction completion time, the interaction completion time may be displayed in a graphical user interface to prompt the game player as to how long the target game task was completed.
In an alternative embodiment, the three-dimensional gesture is displayed as a target object of a vertical gesture and the two-dimensional gesture is displayed as a cross-section of the target object; the first direction is a horizontal direction, and the second direction is a vertical direction.
In the three-dimensional posture, the target object is displayed in a vertical mode. For example, as shown in fig. 2, a wood block object in a three-dimensional pose is displayed. In the two-dimensional pose, the target object is displayed in cross-section. For example, as shown in fig. 3, the right graph is a cross section of a wood block object (i.e., a target object in a two-dimensional pose).
It should be noted that the cross section may be a fixed cross section of the target object, or may be a cross section that changes continuously with the current interaction progress. For example, the target game task is a drill wood task. When the wood block object is in a two-dimensional pose, the bottom surface of the wood block object is displayed in the graphical user interface. However, with the continuous recommendation of the drilling progress, the area, texture and color of the bottom surface of the wood block are changed, and thus the cross section of the target object in the corresponding two-dimensional posture is also changed.
When the target object is in a three-dimensional posture, the controlled virtual object can only interact with the target object in a first direction; when the target object is in a two-dimensional gesture, the controlled virtual object may interact with the target object in either a first direction or a second direction. The first direction may specifically be a horizontal direction and the second direction may specifically be a vertical direction.
In an alternative embodiment, when the target object is in a three-dimensional gesture, the interaction progress of the controlled virtual object and the target object is represented through the length change of the target object in the vertical gesture; and when the target object is in a two-dimensional gesture, representing the interaction progress of the controlled virtual object and the target object through the area change of the cross section of the target object.
When the target object is in the three-dimensional gesture, the length of the target object can be changed along with the recommendation of the current interaction progress, so that the length of the target object can be used for representing the current interaction progress. For example, when a tile object is in a three-dimensional pose, the length of the tile object may change as the tile object is continuously drilled, and thus the length of the tile object may be used to characterize the current interaction progress.
When the target object is in a two-dimensional gesture, the area of the cross section can be changed along with the progress of the interaction progress, so that the length of the target object can be used for representing the current interaction progress. For example, the target object is a wood block object, the wood block is not a regular cylinder, that is, the area of the cross section shown on the bottom surface of the wood block is changed continuously along with the drilling progress, and the area of the cross section of the bottom surface of the wood block is increased along with the pushing of the drilling progress; the cross-sectional area of the bottom surface of the wood block is smaller and smaller along with the advancement of the drilling progress of some wood block objects; with the advancement of the drilling progress, the cross-sectional area of the bottom surface of the wood block is larger and smaller for a period of time, and the present exemplary embodiment is not particularly limited thereto.
When the wood block object is in a two-dimensional gesture, as the wood block object is continuously drilled, the area of the bottom surface (cross section) of the wood block object can be increased or decreased along with the drilling progress, so that the area of the cross section of the wood block object can be used for representing the current interaction progress.
In addition, the style information of the cross section can also represent the interactive progress of the target object. For example, the cross section is the bottom surface of a solid wood block, and along with the continuous drilling task, the bottom surface of the wood block is continuously drilled layer by layer, and the patterns and the colors of the displayed bottom surface can be changed along with the change of drilling depth. Due to insolation and wind, the wood at the bottom surface of the wood block at the outer side has deeper color and obvious patterns; the depth of the wood at the bottom surface of the wood block at the inner side is shallow, and the patterns are not obvious.
Based on the above, in order to perfect the display logic of the interactive progress, in the process of continuously advancing the interactive progress, style information of the cross section is required to be displayed in the graphical user interface, so as to improve the display effect of the target object in the two-dimensional gesture. For example, fig. 5 schematically illustrates a user interface diagram when the interaction gesture of the target object is the second interaction gesture in the embodiment of the disclosure, as shown in fig. 5, as the drill point 520 (i.e. the second interaction mark) is always in the effective interaction area, the color of the target object 510 becomes lighter and the pattern on the target object 510 becomes less obvious in the continuous execution of the target game task.
In the present exemplary implementation, the information for representing the current interaction progress of the target object in different interaction gestures is different, so that the display effect of the current interaction progress is improved.
In an alternative embodiment, the target object includes a corresponding interactive operation region, and the interactive operation region includes an effective interactive region and an ineffective interactive region; when the target object is displayed in a vertical posture, the effective interaction area is a first width range area of the target object in the horizontal direction, and the ineffective interaction area is an area outside the first width range area of the target object in the horizontal direction; when the target object is displayed as a cross-sectional gesture, the effective interaction area is an inner circle range area on the cross-section of the target object, and the ineffective interaction area is an outer circle range area on the cross-section of the target object.
The interactive operation area comprises a valid interactive area and an invalid interactive area no matter what gesture the target object is in. When the target object is displayed in the vertical posture, the width range of the effective interaction area in the horizontal direction is the first width range. The regions other than the effective interactive region in the interactive operation region all belong to the ineffective interactive region.
When the target object is displayed in the cross-sectional pose, the cross-section of the target object is the interactive operation region. Specifically, the effective interaction area is an inner circle range area of the cross section, and the ineffective interaction area is an outer circle range area except for the effective interaction area in the cross section.
Specifically, when the target object is displayed as a cross-sectional pose, the effective interaction area includes one cross-section of the target object. It should be noted that the cross section may be related to the progress of interaction, or may be a fixed cross section of the target object, which is not particularly limited in this exemplary embodiment.
For example, if the target object is a block object, the game player performs a drilling task on the bottom surface of the block object by using the virtual electric drill. When the target object is displayed in a cross-sectional pose, the target object will be displayed in the graphical user interface in the form of a cross-section. Based on the above, in the execution process of the drilling task, the bottom surface is ground away layer by layer along with the interaction progress of the drilling task, and then, the cross section of the wood block object corresponding to the interaction progress is displayed in the graphical user interface, wherein the cross section is the interaction operation area under the interaction progress. At this time, the ineffective interaction area is the cross-section boundary of the cross-section.
When the task execution position is at the section boundary, the section boundary needs to be displayed in a differentiated mode to prompt the game player that the target game task will not be executed continuously. The specific differentiation may be that the cross-section boundary is represented by a conspicuous color, and the cross-section boundary is thickened, which is not particularly limited in the present exemplary embodiment. For example, as illustrated in fig. 4, when the contact position (i.e., interaction position) of the first interaction identifier 440 with the target object 430 is at the boundary of the bottom surface of the target object 430, the boundary of the bottom surface of the target object 430 is changed to red.
In the present exemplary embodiment, the form of the effective interaction area and the ineffective interaction area in different poses is described.
In an alternative embodiment, the method further comprises: responding to a movement control instruction aiming at the first interaction identifier or the second interaction identifier, and controlling and adjusting the interaction position of the first interaction identifier or the second interaction identifier on the target object; and controlling and adjusting display parameters of the first interaction identifier or the second interaction identifier based on the interaction position corresponding to the first interaction identifier or the second interaction identifier so as to enable the first interaction identifier or the second interaction identifier to present visual distinction in different interaction areas on the target object.
Wherein the game player can control the use of the virtual prop by the controlled virtual object to move the interaction position of the first interaction identification or the second interaction identification on the target object. The movement control instruction is an instruction for moving the interaction position of the first interaction identifier or the second interaction identifier on the target object.
And according to the target position carried in the movement control instruction, the interaction position can be moved to the target position. The target location may be in a valid interaction area or in an invalid interaction area. According to the region where the target position is located, the display effects in different regions can be distinguished through the display parameters of the first interaction identifier or the second interaction identifier.
The display parameter may be a color of the first interactive identifier or the second interactive identifier, may be an animation corresponding to the first interactive identifier or the second interactive identifier, may be an audio effect, and may also be a prompt text, which is not particularly limited in the present exemplary embodiment.
For example, as shown in fig. 4, if the game player manipulates the controlled virtual object, the first interactive mark 440 is moved to the right in the horizontal direction, so that the interactive position is at the boundary of the bottom surface section of the wood block object, and the first interactive mark 440 needs to be displayed in red at this time, so as to inform the game player that the task execution position is at the invalid interactive area at this time.
FIG. 6 schematically illustrates a graphical user interface displaying display parameters in an embodiment of the present disclosure, where, as illustrated in FIG. 6, the contact location (i.e., interaction location) of the first interactive identifier 440 with the target object 430 is at the boundary of the bottom surface of the target object 430 (the boundary belongs to an invalid interaction region), and where the spark animation 610 may be displayed at the interaction location.
In the method and the device provided by the exemplary embodiment of the disclosure, on one hand, in response to the gesture switching instruction for the target object, the target object is controlled to be switched from the preset interaction gesture to the target interaction gesture and interact with the target object, and the interaction gesture of the target object can be switched between the first interaction gesture and the second interaction gesture, so that the situation that the target object can only be displayed in one interaction gesture is avoided, the display effect of the target object is improved, and different interaction requirements of game players are met; on the other hand, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty, and interaction with the target object under different interaction operation difficulties can be completed through switching the interaction gesture, so that the interaction progress with the target object can be flexibly controlled, the game interestingness is increased, and the user experience is improved.
The following describes in detail the method for controlling interaction of the target object in the embodiment of the present disclosure in conjunction with an application scenario.
The target game task may be a task of sucking up the liquid in the container. The target object is a container object. When the terminal receives an interaction instruction about the container object, the container object of the first interaction gesture is displayed in the graphical user interface (at this time, the container object is displayed in a stereoscopic form in the graphical user interface). A current draw progress (i.e., a current interaction progress) of the liquid within the container object is determined. The target object at this time is configured with a first interaction progress and a first interaction operation difficulty.
When a gesture switching instruction for the container object is received, the interactive gesture of the container object is switched to a second interactive gesture (one object cross section of the container object is displayed in the graphical user interface at this time). The target object at this time is configured with a second interaction progress and a second interaction operation difficulty.
In the application scene, on one hand, the gesture switching instruction aiming at the target object is responded, the target object is controlled to be switched from the preset interaction gesture to the target interaction gesture and is interacted with the target object, the interaction gesture of the target object can be switched between the first interaction gesture and the second interaction gesture, the condition that the target object can be displayed only in one interaction gesture is avoided, the display effect of the target object is improved, and different interaction requirements of game players are met; on the other hand, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty, and interaction with the target object under different interaction operation difficulties can be completed through switching the interaction gesture, so that the interaction progress with the target object can be flexibly controlled, the game interestingness is increased, and the user experience is improved.
In addition, in the exemplary embodiment of the disclosure, an interaction control device of the target object is also provided. Fig. 7 illustrates a schematic structure of an interaction control apparatus of a target object, and as illustrated in fig. 7, the interaction control apparatus 700 of a target object may include: a first interaction module 710 and a second interaction module 720. Wherein:
the first interaction module 710 is configured to respond to an interaction instruction for a target object, display the target object with a preset interaction gesture in the graphical user interface, and control the controlled virtual object to interact with the target object; the method comprises the steps that an interaction gesture is preset to be one of a first interaction gesture and a second interaction gesture, wherein the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty; the second interaction module 720 is configured to respond to a gesture switching instruction for the target object in the process of interaction between the controlled virtual object and the target object, and control the target object to be switched from a preset interaction gesture to a target interaction gesture and interact with the target object; the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: a progress information determining module configured to determine current interaction progress information of the target object; a third display module configured to display current interaction progress information at a target location of the graphical user interface; there is a positional mapping relationship between the target position and the display position of the target object in the graphical user interface.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the current interaction progress information includes a current progress percentage at the current interaction progress, and size information of the target object at the current interaction progress.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: the target object comprises a corresponding interactive operation area, and the interactive operation area comprises an effective interactive area and an ineffective interactive area which are displayed in a distinguishing mode.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the target object includes a corresponding interactive operation region, and the interactive operation region includes a valid interactive region and an invalid interactive region; the apparatus further comprises: the interaction position determining module is configured to determine an interaction position acted on the target object in the process of interaction between the controlled virtual object and the target object; and the distinguishing display module is configured to respond to the interaction position being positioned in the effective interaction area or the ineffective interaction area and display the effective interaction area and/or the ineffective interaction area in a distinguishing way.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the first interaction gesture is a three-dimensional gesture, and the second interaction gesture is a planar gesture; controlling the controlled virtual object to interact with the target object, the device comprises: the first interaction module is configured to control the controlled virtual object to perform interaction in a first direction on the target object in the three-dimensional posture through the first interaction identifier when the target object is in the three-dimensional posture; and the second interaction module is configured to control the controlled virtual object to respectively perform interaction in the first direction and the second direction on the target object in the plane gesture through the second interaction identifier when the target object is in the plane gesture.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: the first interaction rate module is configured to set the interaction progress in unit time as a first numerical value when the target object is in a first interaction attitude; the first interaction rate module is configured to enable the interaction progress in unit time to be a second numerical value when the target object is in the second interaction attitude; wherein the second value is greater than the first value.
In an exemplary embodiment of the present disclosure, based on the foregoing solution, the target object includes a corresponding interactive operation region, the interactive operation region includes a valid interactive region and an invalid interactive region, and the method further includes: the interaction position determining module is configured to determine the interaction position of the first interaction identifier or the second interaction identifier on the target object in the process of interaction between the controlled virtual object and the target object; the interaction progress increasing module is configured to control the interaction progress of the controlled virtual object and the target object according to the first numerical value or the second numerical value in response to the interaction position being located in the effective interaction area; and the interaction progress reducing module is configured to control the interaction progress of the controlled virtual object and the target object to be reduced in response to the interaction position being in the invalid interaction area.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: and the interaction progress resetting module is configured to control the interaction progress of the controlled virtual object and the target object to be reset to 0 in response to the duration that the interaction position is continuously located in the invalid interaction area reaches the designated duration.
In an exemplary embodiment of the present disclosure, based on the foregoing aspect, after the step of increasing the interaction progress of the controlled virtual object with the target object according to the first numerical value or the second numerical value control in response to the interaction location being located in the effective interaction area, the apparatus further includes: and the interaction completion prompting module is configured to determine the interaction completion time of the controlled virtual object and the target object according to the first numerical value or the second numerical value and prompt the controlled virtual object and the target object on the graphical user interface.
In one exemplary embodiment of the present disclosure, based on the foregoing, a three-dimensional gesture is displayed as a target object of a vertical gesture, and a two-dimensional gesture is displayed as a cross section of the target object; the first direction is a horizontal direction, and the second direction is a vertical direction.
In one exemplary embodiment of the present disclosure, based on the foregoing scheme, when the target object is in a three-dimensional pose, characterizing an interaction progress of the controlled virtual object with the target object by a length change of the target object in the vertical pose; and when the target object is in a two-dimensional gesture, representing the interaction progress of the controlled virtual object and the target object through the area change of the cross section of the target object.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the target object includes a corresponding interactive operation region, and the interactive operation region includes a valid interactive region and an invalid interactive region; when the target object is displayed in a vertical posture, the effective interaction area is a first width range area of the target object in the horizontal direction, and the ineffective interaction area is an area outside the first width range area of the target object in the horizontal direction; when the target object is displayed as a cross-sectional gesture, the effective interaction area is an inner circle range area on the cross-section of the target object, and the ineffective interaction area is an outer circle range area on the cross-section of the target object.
In an exemplary embodiment of the present disclosure, based on the foregoing scheme, the apparatus further includes: an adjust interaction location module configured to control adjustment of an interaction location of the first interaction identity or the second interaction identity on the target object in response to a movement control instruction for the first interaction identity or the second interaction identity; and the control display parameter module is configured to control and adjust the display parameters of the first interaction identifier or the second interaction identifier based on the interaction position corresponding to the first interaction identifier or the second interaction identifier so as to enable the first interaction identifier or the second interaction identifier to present visual distinction when in different interaction areas on the target object.
The specific details of the above-mentioned interaction control device 700 for a target object are already described in detail in the corresponding interaction control method for a target object, and thus will not be described herein.
It should be noted that although several modules or units of the interactive control device 700 of the target object are mentioned in the above detailed description, such division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
In addition, in an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
An electronic device 800 according to such an embodiment of the invention is described below with reference to fig. 8. The electronic device 800 shown in fig. 8 is merely an example and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 8, the electronic device 800 is embodied in the form of a general purpose computing device. Components of electronic device 800 may include, but are not limited to: the at least one processing unit 810, the at least one storage unit 820, a bus 830 connecting the different system components (including the storage unit 1620 and the processing unit 810), and a display unit 840.
Wherein the storage unit stores program code that is executable by the processing unit 810 such that the processing unit 810 performs steps according to various exemplary embodiments of the present invention described in the above section of the "exemplary method" of the present specification.
Storage unit 820 may include readable media in the form of volatile storage units such as Random Access Memory (RAM) 821 and/or cache memory unit 822, and may further include Read Only Memory (ROM) 823.
The storage unit 820 may also include a program/usage tool 824 having a set (at least one) of program modules 825, such program modules 825 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which may include the reality of a network environment, or some combination thereof.
Bus 830 may be one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 800 may also communicate with one or more external devices 870 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 800, and/or any device (e.g., router, modem, etc.) that enables the electronic device 800 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 850. Also, electronic device 800 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 860. As shown, network adapter 860 communicates with other modules of electronic device 800 over bus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 800, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processor in the electronic device may implement the following operations in the virtual background generation method by executing machine executable instructions:
responding to an interaction instruction aiming at the target object, displaying the target object with a preset interaction gesture in a graphical user interface, and controlling the controlled virtual object to interact with the target object; the method comprises the steps that an interaction gesture is preset to be one of a first interaction gesture and a second interaction gesture, wherein the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty; in the interaction process of the controlled virtual object and the target object, responding to a gesture switching instruction aiming at the target object, controlling the target object to be switched from a preset interaction gesture to a target interaction gesture and interacting with the target object; the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
Determining current interaction progress information of a target object; displaying current interaction progress information at a target position of the graphical user interface; there is a positional mapping relationship between the target position and the display position of the target object in the graphical user interface.
When the interaction gesture of the target object is the first interaction gesture, the current interaction progress information comprises the current progress percentage under the current interaction progress and the size information of the target object under the current interaction progress.
The target object comprises a corresponding interactive operation area, and the interactive operation area comprises an effective interactive area and an ineffective interactive area which are displayed in a distinguishing mode.
Determining an interaction position acting on a target object in the process of interaction between the controlled virtual object and the target object; and in response to the interaction position being located in the effective interaction area or the ineffective interaction area, the effective interaction area and/or the ineffective interaction area are displayed in a distinguishing mode.
When the target object is in a three-dimensional gesture, controlling the controlled virtual object to perform interaction in a first direction on the target object in the three-dimensional gesture through a first interaction identifier; when the target object is in the plane gesture, the controlled virtual object is controlled to interact in the first direction and the second direction on the target object in the plane gesture through the second interaction mark.
When the target object is in the first interaction attitude, the interaction progress in unit time is a first numerical value; when the target object is in the second interaction attitude, the interaction progress in unit time is a second numerical value; the second value is greater than the first value.
Determining the interaction position of the first interaction identifier or the second interaction identifier on the target object in the interaction process of the controlled virtual object and the target object; responding to the interaction position in the effective interaction area, and controlling and increasing the interaction progress of the controlled virtual object and the target object according to the first numerical value or the second numerical value; and controlling to reduce the interaction progress of the controlled virtual object and the target object in response to the interaction position being in the invalid interaction area.
And in response to the duration that the interaction position is continuously located in the invalid interaction area reaches the designated duration, controlling to reset the interaction progress of the controlled virtual object and the target object to 0.
And determining the interaction completion time of the controlled virtual object and the target object according to the first numerical value or the second numerical value, and prompting in a graphical user interface.
The three-dimensional gesture is displayed as a target object of the vertical gesture, and the two-dimensional gesture is displayed as a cross section of the target object; the first direction is a horizontal direction, and the second direction is a vertical direction.
When the target object is in a three-dimensional gesture, representing the interaction progress of the controlled virtual object and the target object through the length change of the target object in the vertical gesture;
and when the target object is in a two-dimensional gesture, representing the interaction progress of the controlled virtual object and the target object through the area change of the cross section of the target object.
The target object comprises a corresponding interactive operation area, and the interactive operation area comprises an effective interactive area and an ineffective interactive area; when the target object is displayed in a vertical posture, the effective interaction area is a first width range area of the target object in the horizontal direction, and the ineffective interaction area is an area outside the first width range area of the target object in the horizontal direction; when the target object is displayed as a cross-sectional gesture, the effective interaction area is an inner circle range area on the cross-section of the target object, and the ineffective interaction area is an outer circle range area on the cross-section of the target object.
Responding to a movement control instruction aiming at the first interaction identifier or the second interaction identifier, and controlling and adjusting the interaction position of the first interaction identifier or the second interaction identifier on the target object; and controlling and adjusting display parameters of the first interaction identifier or the second interaction identifier based on the interaction position corresponding to the first interaction identifier or the second interaction identifier so as to enable the first interaction identifier or the second interaction identifier to present visual distinction in different interaction areas on the target object.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a terminal device, or a network device, etc.) to perform the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification is also provided. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the "exemplary methods" section of this specification, when said program product is run on the terminal device.
Referring to fig. 9, a program product 900 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (17)

1. An interactive control method of a target object, characterized in that a graphical user interface is provided through a terminal device, wherein at least part of a game scene is displayed in the graphical user interface, the game scene comprises a controlled virtual object and the target object, and the method comprises:
responding to an interaction instruction aiming at the target object, displaying the target object with a preset interaction gesture in the graphical user interface, and controlling the controlled virtual object to interact with the target object; the preset interaction gesture is one of a first interaction gesture and a second interaction gesture, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty;
In the process of interaction between the controlled virtual object and the target object, responding to a gesture switching instruction aiming at the target object, and controlling the target object to be switched from the preset interaction gesture to the target interaction gesture and to interact with the target object; wherein the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
2. The interactive control method of a target object according to claim 1, further comprising:
determining current interaction progress information of the target object, and displaying the current interaction progress information at a target position of the graphical user interface; and a position mapping relation exists between the target position and the display position of the target object in the graphical user interface.
3. The interactive control method according to claim 2, wherein the current interactive progress information includes a current progress percentage at the current interactive progress, and size information of the target object at the current interactive progress.
4. The interactive control method according to claim 1, wherein the target object includes a corresponding interactive operation region including a valid interactive region and an invalid interactive region which are differentially displayed.
5. The interactive control method according to claim 1, wherein the target object includes a corresponding interactive operation region including an effective interactive region and an ineffective interactive region;
the method further comprises the steps of:
determining an interaction position acting on the target object in the process of interaction between the controlled virtual object and the target object;
and responding to the interaction position being positioned in the effective interaction area or the ineffective interaction area, and displaying the effective interaction area and/or the ineffective interaction area in a distinguishing way.
6. The interactive control method according to claim 1, wherein the first interactive gesture is a three-dimensional gesture and the second interactive gesture is a planar gesture;
the controlling the controlled virtual object to interact with the target object includes:
when the target object is in the three-dimensional gesture, controlling the controlled virtual object to perform interaction in a first direction on the target object in the three-dimensional gesture through a first interaction identifier;
and when the target object is in the plane gesture, controlling the controlled virtual object to respectively perform interaction in the first direction and the second direction on the target object in the plane gesture through a second interaction identifier.
7. The interactive control method according to claim 6, wherein when the target object is in the first interactive posture, the interactive progress per unit time is a first numerical value;
when the target object is in the second interaction attitude, the interaction progress in unit time is a second numerical value;
wherein the second value is greater than the first value.
8. The method of claim 7, wherein the target object includes a corresponding interactive operation region, the interactive operation region including a valid interactive region and an invalid interactive region, the method further comprising:
determining the interaction position of the first interaction identifier or the second interaction identifier on the target object in the process of interaction between the controlled virtual object and the target object;
responding to the interaction position in the effective interaction area, and controlling and increasing the interaction progress of the controlled virtual object and the target object according to the first numerical value or the second numerical value;
and controlling to reduce the interaction progress of the controlled virtual object and the target object in response to the interaction position being in the invalid interaction area.
9. The interactive control method of a target object according to claim 8, further comprising:
and in response to the duration that the interaction position is continuously located in the invalid interaction area reaches the designated duration, controlling the interaction progress of the controlled virtual object and the target object to be reset to 0.
10. The method of claim 8, further comprising, after the step of increasing the progress of interaction of the controlled virtual object with the target object in response to the interaction location being located in the effective interaction zone, controlling in accordance with the first numerical value or the second numerical value:
and determining the interaction completion time of the controlled virtual object and the target object according to the first value or the second value, and prompting on the graphical user interface.
11. The interactive control method according to claim 6, wherein the three-dimensional gesture is displayed as the target object in a vertical gesture and the two-dimensional gesture is displayed as a cross section of the target object;
the first direction is a horizontal direction, and the second direction is a vertical direction.
12. The interactive control method of a target object according to claim 11, characterized in that the progress of interaction of the controlled virtual object with the target object is characterized by a change in length of the target object in a vertical posture when the target object is in the three-dimensional posture;
and when the target object is in the two-dimensional gesture, representing the interaction progress of the controlled virtual object and the target object through the area change of the cross section of the target object.
13. The interactive control method according to claim 12, wherein the target object includes a corresponding interactive operation region including a valid interactive region and an invalid interactive region;
when the target object is displayed in a vertical gesture, the effective interaction area is a first width range area of the target object in the horizontal direction, and the ineffective interaction area is an area outside the first width range area of the target object in the horizontal direction;
when the target object is displayed as a cross-sectional gesture, the effective interaction area is an inner circle range area on the cross-section of the target object, and the ineffective interaction area is an outer circle range area on the cross-section of the target object.
14. The interactive control method of a target object according to claim 13, further comprising:
responding to a movement control instruction aiming at the first interaction identifier or the second interaction identifier, and controlling and adjusting the interaction position of the first interaction identifier or the second interaction identifier on the target object;
and controlling and adjusting display parameters of the first interaction identifier or the second interaction identifier based on the interaction position corresponding to the first interaction identifier or the second interaction identifier so as to enable the first interaction identifier or the second interaction identifier to present visual distinction when in different interaction areas on the target object.
15. An interactive control device for a target object, characterized in that a graphical user interface is provided by a terminal device, wherein at least part of a game scene is displayed in the graphical user interface, the game scene comprises a controlled virtual object and the target object, and the device comprises:
the first interaction module is configured to respond to an interaction instruction aiming at the target object, display the target object with a preset interaction gesture in the graphical user interface and control the controlled virtual object to interact with the target object; the preset interaction gesture is one of a first interaction gesture and a second interaction gesture, the first interaction gesture is provided with a first interaction progress and a first interaction operation difficulty, and the second interaction gesture is provided with a second interaction progress and a second interaction operation difficulty;
The second interaction module is configured to respond to a gesture switching instruction aiming at the target object in the process of interaction between the controlled virtual object and the target object, and control the target object to be switched from the preset interaction gesture to a target interaction gesture and interact with the target object; wherein the target interaction gesture is the other of the first interaction gesture and the second interaction gesture.
16. An electronic device, comprising:
a processor;
a memory for storing executable operations of the processor;
wherein the processor is configured to perform the interactive control method of the target object of any of claims 1-14 via execution of the executable operations.
17. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the interactive control method of a target object according to any one of claims 1-14.
CN202310936138.1A 2023-07-27 2023-07-27 Interaction control method and device of target object, storage medium and electronic equipment Pending CN116999838A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310936138.1A CN116999838A (en) 2023-07-27 2023-07-27 Interaction control method and device of target object, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310936138.1A CN116999838A (en) 2023-07-27 2023-07-27 Interaction control method and device of target object, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116999838A true CN116999838A (en) 2023-11-07

Family

ID=88564893

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310936138.1A Pending CN116999838A (en) 2023-07-27 2023-07-27 Interaction control method and device of target object, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116999838A (en)

Similar Documents

Publication Publication Date Title
CN107977141B (en) Interaction control method and device, electronic equipment and storage medium
WO2021258994A1 (en) Method and apparatus for displaying virtual scene, and device and storage medium
JP2020504851A (en) Game screen display control method, device, storage medium, and electronic device
US9350787B2 (en) Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US20140223490A1 (en) Apparatus and method for intuitive user interaction between multiple devices
EP2750032B1 (en) Methods and systems for generation and execution of miniapp of computer application served by cloud computing system
US9437158B2 (en) Electronic device for controlling multi-display and display control method thereof
CN108776544B (en) Interaction method and device in augmented reality, storage medium and electronic equipment
CN111481923B (en) Rocker display method and device, computer storage medium and electronic equipment
CN103677597A (en) Terminal equipment and same-screen display method and system
CN107430472A (en) System and method for mobile device operation system
CN111643890A (en) Card game interaction method and device, electronic equipment and storage medium
CN113244612A (en) Locking mark display method and device, storage medium and electronic equipment
WO2022022729A1 (en) Rendering control method, device and system
CN114501108A (en) Display device and split-screen display method
CN113952709A (en) Game interaction method and device, storage medium and electronic equipment
CN113559501A (en) Method and device for selecting virtual units in game, storage medium and electronic equipment
CN113440848A (en) In-game information marking method and device and electronic device
US20230310989A1 (en) Object control method and apparatus in virtual scene, terminal device, computer-readable storage medium, and computer program product
CN114863008B (en) Image processing method, image processing device, electronic equipment and storage medium
CN116999838A (en) Interaction control method and device of target object, storage medium and electronic equipment
CN113457144B (en) Virtual unit selection method and device in game, storage medium and electronic equipment
CN110604918B (en) Interface element adjustment method and device, storage medium and electronic equipment
CN114949842A (en) Virtual object switching method and device, storage medium and electronic equipment
CN113304477A (en) Game interface display method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination