WO2021036581A1 - 虚拟对象的控制方法和相关装置 - Google Patents
虚拟对象的控制方法和相关装置 Download PDFInfo
- Publication number
- WO2021036581A1 WO2021036581A1 PCT/CN2020/103033 CN2020103033W WO2021036581A1 WO 2021036581 A1 WO2021036581 A1 WO 2021036581A1 CN 2020103033 W CN2020103033 W CN 2020103033W WO 2021036581 A1 WO2021036581 A1 WO 2021036581A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual object
- user interface
- virtual
- standing state
- operation signal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000004590 computer program Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 14
- 230000003993 interaction Effects 0.000 description 14
- 230000002093 peripheral effect Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/87—Communicating with other players during game play, e.g. by e-mail or chat
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
Definitions
- the embodiments of the application relate to the fields of computer and Internet technology, and particularly relate to the control of virtual objects.
- players can control virtual objects in the game scene provided by the game match, such as controlling the movement of virtual objects.
- the user interface of shooting games is provided with buttons for controlling the movement of virtual objects.
- the user can control the movement of the virtual object by operating the button, such as controlling the virtual object to move in a certain direction in the virtual scene.
- the embodiments of the present application provide a method and related device for controlling a virtual object, which can be used to solve the technical problem of complicated and low-efficiency operations in the related technology when the virtual object is continuously moved.
- the technical solution is as follows:
- an embodiment of the present application provides a method for controlling a virtual object, and the method includes:
- Displaying a user interface the user interface including a joystick control for controlling the movement of the virtual object;
- the virtual object is controlled to run automatically in the virtual scene displayed on the user interface.
- an embodiment of the present application provides a method for controlling a virtual object, and the method includes:
- Displaying a user interface the user interface including a joystick control for controlling the movement of the virtual object;
- the virtual object is controlled to automatically run in the virtual scene displayed on the user interface.
- an embodiment of the present application provides a virtual object control device, the device includes:
- An interface display module for displaying a user interface, the user interface including a joystick control for controlling the movement of the virtual object;
- a signal receiving module for receiving a quick click operation signal acting on the target area corresponding to the joystick control
- the running control module is used to control the virtual object to automatically run in the virtual scene displayed on the user interface according to the quick click operation signal.
- an embodiment of the present application provides a virtual object control device, the device includes:
- An interface display module for displaying a user interface, the user interface including a joystick control for controlling the movement of the virtual object;
- the posture switching module is used to control the virtual object to switch from the non-standing state to a sliding operation signal of the joystick control when the starting position is received when the virtual object is in the non-standing state Standing
- the running control module is configured to control the virtual object to automatically run in the virtual scene displayed on the user interface after the virtual object is switched to the standing state.
- an embodiment of the present application provides a mobile terminal.
- the mobile terminal includes a processor and a memory.
- the memory stores at least one instruction, at least one program, code set, or instruction set, and the at least one instruction ,
- the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the control method of the virtual object described above.
- an embodiment of the present application provides a storage medium, where the storage medium is used to store a computer program, and the computer program is used to execute the virtual object control method in the above aspect.
- the embodiments of the present application provide a computer program product, which when the computer program product runs on a mobile terminal, causes the mobile terminal to execute the above-mentioned method for controlling virtual objects.
- the technical solution provided by the embodiment of the present application controls the virtual object in the virtual environment displayed on the user interface by displaying the joystick control in the user interface, when a quick click operation signal acting on the target area corresponding to the joystick control is received Automatic running, which realizes the function of triggering the virtual object to run automatically with one button, without the user continuously clicking or holding down a certain operation control, which improves the operation efficiency.
- Figure 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application.
- FIG. 2 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
- Fig. 3 is a flowchart of a method for controlling virtual objects provided by an embodiment of the present application
- Figure 4 exemplarily shows a schematic diagram of a user interface
- FIG. 5 is a flowchart of a method for controlling virtual objects provided by another embodiment of the present application.
- Fig. 6 is a flowchart of a method for controlling a virtual object provided by another embodiment of the present application.
- Fig. 7 exemplarily shows a schematic diagram of another user interface
- FIG. 8 is a flowchart of a sliding operation signal control method provided by an embodiment of the present application.
- FIG. 9 is a flowchart of a method for controlling a virtual object provided by another embodiment of the present application.
- Fig. 10 is a schematic diagram of another user interface of the present application.
- Fig. 11 is a schematic diagram of another user interface of the present application.
- FIG. 12 is a flowchart of a method for controlling a virtual object provided by another embodiment of the present application.
- FIG. 13 is a block diagram of a virtual object control device provided by an embodiment of the present application.
- Fig. 14 is a block diagram of a virtual object control device provided by another embodiment of the present application.
- FIG. 15 is a block diagram of a virtual object control device provided by another embodiment of the present application.
- FIG. 16 is a block diagram of a virtual object control device provided by another embodiment of the present application.
- Fig. 17 is a structural block diagram of a mobile terminal provided by an embodiment of the present application.
- a virtual scene is a scene displayed (or provided) when the client of an application program (such as a game application) runs on a terminal.
- the virtual scene refers to a scene created for virtual objects to perform activities (such as game competition). It is a virtual house, a virtual island, a virtual map, etc.
- the virtual scene may be a simulated scene of the real world, a semi-simulated and semi-fictional scene, or a purely fictitious scene.
- the virtual scene may be a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, which is not limited in the embodiment of the present application.
- a virtual object refers to a virtual role controlled by a user account in an application.
- a virtual object refers to a game character controlled by a user account in the game application.
- the virtual object may be in the form of a character, an animal, a cartoon, or other forms, which is not limited in the embodiment of the present application.
- the virtual object can be displayed in a three-dimensional form or a two-dimensional form, which is not limited in the embodiment of the present application.
- the operations that can be performed by the user account to control the virtual object may also be different.
- user accounts can control virtual objects to perform operations such as shooting, running, jumping, picking up firearms, replacing firearms, and adding bullets to firearms.
- FIG. 1 shows a schematic diagram of an implementation environment provided by an embodiment of the present application.
- the implementation environment may include: a mobile terminal 10 and a server 20.
- the mobile terminal 10 may be a portable electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, and a wearable device.
- the mobile terminal 10 can install a client terminal of a game application program, such as a client terminal of a shooting game application program.
- the server 20 is used to provide background services for clients of applications (such as game applications) in the mobile terminal 10.
- the server 20 may be a background server of the above-mentioned application program (such as a game application program).
- the server 20 may be one server, a server cluster composed of multiple servers, or a cloud computing service center.
- the mobile terminal 10 and the server 20 can communicate with each other through the network 30.
- the network 30 may be a wired network or a wireless network.
- the execution subject of each step may be a mobile terminal.
- FIG. 2 shows a schematic structural diagram of a mobile terminal provided by an embodiment of the present application.
- the mobile terminal 10 may include a main board 110, an external output/input device 120, a memory 130, an external interface 140, a touch control system 150, and a power supply 160.
- the motherboard 110 integrates processing elements such as a processor and a controller.
- the external output/input device 120 may include a display component (such as a display screen), a sound playback component (such as a speaker), a sound collection component (such as a microphone), various keys, and the like.
- a display component such as a display screen
- a sound playback component such as a speaker
- a sound collection component such as a microphone
- various keys and the like.
- the memory 130 stores program codes and data.
- the external interface 140 may include an earphone interface, a charging interface, a data interface, and the like.
- the touch control system 150 may be integrated in the display components or keys of the external output/input device 120, and the touch control system 150 is used to detect touch operations performed by the user on the display components or keys.
- the power supply 160 is used to supply power to other components in the mobile terminal 10.
- the processor in the motherboard 110 can generate a user interface (such as a game interface) by executing or calling program codes and data stored in the memory, and output/output the generated user interface (such as a game interface) through an external device.
- the input device 120 performs presentation.
- the touch system 150 can detect a touch operation performed when the user interacts with the user interface (such as a game interface), and respond to the touch operation.
- FIG. 3 shows a flowchart of a method for controlling a virtual object provided by an embodiment of the present application.
- This method can be applied to the mobile terminal introduced above, such as the client of an application (such as a shooting game application) applied to a mobile terminal.
- the method can include the following steps:
- step 301 a user interface is displayed, and the user interface includes a joystick control for controlling the movement of the virtual object.
- the user interface may be a display interface of a game match.
- the user interface is used to present a virtual environment of the game match to the user.
- the user interface may include elements in the virtual environment, such as virtual Buildings, virtual props, virtual objects, etc.
- the user interface also includes some operation controls, such as buttons, sliders, icons, etc., for users to operate.
- the user interface 40 includes a joystick control 41
- the joystick control 41 is an operation control used to control the movement of a virtual object.
- the embodiment of the present application does not limit the display parameters of the rocker control 41, that is, the shape, size, position, and style of the rocker control 41 are not limited.
- the rocker control 41 may be circular or square; the area of the rocker control 41 may occupy 1/10 of the area of the user interface 40, or it may occupy 1/20 of the area of the user interface 40;
- the stick control 41 can be located at the lower left corner of the user interface 40 or the lower right corner of the user interface 40; the joystick control 41 can be displayed in a transparent form or in a form of color filling.
- the display of the joystick control 41 does not obstruct the main display elements of the user interface 40.
- the user interface includes a first view layer and a second view layer; wherein the display level of the first view layer is higher than the display level of the second view layer.
- the joystick control is located in the first view layer, and the game screen for displaying the virtual environment of the game match is located in the second view layer.
- the first view layer may also include other operation controls, such as operation controls used to control the posture of virtual objects, and virtual equipment used to assemble virtual objects.
- the control operation controls, etc. are not limited in the embodiment of the present application.
- Step 302 Receive a quick click operation signal acting on the target area corresponding to the joystick control.
- the target area corresponding to the joystick control refers to the area that overlaps with the joystick control to a certain extent.
- the mobile terminal can respond to a user's touch operation, such as a tap or press operation. After the user performs a click or press operation on the above-mentioned target area, the mobile terminal will receive a corresponding operation signal.
- the target area corresponding to the joystick control completely overlaps with the joystick control, that is, the size and shape of the target area corresponding to the joystick control are exactly the same as the size and shape of the joystick control, and the center of the joystick control The position overlaps with the center position of the target area.
- the target area corresponding to the joystick control includes a joystick control
- the size of the target area corresponding to the joystick control is larger than the size of the joystick control.
- the target area 42 is a rectangular area whose size is larger than the size of the joystick control 41.
- the center position of the joystick control overlaps the center position of the target area.
- the size of the target area corresponding to the joystick control is smaller than the size of the joystick control.
- the center position of the joystick control overlaps the center position of the target area.
- the target area may be an area visible to the user in the user interface, or an area invisible to the user in the user interface.
- the target area may be a fully transparent area, which is not made in this embodiment of the application. limited.
- the rapid click operation signal refers to a signal generated by the triggering of multiple consecutive click operations, and in the consecutive multiple click operations, the time interval between two adjacent click operations is less than a preset threshold.
- the quick click operation signal may be a double click operation signal.
- the double-click operation signal refers to an operation signal that has two consecutive clicks, and the time interval between the two consecutive clicks is less than a preset threshold.
- the quick-click operation signal may also be a three-click operation signal, a four-click operation signal, etc., which is not limited in the embodiment of the present application.
- Step 303 According to the quick click operation signal, control the virtual object to automatically run in the virtual scene displayed on the user interface.
- the virtual object when the quick click operation signal is received, if the virtual object is in a standing state, the virtual object is controlled to automatically run in the virtual scene in the standing state.
- the virtual object when the quick click operation signal is received, if the virtual object is in the non-standing state, the virtual object is controlled to automatically run in the virtual scene in the non-standing state.
- the method further includes: receiving a gesture switching instruction corresponding to the virtual object.
- the posture switching instruction refers to an operation instruction used to control the posture of the virtual object to switch.
- the gesture switching instruction may be triggered by the user by operating controls, voice, or gestures. If the virtual object is in the non-standing state, the virtual object is controlled to switch from the non-standing state to the standing state according to the posture switching instruction, and then the virtual object is controlled to automatically run in the virtual scene in the standing state.
- a gesture switching icon 43 is displayed in the aforementioned user interface. When the user clicks the gesture switching icon 43, the mobile terminal will receive a gesture switching instruction.
- the posture switching icon 43 may be a squat icon.
- the speed of automatic running in the non-standing state is lower than the speed of automatic running in the standing state.
- the speed of the automatic running at this time is 1m/s.
- the mobile terminal receives the posture switching instruction, the virtual object is controlled to switch from the squatting state to the standing state , And control the virtual object to run automatically in the virtual scene in a standing state, and the speed of the automatic running at this time is 3m/s.
- first prompt information is displayed in the user interface, and the first prompt information is used for It indicates that the virtual object is in a non-standing state; when the virtual object is in a standing state, second prompt information is displayed in the user interface, and the second prompt information is used to indicate that the virtual object is in a standing state.
- the first prompt information and the second prompt information may be displayed based on the same icon. For example, as shown in FIG. 4, the first prompt information and the second prompt information are both based on the posture switching icon 43 in the user interface.
- the posture switching icon 43 is used to indicate whether the virtual object is in a standing state or a squatting state.
- the first prompt information is displayed.
- the posture switching icon 43 is highlighted.
- the second prompt information is displayed.
- the posture switching icon 43 is unhighlighted.
- the standing state refers to the state when the virtual object is standing
- the non-standing state refers to the state when the virtual object is not standing.
- the non-standing state may be a squatting state, that is, a state when the virtual object is squatting.
- the direction and speed of the virtual object when running automatically can be preset. For example, the direction of the virtual object when running automatically can be preset to be straight ahead, and the speed is 3m/s, where straight ahead refers to The direction the virtual object faces.
- the technical solution provided by the embodiments of the present application displays a joystick control in the user interface, and when a quick click operation signal acting on the target area corresponding to the joystick control is received, the virtual object is controlled in the user interface
- Automatic running in the displayed virtual environment realizes the function of one-click triggering of virtual objects to run automatically, without requiring the user to continuously click or hold down a certain operation control, which improves the operation efficiency.
- his finger can be released, and then some other operations can be completed by the released finger, such as observing the virtual environment during running, changing equipment during running, and running. Communicate with other users, etc., to bring richer interactive functions.
- the virtual object before receiving the quick click operation signal of the target area, if the virtual object is in the non-standing state, after the virtual object is triggered to start running automatically in the non-standing state, the virtual object can also be controlled to switch from the non-standing state through the posture switching instruction To the standing state, and then control the virtual object to run automatically in the standing state, providing users with a variety of ways to control the virtual object to run automatically, and further enhance the human-computer interaction experience.
- the above step 303 includes: obtaining the attribute value corresponding to the quick click operation signal; and determining the virtual object according to the attribute value.
- the running speed of the object control the virtual object to run automatically at the running speed in the virtual scene.
- the attribute value refers to the operation parameter corresponding to the quick click operation signal.
- the attribute value may be the operation time interval, the number of operations, etc., which is not limited in the embodiment of the present application.
- the attribute value is the click time interval of the double-click operation signal.
- the click interval time of the double-click operation signal is the smaller the click interval time, the greater the running speed; the longer the click interval time, the lower the running speed .
- the attribute value of the quick click operation signal may also be the number of clicks.
- the number of clicks of the quick click operation signal there is a positive correlation between the number of clicks of the quick click operation signal and the running speed of the virtual object, that is, the more the number of clicks, the greater the running speed; the fewer the number of clicks, the lower the running speed.
- the above step 303 includes: the mobile terminal can detect the operation position of the quick click operation signal; The operating position determines the running direction of the virtual object; controls the virtual object to automatically run in the running direction in the virtual scene.
- the above target area is evenly divided into four areas, denoted as area 1, area 2, area 3, and area 4.
- area 1, area 2, area 3, and area 4 The direction corresponding to area 1 is due east, the direction corresponding to area 2 is due south, and the direction corresponding to area 3 is positive.
- the direction corresponding to west and area 4 is true north.
- the target area is specifically divided into several areas, and the directions corresponding to each area are not limited.
- the method for controlling a virtual object may include the following steps:
- Step 501 Determine whether the virtual object is in a standing state; if yes, perform the following step 502; if not, continue to perform step 501;
- Step 502 determine whether a quick click operation signal is received; if yes, execute the following step 503; if not, continue to execute step 502;
- Step 503 Determine whether the quick click operation signal is located in the target area; if yes, perform the following step 504; if not, continue to perform step 503;
- Step 504 Determine the running speed and running direction of the virtual object according to the attribute value and operation position of the quick click operation signal
- Step 505 Control the virtual object to automatically run at the running speed and the running direction.
- the technical solution provided by the embodiments of the present application determines the running speed and running direction of the virtual object automatically according to the attribute value and operation position of the quick click operation signal, thereby giving users more ways to control the virtual object. To better meet the user's operational needs.
- FIG. 6 shows a flowchart of a virtual object control method provided by another embodiment of the present application.
- This method can be applied to the mobile terminal introduced above, such as the client of an application (such as a shooting game application) applied to a mobile terminal.
- the method can include the following steps:
- step 601 a user interface is displayed, and the user interface includes a joystick control for controlling the movement of the virtual object.
- Step 602 When the virtual object is in the non-standing state, if a sliding operation signal of the starting position of the joystick control is received, control the virtual object to switch from the non-standing state to the standing state.
- the method before controlling the virtual object to switch from the non-standing state to the standing state, the method further includes: acquiring the touch position of the sliding operation signal; when the touch position of the sliding operation signal is at the display position of the quick rise icon, The step of controlling the virtual object to switch from the non-standing state to the standing state is performed.
- the quick rise icon may be highlighted to remind the trigger state to switch successfully, which improves the efficiency of human-computer interaction.
- the size of the distance threshold is set according to the size of the user interface of the mobile terminal of the application. For example, the distance threshold is set to 5cm when the mobile terminal is a tablet computer, and the distance threshold is set when the mobile terminal is a mobile phone. This setting can flexibly change the distance threshold according to the different sizes of the user interface of the mobile terminal of the application, which further improves the human-computer interaction experience.
- the method further includes: determining according to the distance The moving speed of the virtual object. There is a negative correlation between the distance and the moving speed, that is, the greater the distance, the smaller the moving speed, and the smaller the distance, the greater the moving speed, that is, as the sliding operation signal The touch position is getting closer and closer to the target position, and the moving speed of the virtual object is getting larger and larger.
- Step 603 After the virtual object is switched to the standing state, control the virtual object to automatically run in the virtual scene displayed on the user interface.
- the user interface 70 includes a joystick control 71 and a quick rise icon 73.
- the joystick control 71 is used to control the movement of a virtual object
- the quick rise icon 73 is used for Switch the state of the virtual object, and control the virtual object to automatically run in the virtual scene after the state is switched.
- the joystick control 71 includes a drag icon 72
- the drag icon 72 is used to change the moving speed and direction of the virtual object.
- the drag icon 72 may change its position in real time according to the sliding operation signal, and the position change of the drag icon 72 reflects the position change of the virtual object.
- the center position of the dragging icon coincides with the center position of the touch position of the sliding operation signal.
- the drag icon 72 also quickly moves to the upper left corner of the user interface, which is reflected in the virtual scene displayed on the user interface as the virtual object quickly moves to the northwest.
- the quick get up icon 73 is displayed in the user interface 70.
- FIG. 7(b) when the center position of the drag icon 72 is at the display position of the quick rise icon 73, the virtual object is controlled to quickly stand up and then run automatically, and the quick rise icon 73 is controlled to be highlighted.
- the user interface 70 includes a squat icon 74, and the squat icon 74 is highlighted when the virtual object is in the squat state.
- the squat icon 74 is unhighlighted.
- the technical solution provided by the embodiments of the present application provides a way to control the virtual object to get up quickly and then automatically run when the virtual object is in the non-standing state, avoiding the control in the non-standing state. After the virtual object runs automatically, some operations must be performed to control the virtual object to stand up and run automatically, thereby further improving the efficiency of human-computer interaction and the human-computer interaction experience.
- the sliding operation signal control method in the embodiment of the present application has the following steps:
- Step 801 Receive a sliding operation signal
- Step 802 Calculate the distance between the center position of the sliding operation signal and the center position of the joystick control
- Step 803 When the distance is greater than the distance threshold, display a fast running icon
- Step 804 Control the center position of the sliding operation signal to the display position of the fast running icon.
- the method for controlling a virtual object may include the following steps:
- Step 901 Control the virtual object to be in a non-standing state
- Step 902 determine whether a sliding operation signal is received; if yes, perform the following step 903; if not, continue to perform step 902;
- Step 903 Calculate the distance between the touch position of the sliding operation signal and the target position
- Step 904 determine whether the distance is greater than the distance threshold; if yes, perform the following step 905; if not, perform the above step 903;
- Step 905 display the quick stand up icon
- Step 906 Determine whether the touch position of the sliding operation signal is located at the display position of the quick rise icon; if yes, perform the following step 907; if not, continue to perform step 906;
- Step 907 Control the virtual object to switch from the non-standing state to the standing state
- Step 908 Control the virtual object to run automatically.
- the above method may further include the following steps:
- a user interface is displayed, and the user interface includes an automatic running icon for controlling the virtual object to run automatically; receiving a trigger signal acting on the automatic running icon; and controlling the virtual object to run automatically according to the trigger signal.
- an automatic running icon 101 is displayed in the user interface 100, and when a trigger signal corresponding to the automatic running 101 is received, the virtual object is controlled to run automatically.
- the auto-running icon 101 is highlighted after the virtual object runs automatically, to prompt that the virtual object is in an auto-running state.
- the trigger signal may be a single click signal or a press signal, which is not limited in the embodiment of the present application.
- the above method may further include the following steps:
- the touch position of the sliding operation signal is acquired; when the distance between the center position of the touch position of the sliding operation signal and the center position of the joystick control is greater than the distance
- the threshold is set, the automatic running icon is displayed; when the touch position of the sliding operation signal is at the display position of the automatic running icon, the virtual object is controlled to run automatically.
- the aforementioned joystick control includes a drag icon for controlling the movement of the virtual object, and the center position of the drag icon coincides with the center position of the touch position of the sliding operation signal.
- the auto-running icon is highlighted after the virtual object runs automatically to prompt the user that the virtual object is in an auto-running state.
- a joystick control 111 is displayed in the user interface 110, and the joystick control 111 includes a drag icon 112, and the position of the drag icon 112 can be changed by receiving a sliding operation signal.
- FIG. 11(b) when the center position of the touch position of the sliding operation signal coincides with the center position of the joystick control 111, that is, the center position of the touch position of the sliding operation signal and the center position of the drag icon 112 When overlapped, the automatic running icon 113 is displayed in the user interface.
- the center position of the drag icon 112 is at the display position of the automatic running icon 113, the virtual object is controlled to run automatically, and the automatic running icon 113 is controlled to be highlighted.
- the method for controlling a virtual object may include the following steps:
- Step 1201 controlling the virtual object to be in a standing state
- Step 1202 determine whether a sliding operation signal is received; if yes, perform the following step 1203; if not, continue to perform step 1202;
- Step 1103 Calculate the distance between the touch position of the sliding operation signal and the target position
- Step 1104 determine whether the distance is greater than the distance threshold; if yes, perform the following step 1205; if not, perform the foregoing step 1203;
- Step 1205 display an automatic running icon
- Step 1206 Determine whether the touch position of the sliding operation signal is located at the display position of the automatic running icon; if yes, execute the following step 1207; if not, continue to execute step 1206;
- Step 1207 Control the virtual object to run automatically.
- FIG. 13 shows a block diagram of a virtual object control device provided by an embodiment of the present application.
- the device has the function of realizing the above-mentioned method example, and the function can be realized by hardware, or by hardware executing corresponding software.
- the device can be a mobile terminal, or it can be set in the mobile terminal.
- the device 1300 may include: an interface display module 1310, a signal receiving module 1320, and a running control module 1230.
- the interface display module 1310 is configured to display a user interface, and the user interface includes a joystick control for controlling the movement of the virtual object.
- the signal receiving module 1320 is configured to receive a quick click operation signal acting on the target area corresponding to the joystick control.
- the running control module 1330 is configured to control the virtual object to automatically run in the virtual scene displayed on the user interface according to the quick click operation signal.
- the signal receiving module 1320 is configured to: receive a quick click operation signal acting on the target area corresponding to the joystick control; or, receive a quick click operation signal acting on the joystick control The pressing operation signal of the corresponding target area.
- the running control module 1330 includes: a speed control sub-module 1331, configured to: obtain the attribute value corresponding to the quick click operation signal; The running speed of the virtual object; controlling the virtual object to automatically run at the running speed in the virtual scene.
- the running control module 1330 includes: a direction control sub-module 1332, configured to: detect the operation position corresponding to the quick click operation signal; determine the operation position according to the operation position The running direction of the virtual object; controlling the virtual object to automatically run in the running direction in the virtual scene.
- the running control module 1330 is further configured to: when the quick click operation signal is received, if the virtual object is in a non-standing state, control the virtual object to be in the non-standing state Run automatically in the virtual scene.
- the running control module 1330 is further configured to: receive a posture switching instruction corresponding to the virtual object; according to the posture switching instruction, control the virtual object to switch from the non-standing state to standing State; controlling the virtual object to automatically run in the virtual scene in the standing state.
- the technical solution provided by the embodiments of the present application displays a joystick control in the user interface, and when a quick click operation signal acting on the target area corresponding to the joystick control is received, the virtual object is controlled in the user interface
- Automatic running in the displayed virtual environment realizes the function of one-click triggering of virtual objects to run automatically, without requiring the user to continuously click or hold down a certain operation control, which improves the operation efficiency.
- his finger can be released, and then some other operations can be completed by the released finger, such as observing the virtual environment during running, changing equipment during running, and running. Communicate with other users, etc., to bring richer interactive functions.
- the virtual object before receiving the quick click operation signal of the target area, if the virtual object is in the non-standing state, after the virtual object is triggered to start running automatically in the non-standing state, the virtual object can also be controlled to switch from the non-standing state through the posture switching instruction To the standing state, and then control the virtual object to run automatically in the standing state, providing users with a variety of ways to control the virtual object to run automatically, and further enhance the human-computer interaction experience.
- FIG. 15 shows a block diagram of a virtual object control device provided by another embodiment of the present application.
- the device has the function of realizing the above method example, and the function can be realized by hardware, or by hardware executing corresponding software.
- the device can be a mobile terminal, or it can be set in the mobile terminal.
- the device 1500 may include: an interface display module 1510, a posture switching module 1520, and a running control module 1530.
- the interface display module 1510 is configured to display a user interface, and the user interface includes a joystick control for controlling the movement of the virtual object.
- the posture switching module 1520 is configured to control the virtual object to switch from the non-standing state if the sliding operation signal of the starting position of the joystick control is received when the virtual object is in the non-standing state To the standing state.
- the running control module 1530 is configured to control the virtual object to automatically run in the virtual scene displayed on the user interface after the virtual object is switched to the standing state.
- the posture switching module 1520 is further configured to: acquire the touch position of the sliding operation signal; when the touch position of the sliding operation signal is at the display position of the quick rise icon, execute the control station The step of switching the virtual object from the non-standing state to the standing state.
- the gesture switching module 1520 is further configured to: obtain the distance between the touch position of the sliding operation signal and the target position; when it is detected that the distance is greater than the distance threshold, the user interface The quick stand up icon is displayed in.
- the posture switching module 1520 is further configured to: determine the moving speed of the virtual object according to the distance, and there is a negative correlation between the distance and the moving speed.
- the device 1500 further includes an information display module 1540, configured to: when the virtual object is in the non-standing state, display the first A prompt information, the first prompt information is used to indicate that the virtual object is in the non-standing state; when the virtual object is in the standing state, second prompt information is displayed on the user interface, The second prompt information is used to indicate that the virtual object is in the standing state.
- an information display module 1540 configured to: when the virtual object is in the non-standing state, display the first A prompt information, the first prompt information is used to indicate that the virtual object is in the non-standing state; when the virtual object is in the standing state, second prompt information is displayed on the user interface, The second prompt information is used to indicate that the virtual object is in the standing state.
- the technical solution provided by the embodiments of the present application provides a way to control the virtual object to get up quickly and then automatically run when the virtual object is in the non-standing state, avoiding the control in the non-standing state. After the virtual object runs automatically, some operations must be performed to control the virtual object to stand up and run automatically, thereby further improving the efficiency of human-computer interaction and the human-computer interaction experience.
- FIG. 17 shows a structural block diagram of a mobile terminal 1700 according to an embodiment of the present application.
- the mobile terminal 1700 may be a portable electronic device such as a mobile phone, a tablet computer, a game console, an e-book reader, a multimedia playback device, and a wearable device.
- the mobile terminal is used to implement the virtual object control method provided in the foregoing embodiment.
- the mobile terminal may be the mobile terminal 10 in the implementation environment shown in FIG. 1. Specifically:
- the mobile terminal 1700 includes a processor 1701 and a memory 1702.
- the processor 1701 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
- the processor 1701 can be implemented in at least one hardware form among DSP (Digital Signal Processing), FPGA (Field Programmable Gate Array), PLA (Programmable Logic Array, Programmable Logic Array) .
- the processor 1701 may also include a main processor and a coprocessor.
- the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
- the processor 1701 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
- the processor 1701 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
- AI Artificial Intelligence
- the memory 1702 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 1702 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 1702 is used to store at least one instruction, at least one program, code set or instruction set, the at least one instruction, at least one program, code set or instruction It is configured to be executed by one or more processors to realize the above-mentioned control method of virtual objects.
- the mobile terminal 1700 may optionally further include: a peripheral device interface 1703 and at least one peripheral device.
- the processor 1701, the memory 1702, and the peripheral device interface 1703 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 1703 through a bus, a signal line, or a circuit board.
- the peripheral device includes: at least one of a radio frequency circuit 1704, a touch display screen 1705, a camera 1706, an audio circuit 1707, a positioning component 1408, and a power supply 1709.
- FIG. 17 does not constitute a limitation on the mobile terminal 1700, and may include more or fewer components than those shown in the figure, or combine certain components, or adopt different component arrangements.
- a storage medium is also provided, the storage medium is used to store a computer program, and the computer program is used to execute the above-mentioned method for controlling a virtual object.
- the computer-readable storage medium may include: read only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), solid state drive (SSD, Solid State Drives), optical disks, and the like.
- random access memory may include resistive random access memory (ReRAM, Resistance Random Access Memory) and dynamic random access memory (DRAM, Dynamic Random Access Memory).
- a computer program product which when it runs on a terminal device, causes the terminal device to execute the above-mentioned method for controlling virtual objects.
- the "plurality” mentioned herein refers to two or more.
- “And/or” describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone.
- the character "/” generally indicates that the associated objects before and after are in an "or” relationship.
- the numbering of the steps described in this article only exemplarily shows a possible order of execution among the steps. In some other embodiments, the above steps may also be executed out of the order of numbers, such as two differently numbered ones. The steps are executed at the same time, or the two steps with different numbers are executed in the reverse order of the figure, which is not limited in the embodiment of the present application.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (15)
- 一种虚拟对象的控制方法,所述方法由终端设备执行,所述方法包括:显示用户界面,所述用户界面中包括用于控制虚拟对象移动的摇杆控件;接收作用于所述摇杆控件对应的目标区域的快速点击操作信号;根据所述快速点击操作信号,控制所述虚拟对象在所述用户界面显示的虚拟场景中自动奔跑。
- 根据权利要求1所述的方法,所述根据所述快速点击操作信号,控制所述虚拟对象在所述用户界面显示的虚拟场景中自动奔跑,包括:获取所述快速点击操作信号对应的属性值;根据所述属性值,确定所述虚拟对象的奔跑速度;控制所述虚拟对象在所述虚拟场景中按照所述奔跑速度进行自动奔跑。
- 根据权利要求1所述的方法,所述根据所述快速点击操作信号,控制所述虚拟对象在所述用户界面显示的虚拟场景中自动奔跑,包括:检测所述快速点击操作信号对应的操作位置;根据所述操作位置,确定所述虚拟对象的奔跑方向;控制所述虚拟对象在所述虚拟场景中按照所述奔跑方向进行自动奔跑。
- 根据权利要求1至3任一项所述的方法,所述根据所述快速点击操作信号,控制所述虚拟对象在所述用户界面显示的虚拟场景中自动奔跑,包括:在接收到所述快速点击操作信号时,若所述虚拟对象处于非站立状态,则控制所述虚拟对象以所述非站立状态在所述虚拟场景中自动奔跑。
- 根据权利要求4所述的方法,所述控制所述虚拟对象以所述非站立状态在所述虚拟场景中自动奔跑之后,还包括:接收对应于所述虚拟对象的姿态切换指令;根据所述姿态切换指令,控制所述虚拟对象从所述非站立状态切换至站立状态;控制所述虚拟对象以所述站立状态在所述虚拟场景中自动奔跑。
- 一种虚拟对象的控制方法,所述方法包括:显示用户界面,所述用户界面中包括用于控制虚拟对象移动的摇杆控件;在所述虚拟对象处于非站立状态的情况下,若接收到起始位置位于所述摇 杆控件的滑动操作信号,则控制所述虚拟对象从所述非站立状态切换至站立状态;在所述虚拟对象切换至所述站立状态之后,控制所述虚拟对象在所述用户界面显示的虚拟场景中自动奔跑。
- 根据权利要求6所述的方法,所述控制所述虚拟对象从所述非站立状态切换至站立状态之前,还包括:获取所述滑动操作信号的触摸位置;当所述滑动操作信号的触摸位置位于快速起身图标的显示位置时,执行所述控制所述虚拟对象从所述非站立状态切换至站立状态的步骤。
- 根据权利要求7所述的方法,所述获取所述滑动操作信号的触摸位置之后,还包括:获取所述滑动操作信号的触摸位置与目标位置之间的距离;当检测到所述距离大于距离阈值时,在所述用户界面中显示所述快速起身图标。
- 根据权利要求8所述的方法,所述获取所述滑动操作信号的触摸位置与目标位置之间的距离之后,还包括:根据所述距离确定所述虚拟对象的移动速度,所述距离与所述移动速度之间呈负相关关系。
- 根据权利要求6至9任一项所述的方法,所述方法还包括:在所述虚拟对象处于所述非站立状态的情况下,在所述用户界面中显示第一提示信息,所述第一提示信息用于指示所述虚拟对象处于所述非站立状态;在所述虚拟对象处于所述站立状态的情况下,在所述用户界面中显示第二提示信息,所述第二提示信息用于指示所述虚拟对象处于所述站立状态。
- 一种虚拟对象的控制装置,所述装置包括:界面显示模块,用于显示用户界面,所述用户界面中包括用于控制虚拟对象移动的摇杆控件;信号接收模块,用于接收作用于所述摇杆控件对应的目标区域的快速点击操作信号;奔跑控制模块,用于根据所述快速点击操作信号,控制所述虚拟对象在所 述用户界面显示的虚拟场景中自动奔跑。
- 一种虚拟对象的控制装置,所述装置包括:界面显示模块,用于显示用户界面,所述用户界面中包括用于控制虚拟对象移动的摇杆控件;姿态切换模块,用于在所述虚拟对象处于非站立状态的情况下,若接收到起始位置位于所述摇杆控件的滑动操作信号,则控制所述虚拟对象从所述非站立状态切换至站立状态;奔跑控制模块,用于在所述虚拟对象切换至所述站立状态之后,控制所述虚拟对象在所述用户界面显示的虚拟场景中自动奔跑。
- 一种移动终端,所述移动终端包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行,以实现如权利要求1至5任一项所述的虚拟对象的控制方法,或实现如权利要求6至10任一项所述的虚拟对象的控制方法。
- 一种存储介质,所述存储介质用于存储计算机程序,所述计算机程序用于执行权利要求1至5任一项所述的虚拟对象的控制方法,或执行如权利要求6至10任一项所述的虚拟对象的控制方法。
- 一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行权利要求1至5任一项所述的虚拟对象的控制方法,或执行如权利要求6至10任一项所述的虚拟对象的控制方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020217031334A KR102625233B1 (ko) | 2019-08-30 | 2020-07-20 | 가상 객체를 제어하기 위한 방법, 및 관련 장치 |
JP2021558011A JP2022527502A (ja) | 2019-08-30 | 2020-07-20 | 仮想オブジェクトの制御方法及び装置、モバイル端末及びコンピュータプログラム |
SG11202110127UA SG11202110127UA (en) | 2019-08-30 | 2020-07-20 | Method for controlling virtual object, and related apparatus |
KR1020247001068A KR20240011871A (ko) | 2019-08-30 | 2020-07-20 | 가상 객체를 제어하기 위한 방법, 및 관련 장치 |
US17/408,362 US11833426B2 (en) | 2019-08-30 | 2021-08-20 | Virtual object control method and related apparatus |
JP2023171282A JP2023171885A (ja) | 2019-08-30 | 2023-10-02 | 仮想オブジェクトの制御方法、及び関連装置 |
US18/491,056 US20240042317A1 (en) | 2019-08-30 | 2023-10-20 | Virtual object control method and related apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910812631.6A CN110523085A (zh) | 2019-08-30 | 2019-08-30 | 虚拟对象的控制方法、装置、终端及存储介质 |
CN201910812631.6 | 2019-08-30 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/408,362 Continuation US11833426B2 (en) | 2019-08-30 | 2021-08-20 | Virtual object control method and related apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021036581A1 true WO2021036581A1 (zh) | 2021-03-04 |
Family
ID=68665389
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/103033 WO2021036581A1 (zh) | 2019-08-30 | 2020-07-20 | 虚拟对象的控制方法和相关装置 |
Country Status (6)
Country | Link |
---|---|
US (2) | US11833426B2 (zh) |
JP (2) | JP2022527502A (zh) |
KR (2) | KR20240011871A (zh) |
CN (1) | CN110523085A (zh) |
SG (1) | SG11202110127UA (zh) |
WO (1) | WO2021036581A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112870701A (zh) * | 2021-03-16 | 2021-06-01 | 网易(杭州)网络有限公司 | 虚拟角色的控制方法和装置 |
CN113440850A (zh) * | 2021-05-26 | 2021-09-28 | 完美世界(北京)软件科技发展有限公司 | 虚拟对象的控制方法及装置、存储介质、电子装置 |
CN113546403A (zh) * | 2021-07-30 | 2021-10-26 | 网易(杭州)网络有限公司 | 角色控制方法、装置、终端和计算机可读存储介质 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110523085A (zh) * | 2019-08-30 | 2019-12-03 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、终端及存储介质 |
CN111324253B (zh) * | 2020-02-12 | 2021-08-03 | 腾讯科技(深圳)有限公司 | 虚拟物品交互方法、装置、计算机设备及存储介质 |
CN111773677B (zh) * | 2020-07-23 | 2024-02-23 | 网易(杭州)网络有限公司 | 游戏控制方法及装置、计算机存储介质、电子设备 |
CN111760280B (zh) * | 2020-07-31 | 2023-08-25 | 腾讯科技(深圳)有限公司 | 界面显示方法、装置、终端及存储介质 |
CN111870945A (zh) * | 2020-08-10 | 2020-11-03 | 网易(杭州)网络有限公司 | 控件选择方法、装置、主机及介质 |
CN112148188A (zh) * | 2020-09-23 | 2020-12-29 | 北京市商汤科技开发有限公司 | 增强现实场景下的交互方法、装置、电子设备及存储介质 |
KR20220066244A (ko) * | 2020-11-13 | 2022-05-24 | 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 | 가상 객체 제어 방법 및 장치, 저장 매체 및 전자 기기 |
CN112817449B (zh) * | 2021-01-28 | 2023-07-21 | 北京市商汤科技开发有限公司 | 增强现实场景的交互方法、装置、电子设备及存储介质 |
CN113157180B (zh) * | 2021-03-29 | 2024-03-12 | 维沃移动通信有限公司 | 应用的触控操作方法、装置和电子设备 |
CN112965655B (zh) * | 2021-04-23 | 2023-12-08 | 努比亚技术有限公司 | 一种虚拟点触控制方法、设备及计算机可读存储介质 |
CN113318430A (zh) * | 2021-05-28 | 2021-08-31 | 网易(杭州)网络有限公司 | 虚拟角色的姿态调整方法、装置、处理器及电子装置 |
CN113304479B (zh) * | 2021-06-25 | 2023-06-20 | 腾讯科技(深圳)有限公司 | 指示信息的显示方法、装置、游戏终端设备及存储介质 |
CN113648656A (zh) * | 2021-08-18 | 2021-11-16 | 网易(杭州)网络有限公司 | 游戏中虚拟角色的控制方法、装置、电子设备和存储介质 |
CN114495476A (zh) * | 2022-01-20 | 2022-05-13 | 北京有竹居网络技术有限公司 | 控制装置及其控制方法、电子设备和存储介质 |
WO2024002255A1 (zh) * | 2022-06-29 | 2024-01-04 | 华人运通(上海)云计算科技有限公司 | 对象的控制方法、装置、设备、存储介质及车辆 |
CN117618903A (zh) * | 2022-08-18 | 2024-03-01 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、终端、存储介质及程序产品 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103252087A (zh) * | 2012-02-20 | 2013-08-21 | 富立业资讯有限公司 | 具有触控面板媒体的游戏控制方法及该游戏媒体 |
CN105446525A (zh) * | 2015-11-10 | 2016-03-30 | 网易(杭州)网络有限公司 | 一种游戏角色行为的控制方法 |
CN108469943A (zh) * | 2018-03-09 | 2018-08-31 | 网易(杭州)网络有限公司 | 奔跑操作的触发方法和装置 |
US20190046878A1 (en) * | 2010-05-20 | 2019-02-14 | John W. Howard | Touch screen with virtual joystick and methods for use therewith |
CN110523085A (zh) * | 2019-08-30 | 2019-12-03 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、终端及存储介质 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090143141A1 (en) * | 2002-08-06 | 2009-06-04 | Igt | Intelligent Multiplayer Gaming System With Multi-Touch Display |
US10180714B1 (en) * | 2008-04-24 | 2019-01-15 | Pixar | Two-handed multi-stroke marking menus for multi-touch devices |
CN111522493A (zh) * | 2008-08-22 | 2020-08-11 | 谷歌有限责任公司 | 移动设备上的三维环境中的导航 |
CN103782263B (zh) * | 2011-09-13 | 2017-02-15 | 索尼电脑娱乐公司 | 信息处理装置、信息处理方法、内容文件的数据结构、gui布局仿真器及gui布局设置辅助方法 |
US10751608B2 (en) * | 2012-08-31 | 2020-08-25 | Blue Goji Llc. | Full body movement control of dual joystick operated devices |
US9348488B1 (en) * | 2012-11-29 | 2016-05-24 | II Andrew Renema | Methods for blatant auxiliary activation inputs, initial and second individual real-time directions, and personally moving, interactive experiences and presentations |
US9227141B2 (en) * | 2013-12-31 | 2016-01-05 | Microsoft Technology Licensing, Llc | Touch screen game controller |
US9561432B2 (en) * | 2014-03-12 | 2017-02-07 | Wargaming.Net Limited | Touch control with dynamic zones |
CN105194873B (zh) * | 2015-10-10 | 2019-01-04 | 腾讯科技(成都)有限公司 | 一种信息处理方法、终端及计算机存储介质 |
CN105582670B (zh) * | 2015-12-17 | 2019-04-30 | 网易(杭州)网络有限公司 | 瞄准射击控制方法及装置 |
US10238962B2 (en) * | 2015-12-27 | 2019-03-26 | Spin Master Ltd. | System and method for recharging battery in augmented reality game system |
JP6588983B2 (ja) * | 2016-08-31 | 2019-10-09 | 任天堂株式会社 | ゲームプログラム、ゲーム処理方法、ゲームシステム、およびゲーム装置 |
KR20180068411A (ko) * | 2016-12-14 | 2018-06-22 | 삼성전자주식회사 | 무인 비행 전자 장치의 운행 제어 방법 및 이를 지원하는 전자 장치 |
CN109621411B (zh) * | 2017-09-30 | 2022-05-06 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN107930105A (zh) * | 2017-10-23 | 2018-04-20 | 网易(杭州)网络有限公司 | 信息处理方法及装置、存储介质、电子设备 |
CN107773987B (zh) * | 2017-10-24 | 2020-05-22 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
CN108379844B (zh) | 2018-03-30 | 2020-10-23 | 腾讯科技(深圳)有限公司 | 控制虚拟对象移动的方法、装置、电子装置及存储介质 |
CN108509139B (zh) * | 2018-03-30 | 2019-09-10 | 腾讯科技(深圳)有限公司 | 虚拟对象的移动控制方法、装置、电子装置及存储介质 |
US11420131B2 (en) * | 2020-05-04 | 2022-08-23 | Sony Interactive Entertainment Inc. | Systems and methods for facilitating secret communication between players during game play |
SG10202011943YA (en) * | 2020-12-01 | 2021-05-28 | Garena Online Private Ltd | Aim assist method for electronic games |
-
2019
- 2019-08-30 CN CN201910812631.6A patent/CN110523085A/zh active Pending
-
2020
- 2020-07-20 WO PCT/CN2020/103033 patent/WO2021036581A1/zh active Application Filing
- 2020-07-20 KR KR1020247001068A patent/KR20240011871A/ko not_active Application Discontinuation
- 2020-07-20 JP JP2021558011A patent/JP2022527502A/ja active Pending
- 2020-07-20 SG SG11202110127UA patent/SG11202110127UA/en unknown
- 2020-07-20 KR KR1020217031334A patent/KR102625233B1/ko active IP Right Grant
-
2021
- 2021-08-20 US US17/408,362 patent/US11833426B2/en active Active
-
2023
- 2023-10-02 JP JP2023171282A patent/JP2023171885A/ja active Pending
- 2023-10-20 US US18/491,056 patent/US20240042317A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190046878A1 (en) * | 2010-05-20 | 2019-02-14 | John W. Howard | Touch screen with virtual joystick and methods for use therewith |
CN103252087A (zh) * | 2012-02-20 | 2013-08-21 | 富立业资讯有限公司 | 具有触控面板媒体的游戏控制方法及该游戏媒体 |
CN105446525A (zh) * | 2015-11-10 | 2016-03-30 | 网易(杭州)网络有限公司 | 一种游戏角色行为的控制方法 |
CN108469943A (zh) * | 2018-03-09 | 2018-08-31 | 网易(杭州)网络有限公司 | 奔跑操作的触发方法和装置 |
CN110523085A (zh) * | 2019-08-30 | 2019-12-03 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、终端及存储介质 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112870701A (zh) * | 2021-03-16 | 2021-06-01 | 网易(杭州)网络有限公司 | 虚拟角色的控制方法和装置 |
CN112870701B (zh) * | 2021-03-16 | 2024-02-23 | 网易(杭州)网络有限公司 | 虚拟角色的控制方法和装置 |
CN113440850A (zh) * | 2021-05-26 | 2021-09-28 | 完美世界(北京)软件科技发展有限公司 | 虚拟对象的控制方法及装置、存储介质、电子装置 |
CN113546403A (zh) * | 2021-07-30 | 2021-10-26 | 网易(杭州)网络有限公司 | 角色控制方法、装置、终端和计算机可读存储介质 |
Also Published As
Publication number | Publication date |
---|---|
KR20210132175A (ko) | 2021-11-03 |
KR102625233B1 (ko) | 2024-01-16 |
SG11202110127UA (en) | 2021-10-28 |
CN110523085A (zh) | 2019-12-03 |
JP2023171885A (ja) | 2023-12-05 |
JP2022527502A (ja) | 2022-06-02 |
US20210379491A1 (en) | 2021-12-09 |
US20240042317A1 (en) | 2024-02-08 |
US11833426B2 (en) | 2023-12-05 |
KR20240011871A (ko) | 2024-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021036581A1 (zh) | 虚拟对象的控制方法和相关装置 | |
WO2021022967A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
KR102602113B1 (ko) | 정보 상호작용 방법 및 관련 장치 | |
WO2022121528A1 (zh) | 互动信息处理方法、装置、终端、存储介质及程序产品 | |
CN110585731B (zh) | 在虚拟环境中投掷虚拟物品的方法、装置、终端及介质 | |
WO2021227684A1 (en) | Method for selecting virtual objects, apparatus, terminal and storage medium | |
US20230289054A1 (en) | Control mode selection to indicate whether simultaneous perspective change and function selection is enabled | |
WO2021218460A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
JP2022532315A (ja) | 仮想オブジェクトの選択方法、装置、端末及びプログラム | |
EP4268913A1 (en) | Position adjustment method and apparatus for operation controls, and terminal, and storage medium | |
CN113546419A (zh) | 游戏地图显示方法、装置、终端及存储介质 | |
CN111475089B (zh) | 任务展示方法、装置、终端及存储介质 | |
WO2023236602A1 (zh) | 虚拟对象的显示控制方法、装置、存储介质和电子装置 | |
WO2023011035A1 (zh) | 虚拟道具的显示方法、装置、终端及存储介质 | |
CN112221123B (zh) | 一种虚拟对象切换方法、装置、计算机设备和存储介质 | |
CN111643895A (zh) | 操作响应方法、装置、终端及存储介质 | |
WO2024037154A1 (zh) | 虚拟对象的控制方法、装置、终端、存储介质及程序产品 | |
CN113546403A (zh) | 角色控制方法、装置、终端和计算机可读存储介质 | |
CN117150166A (zh) | 页面交互方法、装置、电子设备和计算机可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20859298 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021558011 Country of ref document: JP Kind code of ref document: A Ref document number: 20217031334 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20859298 Country of ref document: EP Kind code of ref document: A1 |