WO2021043000A1 - 信息交互方法和相关装置 - Google Patents
信息交互方法和相关装置 Download PDFInfo
- Publication number
- WO2021043000A1 WO2021043000A1 PCT/CN2020/110199 CN2020110199W WO2021043000A1 WO 2021043000 A1 WO2021043000 A1 WO 2021043000A1 CN 2020110199 W CN2020110199 W CN 2020110199W WO 2021043000 A1 WO2021043000 A1 WO 2021043000A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- skill
- target
- effect model
- game screen
- joystick
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/533—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5372—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/58—Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/70—Game security or game management aspects
- A63F13/79—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
- A63F13/798—Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories for assessing skills or for ranking players, e.g. for generating a hall of fame
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/822—Strategy games; Role-playing games
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0338—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/308—Details of the user interface
Definitions
- This application relates to the computer field, specifically to information interaction.
- Game skills can achieve specific effects for specific people, objects, areas, etc. in the game at a specific time under certain game conditions.
- the user can control the virtual characters in the game through the client to cast various game skills in the virtual scene. .
- the embodiments of the present application provide an information interaction method and related devices, which can improve the accuracy of information interaction.
- an embodiment of the present application provides an information exchange method, the method is executed by a terminal, and the method includes:
- a skill effect model of the target skill is generated at the at least one skill generation position.
- an embodiment of the present application also provides an information interaction device, including:
- the screen unit is used to display a game screen, the game screen including a candidate skill area;
- the skill unit is used to determine the target skill based on the skill selection operation for the candidate skill area
- the joystick unit is used to display a virtual joystick object on the game screen
- a position unit configured to calculate at least one skill generation position of the target skill based on the movement operation when a movement operation for the virtual joystick object is detected;
- the generating unit is configured to generate a skill effect model of the target skill at the at least one skill generating position when a casting operation on the virtual joystick object is detected.
- an embodiment of the present application also provides a storage medium, the storage medium is used to store a computer program, and the computer program is used to execute the information interaction method in the above aspect.
- an embodiment of the present application also provides a terminal, including a memory storing a plurality of instructions; the processor loads instructions from the memory to execute the information interaction method in the above aspect.
- the embodiments of the present application provide a computer program product including instructions, which when run on a computer, cause the computer to execute the information interaction method in the above aspects.
- the embodiment of the application can display the game screen, the game screen includes the candidate skill area; based on the skill selection operation for the candidate skill area, the target skill is determined; the virtual joystick object is displayed on the game screen; when the movement of the virtual joystick object is detected During operation, at least one skill generation position of the target skill is calculated based on the movement operation; when a casting operation for the virtual joystick object is detected, a skill effect model of the target skill is generated at the at least one skill generation position.
- the user can control and adjust the generating positions of multiple skill effect models of the game skill through the virtual joystick object in the game screen, thus, the solution can improve the accuracy of information interaction.
- Fig. 1a is a schematic diagram of a scenario of an information interaction method provided by an embodiment of the present application
- Figure 1b is a schematic flowchart of an information interaction method provided by an embodiment of the present application.
- Fig. 1c is a schematic structural diagram of a virtual joystick object provided by an embodiment of the present application.
- FIG. 1d is a schematic diagram of the effect of overlay display provided by an embodiment of the present application.
- FIG. 1e is a schematic diagram of the relationship between the skill generation location and the skill generation area provided by an embodiment of the present application
- FIG. 1f is a schematic diagram of the mapping relationship between the joystick position and the skill generation position provided by an embodiment of the present application
- FIG. 1g is a schematic diagram of changes in the skill effect distribution trajectory provided by an embodiment of the present application.
- FIG. 2a is a schematic diagram of the orientation of the first skill effect model provided by an embodiment of the present application.
- FIG. 2b is a schematic diagram of the orientation of the first skill effect model provided by an embodiment of the present application.
- Fig. 3 is a schematic structural diagram of an information interaction device provided by an embodiment of the present application.
- Fig. 4 is a schematic structural diagram of a network device provided by an embodiment of the present application.
- the embodiments of the present application provide an information interaction method and related devices.
- the information interaction device may be specifically integrated in a terminal, and the terminal may be a mobile phone, a tablet computer, a smart Bluetooth device, a notebook computer, or a personal computer (PC) and other devices.
- the terminal may be a mobile phone, a tablet computer, a smart Bluetooth device, a notebook computer, or a personal computer (PC) and other devices.
- the information interaction device can be specifically integrated in a smart phone.
- the smart phone can be installed with game software.
- the smart phone can run the game software.
- the smart phone can display a game screen.
- the screen may include a candidate skill area; based on the user's skill selection operation for the candidate skill area, the target skill can be determined; the virtual joystick object is displayed on the game screen; when the user's movement operation on the virtual joystick object is detected, based on the movement
- the operation calculates at least one skill generation position of the target skill; when the user's casting operation on the virtual joystick object is detected, the skill effect model of the target skill is generated at the at least one skill generation position.
- an information exchange method is provided. As shown in Fig. 1b, the specific process of the information exchange method may be as follows:
- the game screen may include the game scene screen of the game software and the user interface (UI).
- the game scene screen may display the scenes in the game, virtual characters, etc.
- the user interface may include game buttons, text, windows, etc. Game design elements that have direct or indirect contact with game users.
- the user can interact with the game content through the game screen.
- the game screen may include a candidate skill area, and the candidate skill area may contain skill information of at least one candidate skill, where the skill information may be information such as a skill name, a skill control, and the like.
- game skills can be a series of virtual events in an electronic game. These virtual events can trigger specific effects for specific people, objects, areas, etc. in the game at a specific time, provided that certain conditions are met.
- game skills include the trigger mechanism of the game skill (when the life cycle of the skill is started in a certain way, skill events (atomic information describing the occurrence of the skill), and skill effects (causing changes to the current game environment).
- Game skills are specific The skills and effects of can be formulated by those skilled in the art according to their needs.
- a game skill can include its effect model and its numerical model.
- the realization of the game skill is to generate the effect model of the game skill in the game scene and apply its numerical model to the corresponding target object to achieve the skill. effect.
- the effect model of the game skill can have multiple types, for example, a scene model, an architectural model, a character model, an animation model, a particle effect model, and so on.
- the game effect of game skill A is: summon 5 virtual followers at the same time around the skill cast object of the game skill A, then the implementation method is: around the skill cast object model of the game skill A Generate 5 virtual entourage models.
- the user has various operation modes for the skill selection operation of the candidate skill area.
- the user can select the skill in the candidate skill area by dragging, clicking, swiping, pressing, and touching.
- the skill selection operation can also be generated by simulating a user operation, for example, simulating an operation for a designated position in the candidate skill area to generate a skill selection operation.
- other operations mentioned in this application for example, move operations, release operations, and other operations can also be generated by simulation.
- the candidate skill area may include skill information of at least one candidate skill, and the skill information may be a skill control, and the control may be expressed in the form of icons, symbols, buttons, and the like.
- the target skill can be determined among the candidate skills.
- the candidate skill area may include at least one skill control of the candidate skill, so step 102 may specifically It includes the following steps:
- the target skill is determined in at least one candidate skill.
- the candidate skill area may include skill icons of multiple candidate skills, and the user can select one candidate skill from the skill icons of the multiple candidate skills and click, that is, determine the candidate skill as the target skill.
- the candidate skill area may include a skill icon of a candidate skill, and the user can swipe down the skill icon of the candidate skill in the candidate skill area to switch the candidate skill area to display the skill icon of the next candidate skill, and Determine the next candidate skill as the target skill; and, the user can swipe up the skill icon of the candidate skill in the candidate skill area, so that the candidate skill area is switched to the skill icon of the previous candidate skill, and the previous skill icon is displayed.
- Candidate skills are determined as target skills.
- the virtual joystick object is a virtual component that can be used for human-computer interaction.
- the user inputs information to the terminal by controlling the movement of the virtual joystick object.
- the virtual joystick object may include a preset joystick control, a preset joystick movement range, and a range center axis of the preset joystick movement range.
- the user can control the preset joystick control to move arbitrarily within the preset joystick movement range.
- the preset joystick control and the center axis of the preset joystick movement range The relative distance and direction can be used as information input by the user to interact with the game.
- step 103 may include the following steps:
- Overlay display means that the virtual joystick object and the skill control are layered, and the virtual joystick object is placed above the skill control, and then the virtual joystick object and the skill control are displayed.
- the coverage can be full coverage, partial coverage, etc.
- the virtual joystick object when the target skill is skill A, and the expression form of the skill control of skill A is an icon, the virtual joystick object will be overlaid on the skill A icon of the target skill, that is, the virtual joystick object and the target skill's After the skill A icons are stacked and the virtual joystick object is above the skill A icon, then the virtual joystick object and the skill A icon are displayed
- the game screen may also include a cancel casting control, so after step 103, the following steps may be specifically included:
- the virtual joystick object is stopped to be displayed on the game screen.
- the virtual joystick object is stopped to be displayed in the game screen.
- the skill preview effect model and skill generation area of the target skill can also be stopped displaying in the game screen, wherein the skill preview effect model and the skill generation area
- the skill preview effect model and the skill generation area For the specific introduction of the area, please refer to the description in step 104, which will not be repeated here.
- the user can move the virtual joystick control of the virtual joystick object by dragging, swiping, clicking, etc.
- the skill generation position refers to the position where the skill effect model is actually generated in the game scene when the game skill is triggered.
- the skill generation position can be one or multiple.
- two thunderclouds can be generated at the easternmost and westernmost end of the game scene.
- the skill effect model of skill X is to generate two thunderclouds, and the skill generation position of skill X is the easternmost end. (100, 0) and the westernmost end (-100, 0).
- the virtual joystick object may include a virtual joystick control and a preset movement range of the joystick.
- Step 104 may include the following steps:
- the skill cast object refers to the virtual object for which the skill effect model of the game skill takes effect, where the virtual object may be a character virtual object, an item virtual object, a scene virtual object, and so on.
- the skill cast object of game skill X is the virtual character that casts the skill in the game scene;
- the skill cast object of game skill Y is other virtual characters selected by the user in the game scene;
- the skill cast of game skill Z The object is a certain virtual item, a certain virtual place, etc. selected by the user in the game scene.
- the current position of the skill cast object in the game screen refers to the position of the skill cast object in the game.
- the skill cast object is a tree in the game scene, and the current position of the tree in a certain scene of the game is (299, 5, 41).
- the preset skill casting range refers to the maximum casting range of the game skill, which can be range distance, range radius, range area, range volume, etc.
- the preset skill casting range can be set by the game developer to improve gameplay and maintain Game balance and so on.
- the skill cast object of the game skill "Summon Thundercloud” is a certain place in the game scene specified by the user, and its preset skill cast range is a circle with a radius of 8 meters.
- the skill generation area is displayed on the game screen.
- the skill generation area refers to the area where the skill effect model of the game skill can be generated in the game scene, that is, the area where the skill generation position is located.
- the skill generation area of the target skill in the game screen can be based on the preset skill casting range based on the current position of the target of the skill cast.
- the skill generation area can be centered on the current position and the preset skill cast
- the range is a circle, fan, ball, etc. with a radius; for another example, the skill generation area can be a square, diamond, pyramid, etc. with the current position as the center and the preset skill cast range as the side length.
- the skill generation area can be displayed in the game scene.
- the skill generation area is the center of the circle and the preset skill casting range d as the radius.
- a circular area where the user can control the specific position of the skill generation position (x, y) of the target skill in the skill generation area through the virtual joystick object.
- the skill generation area may be displayed in the game scene in a highlighted form.
- the edge of the skill generation area may be displayed in blue
- the skill generation area may be filled in gray
- the filled skills may be displayed. Generate regions, and so on.
- the preset joystick control can move within the movement range of the joystick.
- the joystick coordinate system is established with the range center axis as the coordinate axis zero point, and the position (x, y) of the joystick control in the joystick coordinate system is preset, that is, the joystick position.
- the movement range of the joystick is the maximum cast range that the joystick can move.
- the preset skill cast range can be set by the game developer to improve gameplay, maintain game balance, and so on.
- the moving range of the joystick is the maximum range that the joystick can move.
- the range can be range distance, range radius, range area, range volume, etc.
- the rocker moves
- the range can be set by the game developer or adjusted by the user.
- the number of skill generation positions can be defined in the game skills.
- the skill generation positions in a circular area with the preset skill cast range as the radius ie, the skill generation area
- the preset joystick movement range as the radius
- the joystick position mapping in the circular area ie, the preset joystick movement area
- the step of "calculating at least one skill generation position of the target skill based on the preset joystick movement range, joystick position, preset skill cast range, and current position" may specifically include the following steps:
- the relative position is the relative position between the skill generation position and the skill cast object
- the preset joystick movement range is radius r
- the user moves the preset joystick control of the virtual joystick object to the joystick position (x, y)
- the preset skill cast range of the target skill is radius R
- the current position of the skill cast object is (a, b)
- the calculation method of the skill generation position (X, Y) is as follows:
- the interaction range ratio is R/r, that is, the mapping ratio between the preset joystick movement range and the preset skill casting range;
- the relative position includes the relative position of the x-axis and the relative position of the y-axis, and the relative position of the x-axis is x* R/r, the relative position of the y axis is y*R/r.
- the effect direction of the skill effect model can be controlled according to the relative position between the skill generation position and the skill cast object.
- the skill effect of the skill "Summon Follower” is Summon
- the relative positions of the virtual followers are determined according to the relative position between the call location controlled by the user (that is, the skill generation location) and the skill cast object, so that the fronts of the followers face these relative directions.
- the relative direction of the skill effect model relative to the skill cast object is calculated according to the relative position.
- the relative direction is arctanB/A; and if the relative position is polar coordinates ( ⁇ , ⁇ ), the relative direction is ⁇ .
- the target skill can generate skill effect models at multiple skill generation locations.
- the skill "Summon Followers” can generate game models of 3 virtual followers at the skill generation location specified by the user, in order to further enhance the information interaction. Accuracy and reduced operational complexity.
- This embodiment provides the concept of the skill effect model distribution trajectory.
- the skill generation positions of the target skill are all on the skill effect model distribution trajectory. Step C may include steps a, b, c, d, and e. ,as follows:
- the number of skill generation positions can be pre-determined in the game skill. For example, the number of skill generation positions for the skill “Summon Followers" is 3. After the skill is triggered, 3 virtual followers can be generated at the skill generation positions specified by the user.
- the distribution trajectory of the skill effect model of the target skill can be calculated.
- step b may specifically include the following steps:
- the relative distance is weighted and summed according to the preset coefficient, and the distribution radius of the skill effect model of the target skill is obtained.
- the preset coefficient K can be preset by the game developer, and the calculation formula of the relative distance d is as follows:
- the relative position is (x, y).
- the large circle is the preset skill cast range
- the small circle is the skill effect model distribution trajectory with the skill effect model distribution radius as the radius
- the triangle is the relative position of the skill cast object; the farther the relative position is (ie , The larger the relative distance), the larger the small circle.
- step c it can include the following steps:
- the relative position is the center of the circle and the skill effect model distribution trajectory of the target skill is determined based on the distribution radius of the skill effect model;
- the distribution trajectory of the skill effect model of the target skill is determined by taking the relative position as the center of the circle and based on the minimum skill effect model distribution radius.
- the preset distribution volume of the skill effect model is the game model volume of the skill effect model.
- a game model of a virtual tree can be generated at each of the 4 skill generation positions.
- the relative position is the center of the circle, and the skill effect model distribution trajectory of the target skill is determined based on the skill effect model distribution radius;
- the relative position is taken as the center of the circle and the skill effect model distribution trajectory of the target skill is determined based on 12 meters.
- the skill effect model distribution trajectory is equally divided by the number of skill generation positions, and each average point is used as Distribution points of skill effect models.
- the effect direction of the skill effect model can be controlled according to the vertical direction of the distribution point of the skill effect model.
- the skill effect of the skill "Summon Followers” is to summon 3 virtual characters.
- the vertical direction of the distribution points is defined as the relative directions of these virtual followers, so that the fronts of these followers face these relative directions.
- step d the following steps may also be included:
- the vertical direction of the distribution point of the skill effect model is determined.
- the position of the skill effect model distribution point (m, n) in the game screen, that is, the skill generation position is (m+a, n+b) ).
- the skill effect of the skill "Summon Followers” is to summon 3 virtual followers.
- a virtual follower model is generated at the 3 skill generation positions in the game scene.
- step 105 may include the following steps:
- the direction of the skill effect model of the target skill is modified based on the relative direction, and the target skill with the modified direction is obtained;
- the skill effect model of the target skill with the modified direction is generated at the skill generating position.
- a virtual follower model is generated.
- the model direction of the virtual follower model is ⁇ , and the relative direction obtained in step B of step 104 is ⁇ , then the model direction of the virtual follower model is modified It is ⁇ + ⁇ .
- step 105 may include the following steps:
- the direction of the skill effect model of the target skill is modified based on the vertical direction, and the target skill with the modified direction is obtained;
- the skill effect model of the target skill with the modified direction is generated at the skill generating position.
- step 105 may specifically include the following steps:
- the skill effect model of the target skill is generated at the skill generation position.
- step 105 in order to enable the user to intuitively observe the cast position of the target skill while controlling the virtual joystick object before casting the skill, so that the user can adjust the skill generation position while observing, so after step 105 It includes the following steps:
- the skill preview effect model is an effect model for users to preview.
- the skill preview effect model is generated in the game scene, the game skill often has not yet taken effect, and the game skill officially takes effect when the skill effect model is generated in the game scene.
- the user can grasp the accuracy of the game cast.
- the generation of the skill preview effect model of the target skill can be stopped at at least one skill generation position, and the skill effect model of the target skill can be generated.
- the skill effect of the game skill "Summon Followers” is to summon 3 virtual followers around the target of the skill cast, and the virtual followers can carry out long-range attacks on nearby enemies. Then, when the user adjusts the skill generation position through the virtual joystick object, a skill preview effect model of the game skill, such as a virtual shadow model of a virtual follower, is generated at the skill generation position. Until the user performs the casting operation, the virtual follower's virtual model is removed from the skill generation position, and the skill effect model of the game skill is generated, such as the game model of the virtual follower, and the virtual follower can perform remote attacks on nearby enemies.
- a skill preview effect model of the game skill such as a virtual shadow model of a virtual follower
- the game screen can be displayed by the method provided in the embodiments of this application, and the game screen includes candidate skill areas; the target skill is determined based on the user's skill selection operation for the candidate skill areas; the virtual joystick object is displayed on the game screen; when When the user’s movement operation on the virtual joystick object is detected, at least one skill generation position of the target skill is calculated based on the movement operation; when the user’s release operation on the virtual joystick object is detected, the target skill’s position is generated at the at least one skill generation position Skill effect model.
- this solution can control the generation positions of multiple skill effect models of the game skill through the virtual joystick object, so that the game skill is cast more accurately, thereby improving the accuracy of information interaction.
- the application of the information interaction method in a mobile phone game with a smart phone as a terminal is taken as an example, and the method in the embodiment of the present application will be described in detail.
- game skills have multiple types, such as summoning types and spell types.
- the summoning type game skill is a model of generating one or more summoning units in the game scene
- the spell type game skill is a model of generating a spell special effect in the game scene.
- the flow of the information interaction method is as follows:
- the game screen is displayed, and the game screen includes candidate skill areas.
- the game screen may include candidate skill areas.
- the game screen may also include the character information of the player's own virtual character, such as a nickname, the blood volume of the own virtual character, the gain effect of the own virtual character, and so on.
- the game screen may also include a battle time control, which can be used to display the duration of the player battle.
- the game screen may also include a second virtual joystick object used to control the movement of its own virtual character.
- a second virtual joystick object may be included in the lower left corner of the game screen.
- the candidate skill area may include a plurality of candidate skill controls, and the candidate skill controls may be candidate skill icons.
- the candidate skill area may include 3 candidate skill icons, which are the skill icon for the skill “Light Ball”, the skill icon for the skill “Summoning: Warrior”, and the skill icon for the skill “Summoning: Archer”.
- the game screen includes a second virtual joystick object, and the second virtual joystick object can be used to control the movement of its own virtual character.
- the game screen includes a first virtual joystick object.
- the virtual joystick object can be used to control the casting of game skills.
- a conventional skill casting method can be used for information interaction; when the skill type of the target skill is a summoning type, this information interaction method is used for information interaction.
- the target skill is the skill "Summoning: Warrior”
- the skill effect of the skill “Summoning: Warrior” is: summon a fighter unit within 7.2 meters of the virtual character, and the fighter unit is hostile to nearby The virtual character performs a melee attack.
- the preset skill cast range of the skill "Summoning: Warrior” is 7.2 meters.
- the preset joystick movement range is 3.6 meters, and the player will use the virtual joystick.
- the method of calculating the skill generation position (X, Y) of the target skill based on the drag operation is as follows:
- the target skill is the skill "Summoning: Archer”
- the skill effect of the skill “Summoning: Archer” is: summon 3 archer followers within 10 meters of the virtual character, Archer
- the entourage unit conducts long-range attacks on nearby hostile virtual characters.
- the preset skill cast range of the skill “Summoning: Archer” is 10 meters, and the number of skill generating positions for the skill “Summoning: Archer” is 3.
- the preset joystick movement range is 4 meters.
- the method of calculating the skill generation position (X, Y) of the target skill based on the drag operation is as follows:
- the model volume of the archer follower unit of the skill "Summoning: Archer” is 1 meter * 1 meter.
- the model of the 3 archer follower units overlaps.
- the distribution radius r of the skill effect model of the target skill is not less than the minimum skill effect model distribution radius min_r.
- the distribution trajectory of the skill effect model is not less than the minimum skill effect model distribution radius min_r.
- the five-pointed star is its own virtual character
- the large circle is the preset skill cast range
- the small circle is the distribution trajectory of the skill effect model
- the arrow points to the relative position (1.25, 3.75)
- the distribution trajectory of the skill effect model It is a circle with a center of (1.25, 3.75) and a radius r of 93.75 meters.
- three archer entourage units can be evenly distributed on a semicircle at one end farther from the virtual character. That is, the semicircle of the skill effect model distribution trajectory at the end farther from the current position (1, 2) is divided into the number of skill generation positions + 1 equal division (ie, 4 equal divisions), and the distribution points of the skill effect model are the skill effect model distribution. Small black dots on the trajectory.
- the distribution points of the skill effect model are (1, 1), (1.5, 0), (1, -1), and the current position of the skill cast object in the game screen is (10, 10), then the skill generation position is (10, 10), (10.5, 10), (10, 9).
- the skill preview model of the skill "Summoning: Archer” is the preview model of the archer follower unit.
- the preview model of the archer entourage unit is a triangle, and its orientation is a dotted arrow pointing direction, that is, its orientation can be calculated from the relative position and the skill generation position.
- the preview model of the archer follower unit is a triangle, and its orientation is a dotted arrow pointing direction, that is, its orientation can be calculated from the current position and the skill generation position.
- the orientation of the skill preview model needs to be modified to the orientation calculated above.
- three archer follower unit models can be generated at the above three skill generation positions according to the modified orientation.
- the player can accurately cast the summoning skill through the virtual joystick object.
- the player can also use the traditional roulette method to cast the spell skill.
- the operation consistency of the summoning skill and the spell skill can be improved.
- this solution can also adjust the orientation and distribution density of the skill effect models at the same time when the game skill can generate multiple skill effect models.
- this solution can display the game screen, the game screen includes candidate skill areas; based on the user's skill selection operation for the candidate skill areas, the target skill is determined; the virtual joystick object is displayed on the game screen; when the skill type of the target skill When it is a summoning type, and when the user's drag operation on the virtual joystick object is detected, calculate at least one skill generation position of the target skill based on the drag operation; when the user's release operation on the virtual joystick object is detected, At least one skill generation location generates the skill effect of the target skill.
- this solution can simultaneously control the casting position and direction of the game skill effect model through the virtual joystick object, and automatically control the casting density of the game skill effect model, thereby reducing the complexity of operations and improving information interaction. Accuracy.
- an embodiment of the present application also provides an information interaction device.
- the information interaction device may be specifically integrated in a terminal.
- the terminal may be a smart phone, a tablet computer, a smart Bluetooth device, a notebook computer, or a personal computer. Computers and other equipment.
- the integration of the information interaction device in a smart phone will be taken as an example to describe the method of the embodiment of the present application in detail.
- the information interaction apparatus may include a screen unit 301, a skill unit 302, a joystick unit 303, a position unit 304, and a generating unit 305 as follows:
- the screen unit 301 may be used to display a game screen, where the game screen may include a candidate skill area.
- the skill unit 302 may be used to determine the target skill based on the skill selection operation for the candidate skill area.
- the candidate skill area includes at least one skill control of the candidate skill, and the skill unit 302 may be specifically used for:
- the target skill is determined in at least one candidate skill.
- the joystick unit 303 may be used to display virtual joystick objects on the game screen.
- the game screen further includes a cancel cast control
- the joystick unit 303 can also be used to:
- the virtual joystick object is stopped to be displayed on the game screen.
- the position unit 304 may be used to calculate at least one skill generation position of the target skill based on the movement operation.
- the virtual joystick object may include a virtual joystick control and a preset joystick movement range
- the position unit 304 may include a current position subunit, a joystick subunit, and a position subunit, as follows:
- the current location subunit can be used to determine the target skill's skill cast object and preset skill cast range, and obtain the current position of the skill cast object in the game screen;
- the current location subunit can also be used to:
- the skill generation area is displayed on the game screen.
- the joystick subunit can be used to obtain the joystick position of the virtual joystick control in the preset joystick movement range when the user's movement operation on the virtual joystick object is detected;
- the position subunit may be used to calculate at least one skill generation position of the target skill based on the preset joystick movement range, joystick position, preset skill cast range, and current position.
- the position sub-unit may include a proportion sub-module, a relative position sub-module, and a generating position sub-module, as follows:
- the ratio example module can be used to determine the interaction range ratio between the preset joystick movement range and the preset skill cast range
- the relative position submodule can be used to determine the relative position according to the ratio of the interaction range and the position of the joystick.
- the relative position is the relative position between the skill generation position and the skill cast object;
- the relative position submodule can also be used to:
- Generate the skill effect model of the target skill in at least one skill generation position including:
- the skill effect model of the target skill with the modified direction is generated at the skill generating position.
- the generating position sub-module may be used to determine the relative position of the skill generating position in the game screen according to the current position of the skill casting object in the game screen.
- the generating location submodule can be used to:
- the generating unit 305 may be used to generate a skill effect model of the target skill at at least one skill generating position.
- the generating unit 305 may be specifically used to:
- the generating unit 305 may be specifically used to:
- the skill effect model of the target skill is generated at the skill generation position.
- the generating unit 305 may be used to:
- the generating unit 305 may also be used to:
- each of the above units can be generated as an independent entity, or can be arbitrarily combined, and generated as the same or several entities.
- each of the above units please refer to the previous method embodiments, which will not be repeated here.
- the information interaction device of this embodiment displays the game screen by the screen unit, and the game screen includes the candidate skill area; the skill unit determines the target skill based on the user's skill selection operation for the candidate skill area; the joystick unit displays the game screen
- the virtual joystick object is displayed on the upper part; when the user's movement operation on the virtual joystick object is detected, the position unit calculates at least one skill generation position of the target skill based on the movement operation; when the user's release operation on the virtual joystick object is detected ,
- the generating unit generates the skill effect model of the target skill in at least one skill generating position.
- the embodiment of the present application also provides a terminal, which may be a mobile phone, a tablet computer, a smart Bluetooth device, a notebook computer, or a personal computer and other devices.
- a terminal which may be a mobile phone, a tablet computer, a smart Bluetooth device, a notebook computer, or a personal computer and other devices.
- the terminal may be a node in a distributed system, where the distributed system may be a blockchain system, and the blockchain system may be connected by the multiple nodes through network communication.
- Distributed system formed.
- nodes can form a peer-to-peer (P2P, Peer To Peer) network, and any form of computing equipment, such as servers, terminals, and other electronic devices, can become a node in the blockchain system by joining the peer-to-peer network.
- P2P peer-to-peer
- computing equipment such as servers, terminals, and other electronic devices
- the terminal of this embodiment is a smart phone as an example for detailed description.
- FIG. 4 shows a schematic diagram of the structure of the terminal involved in the embodiment of the present application, specifically:
- the terminal may include one or more processing core processors 401, one or more computer-readable storage medium memory 402, power supply 403, input module 404, communication module 405 and other components.
- processing core processors 401 one or more computer-readable storage medium memory 402, power supply 403, input module 404, communication module 405 and other components.
- power supply 403 input module 404
- communication module 405 communication module 405 and other components.
- FIG. 4 does not constitute a limitation on the terminal, and may include more or less components than those shown in the figure, or combine some components, or arrange different components. among them:
- the processor 401 is the control center of the terminal. It uses various interfaces and lines to connect various parts of the entire terminal. By running or executing software programs and/or modules stored in the memory 402, and calling data stored in the memory 402, Perform various functions of the terminal and process data to monitor the terminal as a whole.
- the processor 401 may include one or more processing cores; in some embodiments, the processor 401 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user For interface and application programs, the modem processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 401.
- the memory 402 may be used to store software programs and modules.
- the processor 401 executes various functional applications and data processing by running the software programs and modules stored in the memory 402.
- the memory 402 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of the terminal, etc.
- the memory 402 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
- the memory 402 may further include a memory controller to provide the processor 401 with access to the memory 402.
- the terminal also includes a power supply 403 for supplying power to various components.
- the power supply 403 may be logically connected to the processor 401 through a power management system, so that functions such as charging, discharging, and power management are generated through the power management system.
- the power supply 403 may also include any components such as one or more DC or AC power supplies, a recharging system, a power failure detection circuit, a power converter or inverter, and a power status indicator.
- the terminal may also include an input module 404, which can be used to receive input digital or character information and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control.
- an input module 404 which can be used to receive input digital or character information and generate keyboard, mouse, joystick, optical or trackball signal input related to user settings and function control.
- the terminal may also include a communication module 405.
- the communication module 405 may include a wireless module.
- the terminal may perform short-distance wireless transmission through the wireless module of the communication module 405, thereby providing users with wireless broadband Internet access.
- the communication module 405 can be used to help users send and receive emails, browse webpages, and access streaming media.
- the terminal may also include a display unit, etc., which will not be repeated here.
- the processor 401 in the terminal loads the executable files corresponding to the processes of one or more applications into the memory 402 according to the following instructions, and the processor 401 runs and stores them in the memory.
- the applications in 402, thereby generating various functions, are as follows:
- the game screen includes candidate skill areas
- a skill effect model of the target skill is generated at at least one skill generation position.
- an embodiment of the present application also provides a storage medium, where the storage medium is used to store a computer program, and the computer program is used to execute the method provided in the foregoing embodiment.
- the instruction can perform the following steps:
- the game screen includes candidate skill areas
- a skill effect model of the target skill is generated at at least one skill generation position.
- the storage medium may include: read only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk, etc.
- the instructions stored in the storage medium can execute the steps in any information interaction method provided in the embodiments of the present application, it can achieve what can be achieved by any information interaction method provided in the embodiments of the present application.
- the beneficial effects see the previous embodiment for details, and will not be repeated here.
- the embodiments of the present application also provide a computer program product including instructions, which when run on a computer, cause the computer to execute the method provided in the above-mentioned embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computer Security & Cryptography (AREA)
- General Business, Economics & Management (AREA)
- Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
方案 | 位置 | 多单位 | 多单位密度 | 操作一致性 |
传统轮盘 | 仅控制角度 | 不能 | 不能 | 一致 |
拖拽 | 能 | 能 | 不能 | 不一致 |
本设计方案 | 能 | 能 | 能 | 一致 |
Claims (16)
- 一种信息交互方法,所述方法由终端执行,所述方法包括:显示游戏画面,所述游戏画面包括候选技能区域;基于针对所述候选技能区域的技能选取操作,确定目标技能;在所述游戏画面上显示虚拟摇杆对象;当检测到针对所述虚拟摇杆对象的移动操作时,基于所述移动操作计算所述目标技能的至少一个技能生成位置;当检测到针对所述虚拟摇杆对象的施放操作时,在所述至少一个技能生成位置生成所述目标技能的技能效果模型。
- 如权利要求1所述的信息交互方法,所述虚拟摇杆对象包括虚拟摇杆控件和预设摇杆移动范围;所述当检测到针对所述虚拟摇杆对象的移动操作时,基于所述移动操作计算所述目标技能的至少一个技能生成位置,包括:确定所述目标技能的技能施放对象和预设技能施放范围,获取所述技能施放对象在游戏画面中的当前位置;当检测到针对所述虚拟摇杆对象的移动操作时,获取所述虚拟摇杆控件在所述预设摇杆移动范围中的摇杆位置;基于所述预设摇杆移动范围、摇杆位置、预设技能施放范围和当前位置,计算所述目标技能的至少一个技能生成位置。
- 如权利要求2所述的信息交互方法,所述确定所述目标技能的预设技能施放范围和技能施放对象,获取所述技能施放对象在游戏画面中的当前位置之后,还包括:以所述技能施放对象的当前位置为中心,基于所述预设技能施放范围确定所述目标技能在所述游戏画面中的技能生成区域;在所述游戏画面上显示所述技能生成区域;在所述至少一个技能生成位置生成所述目标技能的技能效果模型,包括:当所述至少一个技能生成位置属于所述技能生成区域时,在所述至少一个技能生成位置生成所述目标技能的技能效果模型。
- 如权利要求2所述的信息交互方法,所述基于所述预设摇杆移动范围、摇杆位置、预设技能施放范围和当前位置,计算所述目标技能的至少一个技能 生成位置,包括:确定所述预设摇杆移动范围和预设技能施放范围之间的交互范围比例;根据所述交互范围比例和所述摇杆位置确定相对位置,所述相对位置为技能生成位置和技能施放对象之间相对的位置;根据所述技能施放对象在游戏画面中的当前位置,确定所述相对位置在游戏画面中的技能生成位置。
- 如权利要求4所述的信息交互方法,所述根据所述交互范围比例和摇杆位置确定相对位置之后,还包括:根据所述相对位置计算技能效果模型相对于所述技能施放对象的相对方向;所述在所述至少一个技能生成位置生成所述目标技能的技能效果模型,包括:基于所述相对方向修改所述目标技能的技能效果模型方向,得到方向修改后的目标技能;在所述至少一个技能生成位置生成所述方向修改后的目标技能的技能效果模型。
- 如权利要求4所述的信息交互方法,所述根据所述技能施放对象在游戏画面中的当前位置,确定所述相对位置在游戏画面中的技能生成位置,包括:确定所述目标技能的技能生成位置数量;根据所述相对位置计算所述目标技能的技能效果模型分布半径;以所述相对位置为圆心,基于所述技能效果模型分布半径确定所述目标技能的技能效果模型分布轨迹;基于所述至少一个技能生成位置数量,在所述技能效果模型分布轨迹上确定多个技能效果模型分布点;根据所述技能施放对象在游戏画面中的当前位置,确定所述技能效果模型分布点在游戏画面中的位置作为所述技能生成位置。
- 如权利要求6所述的信息交互方法,基于所述至少一个技能生成位置数量,在所述技能效果模型分布轨迹上确定多个技能效果模型分布点之后,还包括:基于所述技能效果模型分布半径,确定所述技能效果模型分布点的垂线方向;所述在所述至少一个技能生成位置生成所述目标技能的技能效果模型,包括:基于所述垂线方向修改所述目标技能的技能效果模型方向,得到方向修改后的目标技能;在所述至少一个技能生成位置生成所述方向修改后的目标技能的技能效果模型。
- 如权利要求6所述的信息交互方法,所述根据所述相对位置计算所述目标技能的技能效果模型分布半径,包括:获取预设系数;根据所述相对位置计算相对距离;根据所述预设系数对所述相对距离进行加权求和,得到所述目标技能的技能效果模型分布半径。
- 如权利要求6所述的信息交互方法,所述以所述相对位置为圆心,基于所述技能效果模型分布半径确定所述目标技能的技能效果模型分布轨迹,包括:获取所述目标技能的技能效果模型预设分布体积;对所述技能效果模型预设分布体积和技能生成位置数量进行相乘计算,得到最小技能效果模型分布半径;当所述目标技能的技能效果模型分布半径不小于所述最小技能效果模型分布半径时,以所述相对位置为圆心,基于所述技能效果模型分布半径确定所述目标技能的技能效果模型分布轨迹;当所述目标技能的技能效果模型分布半径小于所述最小技能效果模型分布半径时,以所述相对位置为圆心、基于所述最小技能效果模型分布半径确定所述目标技能的技能效果模型分布轨迹。
- 如权利要求1所述的信息交互方法,所述当检测到针对所述虚拟摇杆对象的移动操作时,基于所述移动操作计算所述目标技能的至少一个技能生成位置之后,还包括:在所述至少一个技能生成位置生成所述目标技能的技能预览效果模型;所述当检测到针对所述虚拟摇杆对象的施放操作时,在所述至少一个技能生成位置生成所述目标技能的技能效果模型,包括:当检测到用户针对所述虚拟摇杆对象的施放操作时,在所述至少一个技能生成位置停止生成所述目标技能的技能预览效果模型,并生成所述目标技能的技能效果模型。
- 如权利要求1所述的信息交互方法,所述候选技能区域包括至少一个候选技能的技能控件;所述基于针对所述候选技能区域的技能选取操作,确定目标技能,包括:基于针对所述技能控件的选取操作,在至少一个候选技能中确定目标技能;在所述游戏画面上显示虚拟摇杆对象,包括:在所述目标技能的技能控件上覆盖显示虚拟摇杆对象。
- 如权利要求1所述的信息交互方法,所述游戏画面还包括取消施放控件;所述在所述游戏画面上显示虚拟摇杆对象之后,还包括:当检测到针对所述取消施放控件的取消施放操作时,在所述游戏画面中停止显示所述虚拟摇杆对象。
- 一种信息交互装置,包括:画面单元,用于显示游戏画面,所述游戏画面包括候选技能区域;技能单元,用于基于针对所述候选技能区域的技能选取操作,确定目标技能;摇杆单元,用于在所述游戏画面上显示虚拟摇杆对象;位置单元,用于当检测到针对所述虚拟摇杆对象的移动操作时,基于所述移动操作计算所述目标技能的至少一个技能生成位置;生成单元,用于当检测到针对所述虚拟摇杆对象的施放操作时,在所述至少一个技能生成位置生成所述目标技能的技能效果模型。
- 一种存储介质,所述存储介质用于存储计算机程序,所述计算机程序用于执行权利要求1~12任一项所述的信息交互方法。
- 一种终端,包括处理器和存储器,所述存储器存储有多条指令;所述处理器从所述存储器中加载指令,以执行如权利要求1~12任一项所述的信息 交互方法。
- 一种包括指令的计算机程序产品,当其在计算机上运行时,使得所述计算机执行如权利要求1~12任一项所述的信息交互方法。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11202108571VA SG11202108571VA (en) | 2019-09-04 | 2020-08-20 | Information interaction method and related device |
JP2021550032A JP7242121B2 (ja) | 2019-09-04 | 2020-08-20 | 情報インタラクション方法及び関連装置 |
KR1020217026753A KR102602113B1 (ko) | 2019-09-04 | 2020-08-20 | 정보 상호작용 방법 및 관련 장치 |
EP20861591.4A EP3919145A4 (en) | 2019-09-04 | 2020-08-20 | INFORMATION INTERACTION METHOD AND RELATED DEVICE |
US17/156,087 US11684858B2 (en) | 2019-09-04 | 2021-01-22 | Supplemental casting control with direction and magnitude |
US18/314,299 US20230271091A1 (en) | 2019-09-04 | 2023-05-09 | Information exchange method and related apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910833875.2A CN110559658B (zh) | 2019-09-04 | 2019-09-04 | 信息交互方法、装置、终端以及存储介质 |
CN201910833875.2 | 2019-09-04 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/156,087 Continuation US11684858B2 (en) | 2019-09-04 | 2021-01-22 | Supplemental casting control with direction and magnitude |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021043000A1 true WO2021043000A1 (zh) | 2021-03-11 |
Family
ID=68777795
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/110199 WO2021043000A1 (zh) | 2019-09-04 | 2020-08-20 | 信息交互方法和相关装置 |
Country Status (7)
Country | Link |
---|---|
US (2) | US11684858B2 (zh) |
EP (1) | EP3919145A4 (zh) |
JP (1) | JP7242121B2 (zh) |
KR (1) | KR102602113B1 (zh) |
CN (1) | CN110559658B (zh) |
SG (1) | SG11202108571VA (zh) |
WO (1) | WO2021043000A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113769391A (zh) * | 2021-09-27 | 2021-12-10 | 腾讯科技(深圳)有限公司 | 在虚拟环境中获取技能的方法、装置、设备及介质 |
JP2023528119A (ja) * | 2021-05-14 | 2023-07-04 | ▲騰▼▲訊▼科技(深▲セン▼)有限公司 | 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110559658B (zh) * | 2019-09-04 | 2020-07-28 | 腾讯科技(深圳)有限公司 | 信息交互方法、装置、终端以及存储介质 |
CN111228805B (zh) * | 2020-01-08 | 2023-07-14 | 腾讯科技(深圳)有限公司 | 虚拟操作对象的控制方法和装置、存储介质及电子装置 |
CN111530075B (zh) | 2020-04-20 | 2022-04-05 | 腾讯科技(深圳)有限公司 | 虚拟环境的画面显示方法、装置、设备及介质 |
CN111589131B (zh) * | 2020-04-24 | 2022-02-22 | 腾讯科技(深圳)有限公司 | 虚拟角色的控制方法、装置、设备及介质 |
CN111589134A (zh) * | 2020-04-28 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 虚拟环境画面的显示方法、装置、设备及存储介质 |
CN111760287B (zh) * | 2020-06-30 | 2024-02-02 | 网易(杭州)网络有限公司 | 游戏技能的控制方法、装置、电子设备及计算机可读介质 |
CN111760283B (zh) * | 2020-08-06 | 2023-08-08 | 腾讯科技(深圳)有限公司 | 虚拟对象的技能施放方法、装置、终端及可读存储介质 |
CN113750518A (zh) * | 2021-09-10 | 2021-12-07 | 网易(杭州)网络有限公司 | 技能按钮的控制方法、装置、电子设备及计算机可读介质 |
CN114296597A (zh) * | 2021-12-01 | 2022-04-08 | 腾讯科技(深圳)有限公司 | 虚拟场景中的对象交互方法、装置、设备及存储介质 |
CN114860148B (zh) * | 2022-04-19 | 2024-01-16 | 北京字跳网络技术有限公司 | 一种交互方法、装置、计算机设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140302900A1 (en) * | 2011-12-29 | 2014-10-09 | Neowiz Games Corporation | Method and apparatus for manipulating character of soccer game |
CN106033340A (zh) * | 2015-03-16 | 2016-10-19 | 广州四三九九信息科技有限公司 | 手游战斗技能的可视化编辑方法及*** |
CN107168611A (zh) * | 2017-06-16 | 2017-09-15 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN109550241A (zh) * | 2018-09-20 | 2019-04-02 | 厦门吉比特网络技术股份有限公司 | 一种单摇杆控制方法和*** |
CN110559658A (zh) * | 2019-09-04 | 2019-12-13 | 腾讯科技(深圳)有限公司 | 信息交互方法、装置、终端以及存储介质 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9262073B2 (en) * | 2010-05-20 | 2016-02-16 | John W. Howard | Touch screen with virtual joystick and methods for use therewith |
CN105194873B (zh) * | 2015-10-10 | 2019-01-04 | 腾讯科技(成都)有限公司 | 一种信息处理方法、终端及计算机存储介质 |
JP6143934B1 (ja) | 2016-11-10 | 2017-06-07 | 株式会社Cygames | 情報処理プログラム、情報処理方法、及び情報処理装置 |
KR20180111397A (ko) * | 2017-04-02 | 2018-10-11 | 둘툰 주식회사 | 외부 입력장치를 이용한 게임 가상 컨트롤러 생성 및 매핑 방법 |
CN107661630A (zh) | 2017-08-28 | 2018-02-06 | 网易(杭州)网络有限公司 | 一种射击游戏的控制方法及装置、存储介质、处理器、终端 |
CN108509139B (zh) * | 2018-03-30 | 2019-09-10 | 腾讯科技(深圳)有限公司 | 虚拟对象的移动控制方法、装置、电子装置及存储介质 |
CN108771869B (zh) * | 2018-06-04 | 2022-02-08 | 腾讯科技(深圳)有限公司 | 性能测试方法和装置、存储介质及电子装置 |
CN109011572B (zh) | 2018-08-27 | 2022-09-16 | 广州要玩娱乐网络技术股份有限公司 | 游戏魔法技能处理方法及存储介质、计算机设备 |
CN109550240A (zh) * | 2018-09-20 | 2019-04-02 | 厦门吉比特网络技术股份有限公司 | 一种游戏的技能释放方法和装置 |
CN109568938B (zh) * | 2018-11-30 | 2020-08-28 | 广州要玩娱乐网络技术股份有限公司 | 多资源游戏触控操作方法、装置、存储介质和终端 |
CN109745698A (zh) | 2018-12-28 | 2019-05-14 | 北京金山安全软件有限公司 | 一种取消技能释放的方法、装置及电子设备 |
-
2019
- 2019-09-04 CN CN201910833875.2A patent/CN110559658B/zh active Active
-
2020
- 2020-08-20 KR KR1020217026753A patent/KR102602113B1/ko active IP Right Grant
- 2020-08-20 WO PCT/CN2020/110199 patent/WO2021043000A1/zh unknown
- 2020-08-20 SG SG11202108571VA patent/SG11202108571VA/en unknown
- 2020-08-20 JP JP2021550032A patent/JP7242121B2/ja active Active
- 2020-08-20 EP EP20861591.4A patent/EP3919145A4/en active Pending
-
2021
- 2021-01-22 US US17/156,087 patent/US11684858B2/en active Active
-
2023
- 2023-05-09 US US18/314,299 patent/US20230271091A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140302900A1 (en) * | 2011-12-29 | 2014-10-09 | Neowiz Games Corporation | Method and apparatus for manipulating character of soccer game |
CN106033340A (zh) * | 2015-03-16 | 2016-10-19 | 广州四三九九信息科技有限公司 | 手游战斗技能的可视化编辑方法及*** |
CN107168611A (zh) * | 2017-06-16 | 2017-09-15 | 网易(杭州)网络有限公司 | 信息处理方法、装置、电子设备及存储介质 |
CN109550241A (zh) * | 2018-09-20 | 2019-04-02 | 厦门吉比特网络技术股份有限公司 | 一种单摇杆控制方法和*** |
CN110559658A (zh) * | 2019-09-04 | 2019-12-13 | 腾讯科技(深圳)有限公司 | 信息交互方法、装置、终端以及存储介质 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023528119A (ja) * | 2021-05-14 | 2023-07-04 | ▲騰▼▲訊▼科技(深▲セン▼)有限公司 | 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム |
US11865449B2 (en) | 2021-05-14 | 2024-01-09 | Tencent Technology (Shenzhen) Company Limited | Virtual object control method, apparatus, device, and computer-readable storage medium |
JP7413563B2 (ja) | 2021-05-14 | 2024-01-15 | ▲騰▼▲訊▼科技(深▲セン▼)有限公司 | 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム |
CN113769391A (zh) * | 2021-09-27 | 2021-12-10 | 腾讯科技(深圳)有限公司 | 在虚拟环境中获取技能的方法、装置、设备及介质 |
CN113769391B (zh) * | 2021-09-27 | 2023-06-27 | 腾讯科技(深圳)有限公司 | 在虚拟环境中获取技能的方法、装置、设备及介质 |
Also Published As
Publication number | Publication date |
---|---|
CN110559658B (zh) | 2020-07-28 |
SG11202108571VA (en) | 2021-09-29 |
EP3919145A4 (en) | 2022-05-25 |
US11684858B2 (en) | 2023-06-27 |
US20230271091A1 (en) | 2023-08-31 |
EP3919145A1 (en) | 2021-12-08 |
CN110559658A (zh) | 2019-12-13 |
KR20210117329A (ko) | 2021-09-28 |
KR102602113B1 (ko) | 2023-11-13 |
JP7242121B2 (ja) | 2023-03-20 |
JP2022522443A (ja) | 2022-04-19 |
US20210138351A1 (en) | 2021-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021043000A1 (zh) | 信息交互方法和相关装置 | |
WO2021036581A1 (zh) | 虚拟对象的控制方法和相关装置 | |
KR102050934B1 (ko) | 정보 처리 방법, 단말, 및 컴퓨터 저장 매체 | |
EP3939681A1 (en) | Virtual object control method and apparatus, device, and storage medium | |
WO2022017094A1 (zh) | 界面显示方法、装置、终端及存储介质 | |
JP2023162233A (ja) | 仮想オブジェクトの制御方法、装置、端末及び記憶媒体 | |
WO2021244306A1 (zh) | 虚拟对象的选择方法、装置、设备及存储介质 | |
WO2021227684A1 (en) | Method for selecting virtual objects, apparatus, terminal and storage medium | |
JP2024519880A (ja) | 仮想環境画面の表示方法、装置、端末及びコンピュータプログラム | |
WO2023138192A1 (zh) | 控制虚拟对象拾取虚拟道具的方法、终端及存储介质 | |
CN113546419A (zh) | 游戏地图显示方法、装置、终端及存储介质 | |
WO2024007606A1 (zh) | 虚拟物品的展示方法、装置、计算机设备及存储介质 | |
WO2024045528A1 (zh) | 一种游戏控制方法、装置、计算机设备及存储介质 | |
CN113426115A (zh) | 游戏角色的展示方法、装置和终端 | |
Mei et al. | Sightx: A 3d selection technique for xr | |
US12017141B2 (en) | Virtual object control method and apparatus, device, and storage medium | |
US11978152B2 (en) | Computer-assisted graphical development tools | |
CN113082712B (zh) | 虚拟角色的控制方法、装置、计算机设备和存储介质 | |
WO2024051414A1 (zh) | 热区的调整方法、装置、设备、存储介质及程序产品 | |
WO2024060895A1 (zh) | 用于虚拟场景的群组建立方法、装置、设备及存储介质 | |
CN113082712A (zh) | 虚拟角色的控制方法、装置、计算机设备和存储介质 | |
CN115564916A (zh) | 虚拟场景的编辑方法、装置、计算机设备及存储介质 | |
CN115193062A (zh) | 游戏的控制方法、装置、存储介质及计算机设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20861591 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20217026753 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021550032 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2020861591 Country of ref document: EP Effective date: 20210903 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |