CN109857303A - Interaction control method and device - Google Patents
Interaction control method and device Download PDFInfo
- Publication number
- CN109857303A CN109857303A CN201910105587.5A CN201910105587A CN109857303A CN 109857303 A CN109857303 A CN 109857303A CN 201910105587 A CN201910105587 A CN 201910105587A CN 109857303 A CN109857303 A CN 109857303A
- Authority
- CN
- China
- Prior art keywords
- control
- track
- terminal
- slide
- initiation event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure is directed to a kind of interaction control method and devices.Method includes: to provide an interface, and interface includes scene areas and interaction area, and interaction area includes an at least control, and each control is associated with an object;Whether detecting interaction area has input initiation event, and when detecting interaction area has input initiation event, judges to input whether initiation event is located on a control;Judgement input initiation event be located on a control when, select scene areas in input initiation event where the associated object of control, and detect whether have with input the continuous slide of initiation event;When detecting slide continuous with input initiation event, the terminal of slide track is obtained, and judge whether the terminal of track is located at scene areas;When the terminal for judging the track is located at scene areas, according to the movement for the object that the terminal point control of the track is selected.The disclosure can largely promote the interactive experience of user.
Description
Present patent application be the applying date be on May 27th, 2016, application No. is 201610362985.1, it is entitled
The divisional application of the Chinese invention patent application of " interaction control method and device ".
Technical field
This disclosure relates to which human-computer interaction technique field, fills in particular to a kind of interaction control method and interactive controlling
It sets.
Background technique
With the fast development of the communication technology, occur more and more game applications in various terminal equipment.It is swimming
It plays in the operational process of application, terminal device shows various game objects according to certain layout, to be in player
Existing interface.
With reference to shown in Fig. 1, in a kind of game application, interface 10 includes scene areas 101 and interaction area
102, wherein scene areas 101 mainly provides the elements such as environment, building, machinery, the stage property in game;Interaction area 102 includes
Head portrait control A0~D0 and other for realizing interactive function control.In such game application, it may relate on the scene
Object, and object (such as object A1 and object B1 that control has created in scene areas 101 are created in scene area 101
Deng).
In a kind of technical solution, user can first click on a head portrait control in interaction area 102, then in scene areas
101 click selection one specific positions, if condition meet, can the specific position of scene areas 101 create with it is described
The associated object of head portrait control.But the technical solution operating efficiency is relatively low, and user needs to carry out clicking operation twice, waste operation
Time and physical consumption caused by this interactive operation is increased, affects game experiencing.
In a kind of technical solution, after creating object in scene areas 101, a touching can be registered in scene manager
Sensitizing range is touched, size is directly proportional to the sequence frame sign of the object.When clicking scene areas 101 every time, it can traverse quick
Object of the sense regional choice closest to click location;If there is slide, then can using the object selected as starting point, with
The terminal of slide track carrys out the mobile equal movement of the object of control selections as terminal.But in the technical solution, field
The response region size of object in scene area 101 can change the zoom operations of scene of game because of user, it is also possible to because with
Family exceeds 101 range of scene areas to the slide of scene of game, causes not putting or the problem of operating efficiency reduces.
It should be noted that information is only used for reinforcing the reason to the background of the disclosure disclosed in above-mentioned background technology part
Solution, therefore may include the information not constituted to the prior art known to persons of ordinary skill in the art.
Summary of the invention
The disclosure is designed to provide a kind of interaction control method and interaction control device, and then at least to a certain degree
On overcome the problems, such as caused by the limitation and defect due to the relevant technologies one or more.
According to the first aspect of the embodiments of the present disclosure, a kind of interaction control method is provided, comprising:
An interface is provided, the interface includes scene areas and interaction area, and the interaction area includes
An at least control, each control are associated with an object;
It detects whether the interaction area has input initiation event, and has described input detecting the interaction area
When beginning event, judge whether the input initiation event is located at a control;
When judging that the input initiation event is located on a control, whether detecting has and the input initiation event
Continuous slide;
When detecting slide continuous with the input initiation event, the end of the slide track is obtained
Point, and judge whether the terminal of the track is located at the scene areas;
When the terminal for judging the track is located at the scene areas, inputted in scene areas creation with described
The associated object of control where beginning event.
In a kind of exemplary embodiment of the disclosure, the interaction control method further include:
When the terminal for judging the track is located at the interaction area, cancel the scene areas creation with it is described defeated
The associated object of control where entering initiation event.
In a kind of exemplary embodiment of the disclosure, the interaction control method further include:
When detecting slide continuous with the input initiation event, the present bit of the slide is obtained
It sets, and judges whether the current location of the slide is located at the scene areas;
When the current location for judging the slide is located at the scene areas, in the present bit of the slide
Set the cursor object of creation with the associated object of control where the input initiation event.
In a kind of exemplary embodiment of the disclosure, it is associated right with the control where the input initiation event to create
The cursor object of elephant includes:
Object resource on control where the input initiation event is removed from the interaction area and is transferred to institute
Scene areas is stated, creates the cursor object to reuse the object resource.
In a kind of exemplary embodiment of the disclosure, the interaction control method further include:
The current location for obtaining the slide, judge the slide current location and the interface
Whether the distance between edge is less than preset value;
It is less than at a distance from the current location for judging the slide is between the edge of the interface described pre-
If when value, the direction moving game scene where the immediate edge in the slide current location.
According to the second aspect of an embodiment of the present disclosure, a kind of interaction control method is provided, comprising:
An interface is provided, the interface includes scene areas and interaction area, and the interaction area includes
An at least control, each control are associated with an object;
It detects whether the interaction area has input initiation event, and has described input detecting the interaction area
When beginning event, judge whether the input initiation event is located at a control;
When judging that the input initiation event is located on a control, select in the scene areas with the input
The associated object of control where initiation event, and whether detecting has and the continuous slide of the input initiation event;
When detecting slide continuous with the input initiation event, the end of the slide track is obtained
Point, and judge whether the terminal of the track is located at the scene areas;
When the terminal for judging the track is located at the scene areas, selected according to the terminal point control of the track
The movement of the object.
In a kind of exemplary embodiment of the disclosure, the terminal point control according to the track is selected described right
The movement of elephant includes:
Judge that the corresponding position of the terminal of the track described in the scene areas whether there is the object of preset kind;
In the corresponding position of terminal for judging track described in the scene areas there are when the object of the preset kind,
It controls the object selected and executes the first movement;
The object of the preset kind is not present in the corresponding position of terminal for judging track described in the scene areas
When, it controls the object selected and executes the second movement.
According to the third aspect of an embodiment of the present disclosure, a kind of interaction control device is provided, comprising:
Display unit, for providing an interface, the interface includes scene areas and interaction area, described
Interaction area includes an at least control, and each control is associated with an object;
First detecting unit for detecting whether the interaction area has input initiation event, and is detecting the friendship
When there is the input initiation event in mutual region, judge whether the input initiation event is located at a control;
Second detecting unit, for when judging that the input initiation event is located on a control, whether detecting to have
With the continuous slide of the input initiation event;
First judging unit, for when detecting slide continuous with the input initiation event, described in acquisition
The terminal of slide track, and judge whether the terminal of the track is located at the scene areas;
First creating unit, for when the terminal for judging the track is located at the scene areas, in the scene area
Control associated object of the domain creation where with the input initiation event.
In a kind of exemplary embodiment of the disclosure, first creating unit is also used to:
When the terminal for judging the track is located at the interaction area, cancel the scene areas creation with it is described defeated
The associated object of control where entering initiation event.
In a kind of exemplary embodiment of the disclosure, the interaction control device further include:
Second judgment unit, for when detecting slide continuous with the input initiation event, described in acquisition
The current location of slide, and judge whether the current location of the slide is located at the scene areas;
Second creating unit, for when the current location for judging the slide is located at the scene areas, in institute
State current location creation and the cursor object of the associated object of control where the input initiation event of slide.
In a kind of exemplary embodiment of the disclosure, it is associated right with the control where the input initiation event to create
The cursor object of elephant includes:
Object resource on control where the input initiation event is removed from the interaction area and is transferred to institute
Scene areas is stated, creates the cursor object to reuse the object resource.
In a kind of exemplary embodiment of the disclosure, the interaction control device further include:
Third judging unit judges the present bit of the slide for obtaining the current location of the slide
Set whether the distance between edge of the interface is less than preset value;
Scenery control unit, between the edge of the current location and the interface that judge the slide
Distance when being less than the preset value, direction where the immediate edge in the slide current location is mobile
Scene of game.
According to a fourth aspect of embodiments of the present disclosure, a kind of interaction control device is provided, comprising:
Display unit, for providing an interface, the interface includes scene areas and interaction area, described
Interaction area includes an at least control, and each control is associated with an object;
First detecting unit for detecting whether the interaction area has input initiation event, and is detecting the friendship
When there is the input initiation event in mutual region, judge whether the input initiation event is located at a control;
Second detecting unit, for selecting the field when judging that the input initiation event is located on a control
In scene area with it is described input initiation event where the associated object of control, and detect whether have and the input initiation event
Continuous slide;
Judging unit, for obtaining the sliding when detecting slide continuous with the input initiation event
The terminal of operation trace, and judge whether the terminal of the track is located at the scene areas;
Control unit, for when the terminal for judging the track is located at the scene areas, according to the end of the track
The movement for the object that point control is selected.
In a kind of exemplary embodiment of the disclosure, the terminal point control according to the track is selected described right
The movement of elephant includes:
Judge that the corresponding position of the terminal of the track described in the scene areas whether there is the object of preset kind;
In the corresponding position of terminal for judging track described in the scene areas there are when the object of the preset kind,
It controls the object selected and executes the first movement;
The object of the preset kind is not present in the corresponding position of terminal for judging track described in the scene areas
When, it controls the object selected and executes the second movement.
In a kind of embodiment of the disclosure, a kind of friendship between dynamic scene control and static interfaces control is provided
Mutual control method, so as to which under the premise of not influencing existing interactive frame, the operation dispersed twice originally is integrated into
Primary smooth complete operation, and then save the operating procedure of user and reduce the customer responsiveness time, reduce simultaneously
The physical consumption of user, therefore can largely promote the interactive experience of user.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The disclosure can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the disclosure
Example, and together with specification for explaining the principles of this disclosure.It should be evident that the accompanying drawings in the following description is only the disclosure
Some embodiments for those of ordinary skill in the art without creative efforts, can also basis
These attached drawings obtain other attached drawings.
Fig. 1 schematically shows a kind of interface of game application of the prior art.
Fig. 2 schematically shows a kind of flow chart of interaction control method in disclosure exemplary embodiment.
Fig. 3 A, Fig. 3 B and Fig. 3 C schematically show a kind of game circle of game application in disclosure exemplary embodiment
Face.
Fig. 4 schematically shows a kind of flow chart of interaction control method in disclosure exemplary embodiment.
Fig. 5 A, Fig. 5 B and Fig. 5 C schematically show a kind of game circle of game application in disclosure exemplary embodiment
Face.
Fig. 6 schematically shows a kind of block diagram of interaction control device in disclosure exemplary embodiment.
Fig. 7 schematically shows a kind of block diagram of interaction control device in disclosure exemplary embodiment.
Specific embodiment
Example embodiment is described more fully with reference to the drawings.However, example embodiment can be with a variety of shapes
Formula is implemented, and is not understood as limited to example set forth herein;On the contrary, thesing embodiments are provided so that the disclosure will more
Fully and completely, and by the design of example embodiment comprehensively it is communicated to those skilled in the art.Described feature, knot
Structure or characteristic can be incorporated in any suitable manner in one or more embodiments.In the following description, it provides perhaps
More details fully understand embodiment of the present disclosure to provide.It will be appreciated, however, by one skilled in the art that can
It is omitted with technical solution of the disclosure one or more in the specific detail, or others side can be used
Method, constituent element, device, step etc..In other cases, be not shown in detail or describe known solution to avoid a presumptuous guest usurps the role of the host and
So that all aspects of this disclosure thicken.
In addition, attached drawing is only the schematic illustrations of the disclosure, it is not necessarily drawn to scale.Identical attached drawing mark in figure
Note indicates same or similar part, thus will omit repetition thereof.Some block diagrams shown in the drawings are function
Energy entity, not necessarily must be corresponding with physically or logically independent entity.These function can be realized using software form
Energy entity, or these functional entitys are realized in one or more hardware modules or integrated circuit, or at heterogeneous networks and/or place
These functional entitys are realized in reason device device and/or microcontroller device.
A kind of interaction control method is provided firstly in this example embodiment, which can be applied to one
Terminal device.The terminal device can be for example the PC terminals such as laptop and desktop computer, be also possible to mobile phone, plate
The various touch control terminals for having Touch Screen such as computer games, PDA.It, will be mainly to have touch-control in this example embodiment
It is illustrated for the touch control terminal of screen;But it will be readily appreciated by those skilled in the art that the skill in this example embodiment
Art scheme may be equally applicable for the end PC controlled by modes such as mouse or Trackpads.With reference to Fig. 2 and Fig. 3 A to Fig. 3 C
Shown in, in this example embodiment, virtual role control method of forming a team be may comprise steps of:
Step S11. provides an interface, and the interface includes scene areas and interaction area, the interaction
Region includes an at least control, and each control is associated with an object.
With reference to shown in Fig. 3 A, game application passes through application programming interfaces (API) controlling terminal equipment of terminal device
Screen shows interface 10, and what the interface 10 in this example embodiment can be terminal device all can viewing area
Domain is displayed in full screen;Be also possible to terminal device part can display area, i.e. window shows.It can in the interface 10
To include scene areas 101 and interaction area 102.Wherein, the main environment provided in game of scene areas 101, building, machine
The elements such as tool, stage property.Interaction area 102 may include head portrait control A0~D0 and other for realizing interactive function control
Part.In this example embodiment, the interaction area 102 is located at the lower section of the interface 10, but the disclosure other
In exemplary embodiment, more layout type can also be used, do not do particular determination to this in the present exemplary embodiment.
By taking the control of the interaction area 102 is head portrait control as an example, in this example embodiment, the interaction area
102 may include multiple head portrait controls, and each head portrait control is associated with an object;For example, head portrait control A0 association pair
As A1, head portrait control B0 affiliated partner B1, head portrait control C0 affiliated partner C1, head portrait control D0 affiliated partner D1 etc..Therefore, lead to
The relevant operation of object A1 can be carried out by crossing head portrait control A0, can carry out the relevant operation of object B1 by head portrait control B0
Deng.
Step S12. detects whether the interaction area has input initiation event, and is detecting the interaction area
When stating input initiation event, judge whether the input initiation event is located at a control.
The incoming event of the interaction area 102 is periodically detected, by taking finger touch operates as an example, the incoming event can
To include the origination action, hand raising action and slide of finger.In various operations, finger needs to contact touch screen first
Curtain, i.e. generation origination action.If it is mouse control, the incoming event is also possible to include pressing movement, discharging for mouse
Movement and slide.In various operations, it is necessary first to press mouse, i.e. generation origination action.Therefore, this example is implemented
In mode, the origination action can be for the input initiation event described in this example embodiment.
It is detecting that interaction area 102 has input initiation event, such as after origination action, can further obtain described defeated
The coordinate for entering initiation event, judges whether the input initiation event is located at a control.In order to user with feedback prompts,
In this example embodiment, after judging that the input initiation event is located at a head portrait control, will with it is highlighted or other
Mode highlight the control that is located at of input initiation event.With reference to shown in Fig. 3 A, the input starting is being judged
After event is located at head portrait control C0, and meeting with the associated object C1 of head portrait control C0 can assign the condition of order (for example not have
Have end), then display head portrait control C0 will be highlighted.In addition, described input can also be responded in this example embodiment
Beginning event obtains the other contents such as described and the associated object C1 of head portrait control C0 ID and attribute, and is articulated in the object
The head portrait of C1.
When judging that the input initiation event is located on a control, whether detecting has and the input step S13.
The continuous slide of initiation event.
In this example embodiment, lift hand is not detected after judging that the input initiation event is located on a control
Movement, and detect the sliding action having after origination action greater than pre-determined distance, it is greater than the sliding action of 15 pixels,
It then may determine that and the continuous slide of the input initiation event.Judging the input initiation event positioned at described in one
After on control, if the sliding action for being greater than pre-determined distance is not detected, and detects hand raising action, then may determine that
Have and the continuous clicking operation of the input initiation event or long press operation.Wherein, the specific ginseng of above-mentioned slide judgement
Number etc. can be by user or terminal device manufacturer or game services quotient sets itself as needed, in this example embodiment
Particular determination is not done to this.
Step S14. obtains the slide rail when detecting slide continuous with the input initiation event
The terminal of mark, and judge whether the terminal of the track is located at the scene areas.
In this example embodiment, detect it is described have with after the continuous slide of the input initiation event, such as
Fruit detects hand raising action, then may determine that the slide is over, and the position of the slide track is at this time
For the terminal.According to coordinate position of the terminal of the track in screen, it can be determined that the terminal of the track is to be located at
The scene areas 101 is either located at the interaction area 102.
It,, can also be in finger or mouse sliding process in this example embodiment in addition, with reference to shown in Fig. 3 B
Finger or the cursor object of the creation of mouse current location and the associated object of control where the input initiation event.Citing and
It says, in this example embodiment, the interaction control method can also include:
When detecting slide continuous with the input initiation event, the present bit of the slide is obtained
It sets, and judges whether the current location of the slide is located at the scene areas 101.
For example, in this example embodiment, detecting described have and the continuous slide of the input initiation event
Afterwards, if hand raising action is not detected, it may determine that the slide is still continuing, at this time the position of the slide
The as described current location.According to coordinate of the current location of the slide in screen, it can be determined that the sliding behaviour
The current location of work is to be located at the scene areas 101 to be either located at the interaction area 102.
When the current location for judging the slide is located at the scene areas 101, in working as the slide
The cursor object of front position creation and the associated object of control where the input initiation event.
For example, being located at the scene areas 101 in the current location for judging the slide in this example embodiment
When, coordinate position of the current location of the slide in screen is converted into the coordinate in the scene areas 101,
And in the cursor object of coordinate position creation and the associated object of control where the input initiation event.In Fig. 3 B
It is shown, it, can working as in the slide when the current location for judging the slide is located at the scene areas 101
The cursor object C2 of front position creation object C1;The cursor object will be moved with the movement of the current location of the slide
It is dynamic.
Further, in this example embodiment, control associated object of the creation where with the input initiation event
Cursor object may include, by the object resource on the control where the input initiation event from the interaction area 102
The scene areas 101 is removed and be transferred to, creates the cursor object to reuse the object resource.For example, as schemed
It, can be by the corresponding script file of object C1 (such as ccb project file) in head portrait control C0 from interaction area 102 shown in 3B
It removes, passes to scene areas 101, scene areas 101 reuses the script file in the current location of the slide
Cursor object C2 is created in respective coordinates.In this way, can then save game resource, and the card of game is reduced to a certain extent
?.
Step S15. is when the terminal for judging the track is located at the scene areas, in scene areas creation and institute
State the associated object of control where input initiation event.
For example, in this example embodiment, when the terminal for judging the track is located at the scene areas 101, by institute
It states coordinate position of the terminal of track in screen and is converted to the coordinate in the scene areas 101, and in the coordinate bit
Set control associated object of the creation where with the input initiation event.As shown in FIG. 3 C, judging the slide
Current location when being located at the scene areas 101, the associated object of head portrait control C0 can be created in the terminal of the track
C1, and the components such as AI are assigned to object C1.
Correspondingly, can then cancel when the terminal for judging the track is located at the interaction area 102 in the scene
Control associated object of the creation of region 101 where with the input initiation event.For example, if the end of track described in Fig. 3 B
When point is located at the interaction area 102, then it can cancel the display cursor object C2, and described in showing in head portrait control C0
The head portrait of object C1, and cancel in the scene areas 101 creation and the associated object C1 of head portrait control C0.
In most game applications, the scene areas 101 only can display portion scene of game.This example embodiment
In, the interaction control method can also include:
The current location for obtaining the slide, judge the slide current location and the interface 10
The distance between edge whether be less than preset value (being, for example, less than 20% of the screen size etc.);Judging the sliding behaviour
When the distance between the current location of work and the edge of the interface 10 are less than the preset value, a timing can be opened
Device, the every frame in direction where the immediate edge in the slide current location is to edge direction moving game field
Scape;For example, right hand edge of the slide current location closest to interface, then move right the scene of game.And
And the current location of the slide, closer to the edge of the interface 10, the amplitude that scene of game moves every time is got over
Greatly.When the distance between the current location of the slide and the edge of the interface 10 are greater than preset value, then may be used
With Off Timer, stop the mobile scene of game.
Further, another interaction control method is additionally provided in this example embodiment, with pair to above-mentioned creation
As being controlled.With reference to shown in Fig. 4, interaction control method described in this example embodiment may include:
Step S21. provides an interface, and the interface includes scene areas and interaction area, the interaction
Region includes an at least control, and each control is associated with an object.
Step S22. detects whether the interaction area has input initiation event, and is detecting the interaction area
When stating input initiation event, judge whether the input initiation event is located at a control.
Step S23. when judging that the input initiation event is located on a control, select in the scene areas with
The associated object of control where the input initiation event.
For example, with reference to shown in Fig. 5 C, when judging that the input initiation event is located on head portrait control C0, if head portrait
The associated object C1 of control C0 has been created, then can choose object C1, and mention to user to feed back by ring of light mode
Show;If the associated object C1 of head portrait control C0 is not yet created, above-mentioned steps S13~step S15 can be executed.
Step S24. obtains the slide rail when detecting slide continuous with the input initiation event
The terminal of mark, and judge whether the terminal of the track is located at the scene areas.
Step S25. is when the terminal for judging the track is located at the scene areas, according to the terminal point control of the track
The movement of the object selected.
For example, in this example embodiment, when the terminal for judging the track is located at the scene areas 101, by institute
It states coordinate position of the terminal of track in screen and is converted to the coordinate in the scene areas 101, and judge in the seat
The object that whether there is preset kind in range is marked, the object of the preset kind for example can be enemy's object, friend side's object
Deng.With reference to shown in Fig. 5 B, in the corresponding position of terminal for judging track described in the scene areas 101, there are described default
When object (such as place object E1) of type, controls the object C1 selected and execute the first movement, such as attack
Deng;If the object of the preset kind is friend side's object, first movement or gain movement etc..With reference in Fig. 5 C
It is shown, the object of the preset kind is not present in the corresponding position of terminal for judging track described in the scene areas 101
When, it controls the object C1 selected and executes the second movement, such as corresponding position of terminal for being moved to the track etc..
Step identical with interaction control method in Fig. 2 in the interaction control method interactive controlling in Fig. 2 in Fig. 4
It is described in detail in the corresponding exemplary embodiment of method, therefore it is no longer repeated herein.
The above two interactive controlling between dynamic scene control and static interfaces control in this example embodiment
Originally the operation dispersed twice can be integrated into primary smooth by method under the premise of not influencing existing interactive frame
Complete operation, and then save the operating procedure of user and reduce the customer responsiveness time, while reducing the body of user
It can consume, therefore can largely promote the interactive experience of user.
Further, a kind of interaction control device is additionally provided in this example embodiment, is applied to a terminal device.Ginseng
It examines shown in Fig. 6, the interaction control device 1 may include:
Display unit 11, may be used to provide an interface, and the interface includes scene areas and interactive areas
Domain, the interaction area include an at least control, and each control is associated with an object.
First detecting unit 12, can be used for detecting whether the interaction area has input initiation event, and detect
When the interaction area has the input initiation event, judge whether the input initiation event is located at a control.
Second detecting unit 13 can be used for when judging that the input initiation event is located on a control, detecting
Whether have and the continuous slide of the input initiation event.
First judging unit 14 can be used for obtaining when detecting slide continuous with the input initiation event
The terminal of the slide track is taken, and judges whether the terminal of the track is located at the scene areas.
First creating unit 15 can be used for when the terminal for judging the track is located at the scene areas, described
Control associated object of the scene areas creation where with the input initiation event.
In a kind of exemplary embodiment of the disclosure, first creating unit 15 be can be also used for:
When the terminal for judging the track is located at the interaction area, cancel the scene areas creation with it is described defeated
The associated object of control where entering initiation event.
In a kind of exemplary embodiment of the disclosure, the interaction control device can also include:
Second judgment unit can be used for when detecting slide continuous with the input initiation event, obtain
The current location of the slide, and judge whether the current location of the slide is located at the scene areas.
Second creating unit can be used for when the current location for judging the slide is located at the scene areas,
In the cursor object of the creation of the current location of the slide and the associated object of control where the input initiation event.
In a kind of exemplary embodiment of the disclosure, it is associated right with the control where the input initiation event to create
The cursor object of elephant includes:
Object resource on control where the input initiation event is removed from the interaction area and is transferred to institute
Scene areas is stated, creates the cursor object to reuse the object resource.
In a kind of exemplary embodiment of the disclosure, the interaction control device further include:
Third judging unit can be used for obtaining the current location of the slide, judge working as the slide
Whether the distance between front position and the edge of the interface are less than preset value.
Scenery control unit can be used for the edge in the current location and the interface that judge the slide
The distance between be less than the preset value when, the direction where the immediate edge in the slide current location
Moving game scene.
Further, a kind of interaction control device is additionally provided in this example embodiment, is applied to a terminal device.Ginseng
It examines shown in Fig. 7, the interaction control device 2 may include:
Display unit 21, may be used to provide an interface, and the interface includes scene areas and interactive areas
Domain, the interaction area include an at least control, and each control is associated with an object.
First detecting unit 22, can be used for detecting whether the interaction area has input initiation event, and detect
When the interaction area has the input initiation event, judge whether the input initiation event is located at a control.
Second detecting unit 23 can be used for when judging that the input initiation event is located on a control, selection
In the scene areas with the input initiation event where the associated object of control, and detect and whether have and inputted with described
The continuous slide of beginning event.
Judging unit 24 can be used for when detecting slide continuous with the input initiation event, obtain institute
The terminal of slide track is stated, and judges whether the terminal of the track is located at the scene areas.
Control unit 25 can be used for when the terminal for judging the track is located at the scene areas, according to the rail
The movement for the object that the terminal point control of mark is selected.
In a kind of exemplary embodiment of the disclosure, the terminal point control according to the track is selected described right
The movement of elephant includes:
Judge that the corresponding position of the terminal of the track described in the scene areas whether there is the object of preset kind;
In the corresponding position of terminal for judging track described in the scene areas there are when the object of the preset kind,
It controls the object selected and executes the first movement;
The object of the preset kind is not present in the corresponding position of terminal for judging track described in the scene areas
When, it controls the object selected and executes the second movement.
The detail of each module has carried out in corresponding interaction control method in detail in above-mentioned interaction control device
Description is considered carefully, therefore details are not described herein again.
It should be noted that although being referred to several modules or list for acting the equipment executed in the above detailed description
Member, but this division is not enforceable.In fact, according to embodiment of the present disclosure, it is above-described two or more
Module or the feature and function of unit can embody in a module or unit.Conversely, an above-described mould
The feature and function of block or unit can be to be embodied by multiple modules or unit with further division.
In addition, although describing each step of method in the disclosure in the accompanying drawings with particular order, this does not really want
These steps must be executed in this particular order by asking or implying, or having to carry out step shown in whole could realize
Desired result.Additional or alternative, it is convenient to omit multiple steps are merged into a step and executed by certain steps, and/
Or a step is decomposed into execution of multiple steps etc..
Through the above description of the embodiments, those skilled in the art is it can be readily appreciated that example described herein is implemented
Mode can also be realized by software realization in such a way that software is in conjunction with necessary hardware.Therefore, according to the disclosure
The technical solution of embodiment can be embodied in the form of software products, which can store non-volatile at one
Property storage medium (can be CD-ROM, USB flash disk, mobile hard disk etc.) in or network on, including some instructions are so that a calculating
Equipment (can be personal computer, server, touch control terminal or network equipment etc.) is executed according to disclosure embodiment
Method.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the disclosure
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the disclosure, these modifications, purposes or
Person's adaptive change follows the general principles of this disclosure and including the undocumented common knowledge in the art of the disclosure
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the disclosure are by following
Claim is pointed out.
It should be understood that the present disclosure is not limited to the precise structures that have been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.The scope of the present disclosure is only limited by the accompanying claims.
Claims (6)
1. a kind of interaction control method characterized by comprising
An interface is provided, the interface includes scene areas and interaction area, and the interaction area includes at least
One control, each control are associated with an object;
It detects whether the interaction area has input initiation event, and has the input starting thing detecting the interaction area
When part, judge whether the input initiation event is located on a control;
When judging that the input initiation event is located on a control, select to originate in the scene areas with the input
The associated object of control where event, and whether detecting has and the continuous slide of the input initiation event;
When detecting slide continuous with the input initiation event, the terminal of the slide track is obtained, and
Judge whether the terminal of the track is located at the scene areas;
When the terminal for judging the track is located at the scene areas, according to the terminal point control of the track described in selecting
The movement of object.
2. interaction control method according to claim 1, which is characterized in that the terminal point control quilt according to the track
The movement of the object of selection includes:
Judge that the corresponding position of the terminal of the track described in the scene areas whether there is the object of preset kind;
In the corresponding position of terminal for judging track described in the scene areas there are when the object of the preset kind, control
The object selected executes the first movement;
When the object of the preset kind is not present in the corresponding position of terminal for judging track described in the scene areas, control
It makes the object selected and executes the second movement.
3. interaction control method according to claim 1 or 2, which is characterized in that the interaction control method further include:
The current location for obtaining the slide judges the current location of the slide and the edge of the interface
The distance between whether be less than preset value;
It is less than the preset value at a distance from the current location for judging the slide is between the edge of the interface
When, the direction moving game scene where the immediate edge in the slide current location.
4. a kind of interaction control device characterized by comprising
Display unit, for providing an interface, the interface includes scene areas and interaction area, the interaction
Region includes an at least control, and each control is associated with an object;
First detecting unit for detecting whether the interaction area has input initiation event, and is detecting the interactive areas
When there is the input initiation event in domain, judge whether the input initiation event is located at a control;
Second detecting unit, for selecting the scene area when judging that the input initiation event is located on a control
In domain with it is described input initiation event where the associated object of control, and detect whether have it is continuous with the input initiation event
Slide;
Judging unit, for obtaining the slide when detecting slide continuous with the input initiation event
The terminal of track, and judge whether the terminal of the track is located at the scene areas;
Control unit, for when the terminal for judging the track is located at the scene areas, according to the terminal control of the track
Make the movement of the object selected.
5. interaction control device according to claim 4, which is characterized in that the terminal point control quilt according to the track
The movement of the object of selection includes:
Judge that the corresponding position of the terminal of the track described in the scene areas whether there is the object of preset kind;
In the corresponding position of terminal for judging track described in the scene areas there are when the object of the preset kind, control
The object selected executes the first movement;
When the object of the preset kind is not present in the corresponding position of terminal for judging track described in the scene areas, control
It makes the object selected and executes the second movement.
6. interaction control device according to claim 4 or 5, which is characterized in that the interaction control device further include:
Third judging unit, for obtaining the current location of the slide, judge the current location of the track with it is described
Whether the distance between edge of interface is less than preset value;
Scenery control unit, for small at a distance from the current location for judging the track is between the edge of the interface
Direction moving game field when the preset value, where the immediate edge in the slide current location
Scape.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910105587.5A CN109857303B (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610362985.1A CN106020633A (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
CN201910105587.5A CN109857303B (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610362985.1A Division CN106020633A (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109857303A true CN109857303A (en) | 2019-06-07 |
CN109857303B CN109857303B (en) | 2021-04-02 |
Family
ID=57094518
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610362985.1A Pending CN106020633A (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
CN201910105587.5A Active CN109857303B (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610362985.1A Pending CN106020633A (en) | 2016-05-27 | 2016-05-27 | Interaction control method and device |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN106020633A (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106422329A (en) * | 2016-11-01 | 2017-02-22 | 网易(杭州)网络有限公司 | Game control method and device |
CN106453638B (en) * | 2016-11-24 | 2018-07-06 | 腾讯科技(深圳)有限公司 | Information interacting method and system in a kind of applied business |
WO2018196552A1 (en) | 2017-04-25 | 2018-11-01 | 腾讯科技(深圳)有限公司 | Method and apparatus for hand-type display for use in virtual reality scene |
CN107168530A (en) * | 2017-04-26 | 2017-09-15 | 腾讯科技(深圳)有限公司 | Object processing method and device in virtual scene |
CN111279303B (en) * | 2017-08-29 | 2024-03-19 | 深圳传音通讯有限公司 | Control triggering method and terminal equipment |
CN110046008B (en) * | 2018-11-20 | 2021-11-23 | 创新先进技术有限公司 | Associated control interaction method and device |
CN110193190B (en) * | 2019-06-03 | 2023-02-28 | 网易(杭州)网络有限公司 | Game object creating method, touch terminal device, electronic device and medium |
CN114237413A (en) * | 2020-09-09 | 2022-03-25 | 华为技术有限公司 | Method and device for processing interaction event |
CN112631492A (en) * | 2020-12-30 | 2021-04-09 | 北京达佳互联信息技术有限公司 | Task creation method and device |
CN113599825B (en) * | 2021-08-10 | 2023-06-20 | 腾讯科技(深圳)有限公司 | Method and related device for updating virtual resources in game |
CN115826828B (en) * | 2023-02-23 | 2023-07-14 | 天津联想协同科技有限公司 | Network disk file operation method, device, terminal and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103785170A (en) * | 2012-10-26 | 2014-05-14 | 株式会社得那 | Game providing device |
CN104436657A (en) * | 2014-12-22 | 2015-03-25 | 青岛烈焰畅游网络技术有限公司 | Method and device for controlling game and electronic equipment |
CN105025061A (en) * | 2014-04-29 | 2015-11-04 | 中国电信股份有限公司 | Method and server for constructing cloud-end shared game scene |
CN105194871A (en) * | 2015-09-14 | 2015-12-30 | 网易(杭州)网络有限公司 | Method for controlling game role |
JP2016004500A (en) * | 2014-06-18 | 2016-01-12 | 株式会社コロプラ | User interface program |
CN105582670A (en) * | 2015-12-17 | 2016-05-18 | 网易(杭州)网络有限公司 | Aimed-firing control method and device |
CN105597310A (en) * | 2015-12-24 | 2016-05-25 | 网易(杭州)网络有限公司 | Game control method and device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103019426A (en) * | 2011-09-28 | 2013-04-03 | 腾讯科技(深圳)有限公司 | Interacting method and interacting device in touch terminal |
CN104182880B (en) * | 2014-05-16 | 2015-10-28 | 孙锋 | A kind of net purchase method and system based on true man and/or 3D model in kind |
-
2016
- 2016-05-27 CN CN201610362985.1A patent/CN106020633A/en active Pending
- 2016-05-27 CN CN201910105587.5A patent/CN109857303B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103785170A (en) * | 2012-10-26 | 2014-05-14 | 株式会社得那 | Game providing device |
CN105025061A (en) * | 2014-04-29 | 2015-11-04 | 中国电信股份有限公司 | Method and server for constructing cloud-end shared game scene |
JP2016004500A (en) * | 2014-06-18 | 2016-01-12 | 株式会社コロプラ | User interface program |
CN104436657A (en) * | 2014-12-22 | 2015-03-25 | 青岛烈焰畅游网络技术有限公司 | Method and device for controlling game and electronic equipment |
CN105194871A (en) * | 2015-09-14 | 2015-12-30 | 网易(杭州)网络有限公司 | Method for controlling game role |
CN105582670A (en) * | 2015-12-17 | 2016-05-18 | 网易(杭州)网络有限公司 | Aimed-firing control method and device |
CN105597310A (en) * | 2015-12-24 | 2016-05-25 | 网易(杭州)网络有限公司 | Game control method and device |
Also Published As
Publication number | Publication date |
---|---|
CN109857303B (en) | 2021-04-02 |
CN106020633A (en) | 2016-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109857303A (en) | Interaction control method and device | |
CA2822812C (en) | Systems and methods for adaptive gesture recognition | |
CN105630374A (en) | Virtual character control mode switching method and device | |
CN106325668B (en) | Touch event response processing method and system | |
US20060288314A1 (en) | Facilitating cursor interaction with display objects | |
CN105824531B (en) | Numerical value method of adjustment and device | |
CN108829327B (en) | Writing method and device of interactive intelligent equipment | |
CN103399640B (en) | Method and device for controlling according to user gesture and client | |
JPWO2015040861A1 (en) | Electronic device, control method and program for electronic device | |
CN104360798A (en) | Method and terminal for desktop arrangement | |
KR102205283B1 (en) | Electro device executing at least one application and method for controlling thereof | |
US20150363037A1 (en) | Control method of touch panel | |
CN107797722A (en) | Touch screen icon selection method and device | |
JP2016528600A (en) | How to select parts of a graphical user interface | |
CN106775213B (en) | A kind of method and terminal switching desktop | |
CN107870720A (en) | View switching method, device and client device for touch-screen | |
CN108228020A (en) | A kind of information processing method and terminal | |
CN105760077B (en) | Game control method and device | |
CN104898880A (en) | Control method and electronic equipment | |
CN103809793B (en) | Information processing method and electronic device | |
CN103634631B (en) | Icon method for selecting based on remote control touch screen and system | |
WO2017044669A1 (en) | Controlling a device | |
CN103092615A (en) | Task preview method and device | |
CN104615342B (en) | A kind of information processing method and electronic equipment | |
CN109426424A (en) | A kind of operating method of terminal device, device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |