CN108415570A - Control selection method based on augmented reality and device - Google Patents

Control selection method based on augmented reality and device Download PDF

Info

Publication number
CN108415570A
CN108415570A CN201810187008.1A CN201810187008A CN108415570A CN 108415570 A CN108415570 A CN 108415570A CN 201810187008 A CN201810187008 A CN 201810187008A CN 108415570 A CN108415570 A CN 108415570A
Authority
CN
China
Prior art keywords
target widget
terminal
target
control
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810187008.1A
Other languages
Chinese (zh)
Other versions
CN108415570B (en
Inventor
吴志武
雷月雯
申文迪
姜帆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN201810187008.1A priority Critical patent/CN108415570B/en
Publication of CN108415570A publication Critical patent/CN108415570A/en
Application granted granted Critical
Publication of CN108415570B publication Critical patent/CN108415570B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a kind of control selection method and device based on augmented reality.Wherein, this method includes:The display target control in virtual scene;During terminal is moved towards target widget, in the case where meeting the first predetermined condition, control targe control is in selected state.The present invention solves AR in the related technology using being arranged UI controls in the display screen of terminal, and AR scenes is caused to show incomplete technical problem.

Description

Control selection method based on augmented reality and device
Technical field
The present invention relates to augmented reality field, in particular to a kind of control selection method based on augmented reality and Device.
Background technology
Augmented reality (Augmented Reality, referred to as AR) is a kind of by real world information and virtual world letter It is difficult the entity letter experienced in the certain time spatial dimension of real world that the new technology that breath is " seamless " to be integrated, which is script, (visual information, sound, taste, tactile etc.) is ceased by science and technology such as computers, is superimposed again after analog simulation, by virtual information It is applied to real world, is perceived by human sensory, to reach the sensory experience of exceeding reality.True environment and virtual Object has been added to the same picture or space in real time.AR is applied than wide, currently, mobile terminal device (such as mobile phone) In AR firmly grasped since it needs to occupy user's hand, and user needs just see virtual image by mobile phone screen, so The mode for generally selecting or interacting is still through handset touch panel curtain, and the space that this largely limits AR applications is excellent Gesture has been confined to the interbehavior of user on mobile phone screen, this does not only result in AR scenes and shows imperfect, and user passes through Mobile phone screen, which carries out operation, will cause to grasp less efficient, and error probability is larger.
It is applied for AR in the related technology and UI controls is arranged in the display screen of terminal, cause AR scenes to show endless Whole problem, currently no effective solution has been proposed.
Invention content
An embodiment of the present invention provides a kind of control selection method and device based on augmented reality, at least to solve correlation AR in technology causes AR scenes to show incomplete technical problem using being arranged UI controls in the display screen of terminal.
One side according to the ... of the embodiment of the present invention provides a kind of control selection method based on augmented reality, including: The display target control in virtual scene;During terminal is moved towards the target widget, meeting the first predetermined item In the case of part, controls the target widget and be in selected state.
Further, the first predetermined condition of the satisfaction includes:It is pre- that the terminal is located at target widget front first Determine the folder between the normal vector of plane and the normal vector of target widget place plane in angular range and where the terminal Angle in the second predetermined angular range and the terminal between the target widget at a distance from be less than first threshold, wherein institute The direction for stating the normal vector of plane where terminal be the display screen of the terminal towards negative direction, the target widget place The direction of the normal vector of plane for the target widget front towards direction, the front of the target widget is towards institute State the one side of terminal.
Further, the result of the normal vector of plane where the target widget described in relative position vector dot product is more than second In the case of threshold value, determine that the terminal is located in front of the target widget in first predetermined angular range, wherein described Relative position vector is the direction vector that the terminal position is directed toward from the central point of the target widget;In the terminal In the case that the result of the normal vector of plane is less than third threshold value where target widget described in the normal vector dot product of place plane, really Angle between the normal vector of plane where the normal vector of plane where the fixed terminal and the target widget is described second In predetermined angular range;In the case where the length of the relative position vector is less than the first threshold, the terminal is determined The distance between described target widget is less than the first threshold.
Further, in the case where the target widget is in the selected state, the method further includes:By the target control Part is labeled as selected state, and the display of the target widget is determined according to the distance between the terminal and the target widget Parameter.
Further, in the case where the target widget is in the selected state, the method further includes:It is pre- meeting second In the case of fixed condition, controls the target widget and be active.
Further, the second predetermined condition of the satisfaction includes:The distance between the terminal and the target widget are small In the 4th threshold value, wherein the 4th threshold value is less than the first threshold.
Further, in the case where the target widget is in the state of activation, the method further includes:It is pre- meeting third In the case of fixed condition, controls the target widget and be in triggering state.
Further, described to meet third predetermined condition, including:The direction of motion vector of the terminal and the target control Angle between the normal vector of plane where part is in third predetermined angular range and between the terminal and the target widget Distance be less than the 5th threshold value, wherein the direction of motion vector of the terminal be directed toward from terminal position described in former frame The vector of terminal position described in present frame, the 5th threshold value are less than the 4th threshold value.
Further, the knot of the normal vector of plane where the target widget described in the direction of motion vector dot of the terminal In the case that fruit is less than the 6th threshold value, the normal direction of the direction of motion vector and plane where the target widget of the terminal is determined Angle between amount is in the third predetermined angular range;It is less than the feelings of the 5th threshold value in the length of relative position vector Under condition, determine the distance between the terminal and the target widget be less than the 5th threshold value, wherein the relative position to Amount is that the direction vector of the terminal position is directed toward from the central point of the target widget.
Further, in the case where the target widget is in the triggering state, the method further includes:To what is successfully triggered The target widget is prompted.
Further, when the target widget is in the selected state, the method further includes:It is pre- meeting the 4th In the case of fixed condition, controls the target widget and be in triggering state.
Further, the 4th predetermined condition of the satisfaction includes:The distance between the terminal and the target widget are small In the 7th threshold value, wherein the 7th threshold value is less than the first threshold.
Further, in the case where the target widget is in the selected state, the method further includes:By the terminal with The target widget is bound;It is moved in the virtual scene using target widget described in the terminal control.
Further, described moved in the virtual scene using target widget described in the terminal control includes following At least one:The target widget is moved with the terminal along desired trajectory;The target widget is with the terminal rotating;It is described Target widget is shaken with the terminal in the target widget position.
Further, the process that target widget moves in the virtual scene described in the terminal control is utilized described In, the method further includes:Obtain the control operation executed in the terminal, wherein the control operation is used to indicate solution Except the binding relationship between the terminal and the target widget;It responds the control operation and releases the terminal and the target Binding relationship between control, and obtain the position that the target widget is currently moved to;In the current institute of the target widget It is the first position by the location updating of the target widget, wherein institute in the case that the position moved to is first position It is the position allowed where the target widget to state first position;It is second in the position that the target widget is currently moved to It is the original position of the target widget by the position recovering of the target widget, wherein the second in the case of position It is set to the position not allowed where the target widget.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of control selection device based on augmented reality, packet It includes:Display unit, for the display target control in virtual scene;Selecting unit, in terminal towards the target widget In mobile process, in the case where meeting the first predetermined condition, controls the target widget and be in selected state.
Another aspect according to the ... of the embodiment of the present invention, additionally provides a kind of storage medium, and the storage medium includes storage Program, wherein described program run when execute the control selection method described in any one of the above embodiments based on augmented reality.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of processor, and the processor is used to run program, Wherein, the control selection method described in any one of the above embodiments based on augmented reality is executed when described program is run.
Another aspect according to the ... of the embodiment of the present invention additionally provides a kind of terminal, including memory and processor, wherein Computer program is stored in the memory, the processor is arranged to run the computer program to execute above-mentioned The control selection method based on augmented reality described in one.
In embodiments of the present invention, by the way of the display target control in virtual scene, by terminal towards mesh During marking control movement, in the case where satisfaction is used to indicate triggering and chooses the first predetermined condition of target widget, control Target widget is in selected state, has achieved the purpose that no longer need to operate target widget on terminal screen, and then solve correlation AR in technology causes AR scenes to show incomplete technical problem using being arranged UI controls in the display screen of terminal, from And realize and ensure that terminal shows complete AR scenes, and improve the technique effect to the operating efficiency of target widget.
Description of the drawings
Attached drawing described herein is used to provide further understanding of the present invention, and is constituted part of this application, this hair Bright illustrative embodiments and their description are not constituted improper limitations of the present invention for explaining the present invention.In the accompanying drawings:
Fig. 1 is the schematic diagram of the hardware environment of the control selection method according to the ... of the embodiment of the present invention based on augmented reality;
Fig. 2 is the flow chart of the control selection method according to the ... of the embodiment of the present invention based on augmented reality;
Fig. 3 is the schematic diagram of the normal vector of plane where target widget according to the ... of the embodiment of the present invention;
Fig. 4 is the schematic diagram of relative position vector according to the ... of the embodiment of the present invention;
Fig. 5 is the schematic diagram of plane normal vector where terminal according to the ... of the embodiment of the present invention;
Fig. 6 is the schematic diagram of the first predetermined angular according to the ... of the embodiment of the present invention;
Fig. 7 is the schematic diagram of the second predetermined angular according to the ... of the embodiment of the present invention;
Fig. 8 is the flow chart for the method that selection is touched in mobile device AR applications according to the preferred embodiment of the invention;With And
Fig. 9 is the schematic diagram of the control selection device according to the ... of the embodiment of the present invention based on augmented reality.
Specific implementation mode
In order to enable those skilled in the art to better understand the solution of the present invention, below in conjunction in the embodiment of the present invention Attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is only The embodiment of a part of the invention, instead of all the embodiments.Based on the embodiments of the present invention, ordinary skill people The every other embodiment that member is obtained without making creative work should all belong to the model that the present invention protects It encloses.
It should be noted that term " first " in description and claims of this specification and above-mentioned attached drawing, " Two " etc. be for distinguishing similar object, without being used to describe specific sequence or precedence.It should be appreciated that using in this way Data can be interchanged in the appropriate case, so as to the embodiment of the present invention described herein can in addition to illustrating herein or Sequence other than those of description is implemented.In addition, term " comprising " and " having " and their any deformation, it is intended that cover It includes to be not necessarily limited to for example, containing the process of series of steps or unit, method, system, product or equipment to cover non-exclusive Those of clearly list step or unit, but may include not listing clearly or for these processes, method, product Or the other steps or unit that equipment is intrinsic.
According to embodiments of the present invention, a kind of embodiment of the method for the control selection method based on augmented reality is provided, is needed It is noted that step shown in the flowchart of the accompanying drawings can be in the computer system of such as a group of computer-executable instructions Middle execution, although also, logical order is shown in flow charts, and it in some cases, can be with different from herein Sequence executes shown or described step.
Optionally, in the present embodiment, the above-mentioned control selection method based on augmented reality can be applied to as shown in Figure 1 The hardware environment being made of server 102 and terminal 104 in.As shown in Figure 1, server 102 by network and one or The multiple terminals 104 (Fig. 1 illustrates only a terminal) of person are attached, and above-mentioned network includes but not limited to:Wide area network, Metropolitan Area Network (MAN) Or LAN, terminal 104 is not limited to PC, mobile phone, tablet computer etc., and terminal can be the terminal for having AR functions.This The control selection method based on augmented reality of inventive embodiments can be executed by terminal 104, can also be by being mounted on terminal Client on 104 executes.
Fig. 2 is the flow chart of the control selection method of augmented reality according to the ... of the embodiment of the present invention, as shown in Fig. 2, the party Method includes the following steps:
Step S102, the display target control in virtual scene;
Step S104, during terminal is moved towards target widget, in the case where meeting the first predetermined condition, control Target widget processed is in selected state.
Through the above steps, by the way of the display target control in virtual scene, by terminal towards target control During part moves, in the case where satisfaction is used to indicate triggering and chooses the first predetermined condition of target widget, control targe Control is in selected state, has achieved the purpose that without operating target widget on a terminal screen, and then solve the relevant technologies In AR using being arranged UI controls in the display screen of terminal, cause AR scenes show incomplete technical problem, to reality Show guarantee terminal and shown complete AR scenes, and improves the technique effect to the operating efficiency of target widget.
In the scheme that step S102 is provided, target widget can be UI controls, and user can be by target widget to increasing Strong reality carries out relevant control.The embodiment of the present invention is not specifically limited the shape of target widget, and target widget can be square Shape, or circle etc..Target widget in the embodiment of the present invention may include at least one object component, wherein each Object component can correspond to a kind of UI functions.Target widget may be displayed in virtual scene, and virtual scene herein is three Tie up Virtual Space.It should be noted that target widget essence can be a solid that can be interacted, therefore, the present invention is real Applying example can establish target widget in virtual three dimensional space, and the embodiment of the present invention is to the display target control in virtual scene The technical approach taken is not specifically limited.Target widget is included and not being in virtual scene by the embodiment of the present invention In the terminal, terminal display screen can be enable completely to show AR scenes in this way, and then the usage experience of user can be improved.
In the scheme that step S104 is provided, the embodiment of the present invention can go manipulation to be shown in virtual scene using terminal Target widget, specifically, during terminal is moved towards target widget, the embodiment of the present invention can with real-time judge whether Meet the first predetermined condition, wherein the first predetermined condition can serve to indicate that target widget is chosen in triggering.It should be noted that First predetermined condition can be set according to actual demand.
Optionally, in embodiments of the present invention, be only located in front of target widget in the first predetermined angular range in terminal, And the angle between the normal vector of plane where the normal vector of plane where terminal and target widget is in the second predetermined angular range In the case that interior and the distance between terminal and target widget are less than first threshold, it just can be determined that and meet the first predetermined condition. Optionally, the normal vector of plane where terminal can be the normal direction of plane where terminal by the normal vector of the central point of terminal The direction of amount can for terminal display screen towards negative direction.Optionally, terminal can be rectangle, then the central point of terminal As rectangle diagonal line intersection point.Optionally, the normal vector of plane where target widget can be by by target widget The normal vector of heart point, the direction of the normal vector of plane where target widget can be target widget it is positive towards direction, Wherein, the front of target widget can be the one side towards terminal, and optionally, target widget may be rectangle, then target control The central point of part can be rectangle diagonal line intersection point.The front of target widget can be direction of the target widget towards terminal.
Optionally, it is more than second threshold in the result of the normal vector of plane where relative position vector dot product target widget In the case of, it may be determined that terminal is located in front of target widget in the first predetermined angular range, wherein relative position vector can be The direction vector of terminal position is directed toward from the central point of target widget.The normal vector dot product target control of plane where terminal In the case that the result of the normal vector of plane is less than third threshold value where part, it may be determined that the normal vector and mesh of plane where terminal Angle between the normal vector of plane where marking control is in the second predetermined angular range.It is less than in the length of relative position vector In the case of first threshold, determine that the distance between terminal and target widget are less than first threshold.
That is, the embodiment of the present invention can pass through the normal vector of plane where relative position vector dot product target widget Result whether be more than second threshold, the result of the normal vector of plane where the normal vector dot product target widget of plane where terminal Whether third threshold value is less than, and whether the length of relative position vector is less than first threshold to determine whether meeting the first predetermined item Part can just determine only in the case where above three condition is satisfied by and meet the first predetermined condition.It needs to illustrate herein It is that first threshold, second threshold and third threshold value can set or adjust according to actual demand, be not specifically limited herein.
For the ease of carrying out understanding explanation to the first predetermined condition, below in conjunction in the first predetermined condition of specific attached drawing pair Included each vector and all angles are described in detail:
As shown in figure 3, target widget 10 and 20 diagonal line intersection point of terminal are respectively central point O and central point Q, target widget The normal vector of place plane is the vectorial OM as shown in Figure 3 by the normal vector of the central point of target widget 10, target widget institute The direction of the normal vector of plane for target widget 10 front institute towards direction, the front of target widget 10 is towards terminal 20 one side.
As shown in figure 4, target widget 10 and 20 diagonal line intersection point of terminal are respectively central point O and central point Q, relative position Vector is is directed toward the direction vector of 20 position of terminal from the central point of target widget 10, and optionally, relative position vector is The direction vector of the central point of terminal 20, vectorial OQ as shown in Figure 4 are directed toward from the central point of target widget 10.
As shown in figure 5, target widget 10 is respectively central point O and central point Q, terminal place with 20 diagonal line intersection point of terminal The normal vector of plane is the vectorial QN as shown in Figure 5 by the normal vector of the central point of mesh terminal 20, the method for plane where terminal The direction of vector be the front of terminal 20 towards direction, the front of terminal 20 is towards the one side of target widget 10.
Terminal is located in front of target widget in the first predetermined angular, herein it should be noted that referring in front of target widget Target widget towards terminal a side, the normal vector of plane where the first predetermined angular, that is, relative position vector and target widget it Between angle, angle [alpha] of first predetermined angular between vector OQ as shown in FIG. 6 and vector OM.As shown in fig. 6, α is smaller, then Need terminal that could meet the first predetermined condition in the front of target widget.
Angle between the normal vector of plane where the normal vector of plane where terminal and target widget is in the second predetermined angle It spends in range, herein it should be noted that the second predetermined angular is the angle beta between vectorial QN as shown in Figure 7 and vector OM.β It is bigger, then need 20 place plane of terminal that could meet the second predetermined condition close to parallel with 10 place plane of target widget.
It should be noted that in embodiments of the present invention, specific limit is not done to the first predetermined angular and the second predetermined angular It is fixed, it can be set or be adjusted according to actual demand.For example, the first predetermined angular can be 0-90 degree, second is predetermined Angle can be 90-180 degree.
During terminal is moved towards target widget, if it is judged that meeting the first predetermined condition, then the present invention is real It applies example and can control the target widget and be in selected state;If it is judged that being unsatisfactory for the first predetermined condition, then choosing will not be triggered In the target widget.
The embodiment of the present invention by the display target control in virtual scene, and using terminal towards target widget move with The target widget is chosen in triggering, terminal can be allow completely to show AR scenes in this way, and go to touch using terminal movement Hair chooses target widget, clicks the mode for choosing target widget manually on a terminal screen compared to user, can effectively reduce Error rate can also improve operating efficiency to a certain extent.
As a kind of optional embodiment, in the case where target widget is in selected state, the embodiment can also by terminal after It is continuous to be moved towards target widget, and whether can meet the second predetermined condition with real-time judge in moving process, wherein second is pre- Fixed condition can serve to indicate that control targe control is active.The second predetermined condition of the embodiment of the present invention pair is not done specifically It limits, can be set or adjusted according to practical application scene demand.
Optionally, only it is located in front of target widget plane in the first predetermined angular range and where terminal in terminal Angle between the normal vector of plane where normal vector and target widget is in the second predetermined angular range and terminal and target control In the case that the distance between part is less than the 4th threshold value, it can just determine and meet the second predetermined condition.Wherein, relative position to In the case that the result of the normal vector of plane is more than second threshold where measuring dot product target widget, it may be determined that terminal is located at target In front of control in the first predetermined angular range;The normal vector of plane where normal vector dot product target widget in plane where terminal Result be less than third threshold value in the case of, it may be determined that the method for the normal vector of plane where terminal and plane where target widget Angle between vector is in the second predetermined angular range;In the case where the length of relative position vector is less than four threshold values, Determine that the distance between terminal and target widget are less than the 4th threshold value.The 4th threshold value herein is less than first threshold, the 4th threshold value It can also be set or adjusted according to the demand of practical application scene, be not specifically limited herein.
In the case where target widget is in selected state, if it is judged that meeting the second predetermined condition, then the embodiment of the present invention can It is active with control targe control, to realize the purpose manipulated using terminal-pair target widget.
As a kind of optional embodiment, in the case where control targe control is in selected state, if it is judged that in terminal position In the method for the normal vector and plane where target widget of plane in the first predetermined angular range of target widget front and where terminal Angle between vector in the second predetermined angular range and terminal between target widget at a distance from be more than the feelings of the 4th threshold value Under condition, the embodiment of the present invention can by target widget be labeled as selected state, wherein the display parameters of selected state by terminal with The distance between target widget determines.
It should be noted that after judging to meet the first predetermined condition, target widget is selected.Target widget can be with Selected state is marked as, such as target widget can become highlighting, shape becomes larger, herein to the selected state of target widget The form of expression be not specifically limited.After target widget is marked as selected state, continue the shifting towards target widget in terminal During dynamic, if the distance between terminal and target widget also no less than the 4th threshold value, namely do not meet also and activate target control Before second predetermined condition of the touch event of part, the display parameters of the selected state of target widget can be with terminal and target The distance between control carries out respective change, such as terminal is more and more closer at a distance from target widget, target widget can be with Showing as more and more brighter or target widget, can to show as shape increasing.Correspondingly, if terminal and target widget Distance is more and more remoter, then target widget can show as that darkness deepens or target widget can to show as shape smaller and smaller.
By the selected state of above-mentioned target widget as the distance between terminal and target widget are changed, can compare It is relatively intuitive clearly to allow user the touch event for how activating target widget is illustrated, and user and target can be improved Interactive experience between control.
As a kind of optional embodiment, in the case where control targe control is active, which can also be by end End continues to move towards target widget, and whether can meet third predetermined condition with real-time judge in moving process, wherein the Three predetermined conditions can serve to indicate that the object component in triggering target widget, to realize the purpose for triggering some UI function.This Inventive embodiments are not specifically limited third predetermined condition, can be set or adjusted according to practical application scene demand It is whole.
Optionally, only it is located in front of target widget in the first predetermined angular range and the direction of motion of terminal in terminal Angle between the normal vector of plane where vector and target widget is in third predetermined angular range and terminal and target widget The distance between be less than the 5th threshold value in the case of, can just determine and meet third predetermined condition.Wherein, the direction of motion of terminal Vector is the vector that present frame terminal position is directed toward from former frame terminal position.
Optionally, it is more than second threshold in the result of the normal vector of plane where relative position vector dot product target widget In the case of, it may be determined that terminal is located in front of target widget in the first predetermined angular range;In the direction of motion vector point of terminal Multiply the normal vector of plane where target widget result be less than the 6th threshold value in the case of, it may be determined that the direction of motion of terminal to Angle between the normal vector of plane where amount and target widget is in third predetermined angular range;In the length of relative position vector In the case that degree is less than the 5th threshold value, it may be determined that the distance between terminal and target widget are less than the 5th threshold value.
That is, the embodiment of the present invention can pass through the normal vector of plane where relative position vector dot product target widget Result whether be more than second threshold, the normal vector of plane where the direction of motion vector dot target widget of terminal the result is that No to be less than the 6th threshold value, whether the length of relative position vector is less than the 5th threshold value to determine whether meeting third predetermined condition, Only in the case where above three condition is satisfied by, it can just determine and meet third predetermined condition.Herein it should be noted that 5th threshold value can be less than the 4th threshold value, and the 5th threshold value, the 6th threshold value can set or adjust according to actual demand, not do herein It is specific to limit.
In the case where control targe control is active, during terminal continuation is moved towards target widget, if Judge to meet third predetermined condition, then the embodiment of the present invention can be in triggering state with control targe control, to realize triggering Object component in target widget;If it is judged that being unsatisfactory for third predetermined condition, then the target in target widget will not be triggered Element.
As a kind of optional embodiment, after control targe control is in triggering state, which can also wrap It includes:The target widget successfully triggered is prompted.
It, can be with it should be noted that if having successfully triggered some object component in virtual scene in target widget It is prompted to user feedback, the embodiment of the present invention is not specifically limited the form of prompt, such as can be shaken by terminal, terminal The forms prompt user such as play sound effect, target widget scaling has successfully triggered some object component in target widget.It should Embodiment by the object component rear line feedback prompts in successfully triggering target widget, can reach inform user this Trigger action has succeeded, and user can take away the purpose of terminal, and then improve user experience.
As a kind of optional embodiment, in the case where control targe control is in selected state, which can also be by end End continues to move towards target widget, and whether can meet the 4th predetermined condition with real-time judge in moving process, wherein the Four predetermined conditions can serve to indicate that the object component in triggering target widget, to realize the purpose for triggering some UI function.This The 4th predetermined condition of inventive embodiments pair is not specifically limited, and can be set or adjusted according to practical application scene demand It is whole.
Optionally, only it is located in front of target widget plane in the first predetermined angular range and where terminal in terminal Angle between the normal vector of plane where normal vector and target widget is in the second predetermined angular range and terminal and target control In the case that the distance between part is less than the 7th threshold value, it can just determine and meet the 4th predetermined condition.Wherein, the 7th threshold value is less than First threshold.
In the case where control targe control is in selected state, during terminal continuation is moved towards target widget, if Judge to meet the 4th predetermined condition, then the embodiment of the present invention can directly control target widget and be in triggering state, to realize Trigger the object component in target widget.The embodiment is by choosing target widget directly to trigger the target in target widget later Element can be saved relative to object component of the target widget reactivation target widget then in triggering target widget is chosen Time simplifies operation.
As a kind of optional embodiment, in the case where control targe control is in selected state, which can also include: Terminal and target widget are bound;And it is moved in virtual scene using terminal control target widget.
It should be noted that after choosing target widget using terminal triggering, the embodiment of the present invention can establish terminal It is set membership between binding relationship namely terminal and target widget between target widget, target widget can then become eventually The sub- object at end.Optionally, after choosing target widget using terminal triggering, can establish automatically terminal and target widget it Between binding relationship, alternatively, using terminal triggering choose target widget after, user can be by executing touch-control in terminal It operates to realize the binding relationship established between terminal and target widget, wherein the embodiment of the present invention holds user in terminal The type of capable touch control operation is not specifically limited, such as click, long-press, the gesture etc. executed on a terminal screen, or is shaken The operations such as dynamic terminal.
After terminal and target widget are bound, which can utilize terminal control target widget in virtual field It is moved in scape.Optionally, it may include at least one of to be moved in virtual scene using terminal control target widget:Target Control is moved with terminal along desired trajectory;Target widget is with terminal rotating;Target widget is with terminal in target widget position It shakes.Herein it should be noted that establishing after the binding relationship between terminal and target widget, mobile terminal, target widget It also can be with movement.It should also be noted that, it may include two kinds of triggering modes that target widget, which carries out rotation, one of which is rotation Turn terminal, target widget also can be with rotation, another kind can be surrounded with sliding terminal screen, target widget an axis into Row rotation, which is parallel to terminal screen plane and the axle center Jing Guo target widget, and, foundation vertical with the sliding trace of every frame Glide direction is that direction of rotation is rotated.
The embodiment can utilize terminal control target widget by establishing the binding relationship between terminal and target widget Move, can achieve the purpose that in this way convenient for controlling target widget in virtual scene, greatly facilitate for Operation, improves operating efficiency.
As a kind of optional embodiment, during being moved in virtual scene using terminal control target widget, The alternative embodiment can also include:Obtain the control operation executed in terminal, wherein control operation is used to indicate releasing eventually Binding relationship between end and target widget;Response control operation releases the binding relationship between terminal and target widget, and obtains Take the position that target widget is currently moved to;In the case where the position that target widget is currently moved to is first position, It is first position by the location updating of target widget, wherein first position is the position allowed where target widget;In target control It is the original position of target widget by the position recovering of target widget in the case that the position that part is currently moved to is the second position It sets, wherein the second position is the position not allowed where target widget.
It should be noted that during being moved in virtual scene using terminal control target widget, the present invention is real Applying example can be according to the binding relationship between actual demand releasing terminal and target widget, specifically can be by being executed in terminal Control operation realize, herein it should be noted that the embodiment to executed in terminal control operation do not do specific limit yet Click, long-press, gesture etc. fixed, such as execute on a terminal screen, or shake terminal etc..When user executes control in terminal After system operation, which can respond thereto, and release the binding relationship between terminal and target widget, and obtain target The position that control is currently moved to.If the position that target widget is currently moved to is first position, which can Using by the location updating of target widget as first position, wherein first position be allow target widget where position;If mesh The position that mark control is currently moved to is the second position, then the position recovering of target widget can be target control by the embodiment The original position of part, wherein the second position is the position not allowed where target widget.Optionally, which can also be right First position and the second position are marked with different colours, such as Green Marker first position can be used, with red-label second Position, or target widget becomes green when target widget moves to first position, when target widget moves to the second position When target widget become red.
Movement of the target widget in virtual control may be implemented using terminal in the embodiment, and by judging current kinetic Whether the position arrived is legal, can effectively realize the purpose that the position moved to target widget is managed, greatly It is convenient for users to operate, is effectively improved the usage experience of user.
The present invention also provides a kind of preferred embodiment, the preferred embodiment provides to be touched in a kind of mobile device AR application The method for touching selection.
Fig. 8 is the flow chart for the method that selection is touched in mobile device AR applications according to the preferred embodiment of the invention, such as Shown in Fig. 8, this method can specifically include following steps:
Step S302, mobile device (such as mobile phone, tablet computer etc.) start close in front of UI controls.
Whether step S304 judges mobile device within the scope of angle threshold A immediately ahead of UI controls, mobile device normal direction Amount and UI control normal vectors angle whether within the scope of threshold value B, at a distance from mobile device and UI controls whether less than threshold value D1.Such as Fruit judging result is yes, thens follow the steps S306, and UI controls are chosen in triggering.Herein it should be noted that mobile device normal vector The normal vector of plane where as above-mentioned terminal, UI control normal vectors are the normal vector of plane where above-mentioned target widget.
UI controls are chosen in step S306, triggering.
Whether step S308 judges mobile device within the scope of angle threshold A immediately ahead of UI controls, mobile device normal direction Amount and UI control normal vectors angle whether within the scope of threshold value B, at a distance from mobile device and UI controls whether less than threshold value D2.Such as Fruit judging result is yes, thens follow the steps S310, activation touching UI events.
Step S310, activation touching UI events.
Whether step S312 judges mobile device within the scope of angle threshold A immediately ahead of UI controls, direction of motion vector With UI control normal vectors angle whether within the scope of threshold value C, at a distance from mobile device and UI controls whether be less than threshold value D3.If Judging result is yes, thens follow the steps S314, triggers UI functions.
Step S314 triggers UI functions.
For above-mentioned steps S304, triggering selected state needs while meeting following three conditions:Mobile device is in UI Within the scope of certain angle immediately ahead of control, mobile device need to be seen to UI controls direction, mobile device at a distance from UI controls In a certain range.As mobile device and UI control distances become smaller, it is in line with distance that special efficacy, special efficacy intensity are chosen in the presentation of UI controls Sexual intercourse.
Optionally, judge whether to trigger selected state need to detect following three conditions whether and meanwhile meet:Relative position The result of vector dot UI control normal vectors is more than A1;Mobile device normal vector dot product UI control normal vector results are less than A2;Phase D1 is less than to position vector length.
Wherein, the relative position vector refer to the relative position of mobile device position and UI control central points to Amount, being defined as by UI control central points, the vector that terminates in position of mobile equipment, the direction of relative position vector is by UI Control central point is directed toward mobile device position.The UI controls normal vector refers to the normal vector by UI control central points, UI The front the direction of control normal vector and UI towards direction it is identical.
It should be noted that A1, A2, D1 can be depending on different demands.A1 constrains mobile device that can activate touching item The region relative to UI controls where part, A1 is bigger, then selected state can only be triggered in the front band of position of UI controls, Conversely, side position can also trigger selected state.A2 constrain mobile device can trigger selected state when, mobile device see to Relationship of the direction relative to UI controls, A2 is smaller, then the condition can only just can trigger when seeing to UI controls, otherwise UI controls It can be not present in the equipment visual field in triggering.When D1 constrains mobile device to can trigger selected state, at a distance from UI controls, D1 is smaller, mobile device can only from UI controls it is closer with a distance from can just trigger selected state.
For above-mentioned steps S308, the state of activation touching UI events.Mobile device is when UI controls are in selected state Continue to move to UI controls direction, when close to UI controls, you can activation touching UI events.
Optionally, judge whether that the condition for touching UI events can be activated:In the case where UI controls are in selected state, need to examine Survey whether following three conditions meet simultaneously:The result of relative position vector dot product UI control normal vectors is more than A1;Mobile device Normal vector dot product UI control normal vector results are less than A2;Relative position vector length is less than D2.
It should be noted that D2 can be depending on different demands, when D2 constrains mobile device that can activate touching UI events At a distance from UI control elements, D2 is smaller, mobile device can only from UI controls it is closer with a distance from can just trigger.
For above-mentioned steps S312, UI functions are triggered.When meet activate touching UI state-events when, mobile device continue to Touching UI elements are removed in the movement of UI controls direction, you can the function of the primary UI of triggering.
Optionally, realization method when mobile device triggering scene UI functions can be:It is touched when mobile device meets activation When touching three conditions of UI events, program starts to detect the mobile device direction of motion and the distance of distance UI control central points.When The dot product result of direction of motion vector and UI control normal vectors is more than A3 and when relative position vector length is less than D3, is considered as UI work( It can be triggered.
Wherein, the direction of motion vector refers to position calculated equipment moving direction of the program according to equipment per frame.
It should be noted that A3, D3 can be depending on different demands.A3 constrains mobile device to can trigger UI functions when institute The direction of motion needed, A3 is bigger, then to trigger UI functions, mobile device needs are moved to UI controls direction straight.D3 Distance when mobile device being constrained to trigger UI functions when mobile device and UI controls distance are less than D3, and meets more than all When condition, UI functions are triggered.
Optionally, in the state of meeting UI controls and choosing condition, position of mobile equipment is in D1, when between D2, UI controls Part chooses effect to be changed linearly with relative position vector length.Such as when relative position vector length is D1, UI controls Illumination effect is 50%;When relative position vector length is D2, UI control illumination effects are 100%.The illumination effect Effect optionally is chosen for one kind of UI controls, according to demand, this chooses effect that can be changed to other different forms of expression.
Optionally, it when satisfaction chooses condition, but does not reach when can activate touching UI events, mobile device is controlled far from UI Part, then UI controls choose effect linearly to disappear.
Optionally, the performance logic when meeting triggering UI functions:UI controls will appear triggering feedback prompts, as mobile phone shakes Dynamic, click audio and UI controls scaling etc., to inform that this trigger action of user has succeeded, user can take away movement and set It is standby.
After triggering UI functions, mobile device can be by negative direction far from UI controls, and whole process completes primary The interactive mode of selection is touched in complete AR.
Following functions may be implemented in the present invention:
UI controls can be established in virtual three dimensional space, three-dimensional scenic UI controls break intrinsic mobile device screen The limitation of UI controls in curtain, the entire screen of mobile device may serve to that AR scenes are presented, there will be no and the more hide always The UI buttons of gear on the screen are experienced to reduce AR;The UI controls in virtual three dimensional space, which are contacted, by mobile device reaches triggering Effect improves behaviour without can quickly be interacted with the UI controls in virtual three dimensional space by mobile device screen Make efficiency, and operate with preferable fault-tolerance, reduces user's operation burden;It is identified by mobile device and UI space lengths And threshold determination, the conflict with the response mode of existing control can be avoided;Pass through the identification of mobile device and UI screen angles With threshold determination, it is possible to reduce maloperation.
The present invention reduces courses of action and burden, can improve behaviour because directly triggering UI controls by mobile device Make efficiency, promotes user experience.Moreover, the present invention in virtual three dimensional space by showing UI controls, it is possible to reduce on screen Resident button improves the utilization rate at mobile terminal screen interface due to being not necessarily to reside control in mobile device screen.
According to embodiments of the present invention, a kind of device embodiment of the control selection device based on augmented reality is additionally provided, It should be noted that should control selection device based on augmented reality can be used for execute the embodiment of the present invention in based on enhancing The control selection method based on augmented reality in the control selection method namely the embodiment of the present invention of reality can be based at this It is executed in the control selection device of augmented reality.
Fig. 9 is the schematic diagram of the control selection device according to the ... of the embodiment of the present invention based on augmented reality, as shown in figure 9, The device may include:
Display unit 12, for the display target control in virtual scene;Selecting unit 14, in terminal towards target During control moves, in the case where meeting the first predetermined condition, control targe control is in selected state.
It should be noted that the display unit 12 in the embodiment can be used for executing the step in the embodiment of the present invention S102, the selecting unit 14 in the embodiment can be used for executing the step S104 in the embodiment of the present invention.Above-mentioned module with it is right The example that the step of answering is realized is identical with application scenarios, but is not limited to the above embodiments disclosure of that.
Optionally, meeting the first predetermined condition may include:Terminal is located at the first predetermined angular range in front of target widget Angle between the normal vector of plane where the normal vector of plane where interior and terminal and target widget is in the second predetermined angular model In enclosing and the distance between terminal and target widget are less than first threshold, wherein the direction of the normal vector of plane is where terminal The display screen of terminal towards negative direction, the direction of the normal vector of plane where target widget for target widget positive institute face To direction, the front of target widget is the one side towards the terminal.
Optionally, it is more than second threshold in the result of the normal vector of plane where relative position vector dot product target widget In the case of, determine that terminal is located in front of target widget in the first predetermined angular range, wherein relative position vector is from target control The central point of part is directed toward the direction vector of terminal position;It is put down where normal vector dot product target widget in plane where terminal In the case that the result of the normal vector in face is less than third threshold value, the normal vector of plane is put down with where target widget where determining terminal Angle between the normal vector in face is in the second predetermined angular range;It is less than the feelings of first threshold in the length of relative position vector Under condition, determine that the distance between terminal and target widget are less than first threshold.
As a kind of optional embodiment, which can also include:Marking unit is chosen for being in target widget Under state, target widget is labeled as selected state, and target widget is determined according to the distance between terminal and target widget Display parameters.
As a kind of optional embodiment, which can also include:Unit is activated, is chosen for being in target widget Under state, in the case where meeting the second predetermined condition, control targe control is active.
Optionally, meeting the second predetermined condition includes:The distance between terminal and target widget are less than the 4th threshold value, In, the 4th threshold value is less than first threshold.
As a kind of optional embodiment, which can also include:First trigger element, for being in target widget Under state of activation, in the case where meeting third predetermined condition, control targe element is in triggering state.
Optionally, meet third predetermined condition, including:The method of the direction of motion vector of terminal and plane where target widget Angle between vector in third predetermined angular range and terminal between target widget at a distance from be less than the 5th threshold value, In, the direction of motion vector of terminal is the vector that present frame terminal position is directed toward from former frame terminal position, the 5th Threshold value is less than the 4th threshold value.
Optionally, the result of the normal vector of plane where the direction of motion vector dot target widget in terminal is less than the 6th In the case of threshold value, determine the angle between the direction of motion vector of terminal and the normal vector of target widget place plane in third In predetermined angular range;In the case where the length of relative position vector is less than five threshold values, determine terminal and target widget it Between distance be less than the 5th threshold value.
As a kind of optional embodiment, which can also include:Prompt unit, for being in triggering in target widget Under state, the target widget successfully triggered is prompted.
As a kind of optional embodiment, which can also include:Second trigger element, for being in target widget When selected state, in the case where meeting four predetermined conditions, control targe control is in triggering state.
Optionally, meeting the 4th predetermined condition includes:The distance between terminal and target widget are less than the 7th threshold value, In, the 7th threshold value is less than first threshold.
As a kind of optional embodiment, which can also include:Binding unit is chosen for being in target widget Under state, terminal and target widget are bound;Moving cell, for utilizing terminal control target widget in virtual scene Movement.
Optionally, moving cell may include at least one of:Mobile module, for target widget with terminal along predetermined It moves track;Rotary module, for target widget with terminal rotating;Module is shaken, for target widget with terminal in target control It shakes part position.
As a kind of optional embodiment, which can also include:Acquiring unit, for utilizing terminal control target During control moves in virtual scene, the control operation executed in terminal is obtained, wherein control operation is used to indicate Release the binding relationship between terminal and target widget;Response unit releases terminal and target widget for response control operation Between binding relationship, and obtain the position that target widget is currently moved to;Updating unit, in the current institute of target widget It is first position by the location updating of target widget, wherein first position is in the case that the position moved to is first position Allow the position where target widget;Reduction unit, for being the second position in the position that target widget is currently moved to In the case of, it is the original position of target widget by the position recovering of target widget, wherein the second position is not allow target widget The position at place.
It should be noted that said units or module are identical as the example and application scenarios that corresponding step is realized, but It is not limited to the above embodiments disclosure of that.
By above-mentioned apparatus, achieve the purpose that without operating target widget on a terminal screen, and then solve correlation AR in technology causes AR scenes to show incomplete technical problem using being arranged UI controls in the display screen of terminal, from And realize and ensure that terminal shows complete AR scenes, and improve the technique effect to the operating efficiency of target widget.
To achieve the goals above, according to another aspect of the present invention, the embodiment of the present invention additionally provides a kind of storage Jie Matter, the storage medium include the program of storage, wherein equipment where controlling the storage medium when described program is run is held The row control selection method described above based on augmented reality.
To achieve the goals above, according to another aspect of the present invention, the embodiment of the present invention additionally provides a kind of processor, The processor is for running program, wherein described program executes the control choosing described above based on augmented reality when running Selection method.
To achieve the goals above, according to another aspect of the present invention, the embodiment of the present invention additionally provides a kind of terminal, packet Include memory and processor, wherein computer program is stored in the memory, the processor is arranged to described in operation Computer program is to execute the control selection method described above based on augmented reality.
The embodiments of the present invention are for illustration only, can not represent the quality of embodiment.
In the above embodiment of the present invention, all emphasizes particularly on different fields to the description of each embodiment, do not have in some embodiment The part of detailed description may refer to the associated description of other embodiment.
In several embodiments provided herein, it should be understood that disclosed technology contents can pass through others Mode is realized.Wherein, the apparatus embodiments described above are merely exemplary, for example, the unit division, Ke Yiwei A kind of division of logic function, formula that in actual implementation, there may be another division manner, such as multiple units or component can combine or Person is desirably integrated into another system, or some features can be ignored or not executed.Another point, shown or discussed is mutual Between coupling, direct-coupling or communication connection can be INDIRECT COUPLING or communication link by some interfaces, unit or module It connects, can be electrical or other forms.
The unit illustrated as separating component may or may not be physically separated, aobvious as unit The component shown may or may not be physical unit, you can be located at a place, or may be distributed over multiple On unit.Some or all of unit therein can be selected according to the actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, it can also It is that each unit physically exists alone, it can also be during two or more units be integrated in one unit.Above-mentioned integrated list The form that hardware had both may be used in member is realized, can also be realized in the form of SFU software functional unit.
If the integrated unit is realized in the form of SFU software functional unit and sells or use as independent product When, it can be stored in a computer read/write memory medium.Based on this understanding, technical scheme of the present invention is substantially The all or part of the part that contributes to existing technology or the technical solution can be in the form of software products in other words It embodies, which is stored in a storage medium, including some instructions are used so that a computer Equipment (can be personal computer, server or network equipment etc.) execute each embodiment the method for the present invention whole or Part steps.And storage medium above-mentioned includes:USB flash disk, read-only memory (ROM, Read-Only Memory), arbitrary access are deposited Reservoir (RAM, Random Access Memory), mobile hard disk, magnetic disc or CD etc. are various can to store program code Medium.
The above is only a preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, various improvements and modifications may be made without departing from the principle of the present invention, these improvements and modifications are also answered It is considered as protection scope of the present invention.

Claims (19)

1. a kind of control selection method based on augmented reality, which is characterized in that including:
The display target control in virtual scene;
During terminal is moved towards the target widget, in the case where meeting the first predetermined condition, the mesh is controlled Mark control is in selected state.
2. according to the method described in claim 1, it is characterized in that, the first predetermined condition of the satisfaction includes:
The terminal is located in front of the target widget normal vector of plane in the first predetermined angular range and where the terminal The target widget where plane normal vector between angle in the second predetermined angular range and the terminal with it is described The distance between target widget is less than first threshold, wherein the direction of the normal vector of plane where the terminal is the terminal Display screen towards negative direction, the direction of the normal vector of plane where the target widget is the front of the target widget Towards direction, the front of the target widget is the one side towards the terminal.
3. according to the method described in claim 2, it is characterized in that,
In the case that the result of the normal vector of plane where the target widget described in relative position vector dot product is more than second threshold, Determine that the terminal is located in front of the target widget in first predetermined angular range, wherein the relative position vector To be directed toward the direction vector of the terminal position from the central point of the target widget;
The result of the normal vector of plane where target widget described in the normal vector dot product of plane where the terminal is less than third In the case of threshold value, between the normal vector of plane where the normal vector of plane where determining the terminal and the target widget Angle is in second predetermined angular range;
In the case where the length of the relative position vector is less than the first threshold, the terminal and the target control are determined The distance between part is less than the first threshold.
4. according to the method described in claim 2, it is characterized in that, in the case where the target widget is in the selected state, institute The method of stating further includes:
The target widget is labeled as selected state, and institute is determined according to the distance between the terminal and the target widget State the display parameters of target widget.
5. according to the method described in claim 2, it is characterized in that, in the case where the target widget is in the selected state, institute The method of stating further includes:
In the case where meeting the second predetermined condition, controls the target widget and be active.
6. according to the method described in claim 5, it is characterized in that, the second predetermined condition of the satisfaction includes:
The distance between the terminal and the target widget are less than the 4th threshold value, wherein the 4th threshold value is less than described the One threshold value.
7. according to the method described in claim 6, it is characterized in that, in the case where the target widget is in the state of activation, institute The method of stating further includes:
In the case where meeting third predetermined condition, controls the target widget and be in triggering state.
8. meet third predetermined condition the method according to the description of claim 7 is characterized in that described, including:
Angle between the normal vector of plane where the direction of motion vector of the terminal and the target widget is predetermined in third In angular range and the distance between the terminal and the target widget are less than the 5th threshold value, wherein the movement of the terminal Direction vector is the vector that terminal position described in present frame is directed toward from terminal position described in former frame, the 5th threshold Value is less than the 4th threshold value.
9. according to the method described in claim 8, it is characterized in that,
The result of the normal vector of plane where the target widget described in the direction of motion vector dot of the terminal is less than the 6th threshold In the case of value, the angle between the direction of motion vector of the terminal and the normal vector of target widget place plane is determined In the third predetermined angular range;
In the case where the length of relative position vector is less than five threshold value, determine the terminal and the target widget it Between distance be less than the 5th threshold value, wherein the relative position vector be from the central point of the target widget direction institute State the direction vector of terminal position.
10. the method according to the description of claim 7 is characterized in that in the case where the target widget is in the triggering state, institute The method of stating further includes:
The target widget successfully triggered is prompted.
11. according to the method described in claim 2, it is characterized in that, when the target widget is in the selected state, institute The method of stating further includes:
In the case where meeting four predetermined conditions, controls the target widget and be in triggering state.
12. according to the method for claim 11, which is characterized in that the 4th predetermined condition of the satisfaction includes:
The distance between the terminal and the target widget are less than the 7th threshold value, wherein the 7th threshold value is less than described the One threshold value.
13. method according to any one of claim 1 to 12, which is characterized in that be in described in the target widget Under selected state, the method further includes:
The terminal and the target widget are bound;
It is moved in the virtual scene using target widget described in the terminal control.
14. according to the method for claim 13, which is characterized in that described to be existed using target widget described in the terminal control Movement includes at least one of in the virtual scene:
The target widget is moved with the terminal along desired trajectory;
The target widget is with the terminal rotating;
The target widget is shaken with the terminal in the target widget position.
15. according to the method for claim 13, which is characterized in that utilize target widget described in the terminal control described During being moved in the virtual scene, the method further includes:
Obtain the control operation executed in the terminal, wherein the control operation, which is used to indicate, releases the terminal and institute State the binding relationship between target widget;
The binding relationship between the control operation releasing terminal and the target widget is responded, and obtains the target control The position that part is currently moved to;
In the case where the position that the target widget is currently moved to is first position, more by the position of the target widget It is newly the first position, wherein the first position is the position allowed where the target widget;
In the case where the position that the target widget is currently moved to is the second position, also by the position of the target widget Originally it was the original position of the target widget, wherein the second position is the position not allowed where the target widget.
16. a kind of control selection device based on augmented reality, which is characterized in that including:
Display unit, for the display target control in virtual scene;
Selecting unit is used for during terminal is moved towards the target widget, the case where meeting the first predetermined condition Under, the target widget is chosen in triggering, wherein first predetermined condition is used to indicate triggering and chooses the target widget.
17. a kind of storage medium, which is characterized in that the storage medium includes the program of storage, wherein run in described program When control the storage medium where equipment perform claim require method described in any one of 1 to 15.
18. a kind of processor, which is characterized in that the processor is for running program, wherein right of execution when described program is run Profit requires the method described in any one of 1 to 15.
19. a kind of terminal, including memory and processor, which is characterized in that be stored with computer program, institute in the memory Processor is stated to be arranged to run the computer program to execute the method described in any one of described claim 1 to 15.
CN201810187008.1A 2018-03-07 2018-03-07 Control selection method and device based on augmented reality Active CN108415570B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810187008.1A CN108415570B (en) 2018-03-07 2018-03-07 Control selection method and device based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810187008.1A CN108415570B (en) 2018-03-07 2018-03-07 Control selection method and device based on augmented reality

Publications (2)

Publication Number Publication Date
CN108415570A true CN108415570A (en) 2018-08-17
CN108415570B CN108415570B (en) 2021-08-24

Family

ID=63130472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810187008.1A Active CN108415570B (en) 2018-03-07 2018-03-07 Control selection method and device based on augmented reality

Country Status (1)

Country Link
CN (1) CN108415570B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110478901A (en) * 2019-08-19 2019-11-22 Oppo广东移动通信有限公司 Exchange method and system based on augmented reality equipment
WO2022048428A1 (en) * 2020-09-07 2022-03-10 北京字节跳动网络技术有限公司 Method and apparatus for controlling target object, and electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436663A (en) * 2010-08-12 2012-05-02 株式会社泛泰 User equipment, server, and method for selectively filtering augmented reality
EP2802962A1 (en) * 2012-01-12 2014-11-19 Qualcomm Incorporated Augmented reality with sound and geometric analysis
CN106843498A (en) * 2017-02-24 2017-06-13 网易(杭州)网络有限公司 Dynamic interface exchange method and device based on virtual reality
CN107092492A (en) * 2017-04-27 2017-08-25 广州四三九九信息科技有限公司 The control method and device of virtual objects
CN107430442A (en) * 2015-05-26 2017-12-01 谷歌公司 For entering and exiting the multi-dimensional graphic method of the application in immersion media and activity
KR101806864B1 (en) * 2016-10-05 2017-12-08 연세대학교 산학협력단 Apparatus for controlling 3d object in augmmented reality environment and method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436663A (en) * 2010-08-12 2012-05-02 株式会社泛泰 User equipment, server, and method for selectively filtering augmented reality
EP2802962A1 (en) * 2012-01-12 2014-11-19 Qualcomm Incorporated Augmented reality with sound and geometric analysis
CN107430442A (en) * 2015-05-26 2017-12-01 谷歌公司 For entering and exiting the multi-dimensional graphic method of the application in immersion media and activity
KR101806864B1 (en) * 2016-10-05 2017-12-08 연세대학교 산학협력단 Apparatus for controlling 3d object in augmmented reality environment and method thereof
CN106843498A (en) * 2017-02-24 2017-06-13 网易(杭州)网络有限公司 Dynamic interface exchange method and device based on virtual reality
CN107092492A (en) * 2017-04-27 2017-08-25 广州四三九九信息科技有限公司 The control method and device of virtual objects

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110478901A (en) * 2019-08-19 2019-11-22 Oppo广东移动通信有限公司 Exchange method and system based on augmented reality equipment
WO2021031755A1 (en) * 2019-08-19 2021-02-25 Oppo广东移动通信有限公司 Interactive method and system based on augmented reality device, electronic device, and computer readable medium
CN110478901B (en) * 2019-08-19 2023-09-22 Oppo广东移动通信有限公司 Interaction method and system based on augmented reality equipment
WO2022048428A1 (en) * 2020-09-07 2022-03-10 北京字节跳动网络技术有限公司 Method and apparatus for controlling target object, and electronic device and storage medium
US11869195B2 (en) 2020-09-07 2024-01-09 Beijing Bytedance Network Technology Co., Ltd. Target object controlling method, apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
CN108415570B (en) 2021-08-24

Similar Documents

Publication Publication Date Title
CN105159687B (en) A kind of information processing method, terminal and computer-readable storage medium
KR102270766B1 (en) creative camera
CN109557998B (en) Information interaction method and device, storage medium and electronic device
CN108273265A (en) The display methods and device of virtual objects
CN110523085A (en) Control method, device, terminal and the storage medium of virtual objects
CN107551537A (en) The control method and device of virtual role, storage medium, electronic equipment in a kind of game
CN110064193A (en) Manipulation control method, device and the mobile terminal of virtual objects in game
CN110075522A (en) The control method of virtual weapons, device and terminal in shooting game
TW202218723A (en) Method of controlling virtual object, device, storage medium, electrical equipment and computer program product
CN106445157A (en) Method and device for adjusting image display orientation
CN111389003B (en) Game role control method, device, equipment and computer readable storage medium
CN108310768B (en) Virtual scene display method and device, storage medium and electronic device
CN111957041A (en) Map viewing method in game, terminal, electronic equipment and storage medium
CN111494935A (en) Method and device for controlling virtual object in game
CN108415570A (en) Control selection method based on augmented reality and device
CN113318428A (en) Game display control method, non-volatile storage medium, and electronic device
CN110404257B (en) Formation control method and device, computer equipment and storage medium
CN108434728A (en) Operation control adaptation method and device, electronic equipment and storage medium
CN113262476B (en) Position adjusting method and device of operation control, terminal and storage medium
CN110448903A (en) Determination method, apparatus, processor and the terminal of control strategy in game
CN111766989B (en) Interface switching method and device
CN107479902A (en) Control processing method and processing device, storage medium, processor and terminal
CN114995713B (en) Display control method, display control device, electronic equipment and readable storage medium
CN115619484A (en) Method for displaying virtual commodity object, electronic equipment and computer storage medium
CN109284058A (en) Mobile terminal, display content processing method and device thereof, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant