CN109828660A - A kind of method and device of the control application operating based on augmented reality - Google Patents
A kind of method and device of the control application operating based on augmented reality Download PDFInfo
- Publication number
- CN109828660A CN109828660A CN201811645324.5A CN201811645324A CN109828660A CN 109828660 A CN109828660 A CN 109828660A CN 201811645324 A CN201811645324 A CN 201811645324A CN 109828660 A CN109828660 A CN 109828660A
- Authority
- CN
- China
- Prior art keywords
- user
- gesture information
- application
- information
- predeterminable area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the present application provides a kind of method of control application operating based on augmented reality, it include: the request for receiving user and sending, wherein, the request carries the initial gesture information of the user, and the request is used to indicate the initial gesture information for obtaining the user;The initial gesture information of the user is parsed, to obtain the gesture information of the user;According to the gesture information of the user, operation control is carried out to the first application.By the embodiment of the present application, by obtaining the initial gesture information of user, and the initial gesture information is parsed, so that equipment controls respective application according to the gesture information that parsing obtains.By this programme, screen or keyboard are pressed without user, application operating can be carried out, it can be achieved that controlling at a distance the operation of application, it is very convenient, and interest is strong, is beneficial to the eyes of user, user experience is good.
Description
Technical field
This application involves augmented reality (AR) technical fields, and in particular to a kind of control application operating based on augmented reality
Method and device.
Background technique
With the development and progress of artificial intelligence technology, Gesture Recognition is had begun applied to social life, greatly
Ground facilitates the work of people, while increasing the enjoyment of life.However for existing application, generally user is needed to pass through a little
Triggering system interaction is removed by screen or keyboard, not only process is cumbersome, also has certain injury to human eye, it is therefore desirable to have
A kind of technology that can reach control application by gesticulating gesture apart from screen a certain distance.
Summary of the invention
The embodiment of the present application provides a kind of method and device of control application operating based on augmented reality, can be realized nothing
It needs user to press screen or keyboard, application operating can be carried out, it is very convenient.
The first aspect of the embodiment of the present application provides a kind of method of control application operating based on augmented reality, packet
It includes:
Receive the request that user sends, wherein the request carries the initial gesture information of the user, and the request is used
The initial gesture information of the user is obtained in instruction;
The initial gesture information of the user is parsed, to obtain the gesture information of the user;
According to the gesture information of the user, operation control is carried out to the first application.
The second aspect of the embodiment of the present application provides a kind of device of control application operating based on augmented reality, packet
It includes:
Gesture obtains module, for receiving the request of user's transmission, wherein the request carries the initial hand of the user
Gesture information, the request are used to indicate the initial gesture information for obtaining the user;
Gesture parsing module, for parsing the initial gesture information of the user, to obtain the gesture information of the user;
Application control module carries out operation control to the first application for the gesture information according to the user.
The third aspect of the embodiment of the present application provides a kind of computer readable storage medium, is stored with computer program,
The computer program is executed by processor to realize the method.
Implement the embodiment of the present application, at least has the following beneficial effects:
By the embodiment of the present application, by obtaining the initial gesture information of user, and the initial gesture information is solved
Analysis, so that equipment controls respective application according to the gesture information that parsing obtains.By this programme, screen is pressed without user
Curtain or keyboard, can carry out application operating, it can be achieved that controlling at a distance the operation of application, very convenient, and interest is strong,
It is beneficial to the eyes of user, user experience is good.
Detailed description of the invention
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below
There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this
Some embodiments of application for those of ordinary skill in the art without creative efforts, can be with
It obtains other drawings based on these drawings.
Fig. 1 is a kind of interaction signal of the method for control application operating based on augmented reality provided in an embodiment of the present invention
Figure;
Fig. 2 is a kind of process signal of the method for control application operating based on augmented reality provided in an embodiment of the present invention
Figure;
Fig. 3 is a kind of process signal of the method for control application operating based on augmented reality provided in an embodiment of the present invention
Figure;
Fig. 4 is a kind of process signal of the method for control application operating based on augmented reality provided in an embodiment of the present invention
Figure;
Fig. 5 is a kind of process signal of the method for control application operating based on augmented reality provided in an embodiment of the present invention
Figure;
Fig. 6 is a kind of structural schematic diagram of terminal provided in an embodiment of the present invention;
Fig. 7 is a kind of structural representation of the device of control application operating based on augmented reality provided in an embodiment of the present invention
Figure.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.It is based on
Embodiment in the application, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall in the protection scope of this application.
The description and claims of this application and term " first " in above-mentioned attached drawing, " second " etc. are for distinguishing
Different objects, are not use to describe a particular order.In addition, term " includes " and " having " and their any deformations, it is intended that
It is to cover and non-exclusive includes.Such as the process, method, system, product or equipment for containing a series of steps or units do not have
It is defined in listed step or unit, but optionally further comprising the step of not listing or unit, or optionally also wrap
Include other step or units intrinsic for these process, methods, product or equipment.
" embodiment " mentioned in this application is it is meant that a particular feature, structure, or characteristic described can be in conjunction with the embodiments
Included at least one embodiment of the application.The phrase, which occurs, in each position in the description might not each mean phase
Same embodiment, nor the independent or alternative embodiment with other embodiments mutual exclusion.Those skilled in the art are explicitly
Implicitly understand, embodiments described herein can be combined with other embodiments.
This programme embodiment can be carried out in equipment or terminal etc., wherein by detecting the initial gesture information of user,
Then the gesture information that the initial gesture information obtains equipment or terminal can carry out application operating control is parsed, can be passed through
The initial gesture information of user is compared with default gesture information, to confirm that user wants the corresponding operation made, so
It can control mouse afterwards or directly control using progress corresponding operation.
Specifically, referring to Fig.1, Fig. 1 is a kind of control application operating based on augmented reality provided in an embodiment of the present invention
Method interaction schematic diagram.As shown in Figure 1, comprising: which user 101, terminal 102, specific as follows:
Terminal 102 receives the request that user 101 sends, wherein the request carries the initial gesture letter of the user 101
Breath, the request are used to indicate the initial gesture information for obtaining the user 101;The terminal 102 parses the user's 101
Initial gesture information, to obtain the gesture information of the user 101;The terminal 102 is believed according to the gesture of the user 101
Breath carries out operation control to the first application.
Preferably, before the initial gesture information that the terminal 102 obtains user 101, further includes:
The terminal 102 receives the recognition of face request that the user 101 sends, and the recognition of face request is used to indicate
Obtain the face information of the user 101;The face information and predetermined level for the user 101 that the terminal 102 will acquire
Face information in database is compared, to confirm the grade of the user 101;The display of terminal 102 and the user
Ratings match first application.
By the embodiment of the present application, by obtaining the initial gesture information of user, and the initial gesture information is solved
Analysis, so that equipment controls respective application according to the gesture information that parsing obtains.By this programme, screen is pressed without user
Curtain or keyboard, can carry out application operating, it can be achieved that controlling at a distance the operation of application, very convenient, and interest is strong,
It is beneficial to the eyes of user, user experience is good.
It is a kind of method of control application operating based on augmented reality provided in an embodiment of the present invention referring to Fig. 2, Fig. 2
Flow diagram.As shown in Figure 2 comprising step 201-203, specific as follows:
201, the request that user sends is received, wherein the request carries the initial gesture information of the user, described to ask
Seek the initial gesture information for being used to indicate and obtaining the user;
Wherein it is possible to be the request that equipment or terminal etc. receive that user sends, it ought such as get user and be located at given zone
When domain, then the initial gesture information of user is identified, obtain the initial gesture information of the user;Wherein, the initial gesture letter
Breath can be various gesture motions;
202, the initial gesture information of the user is parsed, to obtain the gesture information of the user;
After the initial gesture information of user as described in being got equipment or terminal, then the initial gesture of the user is believed
Breath is parsed, to obtain the gesture information of the user;
Preferably, the initial gesture information of user can be compared with the gesture information that default gesture information library saves,
To confirm the gesture information of user;
203, according to the gesture information of the user, operation control is carried out to the first application.
Gesture information when it is embodied in screen etc. is obtained by accordingly parsing the initial gesture information of user, it is convenient
Equipment or terminal accordingly control application according to the gesture information.
By the embodiment of the present application, by obtaining the initial gesture information of user, and the initial gesture information is solved
Analysis, so that equipment controls respective application according to the gesture information that parsing obtains.By this programme, screen is pressed without user
Curtain or keyboard, can carry out application operating, it can be achieved that controlling at a distance the operation of application, very convenient, and interest is strong,
It is beneficial to the eyes of user, user experience is good.
It is a kind of method of control application operating based on augmented reality provided in an embodiment of the present invention referring to Fig. 3, Fig. 3
Flow diagram.As shown in Figure 3 comprising step 301-306, specific as follows:
301, the recognition of face request that the user sends is received, the recognition of face request, which is used to indicate, obtains the use
The face information at family;
When detecting that user is located at default face pickup area such as equipment, then receives the recognition of face that the user sends and ask
It asks, and triggers face information acquisition module such as camera and face information acquisition is carried out to the user;
302, the face information for the user that will acquire is compared with the face information in predetermined level database, with
Confirm the grade of the user;
Preferably, such as when screen is located at market, then the predetermined level database can disappearing according to the user got
Expense volume set etc., or is set according to the approximate age etc. of the user of acquisition;
Preferably, step 302 may include S3021-S3025, specific as follows:
S3021, according to the face information of the user, confirm the gender, shape of face information and face information of the user;
If equipment is by believing the face of different sexes, different shapes of face and different face information based on depth training study
After breath is trained identification, and then the gender of the user, shape of face information and face information are obtained and identified;
The consistent face information k1 of gender in S3022, the acquisition predetermined level database with the user;
After gender by confirmed the user, for example women, then equipment obtains all in predetermined level database
The face information k1 of women;
S3023, it is obtained and the consistent face information k2 of shape of face information of the user from the face information k1;
It is then based in the face information k1 of above-mentioned women, searches the consistent face of shape of face information obtained with the user
Information k2, the shape of face information of user as described in being got equipment are circle, then round shape of face is searched from the face information k1
Face information;
S3024, it is obtained and the consistent face information k of face information of the user from the face information k2;
Further, the consistent face information of face information with the user is obtained then from face information k2, described five
Official includes ear, nose, eyes, mouth, eyebrow etc.;
Grade corresponding to S3025, the confirmation face information k is the grade of the user.
That is the face information that face information k is the user;
303, the first application with the ratings match of the user is shown;
Alternatively, multiple applications with the ratings match of the user can also be shown, so that user selects;
304, the request that user sends is received, wherein the request carries the initial gesture information of the user, described to ask
Seek the initial gesture information for being used to indicate and obtaining the user;
305, the initial gesture information of the user is parsed, to obtain the gesture information of the user;
Preferably, the initial gesture information of user can be compared with the gesture information that default gesture information library saves,
To confirm the gesture information of user;
The gesture information that such as default gesture information library saves are as follows: when the initial gesture information of user is to clap hands, then correspond to
The gesture information of the user is to double-click the application;
306, according to the gesture information of the user, operation control is carried out to the first application.
By the embodiment of the present application, before the initial gesture information for obtaining user, by the face information to user into
Row obtains, and application corresponding with the grade of its user is then shown, then obtain initial gesture information and parsed, so as to equipment
Respective application is controlled according to the gesture information that parsing obtains.By this programme, screen or keyboard are pressed without user,
It can carry out application operating, it can be achieved that controlling at a distance the operation of application, it is very convenient, and interest is strong, is beneficial to user
Eyes, user experience is good.
It is a kind of method of control application operating based on augmented reality provided in an embodiment of the present invention referring to Fig. 4, Fig. 4
Flow diagram.As shown in Figure 4 comprising step 401-406, specific as follows:
401, the recognition of face request that the user sends is received, the recognition of face request, which is used to indicate, obtains the use
The face information at family;
402, the face information for the user that will acquire is compared with the face information in predetermined level database, with
Confirm the grade of the user;
403, the first application with the ratings match of the user is shown;
Such as the first application is shown in the display screen of equipment;
404, the request that user sends is received, wherein the request carries the initial gesture information of the user, described to ask
Seek the initial gesture information for being used to indicate and obtaining the user;
405, when the initial gesture information of the user is persistently to keep first gesture within a preset time, then described in confirmation
The gesture information of user is to click;
If first gesture can be to clench fist, when user is always maintained at movement of clenching fist in 5s, then determine that it is such as to first
Using clicking etc.;
Alternatively, then confirming that the gesture information of the user is to move back when the initial gesture information of the user is second gesture
Out;
If second gesture can be that scissor-shaped then confirms the user when the initial gesture information of user is scissor-shaped
Gesture information be exit;
Alternatively, when the initial gesture information of the user is to be directed toward the first position of the first predeterminable area, then described in acquisition
The coordinate (x1, y1) of first position;Default mapping algorithm is called to handle on the coordinate of the first position, to obtain and institute
State the coordinate (x2, y2) of the corresponding second position of coordinate of first position;The gesture information of the user is confirmed to be directed toward second
The second position of predeterminable area;
Preferably, the first predeterminable area is the region of the specific position in front of user, by obtaining the initial of user
Then gesture information is obtained by Mapping and Converting to the second predeterminable area in the location coordinate information of first predeterminable area
It is mapped to the coordinate information of screen;
Wherein, the default mapping algorithm can include:
Plane where confirming first predeterminable area is parallel with the plane where second predeterminable area;
The central point of first predeterminable area is obtained to the distance between the central point of second predeterminable area L;
Obtain the central point of first predeterminable area to second predeterminable area central point line and horizontal line
Angle theta;
Plane as where when the first predeterminable area is parallel with the plane where second predeterminable area, then obtains two
The distance between planar central point;Then according to elevation angle size, then coordinate conversion can be carried out to coordinate;
The coordinate of the second position may be expressed as:
X2=x1+Lcos θ, y2=y1+Lsin θ.
406, according to the gesture information of the user, operation control is carried out to the first application.
By the embodiment of the present application, before the initial gesture information for obtaining user, by the face information to user into
Row obtains, and application corresponding with the grade of its user is then shown, then obtain initial gesture information and parsed, so as to equipment
Respective application is controlled according to the gesture information that parsing obtains.Further, for pressing screen without user by this programme
Curtain or keyboard, can carry out application operating, it can be achieved that controlling at a distance the operation of application, very convenient, and interest is strong,
It is beneficial to the eyes of user, user experience is good.
Referring to Fig. 5, Fig. 5 provides a kind of method of control application operating based on augmented reality for the embodiment of the present application
Flow diagram.As shown in figure 5, it may include step 501-507, it is specific as follows:
501, the request that user sends is received, wherein the request carries the initial gesture information of the user, described to ask
Seek the initial gesture information for being used to indicate and obtaining the user;
502, confirm whether the initial gesture information of the user obtains success;
503, it if so, parsing the initial gesture information of the user, to obtain the gesture information of the user, and executes
Step 507, if it is not, thening follow the steps 504;
504, the user is prompted to show the initial gesture information again, to reacquire the initial gesture of the user
Information;
It can wherein show etc. by voice prompting, or in screen to remind user to reproduce initial gesture information;
If 505, obtaining failure again, the expression in the eyes information of the user is obtained;
Wherein the expression in the eyes information may include opening eyes, closing one's eyes, blink, turning on eyeball or intently etc.;
506, operation control is carried out to the first application according to the expression in the eyes information;
Such as when blinking 2 times in the user preset time, then it is judged to clicking to enter application;When blinking 4 in the user preset time
It is secondary, then it is judged to exiting application etc..
Further, or the header information of user can be obtained, such as shake the head, nod, rotary head it is different specific to realize
Control;
507, according to the gesture information of the user, operation control is carried out to the first application.
Specifically, the initial gesture information for such as working as user is held for both hands, then resolves to and double-click to the first application;When
When the initial gesture information of user is right-angled intersection, then resolves to and the first application is carried out exiting operation;When the initial hand of user
Gesture information is palm opening, then resolves to and carry out one-handed performance etc. to the first application.
Such as setting when first application is video playing application, the initial gesture information for obtaining the user is to raise up,
The volume for then controlling the video playing application is increased, and/or obtaining the initial gesture information of the user is lateral wave
Dynamic movement then controls the video fast forward in the video playing application;
When first application is game themes application, the initial gesture information for obtaining the user is to clench fist, then clicks
The game themes application, and/or obtain the initial gesture information of the user as two hands intersection, then control the game themes
Using entrance operated by two people mode.
By the embodiment of the present application, by obtaining the initial gesture information of user, and the initial gesture information is solved
Analysis, so that equipment controls respective application according to the gesture information that parsing obtains.When the initial gesture information for obtaining user
After failure, it can carry out obtaining gesture information again or obtain the expression in the eyes information of user, to realize the control to application.Pass through this
Scheme presses screen or keyboard without user, can carry out application operating, it can be achieved that controlling at a distance the operation of application,
It is very convenient, and interest is strong, is beneficial to the eyes of user, user experience is good.
It is consistent with above-described embodiment, referring to Fig. 6, Fig. 6 is that a kind of structure of terminal provided by the embodiments of the present application is shown
It is intended to, as shown, including processor, input equipment, output equipment and memory, the processor, input equipment, output are set
Standby and memory is connected with each other, wherein for the memory for storing computer program, the computer program includes that program refers to
It enables, the processor is configured for calling described program instruction, and above procedure includes the instruction for executing following steps;
Receive the request that user sends, wherein the request carries the initial gesture information of the user, and the request is used
The initial gesture information of the user is obtained in instruction;
The initial gesture information of the user is parsed, to obtain the gesture information of the user;
According to the gesture information of the user, operation control is carried out to the first application.
By the embodiment of the present application, by obtaining the initial gesture information of user, and the initial gesture information is solved
Analysis, so that equipment controls respective application according to the gesture information that parsing obtains.By this programme, screen is pressed without user
Curtain or keyboard, can carry out application operating, it can be achieved that controlling at a distance the operation of application, very convenient, and interest is strong,
It is beneficial to the eyes of user, user experience is good.
It is above-mentioned that mainly the scheme of the embodiment of the present application is described from the angle of method side implementation procedure.It is understood that
, in order to realize the above functions, it comprises execute the corresponding hardware configuration of each function and/or software module for terminal.This
Field technical staff should be readily appreciated that, in conjunction with each exemplary unit and algorithm of embodiment description presented herein
Step, the application can be realized with the combining form of hardware or hardware and computer software.Some function actually with hardware also
It is the mode of computer software driving hardware to execute, the specific application and design constraint depending on technical solution.Profession
Technical staff can specifically realize described function to each using distinct methods, but this realization should not be recognized
For beyond scope of the present application.
The embodiment of the present application can carry out the division of functional unit according to above method example to terminal, for example, can be right
The each functional unit of each function division is answered, two or more functions can also be integrated in a processing unit.
Above-mentioned integrated unit both can take the form of hardware realization, can also realize in the form of software functional units.It needs
Illustrate, is schematical, only a kind of logical function partition to the division of unit in the embodiment of the present application, it is practical to realize
When there may be another division manner.
Consistent with the above, referring to Fig. 7, Fig. 7 provides a kind of control based on augmented reality for the embodiment of the present application
The structural schematic diagram of the device of application operating comprising gesture obtains module 701, gesture parsing module 702, application control module
703, specific as follows:
Gesture obtains module 701, for receiving the request of user's transmission, wherein the request carries the first of the user
Beginning gesture information, the request are used to indicate the initial gesture information for obtaining the user;
Gesture parsing module 702, for parsing the initial gesture information of the user, to obtain the gesture letter of the user
Breath;
Application control module 703 carries out operation control to the first application for the gesture information according to the user.
Preferably, further include face recognition module, be used for:
The recognition of face request that the user sends is received, the recognition of face request carries the face letter of the user
Breath, the recognition of face request are used to indicate the face information for obtaining the user;The face information for the user that will acquire
It is compared with the face information in predetermined level database, to confirm the grade of the user;Display with the user etc.
Matched first application of grade.
Preferably, the gesture parsing module is also used to:
When the user initial gesture information be persistently keep first gesture within a preset time, then confirm the user
Gesture information be click;Alternatively, the initial gesture information as the user is second gesture, then the gesture of the user is confirmed
Information is to exit;Alternatively, then obtaining institute when the initial gesture information of the user is to be directed toward the first position of the first predeterminable area
State the coordinate (x1, y1) of first position;Call default mapping algorithm to handle on the coordinate of the first position, with obtain with
The coordinate (x2, y2) of the corresponding second position of the coordinate of the first position;The gesture information of the user is confirmed to be directed toward the
The second position of two predeterminable areas.
Preferably, the gesture parsing module is also used to:
When the first position that the initial gesture information of the user is the first predeterminable area of direction, then confirm that described first is pre-
If the plane where region is parallel with the plane where second predeterminable area;Obtain the central point of first predeterminable area
The distance between central point to second predeterminable area L;The central point of first predeterminable area is obtained to described second
The line of the central point of predeterminable area and horizontal angle theta;The coordinate of the second position are as follows: x2=x1+Lcos θ, y2=
y1+Lsinθ。
As can be seen that being believed by the embodiment of the present application by obtaining the initial gesture information of user, and to the initial gesture
Breath is parsed, so that equipment controls respective application according to the gesture information that parsing obtains.By this programme, without using
Screen or keyboard are pressed in family, can carry out application operating, it can be achieved that controlling at a distance the operation of application, very convenient, and
Interest is strong, is beneficial to the eyes of user, and user experience is good.
The embodiment of the present application also provides a kind of computer storage medium, wherein computer storage medium storage is for electricity
The computer program of subdata exchange, it is as any in recorded in above method embodiment which execute computer
A kind of some or all of method step of the control application operating based on augmented reality.
The embodiment of the present application also provides a kind of computer program product, and the computer program product includes storing calculating
The non-transient computer readable storage medium of machine program, the computer program make computer execute such as above method embodiment
Some or all of any method of control application operating based on augmented reality of middle record step.
It should be noted that for the various method embodiments described above, for simple description, therefore, it is stated as a series of
Combination of actions, but those skilled in the art should understand that, the application is not limited by the described action sequence because
According to the application, some steps may be performed in other sequences or simultaneously.Secondly, those skilled in the art should also know
It knows, the embodiments described in the specification are all preferred embodiments, related actions and modules not necessarily the application
It is necessary.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, there is no the portion being described in detail in some embodiment
Point, reference can be made to the related descriptions of other embodiments.
In several embodiments provided herein, it should be understood that disclosed device, it can be by another way
It realizes.For example, the apparatus embodiments described above are merely exemplary, such as the division of the unit, it is only a kind of
Logical function partition, there may be another division manner in actual implementation, such as multiple units or components can combine or can
To be integrated into another system, or some features can be ignored or not executed.Another point, shown or discussed is mutual
Coupling, direct-coupling or communication connection can be through some interfaces, the indirect coupling or communication connection of device or unit,
It can be electrical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, applying for that each functional unit in bright each embodiment can integrate in one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also be realized in the form of software program module.
If the integrated unit is realized in the form of software program module and sells or use as independent product
When, it can store in a computer-readable access to memory.Based on this understanding, the technical solution of the application substantially or
Person says that all or part of the part that contributes to existing technology or the technical solution can body in the form of software products
Reveal and, which is stored in a memory, including some instructions are used so that a computer equipment
(can be personal computer, server or network equipment etc.) executes all or part of each embodiment the method for the application
Step.And memory above-mentioned includes: USB flash disk, read-only memory (read-only memory, ROM), random access memory
The various media that can store program code such as (random access memory, RAM), mobile hard disk, magnetic or disk.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of above-described embodiment is can
It is completed with instructing relevant hardware by program, which can store in a computer-readable memory, memory
It may include: flash disk, read-only memory, random access device, disk or CD etc..
The embodiment of the present application is described in detail above, specific case used herein to the principle of the application and
Embodiment is expounded, the description of the example is only used to help understand the method for the present application and its core ideas;
At the same time, for those skilled in the art can in specific embodiments and applications according to the thought of the application
There is change place, in conclusion the contents of this specification should not be construed as limiting the present application.
Claims (10)
1. a kind of method of the control application operating based on augmented reality characterized by comprising
Receive the request that user sends, wherein the request carries the initial gesture information of the user, and the request is for referring to
Show the initial gesture information for obtaining the user;
The initial gesture information of the user is parsed, to obtain the gesture information of the user;
According to the gesture information of the user, operation control is carried out to the first application.
2. the method according to claim 1, wherein it is described receive user send request before, the method
Further include:
The recognition of face request that the user sends is received, the recognition of face request carries the face information of the user, institute
It states recognition of face request and is used to indicate the face information for obtaining the user;
The face information for the user that will acquire is compared with the face information in predetermined level database, described in confirmation
The grade of user;
First application of display and the ratings match of the user.
3. according to the method described in claim 2, it is characterized in that, the initial gesture information of the parsing user, with
To the gesture information of the user, comprising:
When the user initial gesture information be persistently keep first gesture within a preset time, then confirm the hand of the user
Gesture information is to click;
Alternatively, then confirming that the gesture information of the user is to exit when the initial gesture information of the user is second gesture;
Alternatively, then obtaining described first when the initial gesture information of the user is to be directed toward the first position of the first predeterminable area
The coordinate of position;Default mapping algorithm is called to handle on the coordinate of the first position, to obtain and the first position
The corresponding second position of coordinate coordinate;The gesture information for confirming the user is the second for being directed toward the second predeterminable area
It sets.
4. according to the method described in claim 3, it is characterized in that, the initial gesture information as the user is to be directed toward the
When the first position of one predeterminable area, which comprises
Plane where confirming first predeterminable area is parallel with the plane where second predeterminable area;
The central point of first predeterminable area is obtained to the distance between the central point of second predeterminable area;
Obtain the central point of first predeterminable area to second predeterminable area central point line and horizontal folder
Angle;
According to the coordinate of the first position, first predeterminable area central point to second predeterminable area central point
The distance between and the angle calcu-lation obtain the coordinate of the second position.
5. according to the method described in claim 4, it is characterized in that, the gesture information according to the user, is answered first
With carrying out operation control, comprising:
When first application is video playing application, the initial gesture information for obtaining the user is to raise up, then described in control
The volume of video playing application is increased, and/or obtains the initial gesture information of the user as transversal wave movement movement, then is controlled
Make the video fast forward in the video playing application;
If first application is game themes application, and the initial gesture information for obtaining the user is when clenching fist, then to click
The game themes application, and/or when to obtain the initial gesture information of the user be that two hands intersect, then control the game master
Topic application enters operated by two people mode.
6. a kind of device of the control application operating based on augmented reality characterized by comprising
Gesture obtains module, for receiving the request of user's transmission, wherein the request carries the initial gesture letter of the user
Breath, the request are used to indicate the initial gesture information for obtaining the user;
Gesture parsing module, for parsing the initial gesture information of the user, to obtain the gesture information of the user;
Application control module carries out operation control to the first application for the gesture information according to the user.
7. device according to claim 6, which is characterized in that further include face recognition module, be used for:
The recognition of face request that the user sends is received, the recognition of face request carries the face information of the user, institute
It states recognition of face request and is used to indicate the face information for obtaining the user;The face information for the user that will acquire and default
Face information in rating database is compared, to confirm the grade of the user;The ratings match of display and the user
First application.
8. device according to claim 7, which is characterized in that the gesture parsing module is also used to:
When the user initial gesture information be persistently keep first gesture within a preset time, then confirm the hand of the user
Gesture information is to click;Alternatively, the initial gesture information as the user is second gesture, then the gesture information of the user is confirmed
To exit;Alternatively, then obtaining described the when the first position that the initial gesture information of the user be the first predeterminable area of direction
The coordinate of one position;Default mapping algorithm is called to handle on the coordinate of the first position, to obtain and described first
The coordinate of the corresponding second position of the coordinate set;The gesture information for confirming the user is the second for being directed toward the second predeterminable area
It sets.
9. device according to claim 8, which is characterized in that the gesture parsing module is also used to:
Plane where confirming first predeterminable area is parallel with the plane where second predeterminable area;Obtain described
Central point the distance between of the central point of one predeterminable area to second predeterminable area;Obtain first predeterminable area
Central point to second predeterminable area central point line and horizontal angle;According to the coordinate of the first position,
The central point of first predeterminable area is obtained to the distance between the central point of second predeterminable area and the angle calcu-lation
To the coordinate of the second position.
10. a kind of computer readable storage medium, is stored with computer program, the computer program is executed by processor with reality
The existing method described in any one of claim 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811645324.5A CN109828660B (en) | 2018-12-29 | 2018-12-29 | Method and device for controlling application operation based on augmented reality |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811645324.5A CN109828660B (en) | 2018-12-29 | 2018-12-29 | Method and device for controlling application operation based on augmented reality |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109828660A true CN109828660A (en) | 2019-05-31 |
CN109828660B CN109828660B (en) | 2022-05-17 |
Family
ID=66861505
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811645324.5A Active CN109828660B (en) | 2018-12-29 | 2018-12-29 | Method and device for controlling application operation based on augmented reality |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109828660B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110727345A (en) * | 2019-09-19 | 2020-01-24 | 北京耐德佳显示技术有限公司 | Method and system for realizing man-machine interaction through finger intersection point movement |
CN111580652A (en) * | 2020-05-06 | 2020-08-25 | Oppo广东移动通信有限公司 | Control method and device for video playing, augmented reality equipment and storage medium |
CN114967484A (en) * | 2022-04-20 | 2022-08-30 | 海尔(深圳)研发有限责任公司 | Method and device for controlling household appliance, household appliance and storage medium |
CN115277143A (en) * | 2022-07-19 | 2022-11-01 | 中天动力科技(深圳)有限公司 | Data secure transmission method, device, equipment and storage medium |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103019559A (en) * | 2012-11-27 | 2013-04-03 | 海信集团有限公司 | Gesture control projection display device and control method thereof |
US20140358302A1 (en) * | 2013-05-31 | 2014-12-04 | Pixart Imaging Inc. | Apparatus having gesture sensor |
CN204347750U (en) * | 2014-10-17 | 2015-05-20 | 李妍 | head-mounted display apparatus |
CN104656903A (en) * | 2015-03-04 | 2015-05-27 | 联想(北京)有限公司 | Processing method for display image and electronic equipment |
CN104914985A (en) * | 2014-03-13 | 2015-09-16 | 扬智科技股份有限公司 | Gesture control method and system and video flowing processing device |
CN106200916A (en) * | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | The control method of augmented reality image, device and terminal unit |
US20160370860A1 (en) * | 2011-02-09 | 2016-12-22 | Apple Inc. | Gaze detection in a 3d mapping environment |
CN106502424A (en) * | 2016-11-29 | 2017-03-15 | 上海小持智能科技有限公司 | Based on the interactive augmented reality system of speech gestures and limb action |
US20170140216A1 (en) * | 2015-11-17 | 2017-05-18 | Huawei Technologies Co., Ltd. | Gesture-Based Object Measurement Method and Apparatus |
CN107272890A (en) * | 2017-05-26 | 2017-10-20 | 歌尔科技有限公司 | A kind of man-machine interaction method and device based on gesture identification |
CN107679860A (en) * | 2017-08-09 | 2018-02-09 | 百度在线网络技术(北京)有限公司 | A kind of method, apparatus of user authentication, equipment and computer-readable storage medium |
US20180260189A1 (en) * | 2015-09-28 | 2018-09-13 | Baidu Online Network Technology (Beijing) Co., Ltd. | Interactive control method and device for voice and video communications |
CN108595005A (en) * | 2018-04-20 | 2018-09-28 | 深圳市天轨年华文化科技有限公司 | Exchange method, device based on augmented reality and computer readable storage medium |
CN109086590A (en) * | 2018-08-13 | 2018-12-25 | 广东小天才科技有限公司 | The interface display method and electronic equipment of a kind of electronic equipment |
-
2018
- 2018-12-29 CN CN201811645324.5A patent/CN109828660B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160370860A1 (en) * | 2011-02-09 | 2016-12-22 | Apple Inc. | Gaze detection in a 3d mapping environment |
CN103019559A (en) * | 2012-11-27 | 2013-04-03 | 海信集团有限公司 | Gesture control projection display device and control method thereof |
US20140358302A1 (en) * | 2013-05-31 | 2014-12-04 | Pixart Imaging Inc. | Apparatus having gesture sensor |
CN104914985A (en) * | 2014-03-13 | 2015-09-16 | 扬智科技股份有限公司 | Gesture control method and system and video flowing processing device |
CN204347750U (en) * | 2014-10-17 | 2015-05-20 | 李妍 | head-mounted display apparatus |
CN104656903A (en) * | 2015-03-04 | 2015-05-27 | 联想(北京)有限公司 | Processing method for display image and electronic equipment |
US20180260189A1 (en) * | 2015-09-28 | 2018-09-13 | Baidu Online Network Technology (Beijing) Co., Ltd. | Interactive control method and device for voice and video communications |
US20170140216A1 (en) * | 2015-11-17 | 2017-05-18 | Huawei Technologies Co., Ltd. | Gesture-Based Object Measurement Method and Apparatus |
CN106200916A (en) * | 2016-06-28 | 2016-12-07 | 广东欧珀移动通信有限公司 | The control method of augmented reality image, device and terminal unit |
CN106502424A (en) * | 2016-11-29 | 2017-03-15 | 上海小持智能科技有限公司 | Based on the interactive augmented reality system of speech gestures and limb action |
CN107272890A (en) * | 2017-05-26 | 2017-10-20 | 歌尔科技有限公司 | A kind of man-machine interaction method and device based on gesture identification |
CN107679860A (en) * | 2017-08-09 | 2018-02-09 | 百度在线网络技术(北京)有限公司 | A kind of method, apparatus of user authentication, equipment and computer-readable storage medium |
CN108595005A (en) * | 2018-04-20 | 2018-09-28 | 深圳市天轨年华文化科技有限公司 | Exchange method, device based on augmented reality and computer readable storage medium |
CN109086590A (en) * | 2018-08-13 | 2018-12-25 | 广东小天才科技有限公司 | The interface display method and electronic equipment of a kind of electronic equipment |
Non-Patent Citations (2)
Title |
---|
CHANGHYUN JEON等: "Hand-Mouse Interface Using Virtual Monitor Concept for Natural Interaction", 《IEEE》 * |
汤志波等: "虚拟维修训练中的手势识别", 《计算机工程与设计》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110727345A (en) * | 2019-09-19 | 2020-01-24 | 北京耐德佳显示技术有限公司 | Method and system for realizing man-machine interaction through finger intersection point movement |
CN110727345B (en) * | 2019-09-19 | 2023-12-26 | 北京耐德佳显示技术有限公司 | Method and system for realizing man-machine interaction through finger intersection movement |
CN111580652A (en) * | 2020-05-06 | 2020-08-25 | Oppo广东移动通信有限公司 | Control method and device for video playing, augmented reality equipment and storage medium |
CN111580652B (en) * | 2020-05-06 | 2024-01-16 | Oppo广东移动通信有限公司 | Video playing control method and device, augmented reality equipment and storage medium |
CN114967484A (en) * | 2022-04-20 | 2022-08-30 | 海尔(深圳)研发有限责任公司 | Method and device for controlling household appliance, household appliance and storage medium |
CN115277143A (en) * | 2022-07-19 | 2022-11-01 | 中天动力科技(深圳)有限公司 | Data secure transmission method, device, equipment and storage medium |
CN115277143B (en) * | 2022-07-19 | 2023-10-20 | 中天动力科技(深圳)有限公司 | Data security transmission method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109828660B (en) | 2022-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109828660A (en) | A kind of method and device of the control application operating based on augmented reality | |
JP6616288B2 (en) | Method, user terminal, and server for information exchange in communication | |
EP2709357B1 (en) | Conference recording method and conference system | |
CN104520849B (en) | Use the search user interface of external physical expression | |
US20180300037A1 (en) | Information processing device, information processing method, and program | |
EP3628381A1 (en) | Game picture display method and apparatus, storage medium and electronic device | |
CA3051060A1 (en) | Automatic control of wearable display device based on external conditions | |
CN105335465B (en) | A kind of method and apparatus showing main broadcaster's account | |
CN110460799A (en) | Intention camera | |
CN107463247A (en) | A kind of method, apparatus and terminal of text reading processing | |
CN105573536A (en) | Touch interaction processing method, device and system | |
WO2018076622A1 (en) | Image processing method and device, and terminal | |
CN109543633A (en) | A kind of face identification method, device, robot and storage medium | |
CN106990840A (en) | control method and control system | |
CN106774824B (en) | Virtual reality interaction method and device | |
CN111580661A (en) | Interaction method and augmented reality device | |
CN108616712A (en) | A kind of interface operation method, device, equipment and storage medium based on camera | |
US20110029889A1 (en) | Selective and on-demand representation in a virtual world | |
CN109144245B (en) | Equipment control method and related product | |
CN108681390A (en) | Information interacting method and device, storage medium and electronic device | |
CN108829239A (en) | Control method, device and the terminal of terminal | |
CN109803109A (en) | A kind of wearable augmented reality remote video system and video call method | |
CN110287925A (en) | Read flipping-over control method and Related product | |
US20160234461A1 (en) | Terminal, system, display method, and recording medium storing a display program | |
CN109240786A (en) | A kind of subject replacement method and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |