CN109496289A - A kind of terminal control method and device - Google Patents

A kind of terminal control method and device Download PDF

Info

Publication number
CN109496289A
CN109496289A CN201880001138.XA CN201880001138A CN109496289A CN 109496289 A CN109496289 A CN 109496289A CN 201880001138 A CN201880001138 A CN 201880001138A CN 109496289 A CN109496289 A CN 109496289A
Authority
CN
China
Prior art keywords
user
expression
terminal
virtual
expressive features
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880001138.XA
Other languages
Chinese (zh)
Inventor
张霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba China Co Ltd
Original Assignee
Ucweb Inc
Ucweb Singapore Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ucweb Inc, Ucweb Singapore Pte Ltd filed Critical Ucweb Inc
Publication of CN109496289A publication Critical patent/CN109496289A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the present application discloses a kind of terminal control method and device.One specific embodiment of this method includes: in some specific scenes, and user, which can issue contactless instruction, the terminals such as movement, expression, sound to terminal, can execute corresponding operation after receiving the contactless instruction that user is issued.The embodiment realizes the contact that can be reduced to a certain extent between user and terminal and interacts, to promote interactive convenience.

Description

A kind of terminal control method and device
Technical field
This application involves field of computer technology, and in particular to Internet technical field more particularly to a kind of terminal control Method and apparatus.
Background technique
With the development of information technology, the function that terminal has is more and more abundant, is widely used in the daily life of people In living and work.The convenience of human-computer interaction already becomes important research topic and developing direction.
In the prior art, terminal generally requires user by carrying out contact between terminal when realizing corresponding function Ground interaction, such as: user clicks on touch screen passes through finger using starting application, user after (Application, App) icon Sliding unlock terminal etc. on touch screen.
Based on the prior art, a kind of mode of more convenient and fast controlling terminal is needed.
Summary of the invention
The application's is designed to provide a kind of terminal control method and device, for solving to adopt in certain application scenarios With the problem that the mode controlling terminal of contact interaction is more inconvenient.
In a first aspect, the embodiment of the present application provides a kind of terminal control method, comprising:
Terminal receives the contactless instruction of user in a particular state;
According to the contactless instruction received, execute preset corresponding with the contactless instruction Operation.
In some embodiments, terminal receives the contactless instruction of user in a particular state, comprising:
The terminal calls image acquisition units to acquire the expressive features of the user under the particular state;
The expressive features are determined as contactless instruction.
In some embodiments, the terminal calls image acquisition units to acquire the user's under the particular state Expressive features, comprising:
The terminal receives the expression trigger collection operation that user issues in instant communication interface;
After the expression trigger collection operation received, described image acquisition unit is called to acquire the table of the user Feelings feature.
In some embodiments, it according to the contactless instruction received, executes and preset non-is connect with described Touch instructs corresponding operation, comprising:
According to the expressive features, in existing virtual expression, the Virtual table to match with the expressive features is searched Feelings;
The virtual expression found out is showed into the user.
In some embodiments, the virtual expression found out is showed into the user, comprising:
The virtual expression found out is shown in the specified region of terminal interface, so as to user's selection.
Second aspect, the embodiment of the present application also provide a kind of terminal control mechanism, comprising:
Receiving processing module is configured to receive the contactless instruction of user in a particular state;
Execution module, is configured to according to the contactless instruction that receives, execute it is preset with it is described non- Contact instructs corresponding operation.
In some embodiments, the receiving processing module is configured to call Image Acquisition under the particular state Unit acquires the expressive features of the user, and the expressive features are determined as contactless instruction.
In some embodiments, the receiving processing module is configured to reception user and issues in instant communication interface Expression trigger collection operation, when receive the expression trigger collection operation after, call described image acquisition unit acquisition The expressive features of the user.
In some embodiments, the execution module is configured to according to the expressive features, in existing virtual expression In, the virtual expression to match with the expressive features is searched, the virtual expression found out is showed into the user.
In some embodiments, the execution module, the virtual expression for being configured to find out are shown in terminal In the specified region at interface, so as to user's selection.
Terminal control method and device provided by the present application, by the way that in some specific scenes, user can be to terminal Issue movement, expression, the contactless instruction such as sound, terminal after receiving the contactless instruction that user is issued, Corresponding operation can be executed.Such control mode can reduce the friendship of the contact between user and terminal to a certain extent Mutually, to promote interactive convenience.Moreover, wearable without being bound with terminal by the way of contactless instruction Smart machine can reduce corresponding cost.
Specifically, under the scene that expression is sent during social activity, terminal can acquire the facial expression of user, and be based on The facial expression of user searches from existing virtual expression and selects the virtual expression to match with the facial expression of user. Exactly using the above method in the application, user searches virtual expression without sliding page in virtual expression candidate region, But a kind of more convenient and fast interactive mode can be used ----corresponding expression is made, correspondingly, terminal can be based on user's Expression finds matched virtual expression for user.So as to reduce user to a certain extent as the certain virtual expressions of lookup Performed interactive operation, so that interaction is more convenient, meanwhile, it can also shorten the time-consuming for searching virtual expression.
Detailed description of the invention
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other Feature, objects and advantages will become more apparent upon:
Fig. 1 is the interaction schematic diagram between user provided by the embodiments of the present application and terminal;
Fig. 2 is terminal control method flow diagram provided by the embodiments of the present application;
Fig. 3 is the schematic diagram of expression candidate region provided by the embodiments of the present application;
Fig. 4 is provided by the embodiments of the present application by the terminal control method under expression control inquiry Virtual table love scape Flow diagram;
Fig. 5 a is expression control schematic diagram provided by the embodiments of the present application;
Fig. 5 b is the interaction schematic diagram provided by the embodiments of the present application controlled under inquiry Virtual table love scape by expression;
Fig. 5 c is the schematic diagram provided by the embodiments of the present application for sending expression;
Fig. 6 a is a kind of virtual expression exhibition method signal found based on user's expression provided by the embodiments of the present application Figure;
Fig. 6 b is that another kind provided by the embodiments of the present application is shown based on the virtual expression exhibition method that user's expression is found It is intended to;
Fig. 7 is provided by the embodiments of the present application in the process signal for controlling the terminal control method under photographed scene by expression Figure;
Fig. 8 is terminal control mechanism structural schematic diagram provided by the embodiments of the present application.
Specific embodiment
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to Convenient for description, part relevant to related invention is illustrated only in attached drawing.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
As previously mentioned, user usually requires to execute certain operations by the interactive mode controlling terminal of contact, institute here The interactive mode for the contact stated, it may include but be not limited to: for terminal screen, physical button or the external equipment for connecting terminal Performed click, pressing, dragging, sliding etc..
But in some cases, the interactive mode of aforementioned contact formula is more inconvenient.Such as: user using tablet computer into When row self-timer, user needs to realize shooting control by virtual shutter button shown on shooting interface, but due to plate electricity The screen size of brain is larger, and it is more inconvenient that user is taken pictures by the virtual shutter button control in shooting interface, it is possible to cause Photo angle changes or generates shake, influences effect of taking pictures.Another example is: the instant messaging on communication terminal (Instant Messaging, IM) function provides a large amount of virtual expression (such as: emoji) for user, and user passes through Instant Messenger During communication function is chatted, corresponding virtual expression may be used, but since virtual expression is more and virtual expression Candidate region is limited, therefore user usually requires to slide page in virtual expression candidate region, to search needed for user Virtual expression.
For this purpose, providing a kind of terminal control method in the embodiment of the present application, user can be allowed to use in some cases Contactless interactive mode can controlling terminal execute corresponding operation, further promoted interaction convenience.
With reference to Fig. 1, the interactive controlling mode in the embodiment of the present application between user and terminal is shown.As shown in Figure 1, with It family can controlling terminal by contactless instruction.It should be understood that during user issues contactless instruction, Yong Huke Not touch touch screen, physical button or the external equipment connecting with terminal of terminal, certainly, user is controlled with contactless instruction Terminal processed is not construed as user and does not contact terminal completely.Specifically, under some embodiments, user can be with handheld terminal And contactless instruction is issued to terminal by modes such as expression, gesture motion, sound;And under other embodiments, it uses Family can be kept completely separate with terminal, issue contactless instruction controlling terminal from user to terminal, under this mode, user and terminal it Between distance should not influence terminal receive user issue contactless instruction.
It should be noted that in the embodiment of the present application, the contactless instruction, specifically can be using expression, The forms such as limb action, sound, contactless interactive instructions, and contactless instruction is not needed by other and terminal phase The external equipment (such as: having the intelligent wearable device of binding relationship with terminal) of pass can be controlled for terminal.
Generally, the terminal has the function of Image Acquisition and/or sound collection, specifically may include but be not limited to: The mobile terminals such as mobile phone, tablet computer, laptop, smartwatch, camera, or, having Image Acquisition, sound collection function The computer etc. of energy, will not enumerate here.Certainly, if itself and do not have Image Acquisition or sound collection function but logical It crosses after being connect with external equipment and realizes the terminal device of aforementioned acquisition function, then it will be also be appreciated that described in the embodiment of the present application In the covering scope of terminal.
In addition, in above-mentioned mobile terminal, the terminal with communication function, such as: mobile phone, tablet computer, notebook electricity The equipment that brain, smartwatch etc. have communication function in subsequent descriptions in the embodiment of the present application, can be collectively referred to as: mobile logical Interrogate terminal.
Based on above-mentioned framework as shown in Figure 1, technical solution provided in the embodiment of the present application will be detailed below.
It with continued reference to Fig. 2, shows and a kind of terminal control method is provided in the embodiment of the present application, specifically include following step It is rapid:
Step S201: terminal receives the contactless instruction of user in a particular state.
The particular state, it is believed that be the certain applications of terminal operating or the state for enabling certain functions, such as: camera takes State etc. when the application display expression candidate region IM run on scape but the state not shot, mobile phone.Need exist for explanation It is that in existing embodiment, under the particular state, the interaction between user and terminal is typically only capable to be to contact terminal Formula interaction, thus more inconvenient to user to a certain extent.
And in the embodiment of the present application, terminal can receive and identify out the non-contact of user under above-mentioned particular state Formula instruction.Specifically, in practical application, user is issued there are many mode possibility of contactless instruction, such as: the facial table of user Feelings, limb action, sound of sending etc..In other words, reception of the terminal to contactless instruction can pass through the acquisition unit of terminal Or component is realized, such as: camera or microphone.Certainly, mentioned acquisition unit or component is also likely to be external set here It is standby, it is not specifically limited herein.
Identification of the terminal to contactless instruction can then realize that the identification function can be by end by corresponding identification function The operating system of self-operating is held to provide, it can also be by installing application (Application, App) offer at the terminal, generally Property, identification function is specific can include: sound detection, natural semantics recognition, posture identification, action recognition, facial characteristics identification At least one of (further may include Expression Recognition), is not construed as the restriction to the application here.
Step S203: according to the contactless instruction received, the preset and contactless finger is executed Enable corresponding operation.
Contactless instruction is used to indicate terminal and executes corresponding operation, it should be appreciated that behaviour corresponding to contactless instruction Make, can be and preset by device manufacturer or function provider, it can also be by user's sets itself.In one example: when When mobile phone is in photographing mode, user can be taken pictures by saying " taking pictures " this instruction control mobile phone execution.Show at another In example: user can voluntarily record phrase " click wiping " as instruction, be taken pictures with controlling mobile phone execution.
Obviously, through the above steps, in some specific scenes, user can to terminal issue such as movement, expression, The contactless instruction such as sound, terminal can execute corresponding operation after receiving the contactless instruction that user is issued. The contact that such control mode can be reduced to a certain extent between user and terminal interacts, thus promoted it is interactive just Victory.Moreover, can reduce phase without the wearable smart machine bound with terminal by the way of contactless instruction The cost answered.
For above-mentioned steps, executing subject usually can be terminal itself, but in certain embodiments, can also be with It is to run App at the terminal as executing subject.Also, during executing above-mentioned steps, executing subject is also possible to send out Changing, such as: the executing subject of step S201 can be terminal itself, and the executing subject of step S203 can be in terminal The App of operation.Certainly, specifically will be according to when practical application the case where, determines, here and should not be construed to the application's It limits.
The terminal control method in the embodiment of the present application is described in detail below in conjunction with specific application scenarios.
Scene one
When user carries out instant messaging, virtual expression can increase the interest of chat.Based on this, on communication terminal Chat feature or third party IM application provide virtual expression abundant for users to use.But the virtual expression in chat interface The size of candidate region is limited, in a fairly large number of situation of virtual expression, it is difficult to show completely.
With continued reference to Fig. 3, the schematic diagram of the virtual expression candidate region in IM application is shown, as shown in figure 3, Virtual table Feelings candidate region can not show all virtual expressions completely.According to existing interactive mode, then need user in void It slides page in quasi- expression candidate region, to browse and search the virtual expression not shown.Obviously, under the scene, It is more inconvenient for a user using existing interactive mode, and user will one timing of consumption during searching virtual expression It is long.
For this purpose, the terminal control method under the scene is shown with continued reference to Fig. 4, as shown in figure 4, this method is specific It can comprise the following steps that
Step S401: terminal acquires the expressive features of user under expression candidate state.
It should be noted that the virtual expression, at least may include: character expression, picture expression or full animation expression Deng.The presentation mode of virtual expression can be static state, be also possible to dynamically, here and be not especially limited.
In the embodiment of the present application, expression candidate state is regarded as being prepared as user's offer time by user's triggering, terminal The state of the virtual expression of choosing.
It should be understood that for above-mentioned steps S401, when terminal is under expression candidate state, Image Acquisition will be called Unit acquires the expressive features of user, and collected expressive features are determined as contactless instruction.Certainly, image acquisition units It can be that terminal is self-contained, be also possible to the external camera of terminal.
In general, expression candidate state is operated by user's sending expression trigger collection and is triggered, and the table that user issues There are many operations of feelings trigger collection:
In a kind of possible embodiment, the triggering when user checks virtual expression of expression candidate state is specifically used Family can click Virtual table feelings candidate control in the interface IM, and (user clicks the operation of virtual expression candidate control, it is believed that is The operation of expression trigger collection), virtual expression candidate region (region as shown in Figure 3) will be shown in the interface IM, while trigger end End enters expression candidate state.
In alternatively possible embodiment, (analogously, user can click expression acquisition control in the interface IM User clicks the operation of the control, is also believed to the operation of expression trigger collection), expression acquisition control is for arousing terminal Image acquisition units (such as: camera) are to carry out expression acquisition to user, at this point, it is also contemplated that terminal is in expression candidate's shape State.
So calling image acquisition units acquire the expressive features of the user when terminal is under expression candidate state, This process can are as follows: terminal receives the expression trigger collection operation that user issues in instant communication interface, when receiving Expression trigger collection operation after, call described image acquisition unit to acquire the expressive features of the user.
Both the above mode may be the triggering mode of expression candidate state, specifically will also be by actual conditions depending on.Than Such as: user holds mobile phone and by mobile phone front camera towards the operation of the user itself, it is also assumed that being expression acquisition touching Hair operation;Alternatively, user unlocks terminal it is also assumed that being the operation of expression trigger collection.Here it should not constitute to the application's It limits.
In the embodiment of the present application, it can be realized by facial characteristics identification function, especially expression identification function to user's table The acquisition and identification of feelings.Certainly, specific expression identification function can be by the technologies such as deep learning algorithm, neural network model reality It is existing, it is no longer excessively repeated here.
Step S403: it according to the collected expressive features, is searched and the expressive features phase in virtual expression The virtual expression matched, to be sent as instant communication information.
Under this scene, the expressive features that the expression that user's face is presented obtains after terminal acquires can be considered A kind of contactless instruction, and terminal searches the virtual expression to match with expressive features in virtual expression or it is virtual to send this The operation of expression can be considered to execute operation corresponding with contactless instruction.
As one of the embodiment of the present application possible embodiment, corresponding Virtual table can be searched according to the classification of expression Feelings.Specifically, in practical applications, the expression of user can be divided into different classes of according to the state of facial face, such as: smile, It curls one's lip, frown, close one's eyes, certainly, in addition to this expression can also be divided based on mood, such as: angry, happiness.Ying Li Solution, it is practical which classification is expression is divided into, generally depend on specific expression identification function.In other words, pass through Expression Recognition Function may recognize that classification belonging to the actual face expression of user.
In addition, virtual expression is typically provided with corresponding identification information, such as: "/smile ", "/frown ", "/stare " (table Feelings identification information can usually be applied set by provider as device manufacturer or third party IM), it is only identification information here certainly A kind of possible mode, is not construed as the restriction to the application.Obviously, the classification belonging to the expression of known users, and On the basis of the identification information of virtual expression, it is also assured that out the virtual expression to match with the expression of user.
As the alternatively possible embodiment in the embodiment of the present application, can be searched according to the similarity of expression corresponding empty Quasi- expression.Specifically, terminal can extract user expressive features, and calculate user expressive features and virtual expression feature it Between similarity, it is possible thereby to find and the most similar virtual expression of the expression of user.Such as: terminal can be according to user The feature of mouth, eye and face when laughing finds out and similar smile, narrows eye and the virtual expressions of difference such as laugh at, laugh.
The virtual expression found can be sent to other users directly as instant communication information.
Such as: as shown in Figure 5 a, when user is chatted using mobile phone and wants to send expression, user can be current Expression control is selected in chat interface, and then virtual expression candidate region will be shown in current chat interface.At this point, hand Machine is in expression candidate state, and the front camera of mobile phone is activated, and acquires the expression of the user.As shown in Figure 5 b, it is assumed that Mobile phone acquires and identifies that the facial expression of user is " smile ", then, it can be searched in virtual expression and selected identity is The virtual expression of "/smile ".As shown in Figure 5 c, the virtual expression of " smile " found can be sent out directly as instant communication information Give other users.
As it can be seen that being sent during social activity under the scene of expression, terminal can acquire the face of user in from the discussion above Expression, and the facial expression based on user, search and are selected from existing virtual expression and match with the facial expression of user Virtual expression.Exactly using the above method in the application, user looks into without sliding page in virtual expression candidate region Virtual expression is looked for, but a kind of more convenient and fast interactive mode can be used ----corresponding expression is made, correspondingly, terminal can Expression based on user finds matched virtual expression for user.It is to search certain so as to reduce user to a certain extent Interactive operation performed by a little virtual expressions, so that interaction is more convenient, meanwhile, it can also shorten the consumption for searching virtual expression When.
It should be noted that terminal recognition goes out classification belonging to the expression of user in above content, specifically it can be, eventually The facial expression image of end acquisition user, and expressive features data are extracted, determine expression class described in the expressive features data Not, classification described in the expression of user is determined with this.Certainly, the process for extracting expressive features data can specifically be mentioned by face characteristic Algorithm is taken to be positioned and extracted for human face five-sense-organ feature, texture region and predefined characteristic point, detailed process is herein not Excessively repeated.
Once it is determined that classification belonging to the expression of user, the identification information that can be directed to virtual expression carry out traversal inspection out Rope, to find out the virtual expression of the same category.
Terminal determines the similarity between the expression of user and virtual expression, specifically can be, and terminal acquires the face of user Portion's facial expression image, and expressive features data are extracted, according to spy corresponding to the expressive features data of extraction and virtual expression Data are levied, similarity is calculated.Certainly, similarity can not be made here using similarity algorithms such as Euclidean distance, COS distances It is specific to limit.
In the embodiment of the present application, the quantity of the virtual expression found is likely to be 1, it is also possible to be greater than 1, tool The processing mode of body difference:
In a kind of feasible embodiment, if the quantity of the virtual expression found is 1, then, it will can directly be somebody's turn to do Virtual expression is sent as instant communication message.
And in another feasible embodiment, the quantity of selected virtual expression may be 1 more than, for this feelings The each virtual expression found can be showed user by condition, decided in its sole discretion by user and sent some virtual expression.Specifically Ways of presentation as shown in Figure 6 a can be used when the virtual expression quantity found is greater than 1 in ground.In Fig. 6 a, with user Each virtual expression that actual expression matches is showed in above virtual expression candidate region with independent a line.If user selects A certain virtual expression, the then virtual expression that selects user are sent as instant communication information, and the row disappears.
In practical applications, exhibition method as shown in Figure 6 b can also be used, that is, in Fig. 6 b, with the actual table of user Each virtual expression that feelings match is shown in original virtual expression candidate region location in a manner of newly-increased figure layer.If with Family has selected a certain virtual expression, then the newly-increased figure layer disappears, and continues to show original virtual expression candidate region.
Certainly, the exhibition method in above-mentioned Fig. 6 a, 6b is also only a kind of example, may use other in practical applications Exhibition method, such as: pop-up show, floating layer show etc., specifically will depend on practical application needs.
Scene two
When user using mobile phone, camera or tablet computer etc. there is the terminal of shooting function and other users to take a group photo, It may be more inconvenient using existing interactive mode:
If user's handheld terminal is shot, it may be limited by factors such as terminal size, shapes, be unfavorable for user Virtual shutter button or physical shutter key are clicked, and then is likely to affect shooting effect.If user uses function of periodically taking pictures It can be carried out shooting, then the shooting as provided by timing photography function prepares what duration was usually fixed, and user can not be voluntarily It adjusts, is unfavorable for user's group photo.
Although user can undoubtedly increase bat using external equipment by external equipments such as camera pole, smartwatch Take the photograph cost.
Therefore under the scene, the embodiment of the present application provides a kind of terminal control method, as shown in fig. 7, specifically may include with Lower step:
Step S701: terminal acquires under pre- shooting state and monitors the expression of user.
The pre- shooting state, it is believed that be that terminal has begun the state found a view but do not shot also.Specifically Ground, if terminal is that communication terminals, the pre- shooting states such as mobile phone or tablet computer can be terminal and have been turned on included phase Machine function or third party shoot App, will show shooting interface on terminal screen at this time but do not shoot also;If terminal is phase Machine, then the camera that the state of taking pictures can be camera in advance, which has begun, finds a view but user does not press shutter also.
It should be noted that user would generally adjust the facial expression of oneself, and one kind is kept to be suitable in actual photographed The state of shooting, generally, user can keep such expression and continue for some time.Therefore in the embodiment of the present application, terminal Monitor user expression, it is believed that be monitor user expression at the appointed time in whether change.
Monitoring for user's expression can realize by corresponding monitoring function, can in a kind of feasible embodiment The face of user is determined by face identification functions, and whether the texture, shadow, default anchor point etc. that monitor face become Change, and then judges whether the expression of user changes.Which does not need excessively complicated recognizer.
Step S703: when monitoring that the expression does not change in the duration of setting, preset shooting behaviour is executed Make.
In this scene, the contactless expression for instructing the user that is regarded as being taken does not become within the set duration Change, correspondingly, preset shooting operation performed by terminal is then regarded as executing operation corresponding with contactless instruction.
Preset shooting operation may include: to trigger shutter, continuous shooting, switching photography etc., specifically can also be according to reality The needs of application are configured.
Preset duration can be set to 1~n seconds, and it is preparatory both can to have shot APP provider by device manufacturer or third party It is set as different time gears, it can also be by user's self-defining.Once the facial expression of terminal monitoring to user are being set When not changing in the time, then above-mentioned shooting operation can be executed, to realize the shooting control of contactless interaction.
Obviously, through the above steps, user can keep expression constant when carrying out self-timer or group photo by certain time length Interactive mode realize shooting control.Which is not limited by terminal size, without user's opening timing shooting function, only Terminal need to be adjusted to pre- state of taking pictures, to effectively improve the convenience of interactive controlling.Meanwhile above method emphasis It is whether the expression for monitoring user changes in certain time length, does not need to identify the specific table that user is made Feelings, also just without using excessively complicated Expression Recognition algorithm.
Using the above method during actual photographed, the number of users being taken is likely larger than 1, in this situation Under, to realize shooting control, then it is taken and the expression of each user of face orientation camera lens is required in certain time length It remains unchanged.
As a kind of embodiments possible of the application, terminal is before shutter being triggered and/or fast after executing shooting The image acquired in a period of time after door triggering is supplied to user as shooting result.Specifically for example: terminal monitoring arrives user's When expression does not change in the duration (being assumed to be 1s) of setting, then executing shooting operation, (terminal automatic trigger shutter carries out Shooting), it is assumed here that it is t at the time of terminal automatic trigger shutter, then the shooting result for being supplied to user can be from moment t-1 Start to picture material captured by moment t terminal, is also possible to since moment t to image captured by moment t+1 terminal Content can also be since moment t-1 to picture material captured by moment t+1 terminal.
Later, it can voluntarily select corresponding content of shooting to be saved by user in above-mentioned shooting result and (save as photograph Piece or video).Certainly, it is not construed as the restriction to the application here.
It should be noted that being using the expression of user as contactless instruction, in fact, the application in above-mentioned scene Above-mentioned terminal control method in embodiment is not limited to that, under some application modes, the contactless finger of user's sending Order can be it is diversified, such as: user also makes corresponding limb action while making facial expression.
Limb action described here, including but not limited to: gesture, posture, major beat of user etc., the limbs of user Movement can be dynamically, is also possible to static state, here and is not especially limited.
As a result, in above-mentioned scene one, in addition to the above-mentioned application mode that virtual expression is searched based on the expression of user with Outside, the above method in the embodiment of the present application can also be looked into according to the limb action that the expression of user and user are made simultaneously Look for corresponding virtual expression.Specifically, terminal moves the limbs of the expression for acquiring user and user under the application mode Make, and the contactless instruction that the expressive features collected and limb action feature are issued as user.
For virtual expression, partial virtual expression be collectively formed by the expression and movement of virtual image, then, The virtual expression to match with preceding feature can be searched according to the expressive features and motion characteristic of user, in virtual expression. Certainly, the method for searching virtual expression can be similar with the above method, that is, (can include table by determining the feature of user Feelings feature and motion characteristic) belonging to classification, to search the virtual expression of the same category;It can also be by comparing the table of user The mode of similarity is realized to the lookup of virtual expression between feelings, movement and virtual expression, is no longer excessively repeated here.
Certainly, can be known in practical applications by various motions such as corresponding posture identification, gesture identification, limb action identifications Acquisition and identification of the other model realization to user's limb action.
In above-mentioned scene two, it can also equally be realized by the way of user's expression combination user's limb action to end The shooting at end controls.Specifically, terminal acquires under pre- shooting state and monitors user in a kind of possible application mode Expressive features and motion characteristic, when the expressive features and motion characteristic that monitor user do not change in the duration of setting When (or the amplitude of variation is within a preset range), preset shooting operation is executed.
And in alternatively possible application mode, user can be constant come controlling terminal triggering by expression in certain time Shutter, while different screening-modes can be switched come controlling terminal by limb actions such as gestures.Such as: in pre- shooting state Under, user stretches out index finger and makes the movement that camera shutter is pressed in simulation, then terminal execution is taken pictures;Another example is: in shooting shape in advance Under state, user stretches out left hand (or right hand), and four fingers close up while being bent, then terminal executes video capture.It certainly, is only one here The possible example of kind specifically will also depend on the needs of practical application.
Certainly, the above content is not construed as the restriction to the application.
The above are terminal control methods provided by the embodiments of the present application, are based on same thinking, and the embodiment of the present application also mentions For corresponding terminal control mechanism.
Specifically, showing terminal control mechanism provided in the embodiment of the present application, the dress with continued reference to Fig. 8 It sets and includes:
Receiving processing module 801 is configured to receive the contactless instruction of user in a particular state;
Execution module 802, is configured to according to the contactless instruction that receives, execute it is preset with it is described It is contactless to instruct corresponding operation.
Terminal control mechanism as shown in Figure 8 can be applied under different scenes.
Scene one by expression realizes terminal control during social activity.
In this scenario, the particular state includes at least expression candidate state, wherein the expression candidate state is used for Provide a user virtual expression.
Further, the receiving processing module 801 is configured to call image acquisition units under the particular state The expressive features are determined as contactless instruction by the expressive features for acquiring the user.
Further, the receiving processing module 801 is configured to receive the table that user issues in instant communication interface The operation of feelings trigger collection is called described in the acquisition of described image acquisition unit after the expression trigger collection operation received The expressive features of user.
Further, the execution module 802 is configured to according to the expressive features, in existing virtual expression, The virtual expression to match with the expressive features is searched, the virtual expression found out is showed into the user.
Further, the execution module 802, the virtual expression for being configured to find out are shown in terminal interface Specified region in, so as to user selection.
Scene two, shooting control
In this scenario, the particular state includes at least pre- shooting state, wherein the pre- shooting state can consider It is that terminal has begun the state found a view but do not shot also.
The receiving processing module 801 is configured to that the expression of user is acquired and monitored under pre- shooting state.
The execution module 802 is configured to hold when monitoring that the expression does not change in the duration of setting The preset shooting operation of row.
It based on device shown in Fig. 8, can be realized in practical applications by entity electronic equipment, specifically, the terminal packet Include one or more processors;And storage device, for storing one or more programs;
When one or more of programs are executed by one or more of processors, so that one or more of processing Device realizes the process described above.
It based on device shown in Fig. 8, can be realized in practical applications by computer readable storage medium, specifically, should Computer program is stored on computer readable storage medium, which realizes the process described above when being executed by processor
Various embodiments are described in a progressive manner in the application, same and similar part between each embodiment It may refer to each other, each embodiment focuses on the differences from other embodiments.Especially for device, set For standby and medium class embodiment, since it is substantially similar to the method embodiment, so being described relatively simple, related place ginseng The part explanation for seeing embodiment of the method, just no longer repeats one by one here.
So far, the specific embodiment of this theme is described.Other embodiments are in the appended claims In range.In some cases, the movement recorded in detail in the claims can execute and still in a different order Desired result may be implemented.In addition, process depicted in the drawing not necessarily requires the particular order shown or continuous suitable Sequence, to realize desired result.In some embodiments, multitasking and parallel processing can be advantageous.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable medium On computer program, which includes the program code for method shown in execution flow chart.In such reality It applies in example, which can be downloaded and installed from network by communications portion, and/or be pacified from detachable media Dress.When the computer program is executed by central processing unit (CPU), the above-mentioned function of limiting in the present processes is executed. It should be noted that computer-readable medium described herein can be computer-readable signal media or computer-readable Storage medium either the two any combination.Computer readable storage medium for example may be-but not limited to- Electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or any above combination.It is computer-readable The more specific example of storage medium can include but is not limited to: have electrical connection, the portable computing of one or more conducting wires Machine disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable programmable read only memory (EPROM Or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device or above-mentioned Any appropriate combination.In this application, computer readable storage medium can be it is any include or storage program it is tangible Medium, the program can be commanded execution system, device or device use or in connection.And in this application, Computer-readable signal media may include in a base band or as the data-signal that carrier wave a part is propagated, wherein carrying Computer-readable program code.The data-signal of this propagation can take various forms, and including but not limited to electromagnetism is believed Number, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be computer-readable storage medium Any computer-readable medium other than matter, the computer-readable medium can be sent, propagated or transmitted for being held by instruction Row system, device or device use or program in connection.The program code for including on computer-readable medium It can transmit with any suitable medium, including but not limited to: wireless, electric wire, optical cable, RF etc. or above-mentioned any conjunction Suitable combination.
The calculating of the operation for executing the application can be write with one or more programming languages or combinations thereof Machine program code, described program design language include object oriented program language-such as Java, Smalltalk, C+ +, further include conventional procedural programming language-such as " C " language or similar programming language.Program code can Fully to execute, partly execute on the user computer on the user computer, be executed as an independent software package, Part executes on the remote computer or executes on a remote computer or server completely on the user computer for part. In situations involving remote computers, remote computer can pass through the network of any kind --- including local area network (LAN) Or wide area network (WAN)-is connected to subscriber computer, or, it may be connected to outer computer (such as utilize Internet service Provider is connected by internet).
Flow chart and block diagram in attached drawing are illustrated according to the system of the various embodiments of the application, method and computer journey The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation A part of one module, program segment or code of table, a part of the module, program segment or code include one or more use The executable instruction of the logic function as defined in realizing.It should also be noted that in some implementations as replacements, being marked in box The function of note can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are actually It can be basically executed in parallel, they can also be executed in the opposite order sometimes, and this depends on the function involved.Also it to infuse Meaning, the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart can be with holding The dedicated hardware based system of functions or operations as defined in row is realized, or can use specialized hardware and computer instruction Combination realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard The mode of part is realized.Described unit also can be set in the processor, for example, can be described as: a kind of processor packet Include receiving unit, resolution unit, information extracting unit and generation unit.Wherein, the title of these units is under certain conditions simultaneously The restriction to the unit itself is not constituted, for example, receiving unit is also described as " receiving the web page browsing request of user Unit ".
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.Those skilled in the art Member is it should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic Scheme, while should also cover in the case where not departing from foregoing invention design, it is carried out by above-mentioned technical characteristic or its equivalent feature Any combination and the other technical solutions formed.Such as features described above has similar function with (but being not limited to) disclosed herein Can technical characteristic replaced mutually and the technical solution that is formed.

Claims (12)

1. a kind of terminal control method characterized by comprising
Terminal receives the contactless instruction of user in a particular state;
According to the contactless instruction received, preset behaviour corresponding with the contactless instruction is executed Make.
2. the method as described in claim 1, which is characterized in that terminal receives the contactless finger of user in a particular state It enables, comprising:
The terminal calls image acquisition units to acquire the expressive features of the user under the particular state;
The expressive features are determined as contactless instruction.
3. method according to claim 2, which is characterized in that the terminal calls Image Acquisition list under the particular state Member acquires the expressive features of the user, comprising:
The terminal receives the expression trigger collection operation that user issues in instant communication interface;
After the expression trigger collection operation received, the expression for calling described image acquisition unit to acquire the user is special Sign.
4. method according to claim 2, which is characterized in that according to the contactless instruction received, execute preparatory The operation corresponding with the contactless instruction of setting, comprising:
According to the expressive features, in existing virtual expression, the virtual expression to match with the expressive features is searched;
The virtual expression found out is showed into the user.
5. method as claimed in claim 4, which is characterized in that the virtual expression found out is showed into the user, Include:
The virtual expression found out is shown in the specified region of terminal interface, so as to user's selection.
6. a kind of terminal control mechanism characterized by comprising
Receiving processing module is configured to receive the contactless instruction of user in a particular state;
Execution module, is configured to according to the contactless instruction that receives, execute it is preset with it is described non-contact Formula instructs corresponding operation.
7. device as claimed in claim 6, which is characterized in that the receiving processing module is configured in the specific shape It calls image acquisition units to acquire the expressive features of the user under state, the expressive features is determined as contactless instruction.
8. device as claimed in claim 7, which is characterized in that it is in sight to be configured to reception user for the receiving processing module When communication interface on the expression trigger collection operation that issues, after the expression trigger collection operation received, described in calling Image acquisition units acquire the expressive features of the user.
9. device as claimed in claim 7, which is characterized in that the execution module is configured to according to the expressive features, In existing virtual expression, the virtual expression to match with the expressive features, the virtual expression that will be found out are searched Show the user.
10. device as claimed in claim 9, which is characterized in that the execution module is configured to the void that will be found out Quasi- expression is shown in the specified region of terminal interface, so as to user's selection.
11. a kind of electronic equipment, comprising: one or more processors;
Storage device, for storing one or more programs;
When one or more of programs are executed by one or more of processors, so that one or more of processors are real The now method as described in any in Claims 1 to 5.
12. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor The method as described in any in Claims 1 to 5 is realized when execution.
CN201880001138.XA 2018-06-20 2018-06-20 A kind of terminal control method and device Pending CN109496289A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/091927 WO2019241920A1 (en) 2018-06-20 2018-06-20 Terminal control method and device

Publications (1)

Publication Number Publication Date
CN109496289A true CN109496289A (en) 2019-03-19

Family

ID=65713836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880001138.XA Pending CN109496289A (en) 2018-06-20 2018-06-20 A kind of terminal control method and device

Country Status (2)

Country Link
CN (1) CN109496289A (en)
WO (1) WO2019241920A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023124200A1 (en) * 2021-12-27 2023-07-06 北京荣耀终端有限公司 Video processing method and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010273280A (en) * 2009-05-25 2010-12-02 Nikon Corp Imaging apparatus
CN104753766A (en) * 2015-03-02 2015-07-01 小米科技有限责任公司 Expression sending method and device
CN106454071A (en) * 2016-09-09 2017-02-22 捷开通讯(深圳)有限公司 Terminal and automatic shooting method based on gestures
CN107153496A (en) * 2017-07-04 2017-09-12 北京百度网讯科技有限公司 Method and apparatus for inputting emotion icons
CN107315488A (en) * 2017-05-31 2017-11-03 北京安云世纪科技有限公司 A kind of searching method of expression information, device and mobile terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101311882A (en) * 2007-05-23 2008-11-26 华为技术有限公司 Eye tracking human-machine interaction method and apparatus
WO2014136521A1 (en) * 2013-03-06 2014-09-12 Necカシオモバイルコミュニケーションズ株式会社 Imaging device, imaging method and program
CN105068662B (en) * 2015-09-07 2018-03-06 哈尔滨市一舍科技有限公司 A kind of electronic equipment for man-machine interaction
TWI587994B (en) * 2016-11-18 2017-06-21 Hiwin Tech Corp Non-contact gestures teach robots

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010273280A (en) * 2009-05-25 2010-12-02 Nikon Corp Imaging apparatus
CN104753766A (en) * 2015-03-02 2015-07-01 小米科技有限责任公司 Expression sending method and device
CN106454071A (en) * 2016-09-09 2017-02-22 捷开通讯(深圳)有限公司 Terminal and automatic shooting method based on gestures
CN107315488A (en) * 2017-05-31 2017-11-03 北京安云世纪科技有限公司 A kind of searching method of expression information, device and mobile terminal
CN107153496A (en) * 2017-07-04 2017-09-12 北京百度网讯科技有限公司 Method and apparatus for inputting emotion icons

Also Published As

Publication number Publication date
WO2019241920A1 (en) 2019-12-26

Similar Documents

Publication Publication Date Title
TWI775091B (en) Data update method, electronic device and storage medium thereof
CN108525305B (en) Image processing method, image processing device, storage medium and electronic equipment
CN109635542B (en) Biological identification interaction method, graphical interaction interface and related device
CN109087376B (en) Image processing method, image processing device, storage medium and electronic equipment
WO2018000585A1 (en) Interface theme recommendation method, apparatus, terminal and server
KR102165818B1 (en) Method, apparatus and recovering medium for controlling user interface using a input image
CN102938826A (en) Taking pictures by using multiple cameras
EP4300431A1 (en) Action processing method and apparatus for virtual object, and storage medium
US10893203B2 (en) Photographing method and apparatus, and terminal device
CN112905074B (en) Interactive interface display method, interactive interface generation method and device and electronic equipment
CN114860187A (en) Intelligent voice equipment control method and device, computer equipment and storage medium
CN112068762A (en) Interface display method, device, equipment and medium of application program
CN112825013A (en) Control method and device of terminal equipment
WO2022095860A1 (en) Fingernail special effect adding method and device
WO2021232875A1 (en) Method and apparatus for driving digital person, and electronic device
CN109257649B (en) Multimedia file generation method and terminal equipment
CN111339938A (en) Information interaction method, device, equipment and storage medium
CN112068907A (en) Interface display method and electronic equipment
CN112764600B (en) Resource processing method, device, storage medium and computer equipment
CN109496289A (en) A kind of terminal control method and device
CN112256890A (en) Information display method and device, electronic equipment and storage medium
CN111986700A (en) Method, device, equipment and storage medium for triggering non-contact operation
CN115496841A (en) Animation generation method and device for virtual character, electronic equipment and storage medium
CN109725722B (en) Gesture control method and device for screen equipment
CN115225756A (en) Method for determining target object, shooting method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200514

Address after: 310051 room 508, floor 5, building 4, No. 699, Wangshang Road, Changhe street, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Alibaba (China) Co.,Ltd.

Address before: 10, galley quay, 10-01, offshore financial center, Singapore

Applicant before: YOUSHI TECHNOLOGY SINGAPORE Co.,Ltd.

Applicant before: UC MOBILE Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190319