CN104699377A - Control method and electronic equipment - Google Patents

Control method and electronic equipment Download PDF

Info

Publication number
CN104699377A
CN104699377A CN201310651523.8A CN201310651523A CN104699377A CN 104699377 A CN104699377 A CN 104699377A CN 201310651523 A CN201310651523 A CN 201310651523A CN 104699377 A CN104699377 A CN 104699377A
Authority
CN
China
Prior art keywords
display screen
action
parameter
service data
operational motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310651523.8A
Other languages
Chinese (zh)
Inventor
刘宸寰
马昱弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310651523.8A priority Critical patent/CN104699377A/en
Publication of CN104699377A publication Critical patent/CN104699377A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a control method and electronic equipment. The electronic equipment comprises a first display screen and a second display screen. The control method comprises the following steps: obtaining operation data of an operation body in corresponding region of the first display screen or the second display screen, wherein the operation data comprise operation motion and motion attribute parameters of the operation motion; comparing the operation motion and the motion attribute parameters of the operation data with those of a preset operation model to obtain a comparison result; when the comparison result shows that the operation data are matched with the operation model, controlling the corresponding display screen of the operation data to be shut off. According to the embodiment, the switching control of the shut-off state of the display screen is achieved by detecting the motion of the user' operation body, the switching control efficiency is improved, the condition that a user closes the operation process complexly is avoided, and the user experiment is improved.

Description

A kind of control method and electronic equipment
Technical field
The application relates to technical field of electronic control, particularly a kind of control method and electronic equipment.
Background technology
Along with the development of science and technology, the terminal applies with projecting function is more and more taken precautions against.Wherein, the terminal with projecting function has two display screens, and one is common display screen, and another is projection screen.
When carrying out closedown to the projecting function of the terminal with projecting function and switching control, usually adopt user to carry out the mode of operation further to telechiric device or terminal itself, realize projecting function execution closing control.
Thus, in the existing control program to projection execution closedown switching, the operating process of meeting adding users, reduces the efficiency of switching control, affects Consumer's Experience.
Summary of the invention
Technical problems to be solved in this application are to provide a kind of control method and electronic equipment, and perform in the control program of closedown switching in order to solve prior art to projection, the operating process of meeting adding users, reduces the technical matters of the efficiency switching control thus.
This application provides a kind of control method, be applied to electronic equipment, described electronic equipment comprises the first display screen and second display screen, and described method comprises:
Obtain the service data of the operating body of described first display screen or described second display screen corresponding region, described service data comprises operational motion and action attributes parameter thereof;
Described service data is compared with the operation model preset about its operational motion and action attributes parameter, obtains comparison result;
When described comparison result shows described service data and described operation model matches, control the display screen corresponding with described service data and be in closed condition.
Said method, preferably, described second display screen comprises projection display screen.
Said method, preferably, described electronic equipment also comprises image acquisition units, and described image acquisition units is corresponding with described first display screen or second display screen corresponding region;
Wherein, the service data of the operating body of described first display screen of described acquisition or second display screen corresponding region comprises:
Obtained multiple application drawing pictures of described first display screen or second display screen corresponding region by described image acquisition units, between adjacent two application drawing pictures, there is image relevant parameter;
Identify the operational motion of operating body in described application drawing picture, and generate the action attributes parameter corresponding with described operational motion according to described image relevant parameter;
According to described operational motion and action attributes parameter thereof, generating run data.
Said method, preferably, described operation model comprises model action constraint parameter and model action attributes constrained parameters thereof;
Wherein, described described service data to be compared about its operational motion and action attributes parameter and the operation model preset, obtains comparison result, comprising:
The model action constraint parameter of the operational motion in described service data and described operation model is compared, obtains the first comparer result;
Action attributes parameter in described service data and the model action attributes constrained parameters in described operation model are compared, obtains the second comparer result;
According to described first comparer result and described second comparer result, generate the comparison result of described service data and described operation model.
Said method, preferably, described operational motion comprises multiple operator action, and described action attributes parameter comprises the sub-action attributes parameter between adjacent two operator actions;
Wherein, in the described application drawing picture of described identification, the operational motion of operating body comprises:
Identify each self-corresponding operator action of each described application drawing picture successively;
Each described operator action is dynamic according to the composition of the relation in turn operation between described application drawing picture;
Wherein, generate the action attributes parameter corresponding with described operational motion according to described image relevant parameter, comprising:
Respectively according to the image relevant parameter between adjacent two application drawing pictures, generate the sub-action attributes parameter between two operator actions corresponding with this image relevant parameter;
According to described sub-action attributes parameter, generate the action attributes parameter of described operational motion.
Said method, preferably, described operational motion is gesture motion, and described gesture motion comprises multiple gesture posture, has variation rate parameter or interval time parameter between adjacent two gesture postures.
Said method, preferably, described operation model generates according to the sample action preset.
Present invention also provides a kind of electronic equipment, described electronic equipment comprises the first display screen and second display screen, and described electronic equipment also comprises:
Data capture unit, for obtaining the service data of the operating body of described first display screen or described second display screen corresponding region, described service data comprises operational motion and action attributes parameter thereof;
Comparing unit, for described service data is compared with the operation model preset about its operational motion and action attributes parameter thereof, obtain comparison result, when described comparison result shows described service data and described operation model matches, trigger indicative control unit;
Indicative control unit, is in closed condition for controlling the display screen corresponding with described service data.
Above-mentioned electronic equipment, preferably, described second display screen comprises projection display screen.
Above-mentioned electronic equipment, preferably, described electronic equipment also comprises image acquisition units, and described image acquisition units is corresponding with described first display screen or described second display screen corresponding region;
Wherein, described data capture unit comprises:
Image Acquisition subelement, for being obtained multiple application drawing pictures of described first display screen or second display screen corresponding region by described image acquisition units, has image relevant parameter between adjacent two application drawing pictures;
Action recognition subelement, for identifying the operational motion of operating body in described application drawing picture;
Parameter generates subelement, for generating the action attributes parameter corresponding with described operational motion according to described image relevant parameter;
Data genaration subelement, for according to described operational motion and action attributes parameter thereof, generating run data.
Above-mentioned electronic equipment, preferably, described operation model comprises model action constraint parameter and model action attributes constrained parameters thereof;
Wherein, described comparing unit comprises:
First comparer unit, for the model action constraint parameter in the operational motion in described service data and described operation model being compared, obtains the first comparer result;
Second comparer unit, for the action attributes parameter in described service data and the model action attributes constrained parameters in described operation model being compared, obtains the second comparer result;
Result generates subelement, for according to described first comparer result and described second comparer result, generates the comparison result of described service data and described operation model.
Above-mentioned electronic equipment, preferably, described operational motion comprises multiple operator action, and described action attributes parameter comprises the sub-action attributes parameter between adjacent two operator actions;
Wherein, described action recognition subelement comprises:
Sub-action recognition module, for identifying each self-corresponding operator action of each described application drawing picture successively;
Action comprising modules, for forming operational motion by each described operator action according to the relation in turn between described application drawing picture;
Wherein, described parameter generation subelement comprises:
Sub-action parameter generation module, for respectively according to the image relevant parameter between adjacent two application drawing pictures, generates the sub-action attributes parameter between two operator actions corresponding with this image relevant parameter;
Property parameters generation module, for according to described sub-action attributes parameter, generates the action attributes parameter of described operational motion.
Above-mentioned electronic equipment, preferably, the operational motion that described action recognition subelement recognizes is gesture motion, and described gesture motion comprises multiple gesture posture, has variation rate parameter or interval time parameter between adjacent two gesture postures.
Above-mentioned electronic equipment, preferably, also comprises:
Model generation unit, for the sample action generating run model that foundation is preset.
From such scheme, a kind of control method that the application provides and electronic equipment, by carrying out the acquisition of the service data of operating body to the display screen corresponding region of electronic equipment, and then service data and the operation model pre-set are compared, thus when service data and operation model match, the switching this display screen being carried out to closed condition controls, and realizes the application's object.The application utilizes the mode of the operation model pre-set, when getting the service data matched with this operation model, control corresponding display screen and be in closed condition, avoid the direct control flow process of user to electronic equipment, action thus by detecting user operation body realizes switching the closed condition of display screen controlling, and improves the efficiency switching and control, simultaneously, the application avoids the shutoff operation flow process of user's complexity, improves Consumer's Experience.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present application, below the accompanying drawing used required in describing embodiment is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the application, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
The process flow diagram of a kind of control method embodiment one that Fig. 1 provides for the application;
Fig. 2 is the application example figure of the embodiment of the present application one;
Fig. 3 is the Another Application exemplary plot of the embodiment of the present application one;
Fig. 4 is the another application example figure of the embodiment of the present application one;
Fig. 5 is the another application example figure of the embodiment of the present application one;
Fig. 6 is the another application example figure of the embodiment of the present application one;
The partial process view of a kind of control method embodiment two that Fig. 7 provides for the application;
Fig. 8 is the application example figure of the embodiment of the present application two;
Fig. 9 is the Another Application exemplary plot of the embodiment of the present application two;
Figure 10 is another part process flow diagram of the embodiment of the present application two;
Figure 11 is the another application example figure of the embodiment of the present application two;
Figure 12 is the another partial process view of the embodiment of the present application two;
Figure 13 is the application example figure of the embodiment of the present application;
Partial process view in a kind of control method embodiment three that Figure 14 the application provides;
Figure 15 is the application example figure of the embodiment of the present application three;
Figure 16 is the Another Application exemplary plot of the embodiment of the present application three;
Figure 17 is the application example figure of the embodiment of the present application;
A kind of structural representation of electronic equipment embodiment four of Figure 18 for providing for the application;
The part-structure schematic diagram of a kind of electronic equipment embodiment five that Figure 19 provides for the application;
Figure 20 is another part structural representation of the embodiment of the present application five;
Figure 21 is the another part-structure schematic diagram of the embodiment of the present application five;
The part-structure schematic diagram of a kind of electronic equipment embodiment six that Figure 22 provides for the application.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, be clearly and completely described the technical scheme in the embodiment of the present application, obviously, described embodiment is only some embodiments of the present application, instead of whole embodiments.Based on the embodiment in the application, those of ordinary skill in the art are not making the every other embodiment obtained under creative work prerequisite, all belong to the scope of the application's protection.
With reference to figure 1, be the process flow diagram of a kind of control method embodiment one that the application provides, wherein, described method is applied to electronic equipment, and described electronic equipment comprises the first display screen and second display screen, and described method can comprise step:
Step 101: the service data obtaining the operating body of described first display screen or described second display screen corresponding region.
Wherein, described service data comprises operational motion and action attributes parameter thereof.
It should be noted that, described first display screen display first content, described second display screen shows the second content, and described first content can be identical with described second content, also can be different.And described electronic equipment can be the mobile terminal that mobile phone or pad etc. have double screen setting.
Wherein, its displaying contents is shown to user by described first display screen or described second display screen, user can to react action according to the content of its display, such as, user needs display screen closed by its displaying contents no longer show avoid other people to read time, the action or block action etc. of waving continuously with refusal wish can be made.Described operating body can be the body part etc. of user, as hand etc.
And described first display screen or described second display screen corresponding region are user is reacted by its operating body the region of action, as shown in Figure 2.
Wherein, described operational motion has its nature and characteristic parameter, and described action attributes parameter refers to the property parameters of described operational motion, and e.g., the kinematic parameter of described operational motion is as speed parameter etc.
Step 102: described service data is compared with the operation model preset about its operational motion and action attributes parameter, obtains comparison result.
Wherein, described operation model can be arranged according to its demand in advance by user, such as, this operation model is set to operating body and blocks operation model or operating body swinging operation model etc.
Step 103: when described comparison result shows described service data and described operation model matches, controls the display screen corresponding with described service data and is in closed condition.
It should be noted that, when described comparison result shows described service data and described operation model matches, illustrate that operational motion in described service data and action attributes parameter thereof all meet the display screen that user pre-sets and close switching condition, thus, perform the control display screen corresponding with described service data in step 103 and be in closed condition.
Wherein, in described step 103, the display screen corresponding with described service data, refers to first display screen or second display screen in described step 101.
Such as, if when the service data got in described step 101 is the service data of described first display screen corresponding region, described step 103 is described first display screen of control and is in closed condition; If when the service data got in described step 101 is the service data of described second display screen corresponding region, described step 103 is the described second display screen of control and is in closed condition.
From such scheme, a kind of control method embodiment one that the application provides, by carrying out the acquisition of the service data of operating body to the display screen corresponding region of electronic equipment, and then service data and the operation model pre-set are compared, thus when service data and operation model match, the switching this display screen being carried out to closed condition controls, and realizes the application's object.The embodiment of the present application one utilizes the mode of the operation model pre-set, when getting the service data matched with this operation model, control corresponding display screen and be in closed condition, avoid the direct control flow process of user to electronic equipment, action thus by detecting user operation body realizes switching the closed condition of display screen controlling, and improves the efficiency switching and control, simultaneously, the embodiment of the present application one avoids the shutoff operation flow process of user's complexity, improves Consumer's Experience.
In above-described embodiment, described second display screen can be projection display screen, and described second display screen is corresponding with projector, and this projector can be arranged at the bottom position of described electronic equipment, near the position of mobile microphone, as shown in Figure 3.The electronic equipment with projection display screen can be applied in various modes or scene.Such as, user is when utilizing this electronic equipment to carry out needing the information such as recording phone number in communication process, p.m.entry or telephone directory editing interface are presented on its first display screen by the projector of described electronic equipment, now, while described second display screen shows call interface, described first display screen display p.m.entry or telephone directory editing interface, as shown in the call enhancement mode in Fig. 4.For another example, this electronic equipment can lie against on desktop by user, and described first display screen is on the relative vertical plane of this electronic equipment, as delivered directly shown in pattern in Fig. 5.In addition, this electronic equipment can also be uprightly fixed on desktop by support by user, by arranging light reflecting device above electronic equipment projector, the displaying contents that projector is thrown in is shown on the first display screen between user and electronic equipment, as shown in Fig. 6 middle plateform pattern.
In above-described embodiment, described electronic equipment can also comprise image acquisition units, and described image acquisition units is corresponding with described first display screen or second display screen corresponding region.
Now, with reference to figure 7, for the process flow diagram of step 101 described in a kind of control method embodiment two that the application provides, described step 101, in the service data of the operating body of described first display screen of described acquisition or second display screen corresponding region, can be realized by following steps:
Step 701: the multiple application drawing pictures being obtained described first display screen or second display screen corresponding region by described image acquisition units, have image relevant parameter between adjacent two application drawing pictures;
Such as, described first display screen is convention display, when described second display screen is projection display screen, described image acquisition units can be depth camera, this image acquisition units can be arranged at the parallel position of described first display screen, as shown in Figure 8, or, described image acquisition units can be arranged at the relative position of described second display screen, the adjacent position of projector as corresponding in it, as shown in Figure 9, thus, described image acquisition units can collect multiple application drawing picture of described first display screen corresponding region or multiple application drawing pictures of described second display screen corresponding region.
It should be noted that, described application drawing picture can be the view data that continuous time point is corresponding.Adjacent two application drawing pictures can be understood as its time point be acquired and have neighbouring relations, and now, the image relevant parameter between these two application drawing pictures can for the mistiming etc. between its time point be acquired separately.
Step 702: the operational motion identifying operating body in described application drawing picture, and generate the action attributes parameter corresponding with described operational motion according to described image relevant parameter;
It should be noted that, in described step 702, identifying operation action can comprise multiple operator action, its action attributes parameter generated comprises the sub-action attributes parameter between adjacent two operator actions, now, identify in described step 702 that the operational motion of operating body in described application drawing picture can pass through step as shown in Figure 10 and realize:
Step 721: identify each self-corresponding operator action of each described application drawing picture successively.
Wherein, can utilize the ellipse model that clusters in each described application drawing picture, identify its each self-corresponding operator action in described step 721, as various gesture postures etc., as shown in Figure 11.
Step 722: by each described operator action according to the composition of the relation in turn operational motion between described application drawing picture.
Wherein, the relation in turn between the application drawing picture in described step 722 refers to, the relation in turn that each described application drawing picture is acquired.
Accordingly, when generating the action attributes parameter corresponding with described operational motion according to described image relevant parameter in described step 702, can be realized by step as shown in Figure 12:
Step 723: respectively according to the image relevant parameter between adjacent two application drawing pictures, generates the sub-action attributes parameter between two operator actions corresponding with this image relevant parameter.
Wherein, described two the operator actions corresponding with this image relevant parameter, refer to, corresponding two each self-corresponding operator actions of application drawing picture of this image relevant parameter.Such as, mistiming etc. between the time point that described image relevant parameter can be acquired separately for two application drawing pictures of its correspondence, now, according to the described mistiming in described step 723, generate the sub-action attributes parameter between two operator actions corresponding to this image relevant parameter, as the movement velocity parameter etc. of two application drawing pictures separately between the sub-action of respective operations.
Step 724: according to described sub-action attributes parameter, generate the action attributes parameter of described operational motion.
Wherein, described sub-action attributes parameter directly can be combined in described step 724, obtain the action attributes parameter of described operational motion.
Such as, a gesture posture is corresponding in turn in described application drawing picture, these gesture postures composition gesture motion and described operational motion, and often have between adjacent two gesture postures one variation rate parameter or interval time parameter as its image relevant parameter, thus, the action attributes parameter of described gesture motion is formed.
Step 703: according to described operational motion and action attributes parameter thereof, generating run data.
Wherein, directly described operational motion and action attributes parameter thereof can be combined in described step 703, obtain service data.
It should be noted that, the embodiment of the present application be applied to the closedown of second display screen and projection display screen switch control time, as shown in Figure 13, described electronic equipment is in the pattern of delivering directly, described second display screen shows project content, user is when needs carry out closedown switching to this second display screen, only hand need be carried out one hand swing in described second display screen corresponding region and block (or two hands cover block), realize controlling described second display screen to close, make second display screen no longer show project content.
In actual applications, the described operation model of each embodiment above-mentioned can comprise model action constraint parameter and model action attributes constrained parameters thereof, thus, in described step 102, described service data is compared with the operation model preset about its operational motion and action attributes parameter, when obtaining comparison result, partial process view in a kind of control method embodiment three that can provide with reference to the application in such as Figure 14, described step 102 can comprise:
Step 121: the model action constraint parameter of the operational motion in described service data and described operation model is compared, obtains the first comparer result;
It should be noted that, described operation model can be generated by the sample action pre-set.When the embodiment of the present application is applied to the closedown switching to display screen, described operation model can be set in advance as gesture and block operation model.Such as, in the embodiment of the present application, by in advance to researchs such as human body natural reaction and hand models (the various human hand model as in Figure 15), sample action is set to two hands cover to block or swing on the other hand and block, thus, according to these sample action generating run models, this operation model comprises model action constraint parameter and model action attributes constrained parameters thereof, described model action constraint parameter can be operational motion parameters for shape characteristic etc., as the finger angle span parameter etc. of finger gesture each in gesture motion, described model action attributes constrained parameters can be the kinematic constraint parameter etc. of operational motion, as kinematic constraint parameter in gesture motion etc., as shown in Figure 16.
Step 122: the action attributes parameter in described service data and the model action attributes constrained parameters in described operation model are compared, obtains the second comparer result;
Concrete, described operational motion can comprise multiple operator action, the action attributes parameter of described operational motion can comprise the sub-action attributes parameter often between adjacent two operator actions, and it is relative, described model action constraint parameter can comprise the action constraint parameter of multiple sub-action, described model action attributes constrained parameters can comprise the action attributes constrained parameters of often adjacent two sub-actions, and thus, described step 121 specifically can realize in the following manner:
The action constraint parameter of sub-action corresponding with described operation model respectively for each operator action in described operational motion is compared, obtains the first comparison result.
Accordingly, described step 122 can realize in the following manner:
The action constraint parameter of the sub-action of the sub-action attributes parameter in described service data respectively with it in described model action constraint parameter is compared, obtains the second comparison result.
Step 123: according to described first comparer result and described second comparer result, generate the comparison result of described service data and described operation model.
Wherein, described first comparison result show described operational motion whether with model action constraint match parameters, described second comparison result show described action attributes parameter whether with model action attributes match parameters.In described step 123, when in described first comparison result and described second comparison result, any one shows not mate, described service data and the comparison result of described operation model show that described service data is not mated with described operation model, that is, only show described operational motion and model action constraint match parameters at described first comparison result, and described second comparison result is when showing described action attributes parameter and model action attributes match parameters, the comparison result generated in described step 123 shows that described service data is mated with described operation model.
Such as, for the operation model preset for gesture motion, because gesture interaction is very ingenious and natural, user points and can uphold by nature, point form by track user, the operation that user will carry out can be judged, then by calculating time and the action attributes parameter of reaching operation, whether the service data obtained thus is predefined operation model, and then determines that namely whether perform privacy screens switch controls described second display screen and be in closed condition.The process of gesture identification can be: first adopt filtering technique to follow the tracks of visible joint to application drawing picture, determine visible finger thus, calculate volar direction afterwards, then calculate 5 fingers, secondly, calculate the affined transformation (M of palm, T), (M, T) is adopted to convert invisible joint, and then by local matching, export gesture.Its last recognition result as shown in Figure 17.
With reference to Figure 18, be the structural representation of a kind of electronic equipment embodiment four that the application provides, wherein, described electronic equipment comprises the first display screen and second display screen, and described electronic equipment also comprises:
Data capture unit 1801, for obtaining the service data of the operating body of described first display screen or described second display screen corresponding region, described service data comprises operational motion and action attributes parameter thereof.
Wherein, described service data comprises operational motion and action attributes parameter thereof.
It should be noted that, described first display screen display first content, described second display screen shows the second content, and described first content can be identical with described second content, also can be different.And described electronic equipment can be the mobile terminal that mobile phone or pad etc. have double screen setting.
Wherein, its displaying contents is shown to user by described first display screen or described second display screen, user can to react action according to the content of its display, such as, user needs display screen closed by its displaying contents no longer show avoid other people to read time, the action or block action etc. of waving continuously with refusal wish can be made.Described operating body can be the body part etc. of user, as hand etc.
And described first display screen or described second display screen corresponding region are user is reacted by its operating body the region of action, as shown in Figure 2.
Wherein, described operational motion has its nature and characteristic parameter, and described action attributes parameter refers to the property parameters of described operational motion, and e.g., the kinematic parameter of described operational motion is as speed parameter etc.
Comparing unit 1802, for described service data is compared with the operation model preset about its operational motion and action attributes parameter thereof, obtain comparison result, when described comparison result shows described service data and described operation model matches, trigger indicative control unit 1803.
Wherein, described operation model can be arranged according to its demand in advance by user, such as, this operation model is set to operating body and blocks operation model or operating body swinging operation model etc.
Indicative control unit 1803, is in closed condition for controlling the display screen corresponding with described service data.
It should be noted that, when described comparison result shows described service data and described operation model matches, illustrate that operational motion in described service data and action attributes parameter thereof all meet the display screen that user pre-sets and close switching condition, thus, trigger the described indicative control unit 1803 control display screen corresponding with described service data and be in closed condition.
Wherein, in described indicative control unit 1803, the display screen corresponding with described service data, refers to first display screen or second display screen in described data capture unit 1801.
Such as, if when the service data got in described data capture unit 1801 is the service data of described first display screen corresponding region, described indicative control unit 1803 controls described first display screen and is in closed condition; If when the service data got in described data capture unit 1801 is the service data of described second display screen corresponding region, described indicative control unit 1803 controls described second display screen and is in closed condition.
From such scheme, a kind of electronic equipment embodiment four that the application provides, by carrying out the acquisition of the service data of operating body to the display screen corresponding region of electronic equipment, and then service data and the operation model pre-set are compared, thus when service data and operation model match, the switching this display screen being carried out to closed condition controls, and realizes the application's object.The embodiment of the present application four utilizes the mode of the operation model pre-set, when getting the service data matched with this operation model, control corresponding display screen and be in closed condition, avoid the direct control flow process of user to electronic equipment, action thus by detecting user operation body realizes switching the closed condition of display screen controlling, and improves the efficiency switching and control, simultaneously, the embodiment of the present application four avoids the shutoff operation flow process of user's complexity, improves Consumer's Experience.
In above-described embodiment, described second display screen can be projection display screen, and described second display screen is corresponding with projector, and this projector can be arranged at the bottom position of described electronic equipment, near the position of mobile microphone, as shown in Figure 3.The electronic equipment with projection display screen can be applied in various modes or scene.Such as, user is when utilizing this electronic equipment to carry out needing the information such as recording phone number in communication process, p.m.entry or telephone directory editing interface are presented on its first display screen by the projector of described electronic equipment, now, while described second display screen shows call interface, described first display screen display p.m.entry or telephone directory editing interface, as shown in the call enhancement mode in Fig. 4.For another example, this electronic equipment can lie against on desktop by user, and described first display screen is on the relative vertical plane of this electronic equipment, as delivered directly shown in pattern in Fig. 5.In addition, this electronic equipment can also be uprightly fixed on desktop by support by user, by arranging light reflecting device above electronic equipment projector, the displaying contents that projector is thrown in is shown on the first display screen between user and electronic equipment, as shown in Fig. 6 middle plateform pattern.
In above-described embodiment, described electronic equipment can also comprise image acquisition units, and described image acquisition units is corresponding with described first display screen or second display screen corresponding region.
Now, with reference to Figure 19, be the structural representation of data capture unit 1801 described in a kind of electronic equipment embodiment five that the application provides, wherein, described data capture unit 1801 can comprise:
Image Acquisition subelement 1811, for being obtained multiple application drawing pictures of described first display screen or second display screen corresponding region by described image acquisition units, has image relevant parameter between adjacent two application drawing pictures.
Such as, described first display screen is convention display, when described second display screen is projection display screen, described image acquisition units can be depth camera, this image acquisition units can be arranged at the parallel position of described first display screen, as shown in Figure 8, or, described image acquisition units can be arranged at the relative position of described second display screen, the adjacent position of projector as corresponding in it, as shown in Figure 9, thus, described image acquisition units can collect multiple application drawing picture of described first display screen corresponding region or multiple application drawing pictures of described second display screen corresponding region.
It should be noted that, described application drawing picture can be the view data that continuous time point is corresponding.Adjacent two application drawing pictures can be understood as its time point be acquired and have neighbouring relations, and now, the image relevant parameter between these two application drawing pictures can for the mistiming etc. between its time point be acquired separately.
Action recognition subelement 1812, for identifying the operational motion of operating body in described application drawing picture.
It should be noted that, aforesaid operations action can comprise multiple operator action, its action attributes parameter generated comprises the sub-action attributes parameter between adjacent two operator actions, now, with reference to Figure 20, for the structural representation of the subelement of action recognition described in the embodiment of the present application 1812, wherein, described action recognition subelement 1812 can comprise:
Sub-action recognition module 2001, for identifying each self-corresponding operator action of each described application drawing picture successively.
Wherein, described sub-action recognition module 2001 can utilize the ellipse model that clusters in each described application drawing picture, identify its each self-corresponding operator action, as various gesture postures etc., as shown in Figure 11.
Action comprising modules 2002, for forming operational motion by each described operator action according to the relation in turn between described application drawing picture.
Wherein, the relation in turn between the application drawing picture in described action comprising modules 2002 refers to, the relation in turn that each described application drawing picture is acquired.
Parameter generates subelement 1813, for generating the action attributes parameter corresponding with described operational motion according to described image relevant parameter.
Accordingly, with reference to Figure 21, for parameter described in the embodiment of the present application generates the structural representation of subelement 1813, wherein, described parameter generates subelement 1813 and can comprise:
Sub-action parameter generation module 2003, for respectively according to the image relevant parameter between adjacent two application drawing pictures, generates the sub-action attributes parameter between two operator actions corresponding with this image relevant parameter.
Wherein, described two the operator actions corresponding with this image relevant parameter, refer to, corresponding two each self-corresponding operator actions of application drawing picture of this image relevant parameter.Such as, mistiming etc. between the time point that described image relevant parameter can be acquired separately for two application drawing pictures of its correspondence, now, described sub-action parameter generation module 2003 is according to the described mistiming, generate the sub-action attributes parameter between two operator actions corresponding to this image relevant parameter, as the movement velocity parameter etc. of two application drawing pictures separately between the sub-action of respective operations.
Property parameters generation module 2004, for according to described sub-action attributes parameter, generates the action attributes parameter of described operational motion.
Wherein, described sub-action attributes parameter can directly combine by described property parameters generation module 2004, obtains the action attributes parameter of described operational motion.
Such as, a gesture posture is corresponding in turn in described application drawing picture, these gesture postures composition gesture motion and described operational motion, and often have between adjacent two gesture postures one variation rate parameter or interval time parameter as its image relevant parameter, thus, the action attributes parameter of described gesture motion is formed.
Data genaration subelement 1814, for according to described operational motion and action attributes parameter thereof, generating run data.
Wherein, described operational motion and action attributes parameter thereof can directly combine by described data genaration subelement 1814, obtain service data.
It should be noted that, the embodiment of the present application be applied to the closedown of second display screen and projection display screen switch control time, as shown in Figure 13, described electronic equipment is in the pattern of delivering directly, described second display screen shows project content, user is when needs carry out closedown switching to this second display screen, only hand need be carried out two hands to cover to block or singlehanded swing is blocked in described second display screen corresponding region, realize controlling described second display screen to close, make second display screen no longer show project content.
In actual applications, the described operation model of each embodiment above-mentioned can comprise model action constraint parameter and model action attributes constrained parameters thereof, thus, with reference to the structural representation of comparing unit 1802 described in a kind of electronic equipment embodiment six that the application in Figure 22 provides, wherein, described comparing unit 1802 can comprise:
First comparer unit 1821, for the model action constraint parameter in the operational motion in described service data and described operation model being compared, obtains the first comparer result.
It should be noted that, described operation model can be generated by the sample action pre-set.When the embodiment of the present application is applied to the closedown switching to display screen, described operation model can be set in advance as gesture and block operation model.Such as, in the embodiment of the present application, by in advance to researchs such as human body natural reaction and hand models (the various hand models as in Figure 15), sample action is set to two hands cover to block or swing on the other hand and block, thus, according to these sample action generating run models, this operation model comprises model action constraint parameter and model action attributes constrained parameters thereof, described model action constraint parameter can be operational motion parameters for shape characteristic etc., as the finger angle span parameter etc. of finger gesture each in gesture motion, described model action attributes constrained parameters can be the kinematic constraint parameter etc. of operational motion, as kinematic constraint parameter in gesture motion etc., as shown in Figure 16.
Second comparer unit 1822, for the action attributes parameter in described service data and the model action attributes constrained parameters in described operation model being compared, obtains the second comparer result.
Concrete, described operational motion can comprise multiple operator action, the action attributes parameter of described operational motion can comprise the sub-action attributes parameter often between adjacent two operator actions, and it is relative, described model action constraint parameter can comprise the action constraint parameter of multiple sub-action, described model action attributes constrained parameters can comprise the action attributes constrained parameters of often adjacent two sub-actions, and thus, described first comparer unit 1821 specifically can realize in the following manner:
The action constraint parameter of sub-action corresponding with described operation model respectively for each operator action in described operational motion is compared, obtains the first comparison result.
Accordingly, described second comparer unit 1822 can realize in the following manner:
The action constraint parameter of the sub-action of the sub-action attributes parameter in described service data respectively with it in described model action constraint parameter is compared, obtains the second comparison result.
Result generates subelement 1823, for according to described first comparer result and described second comparer result, generates the comparison result of described service data and described operation model.
Wherein, described first comparison result show described operational motion whether with model action constraint match parameters, described second comparison result show described action attributes parameter whether with model action attributes match parameters.Described result generates in subelement 1823, when in described first comparison result and described second comparison result, any one shows not mate, described service data and the comparison result of described operation model show that described service data is not mated with described operation model, that is, only show described operational motion and model action constraint match parameters at described first comparison result, and described second comparison result is when showing described action attributes parameter and model action attributes match parameters, the comparison result that described result generates subelement 1823 generation shows that described service data is mated with described operation model.
Such as, for the operation model preset for gesture motion, because gesture interaction is very ingenious and natural, user points and can uphold by nature, point form by track user, the operation that user will carry out can be judged, then by calculating time and the action attributes parameter of reaching operation, whether the service data obtained thus is predefined operation model, and then determines that namely whether perform privacy screens switch controls described second display screen and be in closed condition.The process of gesture identification can be passed through as shown in Figure 17, and its last recognition result as shown in Figure 18.
It should be noted that, each embodiment in this instructions all adopts the mode of going forward one by one to describe, and what each embodiment stressed is the difference with other embodiments, between each embodiment identical similar part mutually see.
Finally, also it should be noted that, in this article, the such as relational terms of first and second grades and so on is only used for an entity or operation to separate with another entity or operational zone, and not necessarily requires or imply the relation that there is any this reality between these entities or operation or sequentially.And, term " comprises ", " comprising " or its any other variant are intended to contain comprising of nonexcludability, thus make to comprise the process of a series of key element, method, article or equipment and not only comprise those key elements, but also comprise other key elements clearly do not listed, or also comprise by the intrinsic key element of this process, method, article or equipment.When not more restrictions, the key element limited by statement " comprising ... ", and be not precluded within process, method, article or the equipment comprising described key element and also there is other identical element.
A kind of control method provided the application above and electronic equipment are described in detail, apply specific case herein to set forth the principle of the application and embodiment, the explanation of above embodiment is just for helping method and the core concept thereof of understanding the application; Meanwhile, for one of ordinary skill in the art, according to the thought of the application, all will change in specific embodiments and applications, in sum, this description should not be construed as the restriction to the application.

Claims (14)

1. a control method, is characterized in that, is applied to electronic equipment, and described electronic equipment comprises the first display screen and second display screen, and described method comprises:
Obtain the service data of the operating body of described first display screen or described second display screen corresponding region, described service data comprises operational motion and action attributes parameter thereof;
Described service data is compared with the operation model preset about its operational motion and action attributes parameter, obtains comparison result;
When described comparison result shows described service data and described operation model matches, control the display screen corresponding with described service data and be in closed condition.
2. method according to claim 1, is characterized in that, described second display screen comprises projection display screen.
3. method according to claim 1 and 2, is characterized in that, described electronic equipment also comprises image acquisition units, and described image acquisition units is corresponding with described first display screen or second display screen corresponding region;
Wherein, the service data of the operating body of described first display screen of described acquisition or second display screen corresponding region comprises:
Obtained multiple application drawing pictures of described first display screen or second display screen corresponding region by described image acquisition units, between adjacent two application drawing pictures, there is image relevant parameter;
Identify the operational motion of operating body in described application drawing picture, and generate the action attributes parameter corresponding with described operational motion according to described image relevant parameter;
According to described operational motion and action attributes parameter thereof, generating run data.
4. method according to claim 1, is characterized in that, described operation model comprises model action constraint parameter and model action attributes constrained parameters thereof;
Wherein, described described service data to be compared about its operational motion and action attributes parameter and the operation model preset, obtains comparison result, comprising:
The model action constraint parameter of the operational motion in described service data and described operation model is compared, obtains the first comparer result;
Action attributes parameter in described service data and the model action attributes constrained parameters in described operation model are compared, obtains the second comparer result;
According to described first comparer result and described second comparer result, generate the comparison result of described service data and described operation model.
5. method according to claim 3, is characterized in that, described operational motion comprises multiple operator action, and described action attributes parameter comprises the sub-action attributes parameter between adjacent two operator actions;
Wherein, in the described application drawing picture of described identification, the operational motion of operating body comprises:
Identify each self-corresponding operator action of each described application drawing picture successively;
Each described operator action is dynamic according to the composition of the relation in turn operation between described application drawing picture;
Wherein, generate the action attributes parameter corresponding with described operational motion according to described image relevant parameter, comprising:
Respectively according to the image relevant parameter between adjacent two application drawing pictures, generate the sub-action attributes parameter between two operator actions corresponding with this image relevant parameter;
According to described sub-action attributes parameter, generate the action attributes parameter of described operational motion.
6. method according to claim 5, is characterized in that, described operational motion is gesture motion, and described gesture motion comprises multiple gesture posture, has variation rate parameter or interval time parameter between adjacent two gesture postures.
7. the method according to claim 1 or 4, is characterized in that, described operation model generates according to the sample action preset.
8. an electronic equipment, is characterized in that, described electronic equipment comprises the first display screen and second display screen, and described electronic equipment also comprises:
Data capture unit, for obtaining the service data of the operating body of described first display screen or described second display screen corresponding region, described service data comprises operational motion and action attributes parameter thereof;
Comparing unit, for described service data is compared with the operation model preset about its operational motion and action attributes parameter thereof, obtain comparison result, when described comparison result shows described service data and described operation model matches, trigger indicative control unit;
Indicative control unit, is in closed condition for controlling the display screen corresponding with described service data.
9. electronic equipment according to claim 8, is characterized in that, described second display screen comprises projection display screen.
10. electronic equipment according to claim 8 or claim 9, it is characterized in that, described electronic equipment also comprises image acquisition units, and described image acquisition units is corresponding with described first display screen or described second display screen corresponding region;
Wherein, described data capture unit comprises:
Image Acquisition subelement, for being obtained multiple application drawing pictures of described first display screen or second display screen corresponding region by described image acquisition units, has image relevant parameter between adjacent two application drawing pictures;
Action recognition subelement, for identifying the operational motion of operating body in described application drawing picture;
Parameter generates subelement, for generating the action attributes parameter corresponding with described operational motion according to described image relevant parameter;
Data genaration subelement, for according to described operational motion and action attributes parameter thereof, generating run data.
11. electronic equipments according to claim 8, is characterized in that, described operation model comprises model action constraint parameter and model action attributes constrained parameters thereof;
Wherein, described comparing unit comprises:
First comparer unit, for the model action constraint parameter in the operational motion in described service data and described operation model being compared, obtains the first comparer result;
Second comparer unit, for the action attributes parameter in described service data and the model action attributes constrained parameters in described operation model being compared, obtains the second comparer result;
Result generates subelement, for according to described first comparer result and described second comparer result, generates the comparison result of described service data and described operation model.
12. electronic equipments according to claim 10, is characterized in that, described operational motion comprises multiple operator action, and described action attributes parameter comprises the sub-action attributes parameter between adjacent two operator actions;
Wherein, described action recognition subelement comprises:
Sub-action recognition module, for identifying each self-corresponding operator action of each described application drawing picture successively;
Action comprising modules, for forming operational motion by each described operator action according to the relation in turn between described application drawing picture;
Wherein, described parameter generation subelement comprises:
Sub-action parameter generation module, for respectively according to the image relevant parameter between adjacent two application drawing pictures, generates the sub-action attributes parameter between two operator actions corresponding with this image relevant parameter;
Property parameters generation module, for according to described sub-action attributes parameter, generates the action attributes parameter of described operational motion.
13. electronic equipments according to claim 12, it is characterized in that, the operational motion that described action recognition subelement recognizes is gesture motion, and described gesture motion comprises multiple gesture posture, has variation rate parameter or interval time parameter between adjacent two gesture postures.
Electronic equipment described in 14. according to Claim 8 or 11, is characterized in that, also comprises:
Model generation unit, for the sample action generating run model that foundation is preset.
CN201310651523.8A 2013-12-04 2013-12-04 Control method and electronic equipment Pending CN104699377A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310651523.8A CN104699377A (en) 2013-12-04 2013-12-04 Control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310651523.8A CN104699377A (en) 2013-12-04 2013-12-04 Control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN104699377A true CN104699377A (en) 2015-06-10

Family

ID=53346567

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310651523.8A Pending CN104699377A (en) 2013-12-04 2013-12-04 Control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN104699377A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106708388A (en) * 2016-12-29 2017-05-24 宇龙计算机通信科技(深圳)有限公司 Gesture interaction method and terminal
WO2018072637A1 (en) * 2016-10-20 2018-04-26 深圳市光峰光电技术有限公司 Projection apparatus
CN108040189A (en) * 2017-11-02 2018-05-15 广州中山大学出版社有限公司 A kind of digital watermarking implementation method based on video
CN113411316A (en) * 2021-06-04 2021-09-17 深圳市华磊迅拓科技有限公司 MES system data communication method and system based on WCF protocol

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101807114A (en) * 2010-04-02 2010-08-18 浙江大学 Natural interactive method based on three-dimensional gestures
EP2290508A2 (en) * 2009-08-25 2011-03-02 Promethean Limited Interactive whiteboard for private use
CN102253765A (en) * 2011-04-13 2011-11-23 南昊(北京)科技有限公司 Interactive display system with emphasis mode
CN102902356A (en) * 2012-09-18 2013-01-30 华南理工大学 Gesture control system and control method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2290508A2 (en) * 2009-08-25 2011-03-02 Promethean Limited Interactive whiteboard for private use
CN101807114A (en) * 2010-04-02 2010-08-18 浙江大学 Natural interactive method based on three-dimensional gestures
CN102253765A (en) * 2011-04-13 2011-11-23 南昊(北京)科技有限公司 Interactive display system with emphasis mode
CN102902356A (en) * 2012-09-18 2013-01-30 华南理工大学 Gesture control system and control method thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018072637A1 (en) * 2016-10-20 2018-04-26 深圳市光峰光电技术有限公司 Projection apparatus
CN106708388A (en) * 2016-12-29 2017-05-24 宇龙计算机通信科技(深圳)有限公司 Gesture interaction method and terminal
CN108040189A (en) * 2017-11-02 2018-05-15 广州中山大学出版社有限公司 A kind of digital watermarking implementation method based on video
CN113411316A (en) * 2021-06-04 2021-09-17 深圳市华磊迅拓科技有限公司 MES system data communication method and system based on WCF protocol

Similar Documents

Publication Publication Date Title
Saquib et al. Interactive body-driven graphics for augmented video performance
US10325407B2 (en) Attribute detection tools for mixed reality
US20200329214A1 (en) Method and system for providing mixed reality service
Akaoka et al. DisplayObjects: prototyping functional physical interfaces on 3d styrofoam, paper or cardboard models
CN105378624B (en) Interaction is shown when interaction comes across on blank
Underkoffler et al. Emancipated pixels: real-world graphics in the luminous room
CN104699377A (en) Control method and electronic equipment
CN103247004A (en) Information management method and system based on electromechanical integrated BIM (building information model)
CN105653071A (en) Information processing method and electronic device
WO2020151255A1 (en) Display control system and method based on mobile terminal
CN110162258A (en) The processing method and processing device of individual scene image
Dhule et al. Computer vision based human-computer interaction using color detection techniques
CN105872201A (en) Method for remotely controlling document display through intelligent terminal, intelligent terminal and computer equipment
CN107908281A (en) Virtual reality exchange method, device and computer-readable recording medium
Shajideen et al. Hand gestures-virtual mouse for human computer interaction
US20180260031A1 (en) Method for controlling distribution of multiple sub-screens and device using the same
CN117292097B (en) AR try-on interactive experience method and system
CN111492396A (en) Mixed reality service providing method and system
CN107527334A (en) Human face light moving method and device
CN107463256A (en) Based on the user of virtual reality towards control method and device
CN109542430A (en) For realizing the method, apparatus and electronic equipment of interface alternation effect
CN108509137A (en) Redefine the method and device of the manipulation display area of screen
Lee et al. Tangible user interface of digital products in multi-displays
Lim et al. Interactive augmented reality system using projector-camera system and smart phone
Yang et al. Interactive augmented reality authoring system using mobile device as input method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20150610