CN104714736A - Control method and terminal for quitting full screen lock-out state - Google Patents

Control method and terminal for quitting full screen lock-out state Download PDF

Info

Publication number
CN104714736A
CN104714736A CN201510138161.1A CN201510138161A CN104714736A CN 104714736 A CN104714736 A CN 104714736A CN 201510138161 A CN201510138161 A CN 201510138161A CN 104714736 A CN104714736 A CN 104714736A
Authority
CN
China
Prior art keywords
out state
full frame
frame lock
gesture
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510138161.1A
Other languages
Chinese (zh)
Inventor
邓俊杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meizu Technology China Co Ltd
Original Assignee
Meizu Technology China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meizu Technology China Co Ltd filed Critical Meizu Technology China Co Ltd
Priority to CN201510138161.1A priority Critical patent/CN104714736A/en
Publication of CN104714736A publication Critical patent/CN104714736A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a control method and terminal for quitting a full screen lock-out state, wherein the control method for quitting the full screen lock-out state includes the steps that in the full screen lock-out state, a gesture operational order which is submitted by a user in a preset region is received; whether the gesture operational order is an order of quitting the full screen lock-out state is judged; if yes, the full screen lock-out state is quitted. By means of the control method and terminal for quitting the full screen lock-out state, quitting the full screen lock-out state can be achieved conveniently, the hardware cost is saved, the space is prevented from being occupied by a virtual key, and the user experience is improved.

Description

A kind of control method and terminal exiting full frame lock-out state
Technical field
The present invention relates to electronic technology field, particularly relate to a kind of control method and the terminal that exit full frame lock-out state.
Background technology
In recent years, such as the terminal such as mobile phone or panel computer is more and more universal, and it can easily realize surfing the Net, play games, see video, read the functions such as magazine.When user uses panel computer or the mobile phone of such as contact action; such as with in screen mode toggle viewing video or the situation such as to play games; usually there will be false touch; such as; user lies on a bed in the process of viewing film; if trigger the progress msg such as F.F. or retrogressing time user alters one's posture; then user needs manually adjustment to return; in addition; when using terminal in subway or public transport; also usually due to crowded and rock generation false touch, then user generally carries out full frame locking by physical button or virtual key, causes interference to avoid false touch.
In prior art, when user carries out full frame locking by physical button, there is hardware cost, the position of physical button is relatively fixing, easily occurs false touch; When user carries out full frame locking by virtual key, virtual key is shown in the application interface of terminal, user submits screen locking signal by virtual key to terminal, then terminal carries out full frame locking according to this screen locking signal, virtual key takies screen space, watch video to user, play games, visual field when reading magazine causes interference, Consumer's Experience is lower.
Summary of the invention
The embodiment of the present invention provides a kind of control method and the terminal that exit full frame lock-out state, can realize easily exiting full frame lock-out state, saves hardware cost, and avoids virtual key to take screen space, promotes Consumer's Experience.
Embodiments provide a kind of control method exiting full frame lock-out state, comprising:
Under full frame lock-out state, receive the gesture operation instruction that user submits at predeterminable area;
Judge whether described gesture operation instruction is the instruction exiting described full frame lock-out state;
When being judged as YES, exit described full frame lock-out state.
Correspondingly, the embodiment of the present invention additionally provides a kind of terminal, comprising:
Receiving element, under full frame lock-out state, receives the gesture operation instruction that user submits at predeterminable area;
Instruction judging unit, for judging whether described gesture operation instruction is the instruction exiting described full frame lock-out state;
Switch unit, for when described instruction judge module is judged as YES, exits described full frame lock-out state.
Implement the embodiment of the present invention, terminal is under full frame lock-out state, receive the gesture operation instruction that user submits at predeterminable area, when judging that gesture operation instruction is the instruction exiting described full frame lock-out state, exit full frame lock-out state, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, below the accompanying drawing used required in describing embodiment is briefly described, apparently, accompanying drawing in the following describes is some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in first embodiment of the invention;
Fig. 2 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in second embodiment of the invention;
Fig. 3 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in third embodiment of the invention;
Fig. 4 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in fourth embodiment of the invention;
Fig. 5 is the structural representation of a kind of terminal provided in first embodiment of the invention;
Fig. 6 is the structural representation of a kind of terminal provided in second embodiment of the invention;
Fig. 7 is the unlock interface schematic diagram of a kind of terminal provided in the embodiment of the present invention;
Fig. 8 is the interface schematic diagram of the first application interface of a kind of terminal demonstration provided in the embodiment of the present invention;
Fig. 9 is the interface schematic diagram of the second application interface of a kind of terminal demonstration provided in the embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, be clearly and completely described the technical scheme in the embodiment of the present invention, obviously, described embodiment is the present invention's part embodiment, instead of whole embodiments.Based on the embodiment in the present invention, those of ordinary skill in the art, not making the every other embodiment obtained under creative work prerequisite, belong to the scope of protection of the invention.
The control method exiting full frame lock-out state in the embodiment of the present invention, terminal can under full frame lock-out state, receive the gesture operation instruction that user submits at predeterminable area, when judging that gesture operation instruction is the instruction exiting full frame lock-out state, exit full frame lock-out state, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
The terminal that the embodiment of the present invention is mentioned to can comprise smart mobile phone, personal digital assistant or panel computer etc., specifically not by the restriction of the embodiment of the present invention.
Refer to Fig. 1, Fig. 1 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in first embodiment of the invention, and the control method exiting full frame lock-out state as shown in the figure in the embodiment of the present invention can comprise:
S101, under full frame lock-out state, receives the gesture operation instruction that user submits at predeterminable area.
Terminal under full frame lock-out state, can receive the gesture operation instruction that user submits at predeterminable area.Concrete, the contact panel of the predeterminable area in the display screen of terminal can be in running order, and whether terminal under full frame lock-out state, can receive the gesture operation instruction of user's submission in predeterminable area monitoring by contact panel.As shown in Figure 8, Fig. 8 is the interface schematic diagram of the first application interface of terminal demonstration, and terminal can show the first application interface under full frame lock-out state, and the first application interface only shows current video.
In an alternative embodiment, terminal can under full frame lock-out state, by the gesture information that camera collection user submits at predeterminable area, the gesture information collected is analyzed, and according to the analysis result identification gesture operation instruction of gesture information, the front-facing camera that such as terminal can be opened a terminal by camera module, gesture information is gathered by front-facing camera, terminal can be analyzed gesture information by the image processing algorithm preset, and according to the analysis result identification gesture operation instruction of gesture information.Wherein gesture information can comprise remaining finger and clenches fist, V-type, OK type, the gestures such as longhorn ox type, five fingers of user can comprise thumb, middle finger, forefinger, the third finger and little finger of toe, then each finger launches or the bending gesture operation instruction that can represent correspondence, namely such as remaining finger clenches fist, and five fingers are all bending, V-type and middle finger and forefinger launch, thumb, nameless and little finger of toe is all bending, OK type and forefinger, nameless and little finger of toe launches, thumb and middle finger bend, longhorn ox type and thumb and little finger of toe launch, middle finger, forefinger and the third finger bend, it is to be noted, gesture information in the embodiment of the present invention is clenched fist including but not limited to remaining finger, V-type, OK type and longhorn ox type four gestures, can revise accordingly in conjunction with different scene according to research staff, the present invention does not limit.
In an alternative embodiment, terminal can under full frame lock-out state, by the gesture operation instruction that range sensor reception user submits at predeterminable area, such as, terminal can detecting distance sensor to the induction number of times of gesture information, and for example, when range sensor comprises the first range sensor and second distance sensor, terminal can obtain the first range sensor and second distance sensor to the sensitive time interval of gesture information, if sensitive time interval is greater than zero, and be interposed between the sensitive time in preset duration, then identify that gesture information is first gesture operational order, if equal zero in sensitive time interval, then identify that gesture information is the second gesture operational order, etc.
In an alternative embodiment, the gesture operation instruction that user submits to can comprise sliding trace information, then terminal can under full frame lock-out state, receive the sliding trace information that user submits at predeterminable area, such as circular or S shape etc., and for example square or V-arrangement etc., and for example linear slide or from left to right linear slide etc. from top to bottom, further alternative, sliding trace information can also comprise closed figure or open figure etc.
In an alternative embodiment, before terminal receives the gesture operation instruction that user submits at predeterminable area, the self-defined gesture that can receive user's submission arranges request, the gesture information of acquisition request gesture entry terminal input is set according to self-defined gesture, and the gesture operation instruction of gesture information and correspondence thereof is stored in the gesture information database preset.Wherein gesture entry terminal can comprise the finger tip etc. of pointer, time writer or user.
S102, judges whether gesture operation instruction is the instruction exiting full frame lock-out state.
Terminal receives user after the gesture operation instruction that predeterminable area is submitted to, can judge whether gesture operation instruction is the instruction exiting full frame lock-out state, concrete, terminal can obtain the instruction exiting full frame lock-out state prestored in gesture information database, and judge whether the gesture operation instruction received is the instruction exiting full frame lock-out state, when judging that gesture operation instruction is the instruction exiting full frame lock-out state, perform step S103 further; When judging that gesture operation instruction is not the instruction exiting full frame lock-out state, perform step S104 further.
In an alternative embodiment, terminal receives user after the gesture operation instruction that predeterminable area is submitted to, gesture operation instruction and the instruction exiting full frame lock-out state can be compared, if gesture operation instruction and the instructions match exiting full frame lock-out state, then determine that gesture operation instruction is the instruction exiting full frame lock-out state, if gesture operation instruction is not mated with the instruction exiting full frame lock-out state, then determine that gesture operation instruction is not the instruction exiting full frame lock-out state, such as gesture operation instruction and the instruction exiting full frame lock-out state can compare by terminal, to obtain gesture operation instruction and the similarity of instruction exiting full frame lock-out state, if gesture operation instruction is greater than predetermined threshold value with the similarity of the instruction exiting full frame lock-out state, gesture operation instruction and the instructions match exiting full frame lock-out state then detected, and then can determine that gesture operation instruction is the instruction exiting full frame lock-out state.
S103, exits full frame lock-out state.
When the instruction of terminal judges gesture operation is the instruction exiting full frame lock-out state, full frame lock-out state can be exited according to the gesture operation instruction received.
As shown in Figure 9, Fig. 9 is the interface schematic diagram of the second application interface of terminal demonstration, after terminal exits full frame lock-out state, the second application interface can be shown, wherein the second application interface can be partly or entirely identical with the first application interface, second application interface is in and exits full frame lock-out state, second application interface display current video, optionally, second application interface can also comprise status bar, such as video labeling " the ice and snow world the 14th season 05 ", and application icon is set, second application interface can also comprise video progress bar, and suspend or play button etc.
S104, ignores gesture operation instruction.
When the instruction of terminal judges gesture operation is not the instruction exiting full frame lock-out state, response can not be made to this gesture operation instruction, namely ignore this gesture operation instruction.
Exiting in the control method of full frame lock-out state shown in Fig. 1, terminal is under full frame lock-out state, receive the gesture operation instruction that user submits at predeterminable area, if judge, gesture operation instruction is the instruction exiting full frame lock-out state, then exit full frame lock-out state according to this gesture operation instruction, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
Refer to Fig. 2, Fig. 2 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in second embodiment of the invention, and the control method exiting full frame lock-out state as shown in the figure in the embodiment of the present invention can comprise:
S201, under full frame lock-out state, the gesture information submitted at predeterminable area by camera collection user.
Terminal is under full frame lock-out state, the gesture information can submitted at predeterminable area by camera collection user, concrete, the contact panel of the predeterminable area in the display screen of terminal can be in running order, whether terminal under full frame lock-out state, can receive the gesture operation instruction of user's submission in predeterminable area monitoring by contact panel.As shown in Figure 8, Fig. 8 is the interface schematic diagram of the first application interface of terminal demonstration, and terminal can show the first application interface under full frame lock-out state, and the first application interface only shows current video.
Wherein gesture information can comprise the gestures such as remaining finger is clenched fist, V-type, OK type, longhorn ox type, and optionally, gesture information also can comprise the motion track information of gesture entry terminal input.The front-facing camera that such as terminal can be opened a terminal by camera module, when user shows OK type gesture above the display interface of terminal, terminal can comprise the gesture information of OK type gesture by front-facing camera collection.Gesture entry terminal can comprise the finger etc. of user.
It is to be noted, gesture information in the embodiment of the present invention is clenched fist including but not limited to remaining finger, V-type, OK type and longhorn ox type four gestures, five fingers of user can comprise thumb, middle finger, forefinger, the third finger and little finger of toe, then each finger launches or the bending gesture information that can represent correspondence, can revise accordingly in conjunction with different scene according to research staff, the present invention does not limit.
In an alternative embodiment, before the gesture information that terminal is submitted at predeterminable area by camera collection user, the self-defined gesture that can receive user's submission arranges request, the gesture information of acquisition request gesture entry terminal input is set according to self-defined gesture, and the gesture operation instruction of gesture information and correspondence thereof is stored in the gesture information database preset.Such as user needs the instruction being set as OK type gesture to exit full frame lock-out state, the self-defined gesture that terminal can receive user's submission arranges request, the OK type gesture of acquisition request gesture entry terminal input is set according to self-defined gesture, the instruction exiting full frame lock-out state of OK type gesture and correspondence thereof is stored in gesture information database.
S202, analyzes gesture information, and according to the analysis result identification gesture operation instruction of gesture information.
Terminal can be analyzed gesture information by the image processing algorithm preset, and according to the analysis result identification gesture operation instruction of gesture information, such as terminal can pass through Hand Gesture Segmentation technical limit spacing gesture information, gesture information has deformation, motion and Texture eigenvalue, then terminal can by the gesture in this gesture information of gesture information recognition technology identification, and then obtains the gesture operation instruction identifying that the gesture that obtains is corresponding.
S203, judges whether gesture operation instruction is the instruction exiting full frame lock-out state.
Terminal receives user after the gesture operation instruction that predeterminable area is submitted to, can judge whether gesture operation instruction is the instruction exiting full frame lock-out state, concrete, terminal can obtain the instruction exiting full frame lock-out state prestored in gesture information database, and judge whether the gesture operation instruction received is the instruction exiting full frame lock-out state, when judging that gesture operation instruction is the instruction exiting full frame lock-out state, perform step S204 further; When judging that gesture operation instruction is not the instruction exiting full frame lock-out state, perform step S205 further.
In an alternative embodiment, terminal receives user after the gesture operation instruction that predeterminable area is submitted to, gesture operation instruction and the instruction exiting full frame lock-out state can be compared, if gesture operation instruction and the instructions match exiting full frame lock-out state, then determine that gesture operation instruction is the instruction exiting full frame lock-out state, if gesture operation instruction is not mated with the instruction exiting full frame lock-out state, then determine that gesture operation instruction is not the instruction exiting full frame lock-out state, such as gesture operation instruction and the instruction exiting full frame lock-out state can compare by terminal, to obtain gesture operation instruction and the similarity of instruction exiting full frame lock-out state, if gesture operation instruction is greater than predetermined threshold value with the similarity of the instruction exiting full frame lock-out state, gesture operation instruction and the instructions match exiting full frame lock-out state then detected, and then can determine that gesture operation instruction is the instruction exiting full frame lock-out state.
Such as, the gesture information that terminal determines to exit the instruction of full frame lock-out state corresponding by the gesture information in gesture information database is OK type gesture, after the gesture information that terminal is submitted to user by camera collection, can judge whether the gesture information that user submits to is OK type gesture, if the gesture information that user submits to is OK type gesture, then can determine that the gesture operation instruction received is the instruction exiting full frame lock-out state; If the gesture information that user submits to is not OK type gesture, then can determine that the gesture operation instruction received is not the instruction exiting full frame lock-out state.
S204, exits full frame lock-out state.
When the instruction of terminal judges gesture operation is the instruction exiting full frame lock-out state, full frame lock-out state can be exited according to the gesture operation instruction received.
As shown in Figure 9, Fig. 9 is the interface schematic diagram of the second application interface of terminal demonstration, after terminal exits full frame lock-out state, the second application interface can be shown, wherein the second application interface can be partly or entirely identical with the first application interface, second application interface is in and exits full frame lock-out state, second application interface display current video, optionally, second application interface can also comprise status bar, such as video labeling " the ice and snow world the 14th season 05 ", and application icon is set, second application interface can also comprise video progress bar, and suspend or play button etc.
S205, ignores gesture operation instruction.
When the instruction of terminal judges gesture operation is not the instruction exiting full frame lock-out state, response can not be made to this gesture operation instruction, namely ignore this gesture operation instruction.
Exiting in the control method of full frame lock-out state shown in Fig. 2, terminal is by camera collection gesture information, the gesture information collected is analyzed, and according to the analysis result identification gesture operation instruction of gesture information, when judging that gesture operation instruction is the instruction exiting full frame lock-out state, exit full frame lock-out state, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
Refer to Fig. 3, Fig. 3 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in third embodiment of the invention, and the control method exiting full frame lock-out state as shown in the figure in the embodiment of the present invention can comprise:
Step S301, under full frame lock-out state, the gesture operation instruction submitted at predeterminable area by range sensor reception user.
Terminal can under full frame lock-out state, the gesture operation instruction submitted at predeterminable area by range sensor reception user.Concrete, the contact panel of the predeterminable area in the display screen of terminal can be in running order, and whether terminal under full frame lock-out state, can receive the gesture operation instruction of user's submission in predeterminable area monitoring by contact panel.As shown in Figure 8, Fig. 8 is the interface schematic diagram of the first application interface of terminal demonstration, and terminal can show the first application interface under full frame lock-out state, and the first application interface only shows current video.
In an alternative embodiment, terminal can detecting distance sensor to the induction number of times of gesture information, in the gesture information database preset, search gesture operation instruction corresponding to induction number of times.
In another embodiment, when sensing that the range sensor of gesture information comprises the first range sensor and second distance sensor, terminal can obtain the first range sensor and second distance sensor to the sensitive time interval of gesture information, if sensitive time interval is greater than zero, and be interposed between the sensitive time in preset duration, then identify that gesture information is first gesture operational order; If equal zero in sensitive time interval, then identify that gesture information is the second gesture operational order.
In another embodiment, when sensing that the range sensor of gesture information comprises the first range sensor, second distance sensor and the 3rd range sensor, terminal can obtain the sensitive time interval of the first range sensor and second distance sensor and the sensitive time interval of second distance sensor and the 3rd range sensor, if the sensitive time interval got is all in preset duration, and the sensitive time interval got all is greater than zero, then identify that gesture information is the 3rd gesture operation instruction.Wherein, three range sensors are in terminal the same side, and second distance sensor is between the first range sensor and the 3rd range sensor.
In an alternative embodiment, before terminal detects gesture information by range sensor, the self-defined gesture that can receive user's submission arranges request, the gesture information of request detection gesture entry terminal input is set according to self-defined gesture, and the gesture operation instruction of gesture information and correspondence thereof is stored in the gesture information database preset.Wherein gesture entry terminal can comprise the finger etc. of user.
Step S302, judges whether gesture operation instruction is the instruction exiting full frame lock-out state.
Terminal receives user after the gesture operation instruction that predeterminable area is submitted to, can judge whether gesture operation instruction is the instruction exiting full frame lock-out state, concrete, terminal can obtain the instruction exiting full frame lock-out state prestored in gesture information database, and judge whether the gesture operation instruction received is the instruction exiting full frame lock-out state, when judging that gesture operation instruction is the instruction exiting full frame lock-out state, perform step S303 further; When judging that gesture operation instruction is not the instruction exiting full frame lock-out state, perform step S304 further.
In an alternative embodiment, terminal receives user after the gesture operation instruction that predeterminable area is submitted to, gesture operation instruction and the instruction exiting full frame lock-out state can be compared, if gesture operation instruction and the instructions match exiting full frame lock-out state, then determine that gesture operation instruction is the instruction exiting full frame lock-out state, if gesture operation instruction is not mated with the instruction exiting full frame lock-out state, then determine that gesture operation instruction is not the instruction exiting full frame lock-out state, such as gesture operation instruction and the instruction exiting full frame lock-out state can compare by terminal, to obtain gesture operation instruction and the similarity of instruction exiting full frame lock-out state, if gesture operation instruction is greater than predetermined threshold value with the similarity of the instruction exiting full frame lock-out state, gesture operation instruction and the instructions match exiting full frame lock-out state then detected, and then can determine that gesture operation instruction is the instruction exiting full frame lock-out state.
Step S303, exits full frame lock-out state.
When the instruction of terminal judges gesture operation is the instruction exiting full frame lock-out state, full frame lock-out state can be exited according to the gesture operation instruction received.
As shown in Figure 9, Fig. 9 is the interface schematic diagram of the second application interface of terminal demonstration, after terminal exits full frame lock-out state, the second application interface can be shown, wherein the second application interface can be partly or entirely identical with the first application interface, second application interface is in and exits full frame lock-out state, second application interface display current video, optionally, second application interface can also comprise status bar, such as video labeling " the ice and snow world the 14th season 05 ", and application icon is set, second application interface can also comprise video progress bar, and suspend or play button etc.
Step S304, ignores gesture operation instruction.
When the instruction of terminal judges gesture operation is not the instruction exiting full frame lock-out state, response can not be made to this gesture operation instruction, namely ignore this gesture operation instruction.
Exiting in the control method of full frame lock-out state shown in Fig. 3, terminal is under full frame lock-out state, by the gesture operation instruction that range sensor reception user submits at predeterminable area, when the gesture operation instruction that judgement receives is the instruction exiting full frame lock-out state, exit full frame lock-out state, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
Refer to Fig. 4, Fig. 4 is a kind of schematic flow sheet exiting the control method of full frame lock-out state provided in fourth embodiment of the invention, and the control method exiting full frame lock-out state as shown in the figure in the embodiment of the present invention can comprise:
S401, under full frame lock-out state, receives the sliding trace information that user submits at predeterminable area.
Terminal is under full frame lock-out state, the sliding trace information that user submits at predeterminable area can be received, concrete, the contact panel of the predeterminable area in the display screen of terminal can be in running order, whether terminal under full frame lock-out state, can receive the sliding trace information of user's submission in predeterminable area monitoring by contact panel.As shown in Figure 8, Fig. 8 is the interface schematic diagram of the first application interface of terminal demonstration, and terminal can show the first application interface under full frame lock-out state, and the first application interface only shows current video.
Wherein sliding trace information can comprise circle or S shape etc., and for example square or V-arrangement etc., and for example linear slide or from left to right linear slide etc. from top to bottom, further alternative, sliding trace information can also comprise closed figure or open figure etc., specifically not by the restriction of the embodiment of the present invention.
For the unlock interface schematic diagram of the terminal shown in Fig. 7, terminal is under full frame lock-out state, the contact panel of the predeterminable area in the display screen of terminal is in running order, terminal can monitor by contact panel the sliding trace information whether receiving user and submit at predeterminable area, if terminal receives the sliding trace information of user's submission in predeterminable area monitoring, then perform S402 further.
S402, judges whether sliding trace information is the instruction exiting full frame lock-out state.
Terminal receives user after the sliding trace information that predeterminable area is submitted to, can judge whether sliding trace information is the instruction exiting full frame lock-out state, concrete, terminal can obtain the instruction exiting full frame lock-out state prestored in gesture information database, and judge whether the sliding trace information received is the instruction exiting full frame lock-out state, when judging that sliding trace information is the instruction exiting full frame lock-out state, perform step S403 further; When judging that sliding trace information is not the instruction exiting full frame lock-out state, perform step S404 further.
In an alternative embodiment, terminal receives user after the sliding trace information that predeterminable area is submitted to, sliding trace information and the instruction exiting full frame lock-out state can be compared, if sliding trace information and the instructions match exiting full frame lock-out state, then determine that sliding trace information exits the instruction of full frame lock-out state, if sliding trace information is not mated with the instruction exiting full frame lock-out state, then determine that sliding trace information is not exit the instruction of full frame lock-out state, such as sliding trace information and the instruction exiting full frame lock-out state can compare by terminal, to obtain sliding trace information and the similarity of instruction exiting full frame lock-out state, if sliding trace information is greater than predetermined threshold value with the similarity of the instruction exiting full frame lock-out state, sliding trace information and the instructions match exiting full frame lock-out state then detected, and then can determine that sliding trace information exits the instruction of full frame lock-out state.
S403, exits full frame lock-out state.
When terminal judges sliding trace information is the instruction exiting full frame lock-out state, full frame lock-out state can be exited according to the sliding trace information received.
As shown in Figure 9, Fig. 9 is the interface schematic diagram of the second application interface of terminal demonstration, after terminal exits full frame lock-out state, the second application interface can be shown, wherein the second application interface can be partly or entirely identical with the first application interface, second application interface is in and exits full frame lock-out state, second application interface display current video, optionally, second application interface can also comprise status bar, such as video labeling " the ice and snow world the 14th season 05 ", and application icon is set, second application interface can also comprise video progress bar, and suspend or play button etc.
S404, ignores sliding trace information.
When terminal judges sliding trace information is not the instruction exiting full frame lock-out state, response can not be made to this sliding trace information, namely ignore this sliding trace information.
Exiting in the control method of full frame lock-out state shown in Fig. 4, terminal is under full frame lock-out state, receive the sliding trace information that user submits at predeterminable area, when the sliding trace information that judgement receives is the instruction exiting full frame lock-out state, exit full frame lock-out state, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
Refer to Fig. 5, Fig. 5 is the structural representation of a kind of terminal 500 provided in first embodiment of the invention, described terminal 500 can comprise smart mobile phone, personal digital assistant or panel computer etc., terminal 500 as shown in the figure in the present embodiment at least can comprise receiving element 510, instruction judging unit 520 and switch unit 530, wherein:
Receiving element 510, under full frame lock-out state, receives the gesture operation instruction that user submits at predeterminable area.Concrete, the contact panel of the predeterminable area in the display screen of terminal 500 can be in running order, and whether receiving element 510 under full frame lock-out state, can receive the gesture operation instruction of user's submission in predeterminable area monitoring by contact panel.As shown in Figure 8, Fig. 8 is the interface schematic diagram of the first application interface that terminal 500 shows, and terminal 500 can show the first application interface under full frame lock-out state, and the first application interface only shows current video.
Instruction judging unit 520, for judging whether gesture operation instruction is the instruction exiting full frame lock-out state.
Switch unit 530, for when being judged as YES when instruction judge module 520, exits full frame lock-out state.
As the optional embodiment of one, the receiving element 510 in the embodiment of the present invention can as shown in Figure 6, comprise further:
Information acquisition module 610, under full frame lock-out state, the gesture information submitted at predeterminable area by camera collection user.Wherein gesture information can comprise the gestures such as remaining finger is clenched fist, V-type, OK type, longhorn ox type, five fingers of user can comprise thumb, middle finger, forefinger, the third finger and little finger of toe, then each finger launches or the bending gesture operation instruction that can represent correspondence, namely such as remaining finger clenches fist, and five fingers are all bending, V-type and middle finger and forefinger launch, thumb, the third finger and little finger of toe are all bending, OK type and forefinger, the third finger and little finger of toe launch, thumb and middle finger bend, longhorn ox type and thumb and little finger of toe launch, and middle finger, forefinger and the third finger are bending.
Instruction identification module 620, for analyzing gesture information, and according to the analysis result identification gesture operation instruction of gesture information.
As the optional embodiment of one, receiving element 510, under full frame lock-out state, the gesture operation instruction submitted at predeterminable area by range sensor reception user.Such as, receiving element 510 can detecting distance sensor to the induction number of times of gesture information, and for example, when range sensor comprises the first range sensor and second distance sensor, receiving element 510 can obtain the first range sensor and second distance sensor to the sensitive time interval of gesture information, if sensitive time interval is greater than zero, and be interposed between the sensitive time in preset duration, then identify that gesture information is first gesture operational order; If equal zero in sensitive time interval, then identify that gesture information is the second gesture operational order, etc.
As the optional embodiment of one, receiving element 510, under full frame lock-out state, receives the sliding trace information that user submits at predeterminable area.Such as circular or S shapes etc., and for example square or V-arrangement etc., and for example linear slide or from left to right linear slide etc. from top to bottom, further alternative, sliding trace information can also comprise closed figure or open figure etc.
As the optional embodiment of one, instruction judging unit 520, for gesture operation instruction and the instruction exiting full frame lock-out state are compared, if gesture operation instruction and the instructions match exiting full frame lock-out state, then determine that gesture operation instruction instruction is the instruction exiting full frame lock-out state.
In the terminal shown in Fig. 5, receiving element 510 receives the gesture operation instruction that user submits at predeterminable area under full frame lock-out state, when instruction judging unit 520 judges that gesture operation instruction is the instruction exiting full frame lock-out state, switch unit 530 exits full frame lock-out state according to the gesture operation instruction received, can realize easily exiting full frame lock-out state, save hardware cost, and avoid virtual key to take screen space, promote Consumer's Experience.
One of ordinary skill in the art will appreciate that all or part of flow process realized in above-described embodiment method, that the hardware that can carry out instruction relevant by computer program has come, described program can be stored in computer read/write memory medium, this program, when performing, can comprise the flow process of the embodiment as above-mentioned each side method.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-Only Memory, ROM) or random store-memory body (Random Access Memory, RAM) etc.
It should be noted that, in the above-described embodiments, the description of each embodiment is all emphasized particularly on different fields in certain embodiment, there is no the part described in detail, can see the associated description of other embodiments.Secondly, those skilled in the art also should know, the embodiment described in instructions all belongs to preferred embodiment, and involved action and module might not be that the present invention is necessary.
Step in embodiment of the present invention method can be carried out order according to actual needs and be adjusted, merges and delete.
Unit in embodiment of the present invention device can carry out merging, divide and deleting according to actual needs.
Unit described in the embodiment of the present invention, universal integrated circuit can be passed through, such as CPU (CentralProcessing Unit, central processing unit), or realized by ASIC (Application Specific IntegratedCircuit, special IC).
The control method exiting full frame lock-out state provided the embodiment of the present invention above and terminal are described in detail, apply specific case herein to set forth principle of the present invention and embodiment, the explanation of above embodiment just understands method of the present invention and core concept thereof for helping; Meanwhile, for one of ordinary skill in the art, according to thought of the present invention, all will change in specific embodiments and applications, in sum, this description should not be construed as limitation of the present invention.

Claims (10)

1. exit a control method for full frame lock-out state, it is characterized in that, described method comprises:
Under full frame lock-out state, receive the gesture operation instruction that user submits at predeterminable area;
Judge whether described gesture operation instruction is the instruction exiting described full frame lock-out state;
When being judged as YES, exit described full frame lock-out state.
2. the method for claim 1, is characterized in that, under described full frame lock-out state, the gesture operation instruction that reception user submits at predeterminable area comprises:
Under described full frame lock-out state, the gesture information submitted at described predeterminable area by gesture information harvester collection user;
Described gesture information is analyzed, and gesture operation instruction according to the analysis result identification of described gesture information.
3. the method for claim 1, is characterized in that, under described full frame lock-out state, the gesture operation instruction that reception user submits at predeterminable area comprises:
Under described full frame lock-out state, the described predeterminable area controlled in contact panel is in running order;
By the gesture operation instruction that described contact panel reception user submits at described predeterminable area.
4. the method for claim 1, is characterized in that, described in exit described full frame lock-out state and comprise:
Remove the locking of target key assignments display; Or
Be back to full frame unlock mode.
5. the method as described in any one of Claims 1 to 4, is characterized in that, described in exit described full frame lock-out state instruction comprise default figurepicture information.
6. a terminal, is characterized in that, described terminal comprises:
Receiving element, under full frame lock-out state, receives the gesture operation instruction that user submits at predeterminable area;
Instruction judging unit, for judging whether described gesture operation instruction is the instruction exiting described full frame lock-out state;
Switch unit, for when described instruction judge module is judged as YES, exits described full frame lock-out state.
7. terminal as claimed in claim 6, it is characterized in that, described receiving element comprises:
Information acquisition module, under described full frame lock-out state, the gesture information submitted at described predeterminable area by gesture information harvester collection user;
Instruction identification module, for analyzing described gesture information, and gesture operation instruction according to the analysis result identification of described gesture information.
8. terminal as claimed in claim 6, it is characterized in that, described receiving element comprises:
Contact panel control module, under described full frame lock-out state, the described predeterminable area controlled in contact panel is in running order;
Command reception module, for receiving the gesture operation instruction that user submits at described predeterminable area by described contact panel.
9. terminal as claimed in claim 6, is characterized in that,
Described switch unit, for removing the locking of target key assignments display or being back to full frame unlock mode.
10. the terminal as described in any one of claim 6 ~ 9, is characterized in that, described in exit described full frame lock-out state instruction comprise default figurepicture information.
CN201510138161.1A 2015-03-26 2015-03-26 Control method and terminal for quitting full screen lock-out state Pending CN104714736A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510138161.1A CN104714736A (en) 2015-03-26 2015-03-26 Control method and terminal for quitting full screen lock-out state

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510138161.1A CN104714736A (en) 2015-03-26 2015-03-26 Control method and terminal for quitting full screen lock-out state

Publications (1)

Publication Number Publication Date
CN104714736A true CN104714736A (en) 2015-06-17

Family

ID=53414128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510138161.1A Pending CN104714736A (en) 2015-03-26 2015-03-26 Control method and terminal for quitting full screen lock-out state

Country Status (1)

Country Link
CN (1) CN104714736A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892846A (en) * 2015-12-21 2016-08-24 乐视致新电子科技(天津)有限公司 Method and device for preventing wrong touch control in full screen playing mode
CN106527899A (en) * 2016-11-24 2017-03-22 珠海市魅族科技有限公司 Full-screen playing mode control method and mobile terminal
CN107562355A (en) * 2017-08-15 2018-01-09 深圳市沃特沃德股份有限公司 The control method and device of touch screen terminal
CN111078111A (en) * 2019-12-27 2020-04-28 联想(北京)有限公司 Information processing method, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020051018A1 (en) * 2000-10-26 2002-05-02 Nan-Ting Yeh Apparatus and method for browser interface operation
US20050107126A1 (en) * 2003-11-18 2005-05-19 Lg Electronics, Inc. Mobile device and method for preventing undesired key depression in the same
US20080165144A1 (en) * 2007-01-07 2008-07-10 Scott Forstall Portrait-Landscape Rotation Heuristics for a Portable Multifunction Device
CN102929522A (en) * 2011-08-05 2013-02-13 诺基亚公司 Controlling responsiveness to user inputs
CN103488960A (en) * 2012-06-14 2014-01-01 华为终端有限公司 Misoperation preventing method and touch screen terminal equipment
CN103793177A (en) * 2014-02-28 2014-05-14 广州视源电子科技股份有限公司 Interface processing method for touch screen device and touch screen device
CN103874985A (en) * 2013-11-01 2014-06-18 华为技术有限公司 Method for presenting terminal device, and terminal device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020051018A1 (en) * 2000-10-26 2002-05-02 Nan-Ting Yeh Apparatus and method for browser interface operation
US20050107126A1 (en) * 2003-11-18 2005-05-19 Lg Electronics, Inc. Mobile device and method for preventing undesired key depression in the same
US20080165144A1 (en) * 2007-01-07 2008-07-10 Scott Forstall Portrait-Landscape Rotation Heuristics for a Portable Multifunction Device
CN102929522A (en) * 2011-08-05 2013-02-13 诺基亚公司 Controlling responsiveness to user inputs
CN103488960A (en) * 2012-06-14 2014-01-01 华为终端有限公司 Misoperation preventing method and touch screen terminal equipment
CN103874985A (en) * 2013-11-01 2014-06-18 华为技术有限公司 Method for presenting terminal device, and terminal device
CN103793177A (en) * 2014-02-28 2014-05-14 广州视源电子科技股份有限公司 Interface processing method for touch screen device and touch screen device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105892846A (en) * 2015-12-21 2016-08-24 乐视致新电子科技(天津)有限公司 Method and device for preventing wrong touch control in full screen playing mode
CN106527899A (en) * 2016-11-24 2017-03-22 珠海市魅族科技有限公司 Full-screen playing mode control method and mobile terminal
CN107562355A (en) * 2017-08-15 2018-01-09 深圳市沃特沃德股份有限公司 The control method and device of touch screen terminal
CN111078111A (en) * 2019-12-27 2020-04-28 联想(北京)有限公司 Information processing method, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US20210096651A1 (en) Vehicle systems and methods for interaction detection
KR101302910B1 (en) Gesture recognition device, gesture recognition method, computer readable recording medium recording control program
EP2891950B1 (en) Human-to-computer natural three-dimensional hand gesture based navigation method
CN104679401B (en) The touch control method and terminal of a kind of terminal
TWI543018B (en) An input device, an input method, and storage medium
US8373654B2 (en) Image based motion gesture recognition method and system thereof
US20130215027A1 (en) Evaluating an Input Relative to a Display
JP5604279B2 (en) Gesture recognition apparatus, method, program, and computer-readable medium storing the program
US20140071042A1 (en) Computer vision based control of a device using machine learning
WO2012081012A1 (en) Computer vision based hand identification
CN103576976A (en) Information processing apparatus and control method thereof
CN104216642A (en) Terminal control method
CN104714736A (en) Control method and terminal for quitting full screen lock-out state
CN103608761A (en) Input device, input method and recording medium
CN104571521A (en) Device and method of handwritten record
US20160085408A1 (en) Information processing method and electronic device thereof
CN111145891A (en) Information processing method and device and electronic equipment
CN106845190B (en) Display control system and method
US9235338B1 (en) Pan and zoom gesture detection in a multiple touch display
CN111803938B (en) Game interface processing method, terminal device, electronic device and storage medium
CN104951211B (en) A kind of information processing method and electronic equipment
WO2017004998A1 (en) System for directing action of self-propelled physical object and method thereof
CN110007748B (en) Terminal control method, processing device, storage medium and terminal
US20150117712A1 (en) Computer vision based control of a device using machine learning
JP5495657B2 (en) Character recognition device, character recognition program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150617

RJ01 Rejection of invention patent application after publication