CN104076907A - Control method, control device and wearable electronic equipment - Google Patents

Control method, control device and wearable electronic equipment Download PDF

Info

Publication number
CN104076907A
CN104076907A CN201310097595.2A CN201310097595A CN104076907A CN 104076907 A CN104076907 A CN 104076907A CN 201310097595 A CN201310097595 A CN 201310097595A CN 104076907 A CN104076907 A CN 104076907A
Authority
CN
China
Prior art keywords
viewing area
input operation
navigation interface
graphical interfaces
electronic equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310097595.2A
Other languages
Chinese (zh)
Inventor
侯欣如
彭世峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310097595.2A priority Critical patent/CN104076907A/en
Publication of CN104076907A publication Critical patent/CN104076907A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a control method, a control device and electronic equipment. The method is applied to wearable electronic equipment. The wearable electronic equipment is provided with a sensing unit and a display unit, wherein the sensing unit corresponds to a sensing region; when the wearable electronic equipment is worn on the body of a user, the eyes of the user have a visible region; when the display unit of the wearable electronic equipment displays a graphical interface, a display region corresponding to the graphical interface exists; the visible region comprises the display region; and the sensing region at least comprises the display region. The method comprises the following steps that: input operation in the display region is obtained through the sensing unit; when the condition that the input operation meets the first preset condition is judged, a navigational interface is displayed through the display unit on the basis of the first graphical interface currently displayed by the display unit, wherein an operation mark used for controlling display contents of the first graphical interface to change is contained on the navigational interface. The method has the advantage that the operation complexity of the wearable electronic equipment can be lowered.

Description

A kind of control method, device and Wearable electronic equipment
Technical field
The present invention relates to electronic equipment control technology field, relate in particular relate to a kind of control method, device and Wearable electronic equipment.
Background technology
With the quickening pace of modern life, people also come higher to the requirement of electronic equipment, therefore, can efficiently carry out data processing and portable wearable electronic equipment arises at the historic moment.As wearable intelligent glasses, this intelligent glasses is the same with common intelligent terminal, has independently operating system, utilizes this intelligent glasses can carry out video calling, digital map navigation, image taking etc.
When wearable electronic equipment is operated, need to trigger this wearable electronic equipment demonstrate Control-Menu in its display interface by pressing the specific physical button being arranged on wearable electronic equipment, so that user, by pressing button or tap operation in display interface, selects corresponding option to complete corresponding control operation from Control-Menu.Yet when wearable electronic equipment is worn on user with it time, user but possibly cannot accurately press this specific button, thereby causes impact cannot open smoothly Control-Menu, and then has influence on normal control operation.For example, existing intelligent glasses is generally that operation push-button district is installed on the mirror holder of a side, when user puts on these intelligence eyes, if the button setting to key zone is unfamiliar with, or this operating area that human eye cannot be seen, can cause user accurately click keys demonstrate Control-Menu.
Summary of the invention
In view of this, the invention provides a kind of control method, device and Wearable electronic equipment, to improve the simple operation to Wearable electronic equipment.
For achieving the above object, the invention provides following technical scheme: a kind of control method, be applied to Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and described induction region at least comprises described viewing area, and described method comprises:
By described sensing unit, obtain the input operation in described viewing area;
Judge whether first predetermined condition of described input operation, obtain the first judged result;
When showing described input operation, described the first judged result meets described first when pre-conditioned, based on the first current presented graphical interfaces of described display unit, by described display unit, present navigation interface, wherein, on described navigation interface, comprise the operation mark changing for controlling the displaying contents of described the first graphical interfaces.
Preferably, describedly based on the first current presented graphical interfaces of described display unit, by described display unit, present navigation interface, comprising:
Based on the first current presented graphical interfaces of described display unit, by described display unit, on described the first graphical interfaces, present navigation interface.
Preferably, describedly judge whether first predetermined condition of described input operation, comprising:
Judge whether the duration of described input operation in described viewing area reaches default duration;
Accordingly, when judging the duration of described input operation in described viewing area and reach default duration, think that described input operation meets first pre-conditioned.
Preferably, describedly judge that whether the duration of described input operation in described viewing area reaches default duration, comprising:
Whether the lasting duration that judges the primary importance place of described input operation in described viewing area reaches default duration, and wherein, described primary importance is the optional position in described viewing area.
Preferably, the method also comprises:
Obtain the first input operation of selecting operation mark from described navigation interface, and determine selected the first operation mark of described the first input operation;
According to corresponding the first instruction of described the first operation mark, the displaying contents of described the first graphical interfaces is changed.
Preferably, described in obtain the first input operation of selecting operation mark from described navigation interface, and determine selected the first operation mark of described the first input operation, comprise following any one:
The phonetic entry operation of described the first operation mark is selected in reception from described navigation interface;
Obtain user's eye movement information of dressing described Wearable electronic equipment, based on described eye movement information, determine that user is at the track of browsing of described navigation interface, and determine selected described the first operation mark according to the described track of browsing;
Obtain the gesture input operation in the sub-viewing area that described navigation interface is corresponding, according to described gesture input operation, determine the first operation mark of selecting from the operation mark of described navigation interface.
Preferably, described in obtain the gesture input operation in the sub-viewing area that described navigation interface is corresponding, according to described gesture input operation, determine and comprise the first operation mark select from the operation mark of described navigation interface:
Obtain the gesture input operation in the sub-viewing area that described navigation interface is corresponding;
In described navigation interface, determine the primary importance corresponding with described gesture input operation, when gesture input operation reaches default duration at the lasting duration of described primary importance, elect the operation mark at described primary importance place as described the first operation mark.
The present invention also provides a kind of control device, be applied to Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and described induction region at least comprises described viewing area, and described device comprises:
Input acquiring unit, for obtaining the input operation in described viewing area by described sensing unit;
Judging unit, for judging whether first predetermined condition of described input operation, obtains the first judged result;
Interface control unit, for showing that when described the first judged result described input operation meets described first when pre-conditioned, based on the first current presented graphical interfaces of described display unit, by described display unit, present navigation interface, wherein, on described navigation interface, comprise the operation mark changing for controlling the displaying contents of described the first graphical interfaces.
Preferably, described interface control unit is specially: for showing that when described the first judged result described input operation meets described first when pre-conditioned, based on the first current presented graphical interfaces of described display unit, by described display unit, on described the first graphical interfaces, present navigation interface.
Preferably, described judging unit, comprising:
The first judging unit, for judging whether the duration of described input operation in described viewing area reaches default duration;
Accordingly, described interface control unit, be used for, when described the first judged result shows that the duration of described input operation in described viewing area reaches default duration, based on the first current presented graphical interfaces of described display unit, by described display unit, presenting navigation interface.
Preferably, described the first judging unit, comprising:
The first judgment sub-unit, for judging whether the lasting duration at the primary importance place of described input operation in described viewing area reaches default duration, and wherein, described primary importance is the optional position in described viewing area.
Preferably, described device also comprises:
Sign determining unit, for obtaining the first input operation of selecting operation mark from described navigation interface, and determines selected the first operation mark of described the first input operation;
Content changing unit, for according to corresponding the first instruction of described the first operation mark, changes the displaying contents of described the first graphical interfaces.
Preferably, described sign determining unit comprises following any one unit:
The first sign determining unit, for receiving the phonetic entry operation of selecting described the first operation mark from described navigation interface;
The second sign determining unit, for obtaining user's eye movement information of dressing described Wearable electronic equipment, based on described eye movement information, determine that user is at the track of browsing of described navigation interface, and determine selected described the first operation mark according to the described track of browsing;
The 3rd sign determining unit, for obtaining the gesture input operation in the sub-viewing area that described navigation interface is corresponding, determines the first operation mark of selecting from the operation mark of described navigation interface according to described gesture input operation.
Preferably, described the 3rd sign determining unit, comprising:
Gesture acquiring unit, for obtaining the gesture input operation in the sub-viewing area that described navigation interface is corresponding;
The 3rd sign is determined subelement, for determining the primary importance corresponding with described gesture input operation at described navigation interface, when gesture input operation reaches default duration at the lasting duration of described primary importance, elect the operation mark at described primary importance place as described the first operation mark.
On the other hand, the present invention also provides a kind of Wearable electronic equipment, described Wearable electronics processors, and be all connected with described processor there is sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and described induction region at least comprises described viewing area, and described processor is built-in with the control device described in as above any one.
Known via above-mentioned technical scheme, compared with prior art, the present invention openly provides a kind of control method, device and Wearable electronic equipment, the method is obtained the input operation of Wearable electronic equipment in this viewing area by this sensing unit, if judging this input operation meets first pre-conditioned,, based on current the first graphical interfaces presenting, present a navigation interface, in this navigation interface, comprise and control the operation mark that the displaying contents of this first graphical interfaces changes.Like this, when if user changes the displaying contents of the first graphical interfaces, cannot take off one's glasses or carry out the complicated operations such as physical button of groping to determine invoke navigation interface, only need in this viewing area, meet pre-conditioned input operation, can trigger and demonstrate current navigation interface corresponding to the first graphical interfaces, improve the convenience of operation.
Accompanying drawing explanation
In order to be illustrated more clearly in the embodiment of the present invention or technical scheme of the prior art, to the accompanying drawing of required use in embodiment or description of the Prior Art be briefly described below, apparently, accompanying drawing in the following describes is only embodiments of the invention, for those of ordinary skills, do not paying under the prerequisite of creative work, other accompanying drawing can also be provided according to the accompanying drawing providing.
Fig. 1 shows the schematic flow sheet of an embodiment of a kind of control method of the present invention;
When Fig. 2 shows user and dresses intelligent glasses, the schematic top plan view of viewing area and viewing area;
When Fig. 3 shows user and dresses intelligent glasses, longitudinal cross-sectional schematic of viewing area and viewing area;
Fig. 4 shows the schematic flow sheet of a kind of another embodiment of control method of the present invention;
Fig. 5 shows in the intelligent glasses of Fig. 2 or 3, triggers the operation chart of carrying out navigation interface demonstration;
Fig. 6 shows the schematic flow sheet of a kind of another embodiment of control method of the present invention;
Fig. 7 shows the structural representation of an embodiment of a kind of control device of the present invention;
Fig. 8 shows the structural representation of a kind of another embodiment of control device of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, rather than whole embodiment.Embodiment based in the present invention, those of ordinary skills, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
The embodiment of the invention discloses a kind of control method, this control method is applied to a Wearable electronic equipment, the simple operation that can improve Wearable electronic equipment by this control method.
Referring to Fig. 1, show the schematic flow sheet of an embodiment of a kind of control method of the present invention, the method of the present embodiment is applied to a Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of this sensing unit, when this Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the display unit of this Wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area; Wherein, this viewing area comprises viewing area, and this induction region at least comprises this viewing area, and the method for the present embodiment comprises:
Step 101: obtain the input operation in described viewing area by sensing unit.
Display unit at Wearable electronic equipment presents graphical interfaces, the corresponding viewing area of this graphical interfaces, and this viewing area can be understood as can carry out the region that input operation is controlled with graphical interfaces.As, the corresponding viewing area of graphical interfaces can be the corresponding viewing area of visual line of sight that user watches this graphical interfaces.Because the induction region of sensing unit comprises this viewing area, therefore, by sensing unit, can get any input operation in this viewing area.Wherein, this input operation can be the gesture input operation in the space comprising in viewing area, can be also the input operation of other operating bodies.
Wherein, this sensing unit can be understood as thermal infrared sensing unit, image acquisition units etc., so long as can get the input operation in this viewing area.
Wherein, the mode that this Wearable electronic equipment presents graphical interfaces has multiple, and a kind of mode is: display unit is to output pattern interface in the display interface of this Wearable electronic equipment, so that user can see the graphical interfaces of exporting in this display interface.Accordingly, the whole visual line of sight region that user's sight line can be watched this graphical interfaces attentively all can be understood as viewing area.For example, on the eyeglass of wearable intelligent glasses, have can display graphics display interface, user is viewing area through the outward extending sight line of this display interface region, and this viewing area intra domain user can carry out input operation, so that this graphical interfaces is carried out to certain operations.
Another mode that presents graphical interfaces is: the display unit of Wearable electronic equipment, to projects images in user's eyes, makes user can see a graphical interfaces, and user can see this graphical interfaces.Accordingly, the input operation of regional extent that can this graphical interfaces user, can trigger graphical interfaces is operated accordingly, and the regional extent that this user can this graphical interfaces is viewing area.
Step 102: judge whether first predetermined condition of this input operation, obtain the first judged result.
Step 103: meet first when pre-conditioned when this first judged result shows this input operation, based on the first current presented graphical interfaces of this display unit, by display unit, present navigation interface, wherein, on this navigation interface, comprise the operation mark changing for controlling the displaying contents of this first graphical interfaces.
In this viewing area, get input operation first predetermined condition time, illustrate that the current input operation in this viewing area of user need to control this Wearable electronic equipment and present the navigation interface relevant to the first graphical interfaces of this current demonstration, so that follow-up displaying contents to this first graphical interfaces changes.
It is first pre-conditioned whether system meets in the input operation of this viewing area by judgement, determines whether to open the navigation interface of this first graphical interfaces.In input operation, meet first when pre-conditioned, system can present the navigation interface relevant to this first graphical interfaces based on this first graphical interfaces, to demonstrate the operation mark relevant to this first graphical interfaces.Like this, user is in dressing the process of this Wearable electronic equipment, while opening some conventional operations that demonstrate current graphical interfaces if looked on the bright side of things, by the input operation of a first predetermined condition in the space of viewing area, can trigger and demonstrate corresponding navigation interface, thereby avoid user to be not easy to the situation of determining respective keys position and occurring maloperation.
Wherein, this first graphical interfaces is only used to distinguish other constantly shown graphical interfaces.When this input operation first predetermined condition, if the first graphical interfaces is different, triggering the navigation interface demonstrating also can be different.The operation mark comprising in this navigation interface displaying contents current with this first graphical interfaces is relevant, operates the content that can control the graphical interfaces of current demonstration change by the operation mark in this navigation interface.
The broadcast interface that current the first graphical interfaces presenting of take is player is example, when input operation first predetermined condition, the navigation interface that triggering presents also can for this player relevant control menu interface, in this navigation interface, also can comprise some other conventional operation mark, such as having time-out, F.F., rewind down in this navigation interface, exiting the operation mark such as full frame, can also comprise the operation mark that shows desktop, enters other pages etc.
And for example, current the first graphical interfaces presenting is picture browsing interface,, when input operation first predetermined condition, can comprise picture rotation, convergent-divergent, page turning, edits, exits the operation mark such as Photo Browser in the navigation interface that triggering presents.
Certainly, according to the difference of graphical interfaces, can preset with the corresponding navigation interface in different graphic interface and navigation interface in operation mark.
Wherein, can have multiplely in the concrete mode that presents navigation interface by display unit, the first graphical interfaces as can be based on this display unit presented presents corresponding navigation interface by this display unit on this first graphical interfaces.On the upper strata of the first graphical interfaces, present this navigation interface, make navigation interface partly cover this graphical interfaces, can more intuitively recognize the concrete effect of each operation mark of navigation interface.Certainly, also can directly present according to actual needs navigation interface, and without when presenting navigation interface, present this graphical interfaces.
In the present embodiment, by this sensing unit, obtain the input operation of Wearable electronic equipment in this viewing area, if judging this input operation meets first pre-conditioned, based on current the first graphical interfaces presenting, present a navigation interface, in this navigation interface, comprise and control the operation mark that the displaying contents of this first graphical interfaces changes.Like this, when if user changes the displaying contents of the first graphical interfaces, cannot take off one's glasses or carry out the complicated operations such as physical button of groping to determine invoke navigation interface, only need in this viewing area, meet pre-conditioned input operation, can trigger and demonstrate current navigation interface corresponding to the first graphical interfaces, improve the convenience of operation.
In addition, because the viewing area of Wearable electronic equipment can comprise scope region larger in space, like this user to carry out the operating area of input operation larger, be conducive to user and carry out input operation, more can reduce because user puts the maloperation situation that the reasons such as wrong button cause because of carelessness.
Wherein, the Wearable electronic equipment in the embodiment of the present application can be for intelligent glasses, helmet first formula electronic equipment, may be worn on the watch style computer in wrist.
In order to be conducive to understand the scheme of the embodiment of the present invention, this user is dressed after this Wearable electronic equipment, the viewing area that user has, this viewing area relation are introduced, take below wearable electronic equipment as Brilliant Eyes mirror electronic equipment be example.And the display unit of supposing this intelligence eyes formula electronic equipment is presented at image on this eyeglass rather than directly figure is projected in eyes of user, that is to say and on eyeglass, be provided with display unit, and then user can be with soon to the graphical interfaces presenting in this display unit.
As shown in Figure 2, for user dresses the plan structure schematic diagram of this intelligent glasses, this intelligent glasses 20 has two spectacle frames (or being called ear frame or ear holder), these two spectacle frames are connected with picture frame 201, on this picture frame, be provided with eyeglass, on this eyeglass, have display unit 202, display unit can be exported a graphical interfaces to this display unit.The corresponding viewing area 211 of this graphical interfaces, as the region between two black thick dashed line (also thinking viewing area by the area of space of two graphical interfaces) in Fig. 2 in this Fig. 2.
Generally, show that, after this graphical interfaces, the lens area at graphical interfaces place still has certain transmittance on the eyeglass at intelligent glasses, user still can see through this graphical interfaces and see outside things.Therefore, the region that whole eyeglass becomes that sees through this electronic equipment from the sight line of user's eyes is viewing area 212, and as shown in Figure 2, this viewing area 212 is two regions between solid line.This viewing area 212 comprises this viewing area 211, as dashed region in figure.Certainly, in actual applications, when the eyeglass of this intelligent glasses is during all as display unit, Yu Gai viewing area, this viewing area is identical.
On this intelligent glasses, be provided with sensing unit, this sensing unit can be responded to the operation in this viewing area, and then judges getting input operation, to determine whether first predetermined condition of this input operation.Wherein, the input operation that user carries out in this viewing area, can be the gesture input in the space of this viewing area, also can on the display unit in being contained in this viewing area, carry out touch operation.
Certainly, Fig. 2 is only a user while dressing this intelligent glasses, the plan structure schematic diagram of regional.For the position relationship of viewing area in Fig. 2 more clearly and viewing area, referring to Fig. 3, when showing user and dressing this intelligent glasses, the longitudinal cross-section schematic diagram of regional.This Fig. 3 is the sectional view vertically that in this Fig. 2, the place, optional position on direction of visual lines is done, and arbitrary section is a rectangle.Wherein, the interior zone that entity thick line surrounds is viewing area, has comprised the viewing area 211 being surrounded by dotted line in viewing area.When user is in space during in input operation, user's eyes can see that its hand moves in this viewing area, if user need to trigger show navigator interface, hand can be moved in this viewing area, if it is first pre-conditioned that user meets at the gesture input action of this viewing area, can trigger this intelligent glasses and present the navigation interface relevant to current graphical interfaces.
Certainly, viewing area in Fig. 2 is the region of a similar taper, the rectangular area that this viewing area also can be stretched out and be formed by the dual-side with display unit in practice, from the outside area of space of the display unit of intelligent glasses, the outward flange in whole region is several the straight lines that are parallel to each other.
It should be noted that, above is only to take the introduction that intelligent glasses carries out as example, but for wear-type electronic equipment and watch style electronic equipment, the relation of its viewing area and viewing area similarly, does not repeat them here.
Referring to Fig. 4, show the schematic flow sheet of a kind of another embodiment of control method of the present invention, the method of the present embodiment is applied to a Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of this sensing unit, when this Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the display unit of this Wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area; Wherein, this viewing area comprises viewing area, and this induction region at least comprises this viewing area, and the method for the present embodiment comprises:
Step 401: obtain the input operation in described viewing area by sensing unit.
The similar process of the operating process of this step and step 101 embodiment illustrated in fig. 1, does not repeat them here.
Step 402: judge whether the duration of this input operation in viewing area reaches default duration, obtain the first judged result.
The present embodiment first is pre-conditioned, and for input operation, in this viewing area, whether lasting duration reaches default duration.If there is input operation in viewing area, but the lasting duration of this input operation in this viewing area do not reach default duration, can not trigger the operation of carrying out subsequent step 403 yet.
Wherein, this default duration can be set as required, generally, be as the criterion can distinguish hand or this viewing area of the unconscious process of operating body of user.For example, when user dresses intelligent glasses, and the display unit by this intelligent glasses presents in the process of a graphical interfaces, user needs possibly to lift hand and adjusts picture frame and wear position, or arrangement hair etc., in this kind of situation, user's hand also can enter into viewing area, thereby make sensing unit input operation be detected, in order to distinguish user's unconscious input operation, also can this default duration of proper extension, generally, user's hand can not stop for a long time at area of space, thereby can avoid erroneous judgement to occur.
Step 403: when showing that according to the first judged result the duration of this input operation in this viewing area reaches default duration, based on the first current presented graphical interfaces of this display unit, by display unit, present navigation interface, wherein, on this navigation interface, comprise the operation mark changing for controlling the displaying contents of this first graphical interfaces.
When determining input operation when the lasting duration in this viewing area reaches default duration, by display unit, present the navigation interface obtaining based on this first graphical interfaces, so that the follow-up displaying contents that operation mark in this navigation interface is operated to control the first graphical interfaces changes.
Wherein, the duration of this input operation in viewing area reaches default duration, presents navigation interface, can be understood as operating body carries out input operation duration in this viewing area and reaches this default duration, triggers and shows navigation interface corresponding to this graphical interfaces; Also can be that if the duration of this input operation reaches default duration, show navigator connects interface when completing an input operation.Wherein complete an input operation, also can be to carry out completing a series of continuous actions to think that an input operation finishes in this viewing area, can be also operating body leaves this viewing area to operating body and is considered to an input operation and completes from entering into viewing area.
For example, present under the prerequisite of graphical interfaces at display unit, user's hand carries out irregular rocking in this viewing area, when user reaches default duration and presents this navigation page at the duration that rocks of this viewing area; And for example, user's hand carries out irregular rocking in this viewing area, when user's hand shifts out this viewing area, if the duration of the hand of determining user in this viewing area reaches default duration, triggers and shows corresponding navigation interface.
In order to reduce operation complexity, and reduce erroneous judgement situation and occur, judge whether the duration of this input operation in viewing area reaches default duration and can be: whether the lasting duration that judges the primary importance place of this input operation in viewing area reaches default duration, wherein, this primary importance can be any one position in viewing area.Like this, if when user need to start the navigation interface of the first graphical interfaces, can trigger and present navigation interface so that the duration that input operation stops at the place, a certain fixed position of operational zone reaches default duration, user is simple to operate.And system acquisition is during to input operation in viewing area, only need to judge whether this input operation reaches default duration in a certain fixed position residence time of viewing area, also can avoid reducing because position in input process constantly changes, cause accurately judging whether input operation meets predetermined condition.
As, the intelligent glasses of Fig. 2 and Fig. 3 of still take is example, at display unit, to the display unit in the eyeglass of this intelligent glasses, export after a graphical interfaces, if user need to present navigation interface in display unit, when carrying out the change to the displaying contents of graphical interfaces according to navigation interface, hand can be reached to the place ahead of this display unit, make the position residence time of hand in this viewing area reach default duration, in order conveniently to become this position, it is primary importance, as shown in Figure 5, for in the sectional view shown in Fig. 3, user's hand position as shown in the figure in this viewing area stops default duration.System is determined input operation and is reached default duration at the duration of this primary importance, can present the navigation interface based on this graphical interfaces.
In actual applications, this first pre-conditionedly can also have other situations, as set when input operation is in viewing area during along default the first track operation, thinks that this input operation meets first pre-conditioned etc.
Referring to Fig. 6, show the schematic flow sheet of a kind of another embodiment of control method of the present invention, the method of the present embodiment is applied to Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of this sensing unit, when this Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the display unit of this Wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area; Wherein, this viewing area comprises viewing area, and this induction region at least comprises this viewing area, and the method for the present embodiment comprises:
Step 601: obtain the input operation in described viewing area by sensing unit.
Step 602: judge whether first predetermined condition of this input operation, obtain the first judged result.
Step 603: meet first when pre-conditioned when this first judged result shows this input operation, based on the first current presented graphical interfaces of this display unit, present navigation interface by display unit.
Wherein, on this navigation interface, comprise the operation mark changing for controlling the displaying contents of this first graphical interfaces.
This input operation in the present embodiment meet first pre-conditioned can be any one situation in above any one embodiment, at this, do not repeating.
Step 604: obtain the first input operation of selecting operation mark from navigation interface, and determine selected the first operation mark of this first input operation.
In order to distinguish with triggering the input operation present navigation interface above, the input operation of selecting a certain operation mark from this navigation interface getting is called to the first input operation.According to the first input operation getting, determine the first operation mark of current required selection.Wherein, this first operation mark is only used to and the non-selected operation mark of this navigation interface.
Step 605: according to corresponding the first instruction of this first operation mark, the displaying contents of described the first graphical interfaces is changed.
A kind of operational order in the equal correspondence of any one operation mark in navigation interface, determines the first instruction corresponding to the first operation mark, and then according to this first instruction, the displaying contents of this first graphical interfaces is changed accordingly.For example, graphical interfaces is picture browsing interface, and this first operation mark is rotating mark, and this first instruction is for to be rotated the current picture representing.
Wherein, obtain and from navigation interface, select the first input operation of operation mark can have multiple way of realization.As being: receive the phonetic entry operation of the first operation mark as described in selecting from navigation interface.User can be controlled the operation mark in navigation interface is selected by voice, and system identification goes out user's voice signal, and the selected operation mark of analyzing speech signal, and then carries out instruction corresponding to this operation mark.
And for example, obtaining this first input operation can also be: obtain user's eye movement information of dressing this Wearable electronic equipment, based on this eye movement information, determine that user is at the track of browsing of navigation interface, and according to this, browse track and determine selected the first operation mark.By obtaining user's eye movement information, can determine the watch situation of user's sight line to each operation mark in this navigation interface, and then determine according to user's eye movement the first operation mark that needs selection.For example, when determine the duration that user watches certain operation mark attentively according to user's eyeball information, reach appointment duration, using this operation mark as this first operation mark to be selected.
For after presenting navigation interface, can realize the selection of carrying out continuously operation mark in this navigation interface.Obtain this first input operation, can also be: obtain the gesture input operation in the sub-viewing area that navigation interface is corresponding, according to this gesture input operation, determine the first operation mark of selecting from the operation mark of this navigation interface.After presenting navigation interface, this navigation interface same corresponding sub-viewing area, in this sub-viewing area, can detect the input operation to the object in this navigation interface, wherein, this sub-viewing area is part or all of viewing area that graphical interfaces is corresponding.When navigation interface covers this graphical interfaces completely, this Yu Gai viewing area, sub-viewing area is identical; When navigation interface is positioned on this graphical interfaces, and in the part graphical interfaces situation that still can show, a part for this Wei Gai viewing area, sub-viewing area.
According to gesture input operation, determine that the first operation mark in the operation mark of this navigation interface can be to point operation mark pointed as the first operation mark according to user.Optionally, can be: in navigation interface, determine the primary importance corresponding with this gesture input operation; When gesture input operation, at this, your the lasting duration of primary importance reaches default duration, elects the operation mark at this primary importance place as first operation mark.Wherein, the primary importance corresponding with the input operation of user's gesture, can be this user's hand in the position at the subpoint place of this navigation interface, can be also the position of determining according to other corresponding relations.For the ease of user, see intuitively primary importance definite in navigation interface, primary importance place display light punctuate or other signs that can also be corresponding with this gesture input operation in this navigation interface, like this, be conducive to user and carry out mobile hand according to cursor point etc., and then hand is moved to position corresponding to first operation mark to be selected with this.
Corresponding control method of the present invention, the present invention also provides a kind of control device.Referring to Fig. 7, show the structural representation of an embodiment of a kind of control device of the present invention, the application of installation of the present embodiment is in Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of this sensing unit, when this Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the display unit of this Wearable electronic equipment shows a graphical interfaces, with graphical interfaces to there being a viewing area; Wherein, this viewing area comprises viewing area, and this induction region at least comprises viewing area, and device comprises: input acquiring unit 701, judging unit 702 and interface control unit 703.
Wherein, this inputs acquiring unit 701, for obtaining the input operation in viewing area by this sensing unit.
Judging unit 702, for judging whether first predetermined condition of this input operation, obtains the first judged result.
Interface control unit 703, for showing that when the first judged result input operation satisfied first is when pre-conditioned, based on the first current presented graphical interfaces of this display unit, by display unit, present navigation interface, wherein, on this navigation interface, comprise the operation mark changing for controlling the displaying contents of the first graphical interfaces.
Wherein, this interface control unit is when control presents navigation interface, can be to control the navigation interface presenting to cover this graphical interfaces completely, now user will can't see this graphical interfaces, also can be on this graphical interfaces, to demonstrate a navigation interface, when user sees this navigation interface like this, still can see the subregion of this graphical interfaces.Optionally, this interface control unit is specially: for showing that when the first judged result input operation meets first when pre-conditioned, based on the first current presented graphical interfaces of this display unit, by display unit, on this first graphical interfaces, present navigation interface.
This control device of the present embodiment, when judging unit is judged the input operation first predetermined condition that this input acquiring unit gets in this viewing area, can trigger interface control unit and present a navigation interface based on the first graphical interfaces, like this, user only need carry out can triggering show navigator interface with the input operation of first predetermined condition in the space of viewing area, operating process is simple, has also avoided because button is pressed the maloperation that mistake causes simultaneously.
In the embodiment of the present application, in this judging unit first pre-conditioned can multiple situation, corresponding, determine whether input operation meets first and pre-conditionedly also have multiplely, and correspondence is a kind of situation wherein, and this judging unit can comprise:
The first judging unit, for judging whether the duration of the input operation that gets of input acquiring unit in described viewing area reaches default duration.
Accordingly, this interface control unit, when showing that when this first judged result the duration of this input operation in viewing area reaches default duration, based on the first current presented graphical interfaces of this display unit, presents navigation interface by this display unit.
In order further to simplify input operation process, this first judging unit can comprise:
The first judgment sub-unit, for judging whether the lasting duration at the primary importance place of described input operation in described viewing area reaches default duration, and wherein, described primary importance is the optional position in described viewing area.
Referring to Fig. 8, show the structural representation of a kind of another embodiment of control device of the present invention, the application of installation of the present embodiment is in Wearable electronic equipment, this Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of this sensing unit, when this Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the display unit of this Wearable electronic equipment shows a graphical interfaces, with graphical interfaces to there being a viewing area; Wherein, this viewing area comprises viewing area, and this induction region at least comprises viewing area, and the difference of the device of the present embodiment and a upper control device embodiment is:
Also comprise in the present embodiment:
Sign determining unit 704, for obtaining the first input operation of selecting operation mark from this navigation interface, and determines selected the first operation mark of this first input operation.
Content changing unit 705, for according to corresponding the first instruction of this first operation mark, changes the displaying contents of the first graphical interfaces.
Concrete, this sign determining unit 704 can comprise following any one unit:
The first sign determining unit, for receiving the phonetic entry operation of selecting described the first operation mark from described navigation interface.
The second sign determining unit, for obtaining user's eye movement information of dressing this Wearable electronic equipment, determines that based on this eye movement information user is at the track of browsing of this navigation interface, and according to this, browses track and determine selected the first operation mark.
The 3rd sign determining unit, for obtaining the gesture input operation in the sub-viewing area that this navigation interface is corresponding, determines the first operation mark of selecting from the operation mark of navigation interface according to this gesture input operation.
Further, the 3rd sign determining unit can comprise:
Gesture acquiring unit, for obtaining the gesture input operation in the sub-viewing area that this navigation interface is corresponding.
The 3rd sign is determined subelement, for determining the primary importance corresponding with gesture input operation at this navigation interface, when gesture input operation reaches default duration at the lasting duration of primary importance, elect the operation mark at this primary importance place as first operation mark.
In addition, the present invention also provides a kind of Wearable electronic equipment, this Wearable electronics processors, and be all connected with this processor there is sensing unit and display unit, the corresponding induction region of this sensing unit, when Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the display unit of Wearable electronic equipment shows a graphical interfaces, with this graphical interfaces to there being a viewing area; Wherein, this viewing area comprises viewing area, and this induction region at least comprises viewing area, and this processor is built-in with the control device described in as above any embodiment.
In this instructions, each embodiment adopts the mode of going forward one by one to describe, and each embodiment stresses is the difference with other embodiment, between each embodiment identical similar part mutually referring to.For the disclosed device of embodiment, because it corresponds to the method disclosed in Example, so description is fairly simple, relevant part partly illustrates referring to method.
Above-mentioned explanation to the disclosed embodiments, makes professional and technical personnel in the field can realize or use the present invention.To the multiple modification of these embodiment, will be apparent for those skilled in the art, General Principle as defined herein can, in the situation that not departing from the spirit or scope of the present invention, realize in other embodiments.Therefore, the present invention will can not be restricted to these embodiment shown in this article, but will meet the widest scope consistent with principle disclosed herein and features of novelty.

Claims (15)

1. a control method, is characterized in that, is applied to Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and described induction region at least comprises described viewing area, and described method comprises:
By described sensing unit, obtain the input operation in described viewing area;
Judge whether first predetermined condition of described input operation, obtain the first judged result;
When showing described input operation, described the first judged result meets described first when pre-conditioned, based on the first current presented graphical interfaces of described display unit, by described display unit, present navigation interface, wherein, on described navigation interface, comprise the operation mark changing for controlling the displaying contents of described the first graphical interfaces.
2. method according to claim 1, is characterized in that, describedly based on the first current presented graphical interfaces of described display unit, by described display unit, presents navigation interface, comprising:
Based on the first current presented graphical interfaces of described display unit, by described display unit, on described the first graphical interfaces, present navigation interface.
3. method according to claim 1, is characterized in that, describedly judges whether first predetermined condition of described input operation, comprising:
Judge whether the duration of described input operation in described viewing area reaches default duration;
Accordingly, when judging the duration of described input operation in described viewing area and reach default duration, think that described input operation meets first pre-conditioned.
4. method according to claim 3, is characterized in that, describedly judges that whether the duration of described input operation in described viewing area reaches default duration, comprising:
Whether the lasting duration that judges the primary importance place of described input operation in described viewing area reaches default duration, and wherein, described primary importance is the optional position in described viewing area.
5. method according to claim 1 and 2, is characterized in that, also comprises:
Obtain the first input operation of selecting operation mark from described navigation interface, and determine selected the first operation mark of described the first input operation;
According to corresponding the first instruction of described the first operation mark, the displaying contents of described the first graphical interfaces is changed.
6. method according to claim 5, is characterized in that, described in obtain the first input operation of selecting operation mark from described navigation interface, and determine selected the first operation mark of described the first input operation, comprise following any one:
The phonetic entry operation of described the first operation mark is selected in reception from described navigation interface;
Obtain user's eye movement information of dressing described Wearable electronic equipment, based on described eye movement information, determine that user is at the track of browsing of described navigation interface, and determine selected described the first operation mark according to the described track of browsing;
Obtain the gesture input operation in the sub-viewing area that described navigation interface is corresponding, according to described gesture input operation, determine the first operation mark of selecting from the operation mark of described navigation interface.
7. method according to claim 6, it is characterized in that, described gesture input operation of obtaining in the sub-viewing area that described navigation interface is corresponding, determines the first operation mark of selecting from the operation mark of described navigation interface according to described gesture input operation, comprising:
Obtain the gesture input operation in the sub-viewing area that described navigation interface is corresponding;
In described navigation interface, determine the primary importance corresponding with described gesture input operation, when gesture input operation reaches default duration at the lasting duration of described primary importance, elect the operation mark at described primary importance place as described the first operation mark.
8. a control device, is characterized in that, is applied to Wearable electronic equipment, described Wearable electronic equipment has sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and described induction region at least comprises described viewing area, and described device comprises:
Input acquiring unit, for obtaining the input operation in described viewing area by described sensing unit;
Judging unit, for judging whether first predetermined condition of described input operation, obtains the first judged result;
Interface control unit, for showing that when described the first judged result described input operation meets described first when pre-conditioned, based on the first current presented graphical interfaces of described display unit, by described display unit, present navigation interface, wherein, on described navigation interface, comprise the operation mark changing for controlling the displaying contents of described the first graphical interfaces.
9. device according to claim 8, it is characterized in that, described interface control unit is specially: for showing that when described the first judged result described input operation meets described first when pre-conditioned, based on the first current presented graphical interfaces of described display unit, by described display unit, on described the first graphical interfaces, present navigation interface.
10. device according to claim 8, is characterized in that, described judging unit, comprising:
The first judging unit, for judging whether the duration of described input operation in described viewing area reaches default duration;
Accordingly, described interface control unit, be used for, when described the first judged result shows that the duration of described input operation in described viewing area reaches default duration, based on the first current presented graphical interfaces of described display unit, by described display unit, presenting navigation interface.
11. devices according to claim 10, is characterized in that, described the first judging unit, comprising:
The first judgment sub-unit, for judging whether the lasting duration at the primary importance place of described input operation in described viewing area reaches default duration, and wherein, described primary importance is the optional position in described viewing area.
12. devices according to claim 8 or claim 9, is characterized in that, also comprise:
Sign determining unit, for obtaining the first input operation of selecting operation mark from described navigation interface, and determines selected the first operation mark of described the first input operation;
Content changing unit, for according to corresponding the first instruction of described the first operation mark, changes the displaying contents of described the first graphical interfaces.
13. devices according to claim 12, is characterized in that, described sign determining unit comprises following any one unit:
The first sign determining unit, for receiving the phonetic entry operation of selecting described the first operation mark from described navigation interface;
The second sign determining unit, for obtaining user's eye movement information of dressing described Wearable electronic equipment, based on described eye movement information, determine that user is at the track of browsing of described navigation interface, and determine selected described the first operation mark according to the described track of browsing;
The 3rd sign determining unit, for obtaining the gesture input operation in the sub-viewing area that described navigation interface is corresponding, determines the first operation mark of selecting from the operation mark of described navigation interface according to described gesture input operation.
14. devices according to claim 13, is characterized in that, described the 3rd sign determining unit, comprising:
Gesture acquiring unit, for obtaining the gesture input operation in the sub-viewing area that described navigation interface is corresponding;
The 3rd sign is determined subelement, for determining the primary importance corresponding with described gesture input operation at described navigation interface, when gesture input operation reaches default duration at the lasting duration of described primary importance, elect the operation mark at described primary importance place as described the first operation mark.
15. 1 kinds of Wearable electronic equipments, it is characterized in that, described Wearable electronics processors, and be all connected with described processor there is sensing unit and display unit, the corresponding induction region of described sensing unit, when described Wearable electronic equipment is worn on user with it time, user's eyes have a viewing area; When the described display unit of described Wearable electronic equipment shows a graphical interfaces, with described graphical interfaces to there being a viewing area; Wherein, described viewing area comprises described viewing area, and described induction region at least comprises described viewing area, and described processor is built-in with the control device as described in claim 8 to 14 any one.
CN201310097595.2A 2013-03-25 2013-03-25 Control method, control device and wearable electronic equipment Pending CN104076907A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310097595.2A CN104076907A (en) 2013-03-25 2013-03-25 Control method, control device and wearable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310097595.2A CN104076907A (en) 2013-03-25 2013-03-25 Control method, control device and wearable electronic equipment

Publications (1)

Publication Number Publication Date
CN104076907A true CN104076907A (en) 2014-10-01

Family

ID=51598222

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310097595.2A Pending CN104076907A (en) 2013-03-25 2013-03-25 Control method, control device and wearable electronic equipment

Country Status (1)

Country Link
CN (1) CN104076907A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104536654A (en) * 2014-12-25 2015-04-22 小米科技有限责任公司 Menu selecting method and device on intelligent wearable device and intelligent wearable device
CN104639965A (en) * 2015-02-10 2015-05-20 云晖软件(成都)有限公司 Aircraft entertainment visualization system
CN104657063A (en) * 2015-02-11 2015-05-27 青岛歌尔声学科技有限公司 Method for displaying contents on user interface and display equipment
WO2016058449A1 (en) * 2014-10-15 2016-04-21 成都理想境界科技有限公司 Smart glasses and control method for smart glasses
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN105892636A (en) * 2015-11-20 2016-08-24 乐视致新电子科技(天津)有限公司 Control method applied to head-mounted device and head-mounted device
CN107466396A (en) * 2016-03-22 2017-12-12 深圳市柔宇科技有限公司 Head-mounted display apparatus and its control method
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus
WO2018112688A1 (en) * 2016-12-19 2018-06-28 深圳前海达闼云端智能科技有限公司 Method and device for amblyopia assistance
CN108351698A (en) * 2015-10-07 2018-07-31 三星电子株式会社 The wearable electronic and method of application for being executed in control electronics
CN110007819A (en) * 2019-03-12 2019-07-12 中国平安财产保险股份有限公司 The operation indicating method, apparatus and computer readable storage medium of system
CN110018871A (en) * 2019-03-12 2019-07-16 中国平安财产保险股份有限公司 The operation indicating method, apparatus and computer readable storage medium of system
CN110060537A (en) * 2019-03-22 2019-07-26 珠海超凡视界科技有限公司 A kind of virtual reality drives training device and its man-machine interaction method

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016058449A1 (en) * 2014-10-15 2016-04-21 成都理想境界科技有限公司 Smart glasses and control method for smart glasses
CN105572869B (en) * 2014-10-15 2018-06-15 成都理想境界科技有限公司 Intelligent glasses and intelligent glasses control method
CN105572869A (en) * 2014-10-15 2016-05-11 成都理想境界科技有限公司 Intelligent glasses and control method of the intelligent glasses
CN104536654B (en) * 2014-12-25 2018-02-02 小米科技有限责任公司 Menu choosing method, device and Intelligent worn device in Intelligent worn device
CN104536654A (en) * 2014-12-25 2015-04-22 小米科技有限责任公司 Menu selecting method and device on intelligent wearable device and intelligent wearable device
CN104639965A (en) * 2015-02-10 2015-05-20 云晖软件(成都)有限公司 Aircraft entertainment visualization system
CN104657063A (en) * 2015-02-11 2015-05-27 青岛歌尔声学科技有限公司 Method for displaying contents on user interface and display equipment
CN104657063B (en) * 2015-02-11 2018-07-06 青岛歌尔声学科技有限公司 The method and display equipment that a kind of user interface content is shown
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN108351698A (en) * 2015-10-07 2018-07-31 三星电子株式会社 The wearable electronic and method of application for being executed in control electronics
CN108351698B (en) * 2015-10-07 2022-04-12 三星电子株式会社 Wearable electronic device and method for controlling applications executed in electronic device
CN105892636A (en) * 2015-11-20 2016-08-24 乐视致新电子科技(天津)有限公司 Control method applied to head-mounted device and head-mounted device
CN107466396A (en) * 2016-03-22 2017-12-12 深圳市柔宇科技有限公司 Head-mounted display apparatus and its control method
WO2018112688A1 (en) * 2016-12-19 2018-06-28 深圳前海达闼云端智能科技有限公司 Method and device for amblyopia assistance
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus
CN110007819A (en) * 2019-03-12 2019-07-12 中国平安财产保险股份有限公司 The operation indicating method, apparatus and computer readable storage medium of system
CN110018871A (en) * 2019-03-12 2019-07-16 中国平安财产保险股份有限公司 The operation indicating method, apparatus and computer readable storage medium of system
CN110060537A (en) * 2019-03-22 2019-07-26 珠海超凡视界科技有限公司 A kind of virtual reality drives training device and its man-machine interaction method

Similar Documents

Publication Publication Date Title
CN104076907A (en) Control method, control device and wearable electronic equipment
US10133407B2 (en) Display apparatus, display system, method for controlling display apparatus, and program
CN109791437B (en) Display device and control method thereof
CN101379455B (en) Input device and its method
US20130335573A1 (en) Input method designed for augmented reality goggles
US20140285520A1 (en) Wearable display device using augmented reality
KR101812227B1 (en) Smart glass based on gesture recognition
CN104145232A (en) System for gaze interaction
US20170038838A1 (en) Information processing system and information processing method
US9823779B2 (en) Method and device for controlling a head-mounted display by a terminal device
US10467949B2 (en) Display apparatus, driving method thereof, and computer readable recording medium
US20150199111A1 (en) Gui system, display processing device, and input processing device
KR20150110257A (en) Method and wearable device for providing a virtual input interface
KR102422793B1 (en) Device and method for receiving character input through the same
EP3039520B1 (en) Display apparatus and ui providing method thereof
CN102981748A (en) Information processing terminal and method, program, and recording medium
EP3187977A1 (en) System for gaze interaction
CN106599738A (en) Display control method and apparatus for terminal device
CN104076930A (en) Blind operation control method, device and system
CN105334718B (en) Display changeover method and electronic equipment
US20220113813A1 (en) Glasses-Type Terminal
CN106325490A (en) Method for controlling electronic device and associated electronic device
US20150277742A1 (en) Wearable electronic device
EP4224300A1 (en) Screen capture method and apparatus, and electronic device
CN104951140A (en) Touch screen menu displaying method and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20141001