CN104063037B - A kind of operational order recognition methods, device and Wearable electronic equipment - Google Patents

A kind of operational order recognition methods, device and Wearable electronic equipment Download PDF

Info

Publication number
CN104063037B
CN104063037B CN201310085514.7A CN201310085514A CN104063037B CN 104063037 B CN104063037 B CN 104063037B CN 201310085514 A CN201310085514 A CN 201310085514A CN 104063037 B CN104063037 B CN 104063037B
Authority
CN
China
Prior art keywords
viewing area
contact surface
detection zone
input operation
operational order
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310085514.7A
Other languages
Chinese (zh)
Other versions
CN104063037A (en
Inventor
高歌
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201310085514.7A priority Critical patent/CN104063037B/en
Publication of CN104063037A publication Critical patent/CN104063037A/en
Application granted granted Critical
Publication of CN104063037B publication Critical patent/CN104063037B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The invention discloses a kind of operational order recognition methods, device and Wearable electronic equipment, the method is applied to Wearable electronic equipment, the electronic equipment has sensing unit and display unit, sensing unit correspondence induction region, when Wearable electronic equipment is worn on user, the eyes of user have a viewing area;When the display unit of Wearable electronic equipment shows a graphical interfaces, with graphical interfaces to there is a viewing area;Viewing area is detection zone comprising the subregion that viewing area is not belonging in viewing area, and viewing area, and induction region includes detection zone;The method obtains the input operation in the detection zone by sensing unit, when judge input operation across detection zone and viewing area contact surface when, the contact surface crossed over according to input operation, it is determined that operational order corresponding with input operation;Response operational order, control graphical interfaces are converted accordingly.The method can reduce the complexity of user input operation.

Description

A kind of operational order recognition methods, device and Wearable electronic equipment
Technical field
The present invention relates to electronic equipment input control technology field, more particularly to a kind of operational order identification side Method, device and Wearable electronic equipment.
Background technology
With the extensive application of electronic equipment, the mode also increasingly variation of the input operation of electronic equipment is realized.Such as, electricity Sub- equipment can be input into corresponding operational order by click keys, or be touched on the display interface of electronic equipment Operate and realize corresponding operation.
And input operation of the input operation based on gesture then flexibly to carry out electronic equipment provides possibility.Current base When gesture carries out input operation, the image in electronic equipment intake designated area, by the figure comprising user gesture action Input order is determined as being analyzed identification.In actual applications, during gesture input operation is carried out, regular need Some specific operations are carried out, and such as returned menu, are returned to upper interface etc., if the user while input operation is carried out based on gesture During need to perform more than the specific input operation for operating, then needing to interrupt current, and press the thing of ad-hoc location setting Reason button is only possible to trigger corresponding operation, and operation complexity is high.
The content of the invention
In view of this, the present invention provides a kind of operational order recognition methods, device and Wearable electronic equipment, and the method can To reduce the complexity of user input operation.
For achieving the above object, the present invention provides following technical scheme:A kind of operational order recognition methods, methods described should For a Wearable electronic equipment, the Wearable electronic equipment has sensing unit and display unit, the sensing unit pair Induction region is answered, when the Wearable electronic equipment is worn on user, the eyes of user have a viewing area;Work as institute When stating the display unit of Wearable electronic equipment and showing a graphical interfaces, with the graphical interfaces to there is a viewing area Domain;Wherein, the viewing area includes the part that the viewing area is not belonging in the viewing area, and the viewing area Region is detection zone, and the induction region includes the detection zone, and methods described includes:
Input operation in the detection zone is obtained by the sensing unit;
Judge whether the input operation crosses over the contact surface of the detection zone and the viewing area;
When contact surface of the input operation across the detection zone and the viewing area, grasp according to the input Make crossed over contact surface, it is determined that operational order corresponding with the input operation;
The operational order is responded, the graphical interfaces is controlled and is converted accordingly.
Preferably, it is described to judge the input operation whether across detection zone contact with the viewing area Face, including:
Judge whether the input operation enters the viewing area across the contact surface from the detection zone;
And/or, judge whether the input operation enters the detection zone across the contact surface from the viewing area Domain;
And/or, whether a contact surface from the detection zone with the viewing area is moved to judge the input operation Enter the detection zone, and the detection zone is removed from another contact surface in the detection zone and the viewing area.
Preferably, the detection zone is at least contacted including the first contact surface and second with the contact surface of the viewing area Face, first contact surface are different from the second contact surface;
It is described when contact surface of the input operation across the detection zone and the viewing area, according to described defeated Enter the crossed over contact surface of operation, it is determined that with the operational order corresponding to the input operation, including:
When first contact surface of the input operation across the detection zone and the viewing area, according to described defeated Enter the first crossed over contact surface of operation, it is determined that the first operational order corresponding with first contact surface crossed over;
When second contact surface of the input operation across the detection zone and the viewing area, according to described defeated Enter the second crossed over contact surface of operation, it is determined that the second operational order corresponding with second contact surface crossed over, wherein, Second operational order is different from first operational order.
Preferably, it is described when contact surface of the input operation across the detection zone and the viewing area, according to According to the contact surface crossed over by the input operation, it is determined that with the operational order corresponding to the input operation, including:
When contact surface of the input operation across the detection zone and viewing area, the input operation institute is determined Across detection zone and viewing area current contact surface, and recognize the override mode across the current contact surface, wherein, The detection zone at least has two contact surfaces with viewing area;
The current contact surface crossed over according to the input operation and the override mode across the current contact surface, really Determine and the operational order corresponding to the input operation;
Wherein, the override mode of input operation includes:From the detection zone stride into the first of the viewing area across More pattern;And/or, the second override mode of the detection zone is striden into from the viewing area;And/or, from the detection One contact surface of region and the viewing area strides into the detection zone, and from detection zone and viewing area another Contact surface is across the 3rd override mode for going out the detection zone.
Preferably, the contact surface crossed over according to the input operation, it is determined that with corresponding to the input operation Operational order, including:
According to the contact surface crossed over by the input operation, from from the corresponding predetermined registration operation instruction of different contact surfaces, really Determine and the operational order corresponding to current input operation, wherein, the predetermined registration operation instruction at least includes returning upper figure circle Face and/or display main menu.
Preferably, the induction region can also include the viewing area;
Methods described also includes:Input operation in the graphical interfaces is obtained by the sensing unit.
On the other hand, present invention also offers a kind of operational order identifying device, it is electric that described device is applied to a Wearable Sub- equipment, the Wearable electronic equipment have sensing unit and display unit, and the sensing unit correspondence induction region works as institute When stating Wearable electronic equipment and being worn on user, the eyes of user have a viewing area;When the Wearable electronics sets When the standby display unit shows a graphical interfaces, with the graphical interfaces to there is a viewing area;Wherein, it is described visual Region is detection zone comprising the subregion that the viewing area is not belonging in the viewing area, and the viewing area, The induction region includes the detection zone, and described device includes:
First operation acquiring unit, for the input operation in the detection zone is obtained by the sensing unit;
Judging unit, for judging the input operation whether across detection zone contact with the viewing area Face;
Instruction-determining unit, for the contact surface of the detection zone and the viewing area is crossed over when the input operation When, according to the contact surface crossed over by the input operation, it is determined that operational order corresponding with the input operation;
Instruction response unit, for responding the operational order, controls the graphical interfaces and is converted accordingly.
Preferably, the judging unit is specially:For judging whether the input operation is crossed over from the detection zone The contact surface enters the viewing area;And/or, judge whether the input operation is connect from the viewing area across described Contacting surface enters the detection zone;And/or, judge the input operation whether from the detection zone and the viewing area One contact surface moves into the detection zone, and removes the inspection from another contact surface in the detection zone and the viewing area Survey region.
Preferably, the detection zone is at least contacted including the first contact surface and second with the contact surface of the viewing area Face, first contact surface are different from the second contact surface;
The instruction-determining unit, including:First instruction-determining unit, for crossing over the detection when the input operation During the first contact surface of region and the viewing area, the first contact surface crossed over according to the input operation, it is determined that and institute Across corresponding first operational order of first contact surface;
Second instruction-determining unit, for when the input operation is across the of the detection zone and the viewing area During two contact surfaces, the second contact surface crossed over according to the input operation, it is determined that with second contact surface pair crossed over The second operational order answered, wherein, second operational order is different from first operational order.
Preferably, the instruction-determining unit, including:
Recognition unit, for when contact surface of the input operation across the detection zone and viewing area, it is determined that Detection zone and the current contact surface of viewing area that the input operation is crossed over, and recognize across the current contact surface Override mode, wherein, the detection zone at least has two contact surfaces, the override mode bag of input operation with viewing area Include:The first override mode of the viewing area is striden into from the detection zone;And/or, stride into from the viewing area Second override mode of the detection zone;And/or, stride into from a contact surface of the detection zone with the viewing area The detection zone, and another contact surface from detection zone with viewing area crosses over the 3rd leap of the detection zone Pattern;
Instruction determination subelement, for the current contact surface crossed over according to the input operation and across described current The override mode of contact surface, it is determined that with the operational order corresponding to the input operation.
Preferably, the instruction-determining unit, specially:For crossing over the detection zone and institute when the input operation When stating the contact surface of viewing area, according to the contact surface crossed over by the input operation, preset from corresponding from different contact surfaces In operational order, it is determined that with the operational order corresponding to current input operation, wherein, predetermined registration operation instruction at least includes returning Return a upper graphical interfaces and/or show main menu.
Preferably, the induction region can also include the viewing area;
Described device also includes:Second operation acquiring unit, for obtaining the graphical interfaces by the sensing unit Interior input operation.
On the other hand, present invention also offers a kind of Wearable electronic equipment, the Wearable electronic equipment is with process Device, and the sensing unit that is connected with the processor and display unit, the sensing unit correspondence induction region, when described When Wearable electronic equipment is worn on user, the eyes of user have a viewing area;When the Wearable electronic equipment Display unit when showing a graphical interfaces, with the graphical interfaces to there is a viewing area;Wherein, the visible area Domain is detection zone comprising the subregion that the viewing area is not belonging in the viewing area, and the viewing area, institute Induction region is stated comprising the detection zone, the processor is built-in with the operational order identifying device described in as above any one.
Understand via above-mentioned technical scheme, compared with prior art, the present disclosure provides a kind of operational order is known Other method, device and electronic equipment, the method judged to the input operation that sensing unit is detected, when judging the input Operation across the detection zone and viewing area contact surface when, according to the contact surface crossed over by the input operation, determine defeated Enter the corresponding operational order of operation, and then respond the graphical interfaces of the presentation of the operational order control Wearable electronic equipment Row is corresponding to be converted.As viewing area and detection zone surface area are larger, positioning is easier, therefore passes through input operation Across viewing area and the contact surface of viewing area, trigger corresponding with the input operation to determine according to the contact surface crossed over Operational order mode, reduce the maloperation in input process, improve the degree of accuracy of input operation, also improve input Speed and the convenience of input operation.
Description of the drawings
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing Accompanying drawing to be used needed for having technology description is briefly described, it should be apparent that, drawings in the following description are only this Inventive embodiment, for those of ordinary skill in the art, on the premise of not paying creative work, can be with basis The accompanying drawing of offer obtains other accompanying drawings.
Fig. 1 shows a kind of schematic flow sheet of operational order recognition methods one embodiment of the invention;
When Fig. 2 shows user's wearing intelligent glasses, the schematic top plan view of detection zone, viewing area and viewing area;
When Fig. 3 shows user's wearing intelligent glasses, longitudinal section view of detection zone, viewing area and viewing area is illustrated Figure;
Fig. 4 shows a kind of schematic flow sheet of another embodiment of operational order recognition methods of the invention;
Fig. 5 shows a kind of schematic flow sheet of another embodiment of operational order recognition methods of the invention;
Fig. 6 shows a kind of structural representation of operational order identifying device one embodiment of the invention;
Fig. 7 shows that the structure of instruction-determining unit in a kind of another embodiment of operational order identifying device of the invention is shown It is intended to.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is only a part of embodiment of the invention, rather than the embodiment of whole.It is based on Embodiment in the present invention, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of protection of the invention.
The embodiment of the invention discloses a kind of operational order recognition methods, the method can improve the convenient of input operation Property, and then realize the interface to being quickly switched into needed for specific operation.
Referring to Fig. 1, a kind of schematic flow sheet of operational order recognition methods one embodiment of the invention, this enforcement are shown The method of example is applied to Wearable electronic equipment, and the Wearable electronic equipment has sensing unit and display unit, the sensing list Unit has induction region.When the wearable electronic is worn on user, the eyes of user have a viewing area;When When the display unit of the wearable electronic equipment shows a graphical interfaces, with the graphical interfaces to there is a viewing area.Its In, the viewing area is not belonging to the region of the viewing area in including the viewing area, and the viewing area be detection zone, is made Obtain the viewing area and there is contact surface with the detection zone intersection.The induction region includes the detection zone.The present embodiment Method includes:
Step 101:Input operation in detection zone is obtained by the sensing unit.
Wherein, sensing unit can obtain the input operation in induction region, and induction region contains the detection zone, Therefore, sensing unit can get the input operation in detection zone.
Wherein, the input operation can be user in the gesture operation in space, or touching on the electronic equipment Touch operation.Accordingly, sensing unit can be existing any device for obtaining user input operation behavior, can such as be shooting Head, or infrared induction unit etc..
Step 102:Judge the input operation whether across detection zone and the contact surface of viewing area.
When user dresses the wearable electronic equipment, user has and has a visible area by the Wearable electronic equipment Domain, so that user can see any object or action behavior in the viewing area.
The display unit of the electronic equipment shows a graphical interfaces, to should graphical interfaces have a viewing area, wherein, use The viewing area having during the Wearable electronic equipment is dressed comprising the viewing area in family, and user can see figure circle Face.
And user sees that the mode of the graphical interfaces has various, one of which is that the display unit is shown on electronic equipment The graphical interfaces (for example, the subregion of the eyeglass on wearable glasses is display interface) is exported on interface, and shows boundary In the viewing area, user can see the graphical interfaces in face.Another way is:Display unit is into the eyes of user Projects images so that user can see the graphical interfaces, the regional extent that the graphical interfaces can be seen in user now is Viewing area.
As viewing area includes viewing area, and the subregion that the viewing area is not belonging in viewing area is detection Region, therefore, there is interface in viewing area and the detection zone, that is, viewing area and detection zone connect in space Contacting surface.
According to the input operation that sensing unit is obtained, can analyze whether current input operation spans the detection zone Domain and the contact surface of viewing area.Across detection zone and the contact surface of viewing area input operation can be by operating body by Detection zone spans the contact surface and enters viewing area, or operating body and span the contact surface by viewing area Enter to the detection zone etc., as long as operating body at least spans once the contact surface of the detection zone and viewing area.
Step 103:When contact surface of the input operation across detection zone and viewing area, crossed over according to input operation Contact surface, it is determined that operational order corresponding with input operation.
If it is judged that when current input operation spans the contact surface of detection zone and viewing area, then explanation needs Current input operation is to need input instruction to be changed graphical interfaces to trigger, accordingly, it would be desirable to according to connecing for being crossed over Contacting surface, determines and the operational order corresponding to current input operation.
According to description above, the viewing area contains viewing area and detection zone, and user can see clearly ought Region residing for front input operation, so, carries out the input of certain operational order if desired, and user just can cause input behaviour Work is interacted from viewing area and detection zone, so as to the contact surface across the viewing area and detection zone, and then is touched There is into corresponding operational order.
As the area of viewing area and detection zone is larger, the area in the two regions is significantly greater than the Wearable electronics The area covered by the contact surface of the area of the button for arranging on equipment or showing, corresponding this viewing area and detection zone It is larger.And according to Fitts' law (Fitts ' Law, also referred to as Fitts' law), it is known that, target is bigger, easier positioning, and points to The faster time is shorter.That is, the time of one target of positioning, depending on the distance of target and current location, and target Size.Therefore, by input operation across the detection zone and viewing area contact surface triggering corresponding operational order, carry The convenience and high efficiency of high input operation.
Step 104:The operational order is responded, control graphical interfaces is converted accordingly.
After determining operational order, according to the operational order determined, the graphical interfaces that control display unit shows is carried out It is corresponding to change.
Wherein, graphical interfaces enters the mode of line translation and is likely to different, and whole figure interface can be switched over, Can also be that only the partial content region of currently displaying graphical interfaces is changed, or by currently displaying figure Overlapping portion spirte interface etc. on shape interface.
Certainly, the difference according to the operational order determined, the change that controlling graphical interfaces is carried out would also vary from. In in view of practical application, generally require and easily set some conventional input instructions, thus determine that operational order can With including the upper graphical interfaces of return, display main menu, operation of execution home function buttons etc. in current interface.
Accordingly, it is determined that during operational order, it is also possible to according to the contact surface crossed over by the input operation, from from it is different In the instruction of contact surface corresponding predetermined registration operation, it is determined that with the operational order corresponding to current input operation.Wherein, predetermined registration operation refers to Order can include the more commonly used instruction, such as at least include returning a upper graphical interfaces and/or show main menu.
Judging to the input operation that sensing unit is detected for the present embodiment, is somebody's turn to do when judging that the input operation is crossed over During the contact surface of detection zone and viewing area, according to the contact surface crossed over by the input operation, determine that input operation institute is right The operational order answered, and then respond the graphical interfaces of the presentation of operational order control Wearable electronic equipment and become accordingly Change.As viewing area and detection zone surface area are larger, positioning is easier, therefore viewing area is crossed over by input operation Domain and the contact surface of viewing area, trigger according to the contact surface crossed over to determine that operation corresponding with the input operation refers to The mode of order, reduces the maloperation in input process, improves the degree of accuracy of input operation, also improves input speed and defeated Enter the convenience of operation.
Wherein, the Wearable electronic equipment in the embodiment of the present application can be intelligent glasses, armor formula electronic equipment, can wear The watch style computer being worn in wrist.This kind of wearable electronic equipment typically all has and is easy to carry, but when carrying out Control operation with carry out interface convert when, if being typically necessary by specific keys to carry out input behaviour using existing mode Make, operation inconvenience.
In order to be conducive to understanding the scheme of the embodiment of the present invention, the user is dressed after the Wearable electronic equipment, user The viewing area that has, the relation between the viewing area and detection zone are introduced, below with wearable electronic equipment As a example by Brilliant Eyes mirror electronic equipment.And assume that the display unit of the Brilliant Eyes mirror electronic equipment includes image in the mirror Project graphics onto on piece rather than directly in eyes of user, that is, display unit is provided with eyeglass, and then user can With with the graphical interfaces for presenting in the display unit soon.
As shown in Fig. 2 the overlooking the structure diagram of the intelligent glasses is dressed for user, the intelligent glasses 20 have two eyes Mirror holder (or referred to as ear mount or earpiece), the two spectacle frames are connected with picture frame 201, on the picture frame are provided with eyeglass, the eyeglass Upper to have display unit 202, display unit can export a graphical interfaces to the display unit.The graphical interfaces correspondence one shows The area of space of two graphical interfaces (is also recognized in the Fig. 2 by the region in region 211, such as Fig. 2 between two black thick dashed line To be viewing area).When user is operated to the object of the graphical interfaces at present, gesture can be carried out in the viewing area Input can also carry out touch operation on the display unit being contained in the viewing area.
Under normal circumstances, after the graphical interfaces is shown on the eyeglass of intelligent glasses, the lens region that graphical interfaces is located Domain still has certain light transmittance, and user still can pass through the things that the graphical interfaces sees outside.Therefore, from the eye of user The sight line of eyeball is viewing area 212 into region through the whole eyeglass institute of the electronic equipment, as shown in Fig. 2 the viewing area 212 For the region between two solid lines.The viewing area 212 includes the viewing area 211, while the viewing area also includes inspection Survey region 213.As Fig. 2 is top view, then the detection zone of viewing area both sides is only shown, the viewing area and detection zone The contact surface in domain is straight line, such as bold dashed lines in figure in the top view.
Sensing unit is provided with the intelligent glasses, the sensing unit can sense the operation in the detection zone, so When the input action of user crosses over any one contact surface of the detection zone and viewing area, the sensing unit can be detected To corresponding input operation.
Certainly, when Fig. 2 is only that a user dresses the intelligent glasses, the overlooking the structure diagram of regional, and In real space, the graphical interfaces can be in the zone line of the eyeglass, now, be in the surrounding volume region of viewing area Detection zone.In order to the position relationship of the detection zone in more clearly Fig. 2, viewing area and viewing area, referring to Fig. 3, when showing that user dresses the intelligent glasses, the longitudinal cross-section schematic diagram of regional.The Fig. 3 be the Fig. 2 in sight line The sectional view vertically made at optional position on direction, arbitrary section are a rectangle.Wherein, entity thick line institute The interior zone for surrounding is viewing area, contains the viewing area 211 surrounded by dotted line in viewing area, and solid line with it is empty Line area defined is detection zone 213.When user in space in input operation when, the eyes of user can see its hand Move in the viewing area, if user needs to trigger corresponding operational order, can be by the hand cross-domain viewing area With the contact surface of detection zone, it is dotted portion in the sectional view, and then the intelligent glasses is according to the current input behaviour of user Make crossed over contact surface, determine operational order corresponding with current input operation.
Certainly, the Fig. 3 can also regard the plan of the intelligent glasses shown in Fig. 2 as, wherein, viewing area is whole mirror Piece, viewing area be graphical interfaces (do not considered the region between two eyeglasses, the i.e. space of nose piece portion in the figure, and It is that the subregion is considered as into viewing area), detection zone is the region in the eyeglass in addition to viewing area, if so as to User is in the enterprising line slip operation of plane that the eyeglass is located, it is also possible to determine whether to span according to the shift position of finger The viewing area and the boundary line of detection zone, illustrate the input operation in space if input operation spans boundary line Span the contact surface of the viewing area and the detection zone.
In actual applications, input operation has various across the mode of contact surface, accordingly, judge input operation whether across More detection zone also has various situations with the contact surface of the viewing area.Specifically, can include:Whether judge input operation Contact surface from detection zone across the detection zone with the viewing area enters viewing area.If sensing units sense is to defeated Enter contact surface of the operation from detection zone to the detection zone with viewing area to move, until the contact surface is removed, so as to enter Enter to viewing area, then may determine that input operation spans the contact surface.For example, by taking Fig. 3 as an example, then this kind is across contact The mode in face is as shown in 301 direction of arrow.
Judge whether input operation spans contact surface and can also pass through to judge the input operation whether from viewing area Detection zone is entered into across contact surface of the viewing area with the detection zone.Sensing unit can be sensed in detection zone Interior operating body is across the contact surface for going out the detection zone and the detection zone, and then operating body enters into the behaviour of the viewing area Make, enter into viewing area with the contact surface of viewing area so that it is determined that going out input operation and detection zone being spanned from detection zone Domain.
In addition, judging whether input operation spans the contact surface and can also be:Judge the input operation whether from detection Region moves into detection zone with a contact surface of viewing area, and removes from another contact surface in the detection zone and viewing area Detection zone.In actual applications, a continuous gesture action of user may be related to multiple contact surfaces, accordingly, sense Answer the input operation detected by unit to be likely to across after a contact surface of detection zone and viewing area, then cross-domain arrive One continuous action of another contact surface of the detection zone and viewing area.For example, by taking Fig. 3 as an example, then this kind of leap connects The mode of contacting surface is as shown in the continuous action of 302 direction of arrow.Also one kind is probably can exist between two viewing areas One detection zone.For example, in the intelligent glasses of Fig. 2, when the region between two graphical interfaces is thought during detection zone, Can there is a detection zone, the detection zone in the middle of the viewing area and the viewing area in the centre of the viewing area 211 of Fig. 3 There are two parallel contact surfaces in domain, now, input operation parallel can be connect across two successively from the viewing area of side Contacting surface further enters into the viewing area of opposite side.
Certainly, judge that whether input operation can also be while carry out with the contact surface of viewing area across the detection zone Any of the above one or more across contact surface mode judgement.
Referring to Fig. 4, a kind of schematic flow sheet of another embodiment of operational order recognition methods of the invention, this reality are shown The method for applying example is applied to wearable electronic equipment, and the electronic equipment has sensing unit and display unit, the sensing unit With induction region.When the wearable electronic is worn on user, the eyes of user have a viewing area;When this When the display unit of wearable electronic equipment shows a graphical interfaces, with the graphical interfaces to there is a viewing area.Wherein, The viewing area is detection zone comprising the region that the viewing area is not belonging in the viewing area, and the viewing area so that The viewing area has contact surface with the detection zone intersection.The induction region includes the detection zone.The side of the present embodiment Method includes:
Step 401:Input operation in detection zone is obtained by the sensing unit.
Step 402:Judge the input operation whether across detection zone and the contact surface of viewing area.
Two above step can be similar to the corresponding steps of the command identifying method in above example, and here is no longer gone to live in the household of one's in-laws on getting married State.
Step 403:When contact surface of the input operation across detection zone and viewing area, according to the current institute of input operation Across contact surface, it is determined that with the operational order corresponding to the contact surface currently crossed over.
Due to detection zone and at least two contact surface of contact surface of viewing area, therefore can pre-set different Contact surface and the corresponding relation of operational order, such that it is able to determine operational order according to the contact surface crossed over, are crossed over Contact surface it is different, the operational order determined would also vary from.Specifically, can be from the contact surface for pre-setting and operation In the corresponding relation of instruction, determine and the operational order corresponding to the contact surface currently crossed over.
Assume that the detection zone and the contact surface of viewing area at least include the first contact surface and the second contact surface, this first Contact surface is different from the second contact surface.When first contact surface of the input operation across detection zone and viewing area, according to defeated Enter the first crossed over contact surface of operation, it is determined that the first operational order corresponding with the first contact surface crossed over;As input behaviour When making the second contact surface across detection zone and viewing area, the second contact surface crossed over according to input operation, it is determined that with Corresponding second operational order of the second contact surface crossed over, wherein, second operational order is different from the first operational order.
It should be noted that first contact surface and the second contact surface are crossed over just for the sake of distinguishing different input operations Contact surface, the relation and quantity of contact surface are not defined.Only connect across one when an input operation is crossed over Operational order determined by contacting surface, it is different across operational order determined by two contact surfaces from an input operation.
In order to make it easy to understand, when still dressing the intelligent glasses with the user shown in Fig. 2 and Fig. 3, the structure of regional is shown It is introduced as a example by intention.Detection zone in figure 3 has the upper boundary in four boundary lines respectively figure with viewing area Line, lower boundary line, left boundary line and right boundary line.This four boundary lines correspond to four contact surfaces respectively in space.In hypothesis The corresponding operational order of contact surface that boundary line is located is the upper interface of return, the corresponding operation of contact surface that lower boundary line is located Instruct to show main menu, to show desktop, right boundary line is located the operational order corresponding to contact surface that left boundary line is located Contact surface corresponding to operational order to exit current interface.When user needs to show main menu, then can enter in space Row gesture operation, the contact surface that gesture input is located across the viewing area and detection zone lower boundary line, for example, mobile hand Portion enters into viewing area across the contact surface that lower boundary line is located from detection zone.Certainly user can also be by eyeglass The touch operation contact surface that comes across the lower boundary line, here is not any limitation as.Intelligent glasses detect the input behaviour of user When making to cross over the contact surface at lower boundary line place, the corresponding operational order of input operation is determined to show main menu.
Step 404:The operational order is responded, control shows that graphical interfaces is converted accordingly.
The operational order is responded, and then according to the operational order, it is determined that needing the change carried out to current graphical interfaces. For example, operational order then can be added to main menu on current graphical interfaces so that graphical interfaces to show during main menu In include main menu;When operational order is to return to a upper interface, then may need to cut currently displaying whole figure interface It is changed to and represents graphical interfaces before current Graphics interface.
Referring to Fig. 5, a kind of schematic flow sheet of another embodiment of operational order recognition methods of the invention, this reality are shown The method for applying example is applied to Wearable electronic equipment, and the Wearable electronic equipment has sensing unit and display unit, the sensing Unit has induction region.When the wearable electronic is worn on user, the eyes of user have a viewing area; When the display unit of the wearable electronic equipment shows a graphical interfaces, with the graphical interfaces to there is a viewing area. Wherein, the viewing area is detection zone comprising the region that the viewing area is not belonging in the viewing area, and the viewing area, So that the viewing area has contact surface with the detection zone intersection.The induction region includes the detection zone.The present embodiment Method include:
Step 501:Input operation in detection zone is obtained by the sensing unit.
Step 502:Judge the input operation whether across detection zone and the contact surface of viewing area.
The operating process of the step 501 and step 502 is identical with the operating process of corresponding steps in embodiment illustrated in fig. 1, Will not be described here.
Step 503:When contact surface of the input operation across detection zone and viewing area, the current institute of input operation is determined Across detection zone and viewing area current contact surface, and recognize the override mode across the current contact surface.
In the present embodiment, the detection zone at least has two contact surfaces with viewing area.When detecting input operation Across viewing area and detection zone contact surface when, in the present embodiment except it needs to be determined that what current input operation was crossed over Contact surface, in addition it is also necessary to override mode of the identification across the current contact surface.
Wherein, the override mode of input operation includes:During the contact surface across detection zone and viewing area, First override mode in shown region is striden into from detection zone;And/or, the second of detection zone is striden into from viewing area Override mode;And/or, stride into detection zone from a contact surface of detection zone Yu viewing area, and from detection zone with it is aobvious Show another contact surface in region across the 3rd override mode for going out detection zone.
Step 504:The current contact surface crossed over according to input operation and the override mode across current contact surface, really Operational order corresponding to the fixed and input operation.
Need in the present embodiment the contact surface crossed over according to input operation and across the contact surface override mode come Determine operational order.When the contact surface crossed over by the input operation of not homogeneous it is identical, but the leap mode of the cross-domain contact surface When different, the operational order corresponding to input operation is also differed.
Still by taking the intelligent glasses of Fig. 3 as an example, and by taking the contact surface being located across upper boundary line as an example, it is assumed that from detection zone When the contact surface being located across upper boundary line enters into viewing area, the operational order for being triggered is display main menu;When from aobvious When showing that the region contact surface that boundary line is located across on this enters into detection zone, corresponding operational order is to exit entree It is single;So, if user needs to show main menu, can enter into from detection zone across the contact surface that upper boundary line is located Viewing area.When user need not show main menu again, then can in opposite direction across on this boundary line be located connect Contacting surface, i.e., the contact surface being located from viewing area across boundary line on this, so as to the operational order for exiting main menu can be triggered.
Step 505:The operational order is responded, control shows that graphical interfaces is converted accordingly.
In any of the above one embodiment, the induction region corresponding to sensing unit can also include viewing area, phase Answer, the input operation in graphical interfaces can also be obtained in the present embodiment by sensing unit, to trigger to graphical interfaces Corresponding operating.
Simultaneously as the sensing unit can include viewing area, therefore, the input in sensing units sense detection zone Operation, to judge input operation whether across the detection zone and the contact surface of viewing area, it is understood that be by sensing Unit senses the input operation of the graphical interfaces, and then to judge input operation whether across the detection zone and viewing area Contact surface.Certainly, as detection zone and viewing area together constitute viewing area, similarly can by sensing unit come Sense the input operation in whole viewing area, and input operation is analyzed, judge input operation whether across detection zone Domain and the contact surface of viewing area.
A kind of operational order recognition methods of the correspondence present invention, present invention also offers a kind of operational order identifying device. Referring to Fig. 6, a kind of structural representation of operational order identifying device one embodiment of the invention, the dress of the present embodiment are shown Put and be applied to a Wearable electronic equipment, the Wearable electronic equipment has sensing unit and display unit, the sensing unit pair Induction region is answered, when Wearable electronic equipment is worn on user, the eyes of user have a viewing area;Work as Wearable When the display unit of electronic equipment shows a graphical interfaces, with graphical interfaces to there is a viewing area;Wherein, viewing area bag Containing the viewing area, and be not belonging in viewing area the viewing area subregion be detection zone, the induction region bag Containing detection zone.The device of the present embodiment includes:First operation acquiring unit 601, judging unit 602, instruction-determining unit 603 With instruction response unit 604.
Wherein, the first operation acquiring unit 601, for the input operation in detection zone is obtained by sensing unit.
Judging unit 602, for judging the input operation whether across detection zone and the contact surface of viewing area.
Instruction-determining unit 603, for when contact surface of the input operation across detection zone and viewing area, according to defeated Enter the crossed over contact surface of operation, it is determined that operational order corresponding with input operation.
Instruction response unit 604, for responding operational order, control graphical interfaces is converted accordingly.
Judging to the input operation that sensing unit is detected for the present embodiment, is somebody's turn to do when judging that the input operation is crossed over During the contact surface of detection zone and viewing area, according to the contact surface crossed over by the input operation, determine that input operation institute is right The operational order answered, and then respond the graphical interfaces of the presentation of operational order control Wearable electronic equipment and become accordingly Change.As viewing area and detection zone surface area are larger, positioning is easier, therefore viewing area is crossed over by input operation Domain and the contact surface of viewing area, trigger according to the contact surface crossed over to determine operational order corresponding with the input operation Mode, reduce the maloperation in input process, improve the degree of accuracy of input operation, also improve input speed and input The convenience of operation.
Wherein, when judging unit judges the contact surface of the whether cross-domain viewing area of input operation and detection zone, can be with In several ways, accordingly, judging unit is specially:For judging input operation whether from the detection zone across display Region enters viewing area with the contact surface of detection zone;And/or, judge whether input operation crosses over the display from viewing area Region enters detection zone with the contact surface of detection zone;And/or, judge input operation whether from detection zone and viewing area A contact surface move into the detection zone, and remove detection zone from another contact surface in detection zone and viewing area.
In practice, viewing area at least has two contact surfaces with detection zone, right in order to determine input operation institute The operational order answered, can be according to the difference of the cross-domain contact surface of input operation, it is determined that operational order corresponding with contact surface.
Specifically, the detection zone is at least contacted including the first contact surface and second with the contact surface of the viewing area Face, first contact surface are different from the second contact surface;
Accordingly, instruction-determining unit, at least includes:First instruction-determining unit, for when input operation is across detection During the first contact surface of region and viewing area, the first contact surface crossed over according to input operation, it is determined that with for being crossed over Corresponding first operational order of one contact surface;
Second instruction-determining unit, for when second contact surface of the input operation across detection zone and viewing area, The second contact surface crossed over according to input operation, it is determined that the second operational order corresponding with the second contact surface crossed over, its In, second operational order is different from first operational order.
Wherein, the viewing area crossed over just for the sake of the different input operations of differentiation from the second contact surface by first contact surface The different contact surfaces of domain and detection zone.
In actual applications, other can also determine the mode of the operational order corresponding to input operation, referring to Fig. 7, Show the structural representation of the instruction-determining unit in a kind of another embodiment of operational order identifying device of the present invention.This Embodiment with the difference of the operational order identifying device of a upper embodiment is, the instruction-determining unit in the present embodiment 603 include:
Recognition unit 6031, for when contact surface of the input operation across the detection zone and viewing area, Determine the current contact surface of detection zone that the input operation crossed over and viewing area, and recognize across the current contact The override mode in face, wherein, the detection zone at least has two contact surfaces, the override mode of input operation with viewing area Including:The first override mode of the viewing area is striden into from the detection zone;And/or, stride into from the viewing area To the second override mode of the detection zone;And/or, from a contact surface of the detection zone with the viewing area across Enter the detection zone, and from another contact surface of detection zone with viewing area across go out the 3rd of the detection zone across More pattern.
Instruction determination subelement 6032, for the current contact surface crossed over according to the input operation and across described The override mode of current contact surface, it is determined that with the operational order corresponding to the input operation.
In any of the above one embodiment, the instruction-determining unit can pre-set operational order, in order to from The operational order met by current input operation is determined in preset operational order.Accordingly, the instruction-determining unit, specifically For:For when contact surface of the input operation across the detection zone and the viewing area, being crossed over according to input operation Contact surface, from from the instruction of different contact surfaces corresponding predetermined registration operation, it is determined that referring to the operation corresponding to current input operation Order, wherein, the predetermined registration operation instruction at least includes returning a upper graphical interfaces and/or shows main menu.
Further, in any of the above one embodiment, the induction region can also include viewing area.
Accordingly, the device also includes:Second operation acquiring unit, for obtaining the graphical interfaces by sensing unit Interior input operation.
On the other hand, present invention also offers a kind of Wearable electronic equipment, the Wearable electronic equipment has processor, And the sensing unit that is connected with the processor and display unit, sensing unit correspondence induction region, when Wearable electronics sets For when being worn on user, the eyes of user have a viewing area;When the display unit of Wearable electronic equipment shows one During graphical interfaces, with graphical interfaces to there is a viewing area.Wherein, viewing area includes the viewing area, and visible area Be not belonging in domain the viewing area subregion be detection zone, the induction region include the detection zone.The process Device is built-in with operational order identifying device as above described in any one embodiment.
In this specification, each embodiment is described by the way of progressive, and what each embodiment was stressed is and other The difference of embodiment, between each embodiment identical similar portion mutually referring to.For device disclosed in embodiment For, as which corresponds to the method disclosed in Example, so description is fairly simple, related part is said referring to method part It is bright.
The foregoing description of the disclosed embodiments, enables professional and technical personnel in the field to realize or using the present invention. Various modifications to these embodiments will be apparent for those skilled in the art, as defined herein General Principle can be realized without departing from the spirit or scope of the present invention in other embodiments.Therefore, the present invention The embodiments shown herein is not intended to be limited to, and is to fit to and principles disclosed herein and features of novelty phase one The most wide scope for causing.

Claims (13)

1. a kind of operational order recognition methods, it is characterised in that methods described is applied to a Wearable electronic equipment, the wearing Formula electronic equipment has sensing unit and display unit, the sensing unit correspondence induction region, when the Wearable electronics sets For when being worn on user, the eyes of user have a viewing area;When the presentation list of the Wearable electronic equipment When unit shows a graphical interfaces, with the graphical interfaces to there is a viewing area;Wherein, the viewing area is comprising described aobvious Be not belonging in showing region, and the viewing area viewing area subregion be detection zone, the induction region bag Containing the detection zone, methods described includes:
Input operation in the detection zone is obtained by the sensing unit;
Judge whether the input operation crosses over the contact surface of the detection zone and the viewing area;
When contact surface of the input operation across the detection zone and the viewing area, according to the input operation institute Across contact surface, it is determined that operational order corresponding with the input operation;
The operational order is responded, the graphical interfaces is controlled and is converted accordingly.
2. method according to claim 1, it is characterised in that described to judge the input operation whether across the detection The contact surface of region and the viewing area, including:
Judge whether the input operation enters the viewing area across the contact surface from the detection zone;
And/or, judge whether the input operation enters the detection zone across the contact surface from the viewing area;
And/or, judge whether a contact surface from the detection zone with the viewing area moves into institute for the input operation Detection zone is stated, and the detection zone is removed from another contact surface in the detection zone and the viewing area.
3. method according to claim 1 and 2, it is characterised in that detection zone contact with the viewing area Face at least includes the first contact surface and the second contact surface, and first contact surface is different from the second contact surface;
It is described when contact surface of the input operation across the detection zone and the viewing area, grasp according to the input Make crossed over contact surface, it is determined that with the operational order corresponding to the input operation, including:
When first contact surface of the input operation across the detection zone and the viewing area, grasp according to the input Make the first crossed over contact surface, it is determined that the first operational order corresponding with first contact surface crossed over;
When second contact surface of the input operation across the detection zone and the viewing area, grasp according to the input Make the second crossed over contact surface, it is determined that the second operational order corresponding with second contact surface crossed over, wherein, it is described Second operational order is different from first operational order.
4. method according to claim 1 and 2, it is characterised in that described when the detection zone is crossed in the input operation During the contact surface of domain and the viewing area, according to the contact surface crossed over by the input operation, it is determined that with the input operation Corresponding operational order, including:
When contact surface of the input operation across the detection zone and viewing area, determine that the input operation is crossed over Detection zone and viewing area current contact surface, and recognize the override mode across the current contact surface, wherein, it is described Detection zone at least has two contact surfaces with viewing area;
The current contact surface crossed over according to the input operation and the override mode across the current contact surface, it is determined that with Operational order corresponding to the input operation;
Wherein, the override mode of input operation includes:The first of the viewing area is striden into from the detection zone and crosses over mould Formula;And/or, the second override mode of the detection zone is striden into from the viewing area;And/or, from the detection zone The detection zone is striden into a contact surface of the viewing area, and another from detection zone and viewing area is contacted Face is across the 3rd override mode for going out the detection zone.
5. method according to claim 1 and 2, it is characterised in that the contact crossed over according to the input operation Face, it is determined that with the operational order corresponding to the input operation, including:
According to the contact surface crossed over by the input operation, from instructing from the corresponding predetermined registration operation of different contact surfaces, it is determined that with Current operational order corresponding to input operation, wherein, the predetermined registration operation instruction at least include returning a upper graphical interfaces with/ Or show main menu.
6. method according to claim 1, it is characterised in that the induction region can also include the viewing area;
Methods described also includes:Input operation in the graphical interfaces is obtained by the sensing unit.
7. a kind of operational order identifying device, it is characterised in that described device is applied to a Wearable electronic equipment, the wearing Formula electronic equipment has sensing unit and display unit, the sensing unit correspondence induction region, when the Wearable electronics sets For when being worn on user, the eyes of user have a viewing area;When the presentation list of the Wearable electronic equipment When unit shows a graphical interfaces, with the graphical interfaces to there is a viewing area;Wherein, the viewing area is comprising described aobvious Be not belonging in showing region, and the viewing area viewing area subregion be detection zone, the induction region bag Containing the detection zone, described device includes:
First operation acquiring unit, for the input operation in the detection zone is obtained by the sensing unit;
Judging unit, for judging whether the input operation crosses over the contact surface of the detection zone and the viewing area;
Instruction-determining unit, for when contact surface of the input operation across the detection zone and the viewing area, According to the contact surface crossed over by the input operation, it is determined that operational order corresponding with the input operation;
Instruction response unit, for responding the operational order, controls the graphical interfaces and is converted accordingly.
8. device according to claim 7, it is characterised in that the judging unit is specially:For judging the input Whether operation enters the viewing area across the contact surface from the detection zone;And/or, judge that the input operation is It is no to enter the detection zone across the contact surface from the viewing area;And/or, judge the input operation whether from institute The contact surface that detection zone is stated with the viewing area moves into the detection zone, and aobvious with described from the detection zone Show that another contact surface in region removes the detection zone.
9. the device according to claim 7 or 8, it is characterised in that detection zone contact with the viewing area Face at least includes the first contact surface and the second contact surface, and first contact surface is different from the second contact surface;
The instruction-determining unit, including:
First instruction-determining unit, for connecing with the first of the viewing area across the detection zone when the input operation During contacting surface, the first contact surface crossed over according to the input operation, it is determined that corresponding with first contact surface crossed over First operational order;
Second instruction-determining unit, for connecing with the second of the viewing area across the detection zone when the input operation During contacting surface, the second contact surface crossed over according to the input operation, it is determined that corresponding with second contact surface crossed over Second operational order, wherein, second operational order is different from first operational order.
10. the device according to claim 7 or 8, it is characterised in that the instruction-determining unit, including:
Recognition unit, for when contact surface of the input operation across the detection zone and viewing area, it is determined that described Detection zone and the current contact surface of viewing area that input operation is crossed over, and recognize the leap across the current contact surface Pattern, wherein, the detection zone at least has two contact surfaces with viewing area, and the override mode of input operation includes:From The detection zone strides into the first override mode of the viewing area;And/or, the inspection is striden into from the viewing area Survey second override mode in region;And/or, the inspection is striden into from a contact surface of the detection zone with the viewing area Region is surveyed, and another contact surface from detection zone with viewing area crosses over the 3rd override mode of the detection zone;
Instruction determination subelement, for the current contact surface crossed over according to the input operation and across the current contact The override mode in face, it is determined that with the operational order corresponding to the input operation.
11. devices according to claim 7 or 8, it is characterised in that the instruction-determining unit, specially:For working as State input operation across the detection zone and the viewing area contact surface when, according to connecing that the input operation is crossed over Contacting surface, from from the instruction of different contact surfaces corresponding predetermined registration operation, it is determined that with the operational order corresponding to current input operation, its In, the predetermined registration operation instruction at least includes returning a upper graphical interfaces and/or shows main menu.
12. devices according to claim 7, it is characterised in that the induction region can also include the viewing area;
Described device also includes:Second operation acquiring unit, for being obtained in the graphical interfaces by the sensing unit Input operation.
13. a kind of Wearable electronic equipments, it is characterised in that the Wearable electronic equipment has a processor, and and institute State sensing unit and display unit that processor is connected, the sensing unit correspondence induction region, when the Wearable electronics sets For when being worn on user, the eyes of user have a viewing area;When the presentation list of the Wearable electronic equipment When unit shows a graphical interfaces, with the graphical interfaces to there is a viewing area;Wherein, the viewing area is comprising described aobvious Be not belonging in showing region, and the viewing area viewing area subregion be detection zone, the induction region bag Containing the detection zone, the processor is built-in with operational order identifying device as above described in any one of claim 7 to 12.
CN201310085514.7A 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment Active CN104063037B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310085514.7A CN104063037B (en) 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310085514.7A CN104063037B (en) 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment

Publications (2)

Publication Number Publication Date
CN104063037A CN104063037A (en) 2014-09-24
CN104063037B true CN104063037B (en) 2017-03-29

Family

ID=51550791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310085514.7A Active CN104063037B (en) 2013-03-18 2013-03-18 A kind of operational order recognition methods, device and Wearable electronic equipment

Country Status (1)

Country Link
CN (1) CN104063037B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104750253B (en) * 2015-03-11 2018-10-12 苏州佳世达电通有限公司 A kind of electronic device carrying out body-sensing input for user
CN107466396A (en) * 2016-03-22 2017-12-12 深圳市柔宇科技有限公司 Head-mounted display apparatus and its control method
CN107728923B (en) * 2017-10-20 2020-11-03 维沃移动通信有限公司 Operation processing method and mobile terminal
CN108008873A (en) * 2017-11-10 2018-05-08 亮风台(上海)信息科技有限公司 A kind of operation method of user interface of head-mounted display apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122229A (en) * 2010-02-19 2011-07-13 微软公司 Use of bezel as an input mechanism
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface
CN102884498A (en) * 2010-02-19 2013-01-16 微软公司 Off-screen gestures to create on-screen input

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013521576A (en) * 2010-02-28 2013-06-10 オスターハウト グループ インコーポレイテッド Local advertising content on interactive head-mounted eyepieces
US20120050140A1 (en) * 2010-08-25 2012-03-01 Border John N Head-mounted display control
US20130002724A1 (en) * 2011-06-30 2013-01-03 Google Inc. Wearable computer with curved display and navigation tool

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122229A (en) * 2010-02-19 2011-07-13 微软公司 Use of bezel as an input mechanism
CN102884498A (en) * 2010-02-19 2013-01-16 微软公司 Off-screen gestures to create on-screen input
US8316319B1 (en) * 2011-05-16 2012-11-20 Google Inc. Efficient selection of characters and commands based on movement-inputs at a user-inerface

Also Published As

Publication number Publication date
CN104063037A (en) 2014-09-24

Similar Documents

Publication Publication Date Title
US9529442B2 (en) Head mounted display providing eye gaze calibration and control method thereof
KR101812227B1 (en) Smart glass based on gesture recognition
CN105511846B (en) Electronic equipment and display control method
CN108595002B (en) Information processing method and electronic equipment
CN104063037B (en) A kind of operational order recognition methods, device and Wearable electronic equipment
US20150002394A1 (en) Head mounted display providing eye gaze calibration and control method thereof
CN104199547B (en) Virtual touch screen operation device, system and method
CN102880304A (en) Character inputting method and device for portable device
KR20150032019A (en) Method and apparatus for providing user interface by using eye tracking
WO2021244145A1 (en) Head-mounted display device interaction method, terminal device, and storage medium
CN106125307A (en) Outer hanging panel Formula V R glasses and adjust the method that shows of screen and screen display device
KR20120045667A (en) Apparatus and method for generating screen for transmitting call using collage
KR20210094247A (en) Display apparatus and method for controlling the same
CN105739707A (en) Electronic equipment, face identifying and tracking method and three-dimensional display method
CN103530060B (en) Display device and control method, gesture identification method
CN104076907A (en) Control method, control device and wearable electronic equipment
WO2016064073A1 (en) Smart glasses on which display and camera are mounted, and a space touch inputting and correction method using same
US20160041616A1 (en) Display device and control method thereof, and gesture recognition method
CN104199548B (en) A kind of three-dimensional man-machine interactive operation device, system and method
CN105450838A (en) Information processing method and electronic device
CN105739700A (en) Notice opening method and apparatus
CN104866786B (en) A kind of display methods and electronic equipment
CN104866103A (en) Relative position determining method, wearable electronic equipment and terminal equipment
CN105045395A (en) Display device and image display method
WO2014084634A1 (en) Mouse apparatus for eye-glass type display device, and method for driving same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant