CN106940477A - A kind of control method and electronic equipment - Google Patents

A kind of control method and electronic equipment Download PDF

Info

Publication number
CN106940477A
CN106940477A CN201710151032.5A CN201710151032A CN106940477A CN 106940477 A CN106940477 A CN 106940477A CN 201710151032 A CN201710151032 A CN 201710151032A CN 106940477 A CN106940477 A CN 106940477A
Authority
CN
China
Prior art keywords
user
electronic equipment
actions menu
wear
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710151032.5A
Other languages
Chinese (zh)
Inventor
周运
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201710151032.5A priority Critical patent/CN106940477A/en
Publication of CN106940477A publication Critical patent/CN106940477A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This application discloses a kind of control method and electronic equipment, the control method is applied to wear-type electronic equipment, and method includes:Receive and show actions menu instruction;Based on display actions menu instruction, the actions menu of wear-type electronic equipment is shown in first position so that the actions menu that the user for wearing wear-type electronic equipment is perceived by wear-type electronic equipment is located at the region in the user visual field on the lower.It is different from the situation for actions menu being projected can be brought inconvenience immediately ahead of user's sight to user depending on thing in the prior art, the application shows actions menu in first position, so that the actions menu that user perceives is located at the region of the user visual field on the lower, the object immediately ahead of user will not be blocked, and user to actions menu when operating, without lifting arm to front, the amplitude that arm need to only be lifted to a very little can be achieved to operate the actions menu in the user visual field on the lower region, the complexity of user's operation is reduced, further improves Consumer's Experience.

Description

A kind of control method and electronic equipment
Technical field
The application is related to technical field of data processing, more particularly to a kind of control method and electronic equipment.
Background technology
Existing augmented reality (Augmented Reality, AR) glasses are after activation system desktop, and system desktop can quilt The front parallel with sight is projected, the object immediately ahead of user is blocked, causes inconvenience to the user, moreover, user needs lift Play arm and carry out gesture operation to front, realize the control of system desktop, further the usage experience of influence user.
The content of the invention
In view of this, the purpose of the application is to provide a kind of control method and electronic equipment, to solve in the prior art The system desktop of AR glasses is projected onto the front parallel with user's sight and can brought inconvenience to user depending on thing, and increase is used The complexity of family operation, influences the technical problem of user experience.
This application provides a kind of control method, applied to wear-type electronic equipment, including:
Receive and show actions menu instruction;
Based on the display actions menu instruction, the actions menu of the wear-type electronic equipment is shown in first position, So that the actions menu position that the user for wearing the wear-type electronic equipment is perceived by the wear-type electronic equipment Region in the user visual field on the lower.
The above method, it is preferred that the region on the lower is specially:User with wearing the wear-type electronic equipment The position of target body site relative level.
The above method, it is preferred that the actions menu of the wear-type electronic equipment is shown in first position, including:
Acquisition wears the eyes of user of the wear-type electronic equipment to the distance on ground;
Based on the distance, height of the target body site apart from ground is obtained;
Determine that with the highly corresponding position be first position in the wear-type electronic equipment;
The actions menu is shown in the first position.
The above method, it is preferred that the actions menu includes using with described between multiple operation objects, the operation object In fan-shaped distribution centered on family.
The above method, it is preferred that also include:
The operating gesture of acquisition operations body;
If the operating gesture is the central region area that the operating body drag target operation object shifts to the user Domain, generates triggering command;
Based on the triggering command, the corresponding interface of the target operation object is shown in the central region region.
Present invention also provides a kind of electronic equipment, including:
Control module, actions menu instruction is shown for receiving, based on the display actions menu instruction, by the electronics The actions menu of equipment is transmitted;
Display module, the actions menu for receiving control module transmission, and show in first position the operation Menu so that the actions menu that the user for wearing the electronic equipment is perceived by the electronic equipment is located at described use Region in the visual field of family on the lower.
Above-mentioned electronic equipment, it is preferred that also include:
Identification module, for the operating gesture of acquisition operations body, if the operating gesture, which is the operating body, drags mesh Mark operation object shifts to the central region region of the user, then triggering command is generated to the control module, by the control Module is based on the triggering command, and the corresponding interface of the target operation object is shown in the central region region.
Above-mentioned electronic equipment, it is preferred that the identification module is:Picture recognition module or touch recognition module.
Above-mentioned electronic equipment, it is preferred that also include:
Input interface, the input data for gathering user, the input data includes input voice data or image is defeated Enter data;
Trigger module, for based on the input voice data or image input data, generation to show menu operation instruction.
Above-mentioned electronic equipment, it is preferred that also include:
Fixation kit, for the electronic equipment to be worn on into user's head.
A kind of control method and electronic equipment provided from such scheme, the application, is receiving display operation dish After single instrction, the actions menu of wear-type electronic equipment is shown in first position, and then to wear wear-type electronic equipment The playground menu that is perceived by wear-type electronic equipment of user be located at region in the user visual field on the lower, be different from existing Actions menu is projected in technology in situation about can be brought inconvenience immediately ahead of user's sight to user depending on thing, the application first Position display actions menu so that the actions menu that user perceives is located at the region of the user visual field on the lower, will not block user The object in front, and user, without arm is lifted to front, need to only lift arm when being operated to actions menu The amplitude for playing a very little is that can be achieved to operate the actions menu in the user visual field on the lower region, reduction user's operation Complexity, further improves Consumer's Experience.
Brief description of the drawings
, below will be to make needed for embodiment description in order to illustrate more clearly of the technical scheme in the embodiment of the present application Accompanying drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the present application, for For those of ordinary skill in the art, without having to pay creative labor, it can also be obtained according to these accompanying drawings His accompanying drawing.
Fig. 1 is a kind of flow chart for control method that the embodiment of the present application one is provided;
Fig. 2~Fig. 5 is respectively the application exemplary plot of the embodiment of the present application one;
Fig. 6 is the partial process view of the embodiment of the present application one;
Fig. 7~Figure 10 is the other application exemplary plot of the embodiment of the present application one;
Figure 11 is another flow chart of the embodiment of the present application one;
Figure 12~Figure 13 is the other application exemplary plot of the embodiment of the present application one;
Figure 14 is the structural representation for a kind of electronic equipment that the embodiment of the present application two is provided;
Figure 15 is the structural representation for a kind of electronic equipment that the embodiment of the present application three is provided;
Figure 16 is the structural representation for a kind of electronic equipment that the embodiment of the present application three is provided;
Figure 17 is the structural representation for a kind of electronic equipment that the embodiment of the present application four is provided;
Figure 18 is another structural representation of the embodiment of the present application four;
Figure 19~Figure 21 is respectively the application exemplary plot of the embodiment of the present application four.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present application, the technical scheme in the embodiment of the present application is carried out clear, complete Site preparation is described, it is clear that described embodiment is only some embodiments of the present application, rather than whole embodiments.It is based on Embodiment in the application, it is every other that those of ordinary skill in the art are obtained under the premise of creative work is not made Embodiment, belongs to the scope of the application protection.
It is a kind of flow chart of realizing for control method that the embodiment of the present application one is provided with reference to Fig. 1, this method is applied to head Wear in formula electronic equipment, such as AR glasses, the wear-type electronic equipment can export actions menu, user can be by operation Menu is operated, and realizes corresponding function.
In the present embodiment, this method may comprise steps of:
Step 101:Receive and show actions menu instruction.
Wherein, display actions menu instruction can pass through language of the voice acquisition module to the user of wear-type electronic equipment Sound input is acquired, and then whether the phonetic entry of identifying user is the phonetic entry of wake operation menu, and then is generated aobvious Show that actions menu is instructed, realize that voice wakes up menu;
Or, display actions menu instruction can pass through hand of the image capture module to the user of wear-type resistance equipment Gesture input is obtained, and then whether the gesture input of identifying user is the gesture input of wake operation menu, and then is generated aobvious Show that actions menu is instructed, realize that gesture wakes up menu.
Step 102:Based on display actions menu instruction, the actions menu of wear-type electronic equipment is shown in first position, So that the actions menu that the user for wearing wear-type electronic equipment is perceived by wear-type electronic equipment is located in the user visual field Region on the lower.
Wherein, first position is the position on the display screen of wear-type electronic equipment, by AR of wear-type electronic equipment Exemplified by mirror:
First position is the location A on the eyeglass of AR glasses, as shown in Figure 2 so that the user for wearing AR glasses passes through The actions menu that AR glasses are perceived is located in the region in the user visual field on the lower, such as B regions, and the B regions are different from use The front C region parallel with sight in the visual field of family, thus, the display of actions menu has no effect on user and front is regarded Thing, thus improves usage experience of the user to AR glasses;
Meanwhile, user lifts when needing to operate actions menu without arm is raised in front, such as Fig. 3 Play that 90 degree of arm is even more more, and need to only lift be significantly less than 90 degree of angle such as in Fig. 3 30 degree, you can realization is to grasping Make the operation of menu, thus reduce the complexity of user's operation, further improve usage experience of the user to AR glasses.
A kind of control method provided from such scheme, the embodiment of the present application one, is receiving display actions menu After instruction, the actions menu of wear-type electronic equipment is shown in first position, and then to wear wear-type electronic equipment The playground menu that user is perceived by wear-type electronic equipment is located at the region in the user visual field on the lower, is different from existing skill Actions menu is projected in art in situation about can be brought inconvenience immediately ahead of user's sight to user depending on thing, the present embodiment first Position display actions menu so that the actions menu that user perceives is located at the region of the user visual field on the lower, will not block user The object in front, and user, without arm is lifted to front, need to only lift arm when being operated to actions menu The amplitude for playing a very little is that can be achieved to operate the actions menu in the user visual field on the lower region, reduction user's operation Complexity, further improves Consumer's Experience.
In one implementation, the region in the user visual field on the lower is specifically referred to:With wearing wear-type electronic equipment The position of the target body site relative level of user.
Wherein, can be lumbar regions or buttocks position or arm with the position of target body site relative level The position of wrist relative level when naturally drooping.
As shown in Figure 4, in region of the actions menu in the user visual field on the lower with the waist relative level of user Position, thus, the actions menu had not both interfered with regard thing of the user to front, and user also need to only lift arm in operation To waist location, as shown in Figure 5, further improve the usage experience of user.
In one implementation, as shown in Figure 6, step 102 can be realized by following steps:
Step 601:Acquisition wears the eyes of user of wear-type electronic equipment to the distance on ground.
As shown in Figure 7, eyes of user to ground distance be L1.Can be by being set in wear-type electronics in the present embodiment Eyes of user is measured to ground for upper setting sensor such as range sensor, laser range finder, gyroscope or spirit level etc. Distance, or eyes of user can be obtained in the present embodiment according to the ratio of the height of general human body to the distance on ground.
Step 602:Based on distance, height of the target body site apart from ground is obtained.
As shown in Figure 7, so that target body site is waist as an example, height of the waist apart from ground is obtained according to apart from L1 L2。
Wherein, can be according to the height ratio of general human body, to determine eyes of user to waist based on L1 in the present embodiment Apart from L3, and then L1 subtracted into L3 can obtain L2;Or, in the present embodiment can according to the height ratio of general human body, It is directly based upon L1 and determines L2.
Step 603:Determine that with highly corresponding position be first position in wear-type electronic equipment.
Wherein, height of the target body site apart from ground is different, accordingly the corresponding position on wear-type electronic equipment It is different.As shown in Figure 8, position X corresponding with height L2 on wear-type electronic equipment is first position.
Step 604:Actions menu is shown in first position.
As shown in Figure 9, actions menu is shown on the X of position, as a result, the user for wearing wear-type electronic equipment is led to The actions menu that wear-type electronic equipment perceives is crossed to be in the user visual field for wearing wear-type electronic equipment and intended body The position of position relative level.
In one implementation, multiple operation objects, operation pair can be included in the actions menu of wear-type electronic equipment As that can be three-dimensional (three dimensional, the 3D) stereo-picture such as icon or indication character, such as " photograph ", " music " or It is distributed between operation objects such as " videos ", operation object with the user-center for wearing wear-type electronic equipment in fan-shaped.As schemed Shown in 10, the operation object customer-centric in actions menu is centered around around user, is distributed in 120 degree of sector, tool Body is centered around around the target body site of user such as waist, so as to the operation using family to operation object.
In one implementation, as shown in Figure 11, the control method in the present embodiment can also comprise the following steps:
Step 103:The operating gesture of acquisition operations body.
Wherein, operating body can be identified by the image of acquisition operations body using image capture module in the present embodiment Operating gesture, such as upward sliding or slide downward gesture.
Step 104:If operating gesture is the central region region that operating body drag target operation object shifts to user, Generate triggering command.
By the way that operating gesture is identified in the present embodiment, to determine whether operating gesture is user's operating body dragging one The operating gesture that individual or multiple target operation objects are moved to the central region region of user, as shown in Figure 12, operating body is dragged Moving-target operation object is moved up, the central region region being moved to immediately ahead of user, now, is shown that user needs to open and is somebody's turn to do The interface of target operation object, such as watches video or browsing pictures, now generates triggering command.
Step 105:Based on triggering command, the corresponding interface of display target operation object in central region region.
As shown in Figure 13, need to open the triggering command at the interface of target operation object based on user, in regarding for user The corresponding interface of wild central area display target operation object, such as video display interface or picture browsing interface.
In the present embodiment, target operation object is clicked on if operating gesture is operating body, increase target operation object Display brightness, to show that operating body chooses target operation object.
It is the structural representation for a kind of electronic equipment that the embodiment of the present application two is provided, the electronic equipment can with reference to Figure 14 Think wear-type electronic equipment, such as AR glasses.
In the present embodiment, the electronic equipment can include following structure:
Control module 1401, actions menu instruction is shown for receiving, based on display actions menu instruction, by electronic equipment Operating body menu be transmitted.
Display module 1402, the actions menu transmitted for receive and control module 1402, and show operation in first position Menu so that the actions menu that the user of wearable electronic device is perceived by electronic equipment is located at the area in the user visual field on the lower Domain.
The a kind of electronic equipment provided from such scheme, the embodiment of the present application two, is receiving display actions menu After instruction, the actions menu of wear-type electronic equipment is shown in first position, and then to wear wear-type electronic equipment The playground menu that user is perceived by wear-type electronic equipment is located at the region in the user visual field on the lower, is different from existing skill Actions menu is projected in art in situation about can be brought inconvenience immediately ahead of user's sight to user depending on thing, the present embodiment first Position display actions menu so that the actions menu that user perceives is located at the region of the user visual field on the lower, will not block user The object in front, and user, without arm is lifted to front, need to only lift arm when being operated to actions menu The amplitude for playing a very little is that can be achieved to operate the actions menu in the user visual field on the lower region, reduction user's operation Complexity, further improves Consumer's Experience.
It is the structural representation for a kind of electronic equipment that the embodiment of the present application three is provided, the electronic equipment is also with reference to Figure 15 Following structure can be included:
Identification module 1403, for the operating gesture of acquisition operations body, if operating gesture is grasped for operating body drag target The central region region of user is shifted to as object, then generates triggering command to control module 1402, is based on by control module 1402 Triggering command, the corresponding interface of display target operation object in central region region.
Wherein, the identification module 1403 can be the mould of picture recognition module, such as camera and image recognition chip composition Block etc., by gathering one or more images for including operating body, come identify operating body operating gesture whether be dragging mesh Mark operation object shifts to the gesture in the central region region of user.
Or, the identification module 1403 can be touch recognition module, and such as touch screen is being touched by acquisition operations body Operation trace on region is come whether identify the operating gesture of operating body be the visual field that drag target operation object shifts to user The gesture of central area.
It is the structural representation for a kind of electronic equipment that the embodiment of the present application three is provided, the electronic equipment is also with reference to Figure 16 Following structure can be included:
Input interface 1404, the input data for gathering user.
Wherein, input data can be:Input voice data, the use such as collected by microphone sound collection equipment The voice signal at family.
Or, input data can be:Image input data, such as by imaging the use that first-class image capture device is collected The view data at family.
Trigger module 1405, for based on input voice data or image input data, generation to show actions menu instruction.
Refer to for example, trigger module 1405 generates display actions menu by recognizing the voice signal in input voice data Order, and then the actions menu of electronic equipment is waken up;Or, trigger module 1405 is by recognizing the gesture in image input data To generate display actions menu instruction, and then the actions menu of electronic equipment is waken up.
That is, user can wake up out actions menu by microphone, operation can also be waken up out by gesture Menu.
It is the structural representation for a kind of electronic equipment that the embodiment of the present application four is provided, the electronic equipment is also with reference to Figure 17 It can include:
Fixation kit 1406, for the other structures on electronic equipment to be fixed on into user's head.
For example, fixation kit 1406 can be loop configuration, as shown in Figure 18, by control module 1401 and display module 1402 are worn on user's head, and the eyes of user can perceive the area in the user visual field on the lower by display module 1402 Actions menu in domain, and actions menu has no effect on user and regards thing ability to front, and user enters to actions menu During row operation, only need to lift arm can operate for such as 30 degree to a small angle to actions menu, further improve and use Family usage experience.
So that electronic equipment is AR glasses as an example, AR glasses are fixed on user's head, the mirror of AR glasses by bandage or temple The display of piece is relative with eyes of user, and it is aobvious by its eyeglass that eyes of user can perceive AR glasses through the eyeglass of AR eyes Show the content that device is exported, such as actions menu.
In the present embodiment, action, voice or gesture that user can be by wearing AR glasses etc. is given birth to trigger AR glasses Instructed into display actions menu, and then wake operation menu:By the actions menu of AR glasses AR glasses lens display Shown on first position so that the user for wearing AR glasses can perceive the actions menu positioned at user by lens display Region in the visual field on the lower, such as position with the waist relative level of user, and include multiple operation objects such as in actions menu Operation object in the 3D stereo-pictures such as chart or indication character, actions menu can be distributed in 120 degree of sector, in such as Figure 19 It is shown, and screen and ground where actions menu is into 30 degree angles, as shown in Figure 20, to facilitate the micro- left side of lift arm such as 30 of user Right angle can be operated to actions menu.
And user can operate the actions menu when being operated to actions menu by the movement of operating body such as hand In operation object, such as choose some operation object, and in the increase display brightness such as the operation object chosen such as icon, such as Figure 21 Shown, selected to show operation object state.And user enters to be about to operation object by operating body and is dragged to regarding for user During wild central area, AR glasses show at the interface of trailing operation object in central region region, as shown in figure 13.This During, the display in the region of the interface that central region region is shown and actions menu in the user visual field on the lower does not conflict.
From in above-mentioned realization, actions menu is shown in the front of user's waist in real time, follows user and move, relatively It can be quickly found in user;And actions menu be shown in will not stop on the height of waist user to front regard thing energy Power, user is not influenceed to the observation of real world or the content at the shown interface of interference user viewing;And in user behaviour Arm need not be raised to very high position, attitude naturally, fatigue strength is small when making.
Further, the operation object in actions menu is in fan-shaped arrangement form, and customer-centric is more convenient user Operation, and relative to the linear pattern arrangement mode of same length, fan-shaped arrangement mode can show more contents.
It should be noted that each embodiment in this specification is described by the way of progressive, each embodiment weight Point explanation be all between difference with other embodiment, each embodiment identical similar part mutually referring to.
Finally, in addition it is also necessary to explanation, herein, such as first and second or the like relational terms be used merely to by One entity or operation make a distinction with another entity or operation, and not necessarily require or imply these entities or operation Between there is any this actual relation or order.Moreover, term " comprising ", "comprising" or its any other variant meaning Covering including for nonexcludability, so that process, method, article or equipment including a series of key elements not only include that A little key elements, but also other key elements including being not expressly set out, or also include be this process, method, article or The intrinsic key element of equipment.In the absence of more restrictions, the key element limited by sentence "including a ...", is not arranged Except also there is other identical element in the process including the key element, method, article or equipment.
A kind of control method provided by the present invention and electronic equipment are described in detail above, to disclosed reality The described above of example is applied, professional and technical personnel in the field is realized or using the present invention.The a variety of of these embodiments are repaiied Change and will be apparent for those skilled in the art, generic principles defined herein can not departed from In the case of the spirit or scope of the present invention, realize in other embodiments.Therefore, the present invention is not intended to be limited to this paper institutes These embodiments shown, and it is to fit to the most wide scope consistent with features of novelty with principles disclosed herein.

Claims (10)

1. a kind of control method, applied to wear-type electronic equipment, including:
Receive and show actions menu instruction;
Based on the display actions menu instruction, the actions menu of the wear-type electronic equipment is shown in first position so that The actions menu that the user for wearing the wear-type electronic equipment is perceived by the wear-type electronic equipment is located at institute State region on the lower in the user visual field.
2. according to the method described in claim 1, it is characterised in that the region on the lower is specially:With wearing described wear The position of the target body site relative level of the user of formula electronic equipment.
3. method according to claim 2, it is characterised in that the behaviour of the wear-type electronic equipment is shown in first position Make menu, including:
Acquisition wears the eyes of user of the wear-type electronic equipment to the distance on ground;
Based on the distance, height of the target body site apart from ground is obtained;
Determine that with the highly corresponding position be first position in the wear-type electronic equipment;
The actions menu is shown in the first position.
4. according to the method described in claim 1, it is characterised in that the actions menu includes multiple operation objects, the behaviour Make to be distributed in fan-shaped with the user-center between object.
5. according to the method described in claim 1, it is characterised in that also include:
The operating gesture of acquisition operations body;
It is raw if the operating gesture is the central region region that the operating body drag target operation object shifts to the user Into triggering command;
Based on the triggering command, the corresponding interface of the target operation object is shown in the central region region.
6. a kind of electronic equipment, including:
Control module, actions menu instruction is shown for receiving, based on the display actions menu instruction, by the electronic equipment Actions menu be transmitted;
Display module, the actions menu for receiving control module transmission, and show in first position the actions menu, So that the actions menu that the user for wearing the electronic equipment is perceived by the electronic equipment is regarded positioned at the user The region of Yezhong on the lower.
7. electronic equipment according to claim 6, it is characterised in that also include:
Identification module, for the operating gesture of acquisition operations body, if the operating gesture is grasped for the operating body drag target The central region region of the user is shifted to as object, then triggering command is generated to the control module, by the control module Based on the triggering command, the corresponding interface of the target operation object is shown in the central region region.
8. electronic equipment according to claim 7, it is characterised in that the identification module is:Picture recognition module is touched Touch identification module.
9. electronic equipment according to claim 6, it is characterised in that also include:
Input interface, the input data for gathering user, the input data includes input voice data or image input number According to;
Trigger module, for based on the input voice data or image input data, generation to show menu operation instruction.
10. electronic equipment according to claim 6, it is characterised in that also include:
Fixation kit, for the electronic equipment to be worn on into user's head.
CN201710151032.5A 2017-03-14 2017-03-14 A kind of control method and electronic equipment Pending CN106940477A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710151032.5A CN106940477A (en) 2017-03-14 2017-03-14 A kind of control method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710151032.5A CN106940477A (en) 2017-03-14 2017-03-14 A kind of control method and electronic equipment

Publications (1)

Publication Number Publication Date
CN106940477A true CN106940477A (en) 2017-07-11

Family

ID=59469163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710151032.5A Pending CN106940477A (en) 2017-03-14 2017-03-14 A kind of control method and electronic equipment

Country Status (1)

Country Link
CN (1) CN106940477A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933195A (en) * 2019-03-06 2019-06-25 广州世峰数字科技有限公司 The three-dimensional methods of exhibiting in interface and interactive system based on MR mixed reality technology

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140184496A1 (en) * 2013-01-03 2014-07-03 Meta Company Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities
US20150301592A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device
CN106575043A (en) * 2014-09-26 2017-04-19 英特尔公司 Systems, apparatuses, and methods for gesture recognition and interaction

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140184496A1 (en) * 2013-01-03 2014-07-03 Meta Company Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities
US20150301592A1 (en) * 2014-04-18 2015-10-22 Magic Leap, Inc. Utilizing totems for augmented or virtual reality systems
CN106575043A (en) * 2014-09-26 2017-04-19 英特尔公司 Systems, apparatuses, and methods for gesture recognition and interaction
CN105867599A (en) * 2015-08-17 2016-08-17 乐视致新电子科技(天津)有限公司 Gesture control method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109933195A (en) * 2019-03-06 2019-06-25 广州世峰数字科技有限公司 The three-dimensional methods of exhibiting in interface and interactive system based on MR mixed reality technology
CN109933195B (en) * 2019-03-06 2022-04-22 广州世峰数字科技有限公司 Interface three-dimensional display method and interaction system based on MR mixed reality technology

Similar Documents

Publication Publication Date Title
US10121063B2 (en) Wink gesture based control system
KR20230164185A (en) Bimanual interactions between mapped hand regions for controlling virtual and graphical elements
KR20230170086A (en) Hand gestures for animating and controlling virtual and graphical elements
CN116724285A (en) Micro-gestures for controlling virtual and graphical elements
Barfield et al. Basic concepts in wearable computers and augmented reality
KR20230074780A (en) Touchless photo capture in response to detected hand gestures
AU2014204252B2 (en) Extramissive spatial imaging digital eye glass for virtual or augmediated vision
EP4196866A2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
CN103649874B (en) Use the interface of eye tracking contact lens
CN103631364B (en) A kind of control method and electronic equipment
CN106258004B (en) Virtual live-action device and operation mode
CN105393192A (en) Web-like hierarchical menu display configuration for a near-eye display
DE102013207528A1 (en) A method for interacting with an object displayed on a data goggle
CN108027655A (en) Information processing system, information processing equipment, control method and program
CN109145802A (en) More manpower gesture man-machine interaction methods and device based on Kinect
CN115735150A (en) Augmented reality eye wears ware and 3D clothing
CN107589628A (en) A kind of holographic projector and its method of work based on gesture identification
CN103984101B (en) Display contents controlling method and device
CN106940477A (en) A kind of control method and electronic equipment
WO2024049576A1 (en) Real-world responsiveness of a collaborative object
WO2024049578A1 (en) Scissor hand gesture for a collaborative object
WO2024049585A1 (en) Timelapse of generating a collaborative object
WO2024049580A1 (en) Authenticating a selective collaborative object
CN114296543A (en) Fingertip force detection and gesture recognition intelligent interaction system and intelligent ring
CN106095088B (en) A kind of electronic equipment and its image processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20170711

RJ01 Rejection of invention patent application after publication