CN111948807A - Control method, control device, wearable device and storage medium - Google Patents

Control method, control device, wearable device and storage medium Download PDF

Info

Publication number
CN111948807A
CN111948807A CN201910398809.7A CN201910398809A CN111948807A CN 111948807 A CN111948807 A CN 111948807A CN 201910398809 A CN201910398809 A CN 201910398809A CN 111948807 A CN111948807 A CN 111948807A
Authority
CN
China
Prior art keywords
voice command
wearable device
issuer
authority
command
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910398809.7A
Other languages
Chinese (zh)
Other versions
CN111948807B (en
Inventor
杨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910398809.7A priority Critical patent/CN111948807B/en
Publication of CN111948807A publication Critical patent/CN111948807A/en
Application granted granted Critical
Publication of CN111948807B publication Critical patent/CN111948807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/225Feedback of the input speech

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Prostheses (AREA)

Abstract

The application discloses a control method of wearable equipment. The wearable device comprises a shell, a display contained in the shell and a plurality of sound and electricity elements arranged on the shell, wherein the sound and electricity elements are used for collecting sound commands. The control method comprises the following steps: determining whether a sender of the voice command has the execution authority of the voice command or not according to the current working mode of the wearable device; under the condition that the sender has the execution authority of the voice command, controlling the wearable device to execute the voice command in the current working mode and enabling the display to display the content corresponding to the voice command; and controlling the wearable device to ignore the voice command in the current working mode under the condition that the sender does not have the execution authority of the voice command. Therefore, the wearable device can be prevented from being triggered by mistake by a person who does not have the execution authority of the voice command in the current working mode, and the safety and the user experience of the wearable device can be improved. The application also discloses a control device, wearable equipment and a storage medium.

Description

Control method, control device, wearable device and storage medium
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a control method, a control device, a wearable device, and a storage medium.
Background
The related art may interact with the wearable device through speech. However, in the case of a complex speech environment, such interaction is prone to false triggering of the wearable device. In addition, as the voice operator is generally identified through the voiceprint technology in the related technology, the operator needs to input the voiceprint in advance, and other persons who input the voiceprint cannot exist around the controller when the voice operator is used, otherwise, false triggering is easily caused, and even the voice operator cannot be identified. Therefore, the operation is complex, and the user experience is not improved easily.
Disclosure of Invention
The application provides a control method, a control device, wearable equipment and a storage medium.
The embodiment of the application provides a control method of wearable equipment. The wearable device comprises a shell, a display contained in the shell and a plurality of acoustic-electric elements arranged on the shell, wherein the acoustic-electric elements are used for collecting voice commands, and the control method comprises the following steps:
determining whether the sender of the voice command has the execution authority of the voice command according to the current working mode of the wearable device;
if the issuer has the execution authority of the voice command, controlling the wearable device to execute the voice command in the current working mode and enabling the display to display the content corresponding to the voice command;
and controlling the wearable device to ignore the voice command in the current working mode under the condition that the sender does not have the execution authority of the voice command.
The embodiment of the application provides a control device of wearable equipment. The wearable device comprises a shell, a display accommodated in the shell and a plurality of sound and electricity elements arranged on the shell, wherein the sound and electricity elements are used for collecting sound commands, the control device comprises a determining module, a first control module and a second control module, and the determining module is used for determining whether a sender of the sound commands has the execution permission of the sound commands according to the current working mode of the wearable device; the first control module is used for controlling the wearable device to execute the voice command in the current working mode and enabling the display to display the content corresponding to the voice command under the condition that the sender has the execution authority of the voice command; the second control module is used for controlling the wearable device to ignore the voice command in the current working mode under the condition that the sender does not have the execution authority of the voice command.
The wearable device comprises a processor, a shell, a display contained in the shell and a plurality of acoustic-electric elements arranged on the shell, wherein the acoustic-electric elements are used for collecting voice commands, and the processor is used for executing the control method of the wearable device in any one of the above embodiments.
One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the above-described method of controlling a wearable device.
According to the control method, the wearable device and the storage medium, whether the sender of the voice command has the execution authority of the voice command or not is determined according to the current working mode of the wearable device, so that the wearable device is controlled to execute or ignore the voice command, a person who does not have the execution authority of the voice command in the current working mode can be prevented from mistakenly triggering the wearable device, and the safety and the user experience of the wearable device can be improved.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a perspective view of a wearable device according to an embodiment of the present application;
FIG. 2 is a schematic plan view of a wearable device according to another embodiment of the present application;
FIG. 3 is a schematic plan view of a part of the structure of the wearable device according to the embodiment of the present application;
FIG. 4 is a schematic diagram of an adjustment process of a wearable device of an embodiment of the present application;
FIG. 5 is another schematic illustration of an adjustment process of the wearable device of an embodiment of the present application;
FIG. 6 is a schematic plan view of a portion of a wearable device according to another embodiment of the present application;
FIG. 7 is a schematic plan view of a portion of a wearable device according to yet another embodiment of the present application;
fig. 8 is a flowchart illustrating a control method of a wearable device according to an embodiment of the present application;
fig. 9 is a block diagram of a control device of the wearable device according to the embodiment of the present application;
FIG. 10 is a schematic view of another module of a wearable device according to an embodiment of the present application;
fig. 11 is a flowchart illustrating a control method of a wearable device according to still another embodiment of the present application;
fig. 12 is a flowchart illustrating a method of controlling a wearable device according to another embodiment of the present application;
fig. 13 is a flowchart illustrating a method of controlling a wearable device according to another embodiment of the present application;
fig. 14 is a scene schematic diagram of a control method of a wearable device according to an embodiment of the present application;
fig. 15 is another schematic view of a control method of a wearable device according to an embodiment of the present disclosure;
fig. 16 is a flowchart illustrating a method of controlling a wearable device according to another embodiment of the present application;
fig. 17 is a flowchart illustrating a method of controlling a wearable device according to another embodiment of the present application;
fig. 18 is a flowchart illustrating a control method of a wearable device according to still another embodiment of the present application;
fig. 19 is a flowchart illustrating a control method of a wearable device according to another embodiment of the present application;
fig. 20 is a schematic view of another scene of a control method of a wearable device according to an embodiment of the present application;
fig. 21 is a flowchart illustrating a method of controlling a wearable device according to still another embodiment of the present application.
Description of the reference symbols:
the housing 20, the receiving slot 22, the housing top wall 24, the housing bottom wall 26, the notch 262, the housing side wall 28, the supporting member 30, the first bracket 32, the first bending portion 322, the second bracket 34, the second bending portion 342, the elastic band 36, the display 40, the diopter member 50, the diopter cavity 52, the transparent liquid 54, the first film layer 56, the second film layer 58, the side wall 59, the adjusting mechanism 60, the cavity 62, the sliding slot 622, the sliding member 64, the driving member 66, the knob 662, the screw shaft 664, the gear 666, the rack 668, the driving motor 669, the motor shaft 6691, the input device 6692, and the adjusting cavity 68;
wearable apparatus 100, control device 10, determination module 12, first control module 14 and second control module 16, processor 101, memory 102, internal memory 103, display 104, input device 105, sound source 200, audience 300.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary and intended to be used for explaining the present application and should not be construed as limiting the present application.
Referring to fig. 1 and 2, a wearable device 100 according to an embodiment of the present application includes a housing 20, a support member 30, a display 40, a diopter member 50, and an adjustment mechanism 60.
The housing 20 is an external component of the wearable device 100, and serves to protect and fix an internal component of the wearable device 100. By enclosing the internal components with the housing 20, direct damage to the internal components from external factors can be avoided.
Specifically, in this embodiment, the housing 20 may be used to house and secure at least one of the display 40, the diopter member 50, and the adjustment mechanism 60. In the example of fig. 2, the housing 20 is formed with a receiving slot 22, and the display 40 and the diopter member 50 are received in the receiving slot 22. The adjustment mechanism 60 is partially exposed from the housing 20.
The housing 20 also includes a housing top wall 24, a housing bottom wall 26, and housing side walls 28. The middle of the housing bottom wall 26 forms a notch 262 toward the housing top wall 24. Alternatively, the housing 20 is generally "B" shaped. When the user wears the wearable device 100, the wearable device 100 can be erected on the bridge of the nose of the user through the notch 262, so that the stability of the wearable device 100 can be guaranteed, and the wearing comfort of the user can be guaranteed. The adjustment mechanism 60 may be partially exposed from the housing sidewall 28 to allow the user to adjust the diopter member 50.
In addition, housing 20 may be formed from a Computer Numerically Controlled (CNC) machine tool aluminum alloy, or may be injection molded from Polycarbonate (PC) or PC and Acrylonitrile Butadiene Styrene (ABS). The specific manner of manufacturing and the specific materials of the housing 20 are not limited herein.
The support member 30 is used to support the wearable device 100. The wearable device 100 may be fixed on the head of the user by the support member 30 when the user wears the wearable device 100. In the example of fig. 2, the support member 30 includes a first bracket 32, a second bracket 34, and an elastic band 36.
The first bracket 32 and the second bracket 34 are symmetrically disposed about the notch 262. Specifically, the first stand 32 and the second stand 34 are rotatably provided at the edge of the housing 20, and the first stand 32 and the second stand 34 can be stacked adjacent to the housing 20 for storage when the user does not need to use the wearable device 100. When the user needs to use the wearable device 100, the first stand 32 and the second stand 34 can be unfolded to realize the function of the first stand 32 and the second stand 34.
The first bracket 32 has a first bent portion 322 formed at an end thereof away from the housing 20, and the first bent portion 322 is bent toward the housing bottom wall 26. In this way, when the user wears the wearable device 100, the first bending part 322 can be erected on the ear of the user, so that the wearable device 100 is not easy to slip off.
Similarly, the end of the second bracket 34 away from the housing 20 is formed with a second bent portion 342. The explanation and description of the second bending portion 342 can refer to the first bending portion 322, and are not repeated herein for avoiding redundancy.
The elastic band 36 detachably connects the first bracket 32 and the second bracket 34. In this way, when the user wears the wearable device 100 to perform strenuous activities, the wearable device 100 can be further fixed by the elastic band 36, and the wearable device 100 is prevented from loosening or even falling off during strenuous activities. It is understood that in other examples, the elastic band 36 may be omitted.
In this embodiment, the display 40 includes an OLED display screen. The OLED display does not need a backlight, which is advantageous for the light and thin of the wearable device 100. Moreover, the OLED screen has a large visual angle and low power consumption, and is favorable for saving the power consumption.
Of course, the display 40 may also be an LED display or a Micro LED display. These displays are merely examples and embodiments of the present application are not limited thereto.
Referring also to fig. 3, the diopter member 50 is disposed on a side of the display 40. The refractive member 50 includes a refractive cavity 52, a light-transmissive liquid 54, a first film layer 56, a second film layer 58, and sidewalls 59.
A light-transmissive liquid 54 is disposed within the refractive cavity 52. The adjustment mechanism 60 is used to adjust the amount of the light-transmissive liquid 54 to adjust the configuration of the diopter member 50. Specifically, the second film layer 58 is disposed opposite to the first film layer 56, the sidewall 59 connects the first film layer 56 and the second film layer 58, the first film layer 56, the second film layer 58 and the sidewall 59 enclose the light refraction cavity 52, and the adjusting mechanism 60 is used for adjusting the amount of the transparent liquid 54 to change the shape of the first film layer 56 and/or the second film layer 58.
In this way, the implementation of the dioptric function of the dioptric member 50 is achieved. Specifically, "changing the shape of the first film layer 56 and/or the second film layer 58" includes three cases: in the first case: changing the shape of the first film layer 56 and not changing the shape of the second film layer 58; in the second case: not changing the shape of the first film layer 56 and changing the shape of the second film layer 58; in the third case: the shape of the first film layer 56 is changed and the shape of the second film layer 58 is changed. Note that, for convenience of explanation, in the present embodiment, the first case is explained as an example.
The first film layer 56 may be elastic. It will be appreciated that as the amount of the optically transparent liquid 54 in the refractive cavity 52 changes, the pressure within the refractive cavity 52 changes, thereby causing a change in the configuration of the refractive member 50.
In one example, the adjustment mechanism 60 decreases the amount of the optically transparent liquid 54 in the refractive chamber 52, decreases the pressure within the refractive chamber 52, increases the pressure differential between the pressure outside the refractive chamber 52 and the pressure within the refractive chamber 52, and causes the refractive chamber 52 to be more concave.
In another example, the adjustment mechanism 60 increases the amount of the optically transparent liquid 54 in the refractive chamber 52, increases the pressure within the refractive chamber 52, decreases the pressure differential between the pressure outside the refractive chamber 52 and the pressure within the refractive chamber 52, and increases the convexity of the refractive chamber 52.
In this way, it is achieved that the form of the refractive member 50 is adjusted by adjusting the amount of the light-transmissive liquid 54.
An adjustment mechanism 60 is coupled to the diopter member 50. The adjustment mechanism 60 is used to adjust the configuration of the diopter member 50 to adjust the diopter of the diopter member 50. Specifically, adjustment mechanism 60 includes a cavity 62, a slide 64, a drive member 66, an adjustment cavity 68, and a switch 61.
The sliding member 64 is slidably disposed in the cavity 62, the driving member 66 is connected to the sliding member 64, the cavity 62 and the sliding member 64 jointly define a regulation cavity 68, the regulation cavity 68 is communicated with the refractive cavity 52 through the side wall 59, and the driving member 66 is used for driving the sliding member 64 to slide relative to the cavity 62 to adjust the volume of the regulation cavity 68 so as to regulate the amount of the transparent liquid 54 in the refractive cavity 52.
In this way, the adjustment of the volume of the adjustment chamber 68 by the slider 64 is achieved to adjust the amount of the light-transmissive liquid 54 in the refractive chamber 52. In one example, referring to FIG. 4, as the slide member 64 slides away from the sidewall 59, the volume of the adjustment chamber 68 increases, the pressure within the adjustment chamber 68 decreases, the optically transparent liquid 54 within the refractive chamber 52 enters the adjustment chamber 68, and the first membrane layer 56 increasingly recedes inwardly.
In another example, referring to fig. 5, when the sliding member 64 slides toward the side wall 59, the volume of the adjusting cavity 68 decreases, the pressure inside the adjusting cavity 68 increases, the transparent liquid 54 inside the adjusting cavity 68 enters the refractive cavity 52, and the first film 56 protrudes outward.
The sidewall 59 defines a flow passage 591, the flow passage 591 communicating the accommodation chamber 68 with the refraction chamber 52. The adjustment mechanism 60 includes a switch 61 provided in the flow passage 591, and the switch 61 is used to control the open-close state of the flow passage 591.
In this embodiment, the number of switches 61 is two, and both switches 61 are one-way switches, wherein one switch 61 is used for controlling the flow of the transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52, and the other switch 61 is used for controlling the flow of the transparent liquid 54 from the refraction chamber 52 to the adjustment chamber 68.
In this manner, the flow of the light-transmissive liquid 54 between the adjustment chamber 68 and the refractive chamber 52 is effected by the switch 61 to maintain pressure equilibrium across the side wall 59. As previously described, a change in the volume of the accommodation chamber 68 causes a change in the pressure in the accommodation chamber 68, thereby causing the now-transparent liquid 54 to flow between the accommodation chamber 68 and the refractive chamber 52. The switch 61 controls the open/close state of the flow passage 591 to control the flow of the transparent liquid 54 between the adjustment chamber 68 and the refraction chamber 52, thereby controlling the adjustment of the shape of the refraction member 50.
In one example, referring to FIG. 4, the switch 61 that controls the flow of the optically transparent liquid 54 from the diopter chamber 52 to the adjustment chamber 68 is opened, the slide 64 slides away from the side wall 59, the volume of the adjustment chamber 68 increases, the pressure within the adjustment chamber 68 decreases, the optically transparent liquid 54 within the diopter chamber 52 passes through the switch 61 into the adjustment chamber 68, and the first film layer 56 increasingly recedes inwardly.
In another example, the switch 61 controlling the flow of the optically transparent liquid 54 from the diopter chamber 52 to the adjustment chamber 68 is closed, and even if the slide member 64 slides away from the side wall 59, the volume of the adjustment chamber 68 increases, the pressure within the adjustment chamber 68 decreases, the optically transparent liquid 54 within the diopter chamber 52 cannot enter the adjustment chamber 68, and the configuration of the first film layer 56 does not change.
In yet another example, referring to FIG. 5, the switch 61 controlling the flow of the transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52 is opened, the sliding member 64 slides toward the side wall 59, the volume of the adjustment chamber 68 decreases, the pressure in the adjustment chamber 68 increases, the transparent liquid 54 in the adjustment chamber 68 enters the refraction chamber 52 through the switch 61, and the first film 56 bulges outward.
In yet another example, the switch 61 controlling the flow of the transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52 is closed, and even if the sliding member 64 slides toward the side wall 59, the volume of the adjustment chamber 68 decreases, the pressure in the adjustment chamber 68 increases, the transparent liquid 54 in the adjustment chamber 68 cannot enter the refraction chamber 52, and the configuration of the first film layer 56 is not changed.
The driving member 66 may perform its function of driving the sliding member 64 to slide based on various structures and principles.
In the example of fig. 1, 2, 3, 4, and 5, the driving member 66 includes a knob 662 and a lead screw 664, the lead screw 664 connects the knob 662 and the slider 64, and the knob 662 is used to drive the lead screw 664 to rotate so as to slide the slider 64 relative to the cavity 62.
In this manner, the slider 64 is driven by the knob 662 and the lead screw 664. Because the screw 664 and the knob 662 are matched to convert the rotary motion of the knob 662 into the linear motion of the screw 664, when the knob 662 is rotated by a user, the screw 664 drives the sliding member 64 to slide relative to the cavity 62, so as to cause the volume of the adjusting cavity 68 to change, and further adjust the amount of the transparent liquid 54 in the refractive cavity 52. The knob 662 may be exposed from the housing 20 for easy rotation by a user.
Specifically, a threaded portion is formed on the knob 662, a threaded portion engaged with the knob 662 is formed on the lead screw 664, and the knob 662 and the lead screw 664 are threadedly coupled.
While the knob 662 is rotated, the switch 61 may be correspondingly turned on. In this way, the transparent liquid 54 can flow, and the pressure balance between the two sides of the sidewall 59 is ensured.
In one example, the knob 662 is rotated clockwise and the slide 64 is slid away from the sidewall 59, opening the switch 61 that controls the flow of the optically transparent liquid 54 from the refractive chamber 52 to the adjustment chamber 68. In another example, the knob 662 is rotated counterclockwise and the slide 64 is slid in a direction toward the sidewall 59, which opens the switch 61 that controls the flow of the optically transparent liquid 54 from the adjustment chamber 68 to the refractive chamber 52.
Note that in the present embodiment, the rotation angle of the knob 662 and the dioptric power of the dioptric member 50 are not related, and the user may rotate the knob 662 to a position where the visual experience is optimal. Of course, in other embodiments, the angle of rotation of the knob 662 may be correlated to the diopter number of the diopter member 50. Here, whether or not the rotation angle of the knob 662 is related to the dioptric power of the dioptric member 50 is not limited.
Referring to fig. 6, the driving member 66 includes a gear 666 and a rack 668 engaged with the gear 666, the rack 668 is connected to the gear 666 and the sliding member 64, and the gear 666 is used to drive the rack 668 to move so as to slide the sliding member 64 relative to the cavity 62.
In this way, the slide 64 is driven by the gear 666 and the rack 668. Since the cooperation of the gear 666 and the rack 668 can convert the rotation of the gear 666 into the linear movement of the rack 668, when the user rotates the gear 666, the rack 668 can drive the sliding member 64 to slide relative to the cavity 62, so as to cause the volume of the adjusting cavity 68 to change, thereby adjusting the amount of the transparent liquid 54 in the refractive cavity 52. Gear 666 may be exposed from housing 20 for convenient rotation by a user.
Similarly, switch 61 may be correspondingly opened while gear 666 is rotating. In this way, the transparent liquid 54 can flow, and the pressure balance between the two sides of the sidewall 59 is ensured.
In one example, clockwise rotation of the gear 666 causes the rack 668 to engage the gear 666, the length of the rack 668 is shortened, and the switch 61, which controls the flow of the lucent liquid 54 from the diopter chamber 52 to the adjustment chamber 68, is opened by pulling the slide 64 away from the side wall 59.
In another example, the counter-clockwise rotation of the gear 666 disengages the rack 668 engaged on the gear 666 from the gear 666, the length of the rack 668 increases, pushing the slide 64 to move in a direction towards the side wall 59, which opens the switch 61 controlling the flow of the translucent liquid 54 from the adjustment chamber 68 to the diopter chamber 52.
Similarly, in this embodiment, the angle of rotation of gear 666 and the diopter number of diopter member 50 are not correlated, and the user may rotate gear 666 to the position where the visual experience is optimal. Of course, in other embodiments, the angle of rotation of gear 666 can be correlated with the diopter number of diopter member 50. Here, whether or not the rotation angle of the gear 666 and the dioptric power of the dioptric member 50 are related is not limited
Referring to fig. 7, the driving part 66 includes a driving motor 669, a motor shaft 6691 of the driving motor 669 is connected to the sliding member 64, and the driving motor 669 is used for driving the sliding member 64 to slide relative to the cavity 62.
In this manner, the slide 64 is driven by the drive motor 668. Specifically, the drive motor 669 may be a linear motor. The linear motor has a simple structure, does not need to pass through an intermediate conversion mechanism and directly generates linear motion, can reduce motion inertia and improve dynamic response performance and positioning accuracy. The slider 64 is driven by the drive motor 668, so that the driving of the slider 64 is editable. For example, the drive motor 668 can be correlated to the degree of refraction by prior calibration. The user can directly input the dioptric power and the drive motor 668 is automatically operated to drive the slide member 64 to slide to the corresponding position.
Further, the driving component 66 may further include an input device 6692, and the input device 6692 includes, but is not limited to, a key, a knob, or a touch screen. In the example of fig. 7, the input device 6692 is a key, and two keys are respectively disposed on opposite sides of the cavity 62. The keys may be exposed from the housing 20 for easy depression by a user. The key can control the working time of the driving motor 669 according to the number or time of external force pressing, thereby controlling the sliding distance of the sliding member 64.
Similarly, while the drive motor 669 is operating, the switch 61 may be correspondingly opened. In this way, the transparent liquid 54 can flow, and the pressure balance between the two sides of the sidewall 59 is ensured.
In one example, a user presses one of the two buttons to extend the motor shaft 6691, and the motor shaft 6691 pushes the slider 64 to move toward the side wall 59, which opens the switch 61 that controls the flow of the optically transparent liquid 54 from the adjustment chamber 68 to the refraction chamber 52.
In another example, the user presses the other of the two buttons to cause the motor shaft 6691 to contract, and the motor shaft 6691 pulls the slider 64 away from the side wall 59, which opens the switch 61 that controls the flow of the optically transparent liquid 54 from the diopter chamber 52 to the adjustment chamber 68.
It should be noted that the structure of the diopter member 50 includes not only the above diopter chamber 52, the light-transmissive liquid 54, the first film layer 56, the second film layer 58 and the side wall 59, as long as the diopter member 50 can achieve the diopter change effect. For example, in other aspects, the diopter member 50 includes a plurality of lenses and a drive member for driving each lens from the stored position to the diopter position. Thus, the diopter of the diopter member 50 can be changed by the combination of the plurality of lenses. Of course, the driving member can also drive each lens moved to the dioptric position to move on the dioptric optical axis, thereby changing the diopter of the dioptric member 50.
Thus, the above-described configuration of the diopter components includes the shape and state of the diopter components, and the change in diopter is accomplished by changing the shape of the first film 56 and/or the second film 58 in the manner of the structure of the diopter cavity 52, the light-transmissive liquid 54, the first film 56, the second film 58, and the side wall 59; the above structure mode of a plurality of lenses and driving pieces realizes the diopter change by changing the state of the lenses.
In summary, the present embodiment provides a wearable device 100 that includes a display 40, a diopter member 50, and an adjustment mechanism 60. The diopter member 50 is disposed on the side of the display 40. An adjustment mechanism 60 is coupled to the diopter members 50, the adjustment mechanism 60 being operable to adjust the configuration of the diopter members 50 to adjust the diopter power of the diopter members 50.
The wearing device 100 of the embodiment of the application adjusts the form of the diopter member 50 through the adjusting mechanism 60 to adjust the diopter of the diopter member 50, so that a user with ametropia can clearly see the image displayed by the display 40, and the user experience is improved.
Furthermore, in the wearable device 100 according to the embodiment of the present application, the diopter member 50 and the adjustment mechanism 60 can linearly correct the diopter power, so that each person with different diopter powers can flexibly wear the wearable device. Meanwhile, the diopter component 50 and the adjusting mechanism 60 are small in size, and the wearing experience of the wearable device 100 is not affected. The user does not need to purchase many lenses and the price can be reduced.
Referring to fig. 8, an embodiment of the present application provides a control method of a wearable device 100. The wearable device 100 comprises a shell 20, a display 40 accommodated in the shell 20 and a plurality of acoustoelectric elements 110 arranged on the shell 20, wherein the acoustoelectric elements 110 are used for collecting voice commands, and the control method comprises the following steps:
step S12: determining whether the sender of the voice command has the execution authority of the voice command according to the current working mode of the wearable device 100;
step S14: in the case where the issuer has the execution authority of the voice command, controlling the wearable device 100 to execute the voice command in the current operation mode and causing the display 40 to display the content corresponding to the voice command;
step S16: in a case where the issuer does not have the execution authority of the voice command, the wearable device 100 is controlled to ignore the voice command in the current operation mode.
Referring to fig. 9, the present embodiment provides a control device 10 of a wearable device 100. The wearable device 100 includes a housing 20, a display 40 housed in the housing 20, and a plurality of acoustic-electric elements 110 provided in the housing 20. A plurality of acousto-electric elements 110 are used to capture voice commands. The control device 10 comprises a determination module 12, a first control module 14 and a second control module 16, wherein the determination module 12 is used for determining whether a sender of a voice command has an execution authority of the voice command according to a current working mode of the wearable device 100; the first control module 14 is configured to, if the issuer has the execution authority of the voice command, control the wearable device 100 to execute the voice command in the current working mode and cause the display 40 to display the content corresponding to the voice command; the second control module 16 is configured to control the wearable device 100 to ignore the voice command in the current working mode if the issuer does not have the execution authority of the voice command.
Referring to fig. 10, the present embodiment provides a wearable device 100. The wearable device 100 includes a processor 101, a housing 20, a display 40 housed in the housing 20, and a plurality of acoustic-electric elements 110 provided in the housing 20. A plurality of acousto-electric elements 110 are used to capture voice commands. The steps S12-S16 of the above control method may be implemented by the processor 101. In other words, the processor 101 is configured to determine whether the sender of the voice command has the execution authority of the voice command according to the current working mode of the wearable device 100; for controlling the wearable device 100 to execute the voice command in the current working mode and causing the display 40 to display the content corresponding to the voice command if the issuer has the execution authority of the voice command; and for controlling the wearable device 100 to ignore the voice command in the current operation mode if the issuer does not have the execution authority of the voice command.
According to the control method of the wearable device 100, the control device 10 and the wearable device 100 of the embodiment of the application, whether the sender of the voice command has the execution authority of the voice command is determined according to the current working mode of the wearable device 100, so as to control the wearable device 100 to execute or ignore the voice command, thereby preventing people who do not have the execution authority of the voice command in the current working mode from mistakenly triggering the wearable device 100, and improving the safety and user experience of the wearable device 100.
Specifically, the wearable device 100 may be an electronic device such as electronic glasses, electronic clothing, an electronic bracelet, an electronic necklace, an electronic tattoo, a watch, an in-ear earphone, a pendant, or a headset. The wearable device 100 may also be a Head Mounted Display (HMD) of an electronic device or a smart watch. The specific form of the wearable device 100 is not limited herein.
Note that, for convenience of description, the present embodiment explains a control method of the wearable device 100 of the present embodiment, taking the wearable device 100 as an example of an electronic eyeglass. This is not intended to limit the specific form of the wearable device 100.
The operation mode of the wearable device 100 includes one or more of a personal mode, a meeting mode, a game mode, a learning mode, and the like. The specific content of the operation mode of the wearable device 100 is not limited herein.
Each operation mode of the wearable device 100 may have a default setting for the execution authority of the voice command, and thus, it may be determined whether the issuer of the voice command has the execution authority of the voice command according to the current operation mode of the wearable device 100. The default settings may be pre-designed by the manufacturer and stored in the wearable device 100. Of course, the user can also perform customized setting on the default setting of each operation mode of the wearing device 100 according to the use requirement.
Referring to FIG. 11, in some embodiments, the operational mode includes a conference mode and a game mode, the issuer includes a non-wearer, the voice command includes a document command, and step S12 includes:
step S121: under the condition that the current working mode is a conference mode, determining that a non-wearer has the execution authority of a document command in the current working mode;
step S123: in the case where the current operation mode is the game mode, it is determined that the non-wearer does not have the execution authority of the document command in the current operation mode.
In this way, it is realized that whether the issuer of the voice command has the execution authority of the voice command is determined according to the current operation mode of the wearable device 100.
Referring to FIG. 12, in some embodiments, the issuer includes a wearer and a non-wearer, the current operating mode includes a meeting mode, the voice command includes a transfer command, and step S12 includes:
step S125: determining that the issuer has an execution authority of the transfer command in the conference mode in case that the issuer is a wearer;
step S127: in the case where the issuer is a non-wearer, it is determined that the issuer does not have an execution authority of the transfer command in the conference mode.
In this way, it is realized that whether the issuer of the voice command has the execution authority of the voice command is determined according to the current operation mode of the wearable device 100.
Referring to FIG. 13, in some embodiments, the voice command includes a document command and a transfer command, the current operating mode includes a meeting mode, the issuer includes a non-wearer, and step S12 includes:
step S128: in the case that the voice command is a document command, determining that the non-wearer has an execution authority of the voice command in the conference mode;
step S129: in the case where the voice command is a transfer command, it is determined that the non-wearer does not have the execution authority of the voice command in the conference mode.
In this way, it is realized that whether the issuer of the voice command has the execution authority of the voice command is determined according to the current operation mode of the wearable device 100.
Specifically, in the example shown in fig. 14, the wearable device 100 is electronic glasses, six persons are participating in a conference while wearing the wearable device 100, and the current operation mode of each wearable device 100 is a conference mode. It is understood that the originator of the voice command is a 200-sound source and the persons other than the originator are the audience 300.
In the example shown in fig. 14, the default settings for the conference mode are: the non-wearer has only the right to execute commands related to the conference, e.g., the right to open a file, close a file, transmit a file, receive a document command, etc.; and the wearer has all rights.
In one utterance, the utterer 200 utters voice commands as: a document command to "open document A". The issuer 200 is a non-wearer for the electronic glasses worn by the audience 300, and after receiving the document command of "opening document a", the electronic glasses worn by the audience 300 determine that the issuer 200 is a non-wearer and has the execution authority of the document command of "opening document a" in the conference mode according to the setting of the conference mode, so that the electronic glasses worn by the audience 300 execute the document command of "opening document a", and the display 40 of the electronic glasses worn by the audience 300 displays the content of "document a".
The issuer 200 is a wearer for the electronic glasses worn by the issuer 200, and after receiving the document command of "opening document a", the electronic glasses worn by the issuer 200 determines that the issuer 200 is a wearer and has the execution authority of the document command of "opening document a" in the conference mode according to the setting of the conference mode, so that the electronic glasses worn by the issuer 200 also execute the document command of "opening document a", and the display 40 of the electronic glasses worn by the issuer 200 displays the content of "document a".
In another utterance, the utterer 200 utters a voice command: transfer command "transfer 1000 Yuan to B". The sender 200 is a non-wearer for the electronic glasses worn by the audience 300, and after receiving the transfer command of "transfer 1000 yuan for B", the electronic glasses worn by the audience 300 determine that the sender 200 is a non-wearer and does not have the execution authority of the transfer command of "transfer 1000 yuan for B" in the meeting mode according to the setting of the meeting mode, so that the electronic glasses worn by the audience 300 ignore the transfer command of "transfer 1000 yuan for B".
The issuer 200 is a wearer for the electronic glasses worn by the issuer 200, and after receiving the transfer command of "transfer for B1000 yuan", the electronic glasses worn by the issuer 200 determines that the issuer 200 is a wearer and has an execution authority of the transfer command of "transfer for B1000 yuan" in the conference mode according to the setting of the conference mode, so that the electronic glasses worn by the issuer 200 execute the transfer command of "transfer for B1000 yuan", and the display 40 of the electronic glasses worn by the issuer 200 displays a message box of "transfer for B1000 yuan".
Referring to fig. 15, in the example shown in fig. 15, the wearable device 100 is electronic glasses, six persons wear the wearable device 100 to play a game, and the current working mode of each wearable device 100 is a game mode.
In the example shown in fig. 15, the default settings for the game mode are: non-wearers only have the right to execute game-related commands, such as game commands like co-solicitation, sharing the field of view, etc.; and the wearer has all rights. At this time, it is determined whether the issuer of the voice command has an execution authority of the voice command according to the game mode.
Referring to fig. 16, in some embodiments, step S12 includes:
step S122: determining a permission level of the issuer;
step S124: determining authority information of the voice command in the current working mode;
step S126: and identifying whether the sender has the execution authority of the voice command according to the authority information and the authority level.
In this way, it is realized that whether the issuer of the voice command has the execution authority of the voice command is determined according to the current operation mode of the wearable device 100. Specifically, here, the "authority level" refers to a level of authority that the issuer of the voice command has.
Further, the number of privilege levels may be 2, 3, 4, or other numbers. In one example, the privilege levels include two levels: a wearer level and a non-wearer level; in another example, the privilege levels include three levels: a first level, a second level, and a third level. Here, the number and specific form of the authority levels are not limited.
Here, "authority information" refers to information as to whether or not the issuer of each authority level has an execution authority for a voice command in the current operation mode. Alternatively, the "authority information" herein refers to information of which level of the issuer the voice command is controlled in the current operation mode.
For example, the authority level of the issuer includes a first level and a second level, and the authority information of the sound command in the current operation mode includes: first level is entitled and second level is not entitled, second level is entitled and first level is not entitled, first level is not entitled and second level is not entitled, first level is entitled and second level is entitled:
in one example, if the authority level of the issuer is a first level, and the authority information of the voice command is the first level is authorized and the second level is not authorized according to the default setting of the current operating mode, then it may be determined that the issuer has the authority to execute the voice command in the current operating mode.
In another example, if the authority level of the issuer is a first level, and the authority information of the voice command in the current operation mode is a second level and the first level is not authority according to the default setting of the current operation mode, then it may be determined that the issuer does not have the authority to execute the voice command in the current operation mode.
In another example, if the authority level of the issuer is the first level, and the authority information of the voice command in the current operation mode is that the first level is not authority and the second level is not authority according to the default setting of the current operation mode, then it may be determined that the issuer does not have the authority to execute the voice command in the current operation mode.
In another example, if the authority level of the issuer is the first level, and the authority information of the voice command in the current operation mode is the first level and the second level according to the default setting of the current operation mode, then it may be determined that the issuer has the authority to execute the voice command in the current operation mode.
Note that, in the case where the authority level of the issuer is the second level, the manner of determining whether the issuer has the execution authority of the voice command according to the authority information and the authority level is similar to the above example, and details are not described here to avoid redundancy.
Referring to fig. 17, in some embodiments, step S122 includes:
step S1222: determining identity information of an issuer;
step S1224: and determining the authority level of the issuer according to the identity information.
In this manner, determining the privilege level of the issuer is achieved. Specifically, the "identity information" herein may refer to an inherent identity of the issuer (e.g., an identity uniquely determined by an identification number), or may refer to an identity that the issuer has due to factors such as position, behavior, status, etc. (e.g., an electronic glasses wearer, a non-electronic glasses wearer).
In one example, the voice command is a document command of "open document a", and the authority information of the document command is: the leader level has the authority to document commands in conference mode. The identity information of the issuer is: identification number 00001. According to the identity information, the authority level of the issuer can be determined as follows: and (5) leader level. Therefore, in the conference mode, when the issuer issues a document command of "open document a", the electronic glasses that have received the document command each execute the document command of "open document a", and the display 40 of the electronic glasses that have received the document command displays the content of "document a".
In another example, the voice command is a transfer command of "1000 Yuan to B transfer", and the authority information of the transfer command is: the wearer level has the right to transfer commands in conference mode. The identity information of the issuer is: a wearer. According to the identity information, the authority level of the issuer can be determined as follows: the wearer level. At this time, when the electronic glasses issue a transfer command of "transfer 1000 yuan to B" by the issuer in the conference mode, only the electronic glasses of the issuer perform the transfer command of "transfer 1000 yuan to B", and only the display 40 of the electronic glasses of the issuer "transfer 1000 yuan to B" has a message box.
In this embodiment, the authority level of the issuer is determined based on the identity information of the issuer. It will be appreciated that in other embodiments, the permission level of an issuer may be determined based on other information of the issuer. For example, the authority level of the issuer may be determined based on gender information, age information, credit information, account balance, or other information of the issuer. The specific manner of determining the authority level of the issuer is not limited herein.
Referring to fig. 18, in some embodiments, step S1222 includes:
step S1226: determining an issuing position of a voice command;
step S1229: and determining the identity information of the sender according to the sending position.
Thus, the identity information of the sender is determined. It is to be appreciated that in other embodiments, the identity information of the issuer may also be determined by voiceprint or other means. The specific manner in which the identity information of the issuer is determined is not limited herein.
Further, referring to fig. 19 and 20, in some embodiments, step S1226 includes:
step S1227: determining the distance from the issuing location of the voice command to each of the acousto-electric elements 110;
step S1228: the position of the sound command is determined according to the position and distance of each of the acoustoelectric elements 110.
In this way, determination of the issuing position of the voice command is achieved. In the example of fig. 20, the acoustoelectric element 110 is disposed in the housing 20. The acoustic-electric element 110 of the wearable device 100 is a microphone, the number of the microphones is three, and the position coordinates of the three microphones are respectively recorded as: o1, o2 and o 3. The utterer is the sound source 200, and the wearable device 100 of the audience 300 receives a voice command from the sound source 200.
Since the three microphones are located at different positions, the sound waves emitted from the sound source 200 travel to each microphone at different times. It is assumed that the sound waves emitted from the sound source 200 travel to each microphone at times t1, t2, and t3, respectively. The distances from the sound source 200 to each microphone are vt1, vt2 and vt3, respectively. Where v is the speed of sound propagation in air.
Then, a spherical surface may be drawn with the three microphones as the origin points and the distances from the sound source 200 to the corresponding microphones as the radii, respectively. That is, a first spherical surface is drawn with o1 as the origin and vt1 as the radius; drawing a second spherical surface by taking o2 as an origin and vt2 as a radius; the third sphere was drawn with o3 as the origin and vt3 as the radius.
And finally, calculating the intersection point of the three spherical surfaces, wherein the intersection point of the three spherical surfaces is the position of the sound source 200. This method may be implemented by an algorithm.
Referring to fig. 21, in some embodiments, step S126 includes:
step S1262: determining that the issuer has an execution authority of the voice command in a case where the authority level matches the authority information;
step S1264: in the case where the authority level does not match the authority information, it is determined that the issuer does not have the execution authority of the voice command.
Thus, it is realized that whether the issuer has the execution authority of the voice command is determined based on the authority information and the authority level. Specifically, the phrase "the authority level matches the authority information" herein means that the authority level of the voice command, which is authorized to execute the voice command, is included in the authority information of the voice command, and the authority level of the issuer determined based on the identity information of the issuer.
Here, "the authority level does not match the authority information" means that the authority level of the voice command, which is authorized to execute the voice command, is included in the authority information of the voice command, and the authority level of the issuer determined based on the identity information of the issuer is not included.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors 101, cause the processors 101 to perform the control method of any of the embodiments described above.
Fig. 10 is a schematic view of internal modules of the wearable device 100 in one embodiment. The wearable device 100 includes a processor 101, a memory 102 (e.g., a non-volatile storage medium), an internal memory 103, a display 104, and an input device 105 connected by a system bus 109.
The processor 101 may be used to provide computing and control capabilities, supporting the operation of the entire wearable device 100. The internal memory 103 of the wearable device 100 provides an environment for the computer readable instructions in the memory 102 to run. The display 104 of the wearable device 100 may be an OLED display layer or a Micro LED display, and the input device 105 may be an acoustic-electric element 110 disposed on the wearable device 100, a key, a trackball, or a touch pad disposed on the wearable device 100, or an external keyboard, a touch pad, or a mouse. This wearing equipment 100 can be intelligent bracelet, intelligent wrist-watch, intelligent helmet, electronic glasses etc.. It will be understood by those skilled in the art that the structures shown in the figures are only schematic illustrations of some of the structures relevant to the present application, and do not constitute limitations on the wearable device 100 to which the present application is applied, and a particular wearable device 100 may include more or fewer components than shown in the figures, or some components may be combined, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, and the program may be stored in a non-volatile computer readable storage medium, and when executed, may include the processes of the embodiments of the methods as described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above examples only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (13)

1. A control method of wearable equipment is characterized in that the wearable equipment comprises a shell, a display accommodated in the shell and a plurality of acoustic-electric elements arranged on the shell, wherein the acoustic-electric elements are used for collecting voice commands, and the control method comprises the following steps:
determining whether the sender of the voice command has the execution authority of the voice command according to the current working mode of the wearable device;
if the issuer has the execution authority of the voice command, controlling the wearable device to execute the voice command in the current working mode and enabling the display to display the content corresponding to the voice command;
and controlling the wearable device to ignore the voice command in the current working mode under the condition that the sender does not have the execution authority of the voice command.
2. The method for controlling the wearable device according to claim 1, wherein determining whether a sender of a voice command has an execution authority of the voice command according to a current working mode of the wearable device comprises:
determining a permission level of the issuer;
determining authority information of the voice command in the current working mode;
and identifying whether the sender has the execution authority of the voice command according to the authority information and the authority level.
3. The method of controlling a wearable device according to claim 2, wherein determining the authority level of the issuer comprises:
determining identity information of the issuer;
and determining the authority level of the issuer according to the identity information.
4. The method for controlling a wearable device according to claim 3, wherein determining the identity information of the issuer comprises:
determining an issuing position of the voice command;
and determining the identity information of the sender according to the sending position.
5. The method of controlling a wearable device according to claim 4, wherein determining the location of issuance of the voice command comprises:
determining a distance from an issuing location of the voice command to each of the acousto-electric elements;
and determining the issuing position of the sound command according to the position of each sound electric element and the distance.
6. The method for controlling the wearable device according to claim 2, wherein identifying whether the issuer has the execution authority of the voice command according to the authority information and the authority level comprises:
determining that the issuer has an execution authority of the voice command in a case where the authority level matches the authority information;
in a case where the authority level does not match the authority information, it is determined that the issuer does not have the execution authority of the sound command.
7. The method for controlling a wearable device according to claim 1, wherein the operation modes include a conference mode and a game mode, the issuer includes a non-wearer, the voice command includes a document command, and determining whether the issuer of the voice command has an execution authority of the voice command according to a current operation mode of the wearable device includes:
determining that the non-wearer has the execution authority of the document command in the current working mode under the condition that the current working mode is the conference mode;
determining that the non-wearer does not have the execution authority of the document command in the current working mode, in a case where the current working mode is the game mode.
8. The method of claim 1, wherein the issuer comprises a wearer and a non-wearer, the current operating mode comprises a conference mode, the voice command comprises a transfer command, and determining whether the issuer of the voice command has an authority to execute the voice command according to the current operating mode of the wearable device comprises:
determining that the issuer has an execution authority of the transfer command in the conference mode in case that the issuer is the wearer;
in a case where the issuer is the non-wearer, it is determined that the issuer does not have an execution authority of the transfer command in the conference mode.
9. The method for controlling a wearable device according to claim 1, wherein the voice command includes a document command and a transfer command, the current operation mode includes a conference mode, the issuer includes a non-wearer, and determining whether the issuer of the voice command has an authority to execute the voice command according to the current operation mode of the wearable device includes:
determining that the non-wearer has the execution authority of the voice command in the conference mode in a case where the voice command is the document command;
determining that the non-wearer does not have execution authority of the voice command in the conference mode in a case where the voice command is the transfer command.
10. A control device of wearable equipment is characterized in that the wearable equipment comprises a shell, a display accommodated in the shell and a plurality of sound and electricity elements arranged on the shell, wherein the sound and electricity elements are used for collecting sound commands; the first control module is used for controlling the wearable device to execute the voice command in the current working mode and enabling the display to display the content corresponding to the voice command under the condition that the sender has the execution authority of the voice command; the second control module is used for controlling the wearable device to ignore the voice command in the current working mode under the condition that the sender does not have the execution authority of the voice command.
11. A wearable device, comprising a processor, a housing, a display housed in the housing, and a plurality of acoustoelectric elements disposed in the housing, wherein the plurality of acoustoelectric elements are used for collecting voice commands, and the processor is used for executing the control method of the wearable device according to any one of claims 1 to 9.
12. The wearable device according to claim 11, wherein the wearable device comprises:
a display;
a diopter member disposed at one side of the display; and
an adjustment mechanism coupled to the diopter member for adjusting the configuration of the diopter member to adjust the diopter of the diopter member.
13. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the method of controlling a wearable device of any of claims 1-9.
CN201910398809.7A 2019-05-14 2019-05-14 Control method, control device, wearable device and storage medium Active CN111948807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910398809.7A CN111948807B (en) 2019-05-14 2019-05-14 Control method, control device, wearable device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910398809.7A CN111948807B (en) 2019-05-14 2019-05-14 Control method, control device, wearable device and storage medium

Publications (2)

Publication Number Publication Date
CN111948807A true CN111948807A (en) 2020-11-17
CN111948807B CN111948807B (en) 2022-10-25

Family

ID=73335638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910398809.7A Active CN111948807B (en) 2019-05-14 2019-05-14 Control method, control device, wearable device and storage medium

Country Status (1)

Country Link
CN (1) CN111948807B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006238218A (en) * 2005-02-25 2006-09-07 Fujitsu Ltd Output method, output device and communication system
US20110001695A1 (en) * 2009-07-06 2011-01-06 Toshiba Tec Kabushiki Kaisha Wearable terminal device and method of controlling the wearable terminal device
JP2013046265A (en) * 2011-08-25 2013-03-04 Onkyo Corp Sound processor and program therefor
TW201349004A (en) * 2012-05-23 2013-12-01 Transcend Information Inc Voice control method and computer-implemented system for data management and protection
CN104503086A (en) * 2014-12-31 2015-04-08 青岛歌尔声学科技有限公司 Adjustable head-mounted display
JP2015194930A (en) * 2014-03-31 2015-11-05 セコム株式会社 Information processing device and information processing system
CN105579917A (en) * 2013-09-04 2016-05-11 依视路国际集团(光学总公司) Methods and systems for augmented reality
US20170072312A1 (en) * 2015-09-10 2017-03-16 Sap Se Instructions on a wearable device
CN206282045U (en) * 2016-09-28 2017-06-27 芦溪县上埠镇中学 A kind of intelligent glasses
CN107767875A (en) * 2017-10-17 2018-03-06 深圳市沃特沃德股份有限公司 Sound control method, device and terminal device
US20180308477A1 (en) * 2016-01-07 2018-10-25 Sony Corporation Control device, display device, method, and program
CN108986806A (en) * 2018-06-30 2018-12-11 上海爱优威软件开发有限公司 Sound control method and system based on Sounnd source direction
CN109087643A (en) * 2018-09-28 2018-12-25 联想(北京)有限公司 Sound control method, device and electronic equipment
CN109448716A (en) * 2018-12-06 2019-03-08 珠海格力电器股份有限公司 A kind of sound control method, device, storage medium and air-conditioning

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006238218A (en) * 2005-02-25 2006-09-07 Fujitsu Ltd Output method, output device and communication system
US20110001695A1 (en) * 2009-07-06 2011-01-06 Toshiba Tec Kabushiki Kaisha Wearable terminal device and method of controlling the wearable terminal device
JP2013046265A (en) * 2011-08-25 2013-03-04 Onkyo Corp Sound processor and program therefor
TW201349004A (en) * 2012-05-23 2013-12-01 Transcend Information Inc Voice control method and computer-implemented system for data management and protection
CN105579917A (en) * 2013-09-04 2016-05-11 依视路国际集团(光学总公司) Methods and systems for augmented reality
JP2015194930A (en) * 2014-03-31 2015-11-05 セコム株式会社 Information processing device and information processing system
CN104503086A (en) * 2014-12-31 2015-04-08 青岛歌尔声学科技有限公司 Adjustable head-mounted display
US20170072312A1 (en) * 2015-09-10 2017-03-16 Sap Se Instructions on a wearable device
US20180308477A1 (en) * 2016-01-07 2018-10-25 Sony Corporation Control device, display device, method, and program
CN206282045U (en) * 2016-09-28 2017-06-27 芦溪县上埠镇中学 A kind of intelligent glasses
CN107767875A (en) * 2017-10-17 2018-03-06 深圳市沃特沃德股份有限公司 Sound control method, device and terminal device
CN108986806A (en) * 2018-06-30 2018-12-11 上海爱优威软件开发有限公司 Sound control method and system based on Sounnd source direction
CN109087643A (en) * 2018-09-28 2018-12-25 联想(北京)有限公司 Sound control method, device and electronic equipment
CN109448716A (en) * 2018-12-06 2019-03-08 珠海格力电器股份有限公司 A kind of sound control method, device, storage medium and air-conditioning

Also Published As

Publication number Publication date
CN111948807B (en) 2022-10-25

Similar Documents

Publication Publication Date Title
CN112071311A (en) Control method, control device, wearable device and storage medium
US20220155910A1 (en) Method for displaying user interface and electronic device therefor
CN110398840B (en) Method for adjusting optical center distance, head-mounted device and storage medium
KR20150093054A (en) Electronic device including flexible display and operation method thereof
KR102579034B1 (en) An electronic device including a semi-transparent member disposed at an angle specified with respect to a direction in which a video is outputbelow the video outputmodule
CN105227703B (en) Glasses type terminal
WO2021036591A1 (en) Control method, control device, electronic device and storage medium
CN110189665A (en) Control method, wearable device and storage medium
US20220174764A1 (en) Interactive method, head-mounted device, interactive system and storage medium
KR102614047B1 (en) Electronic apparatus comprising flexible display
JP2002162597A (en) Wearable display device
EP4258826A1 (en) Electronic apparatus comprising flexible printed circuit board
CN111948807B (en) Control method, control device, wearable device and storage medium
CN110412766A (en) Control method, helmet and storage medium
CN112083796A (en) Control method, head-mounted device, mobile terminal and control system
CN110794587A (en) Wearing equipment and wearing subassembly
CN210690952U (en) Wearing equipment and wearing subassembly
US20240046578A1 (en) Wearable electronic device displaying virtual object and method for controlling the same
EP4137871A1 (en) Wearable electronic device comprising display
US11867910B2 (en) Wearable electronic device including sensor module
EP4350420A1 (en) Lens assembly including light-emitting element disposed on first lens, and wearable electronic device including same
US20240073508A1 (en) Wearable electronic device for controlling camera module and method for operating thereof
KR20230134961A (en) Electronic device and operating method thereof
KR20220126074A (en) Wearable electronic device comprising display
KR20240019668A (en) Wearable electronic device for display virtual object and method for controlling the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant