WO2021036591A1 - 控制方法、控制装置、电子装置和存储介质 - Google Patents

控制方法、控制装置、电子装置和存储介质 Download PDF

Info

Publication number
WO2021036591A1
WO2021036591A1 PCT/CN2020/103341 CN2020103341W WO2021036591A1 WO 2021036591 A1 WO2021036591 A1 WO 2021036591A1 CN 2020103341 W CN2020103341 W CN 2020103341W WO 2021036591 A1 WO2021036591 A1 WO 2021036591A1
Authority
WO
WIPO (PCT)
Prior art keywords
external object
light
camera
electronic device
image
Prior art date
Application number
PCT/CN2020/103341
Other languages
English (en)
French (fr)
Inventor
林贻鸿
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP20859159.4A priority Critical patent/EP3979046A4/en
Publication of WO2021036591A1 publication Critical patent/WO2021036591A1/zh
Priority to US17/565,373 priority patent/US20220121292A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/57Control of contrast or brightness
    • H04N5/58Control of contrast or brightness in dependence upon ambient light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • This application relates to the field of display technology, in particular to a control method, a control device, an electronic device, and a storage medium.
  • electronic devices such as mobile phones and wearable devices can acquire gesture actions through a camera, and control the electronic device to perform related operations according to the gesture actions.
  • the electronic device can lock the screen according to the gesture action.
  • This application provides a control method, control device, electronic device, and storage medium.
  • the electronic device includes a camera
  • the control method includes:
  • control device of the embodiment of the present application is used in an electronic device, the electronic device includes a camera, and the control device includes:
  • the detection module is used to detect whether there is an external object within the preset range of the camera
  • the control module is used to turn on the camera to obtain the image of the external object when there is the external object within the preset range of the camera;
  • the object performs gesture recognition.
  • the electronic device of the embodiment of the present application includes a camera and a processor, the processor is used for detecting whether there is an external object within a preset range of the camera; and used for detecting whether there is an external object within the preset range of the camera In the case of turning on the camera to obtain an image of the external object; and for performing motion gesture recognition on the external object according to the image of the external object.
  • a non-volatile computer-readable storage medium containing computer-executable instructions when the computer-executable instructions are executed by one or more processors, cause the processors to execute the control method described above.
  • FIG. 1 is a three-dimensional schematic diagram of an electronic device according to an embodiment of the present application.
  • FIG. 2 is a schematic cross-sectional view of an electronic device according to an embodiment of the present application.
  • FIG. 3 is a schematic cross-sectional view of an electronic device according to an embodiment of the present application.
  • FIG. 4 is a schematic cross-sectional view of an electronic device according to an embodiment of the present application.
  • FIG. 5 is a schematic cross-sectional view of a light emitting component of an electronic device according to an embodiment of the present application.
  • FIG. 6 is a schematic cross-sectional view of a light emitting component of an electronic device according to another embodiment of the present application.
  • FIG. 7 is a three-dimensional schematic diagram of an electronic device according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the principle structure of an electronic device according to an embodiment of the present application.
  • FIG. 9 is a schematic plan view of an electronic device according to another embodiment of the present application.
  • FIG. 10 is a schematic plan view of a partial structure of an electronic device according to an embodiment of the present application.
  • FIG. 11 is a schematic diagram of an adjustment process of an electronic device according to an embodiment of the present application.
  • FIG. 12 is another schematic diagram of the adjustment process of the electronic device according to the embodiment of the present application.
  • FIG. 13 is a schematic plan view of a partial structure of an electronic device according to another embodiment of the present application.
  • FIG. 14 is a schematic plan view of a light quantity adjusting component according to an embodiment of the present application.
  • 15 is a schematic diagram of the relationship between the environmental brightness and the light transmittance of the light quantity adjustment component of the embodiment of the present application;
  • FIG. 16 is a schematic diagram of modules of an electronic device according to an embodiment of the present application.
  • FIG. 17 is a schematic diagram of modules of an electronic device according to another embodiment of the present application.
  • FIG. 18 is a schematic diagram of internal modules of an electronic device according to an embodiment of the present application.
  • FIG. 19 is a schematic diagram of a scene of an electronic device according to an embodiment of the present application.
  • FIG. 20 is a schematic flowchart of a control method according to an embodiment of the present application.
  • FIG. 21 is a schematic diagram of modules of a control device according to an embodiment of the present application.
  • FIG. 22 is a schematic flowchart of a control method according to an embodiment of the present application.
  • FIG. 23 is a schematic flowchart of a control method according to an embodiment of the present application.
  • FIG. 24 is a schematic flowchart of a control method according to an embodiment of the present application.
  • Electronic device 100 sensor assembly 10, light emitting component 11, packaging shell 111, first light emitting source 112, second light emitting source 113, substrate 114, diffuser 115, depth camera 12, environmental camera 13, light sensor 14, electro-optical Color changing device 120, antireflection film 130, housing 20, inner surface 201, outer surface 202, light-passing hole 203, light-transmitting portion 204, receiving chamber 22, housing top wall 24, housing bottom wall 26, gap 262 , Housing side wall 28, support member 30, first bracket 32, first bending portion 322, second bracket 34, second bending portion 342, elastic band 36, display 40, refractive member 50, refractive cavity 52.
  • Light-transmitting liquid 54 First film layer 56, second film layer 58, side wall 59, adjusting mechanism 60, cavity 62, sliding groove 622, sliding member 64, driving component 66, knob 662, screw 664, Gear 666, rack 668, adjustment cavity 68, light guide member 70, first side 71, second side 72, light quantity adjustment member 80, first conductive layer 81, second conductive layer 82, electrochromic layer 83, electrolyte Layer 84, ion storage layer 85, processor 90, collimation component 92, driving chip 94;
  • the control device 200 the detection module 210, and the control module 220.
  • the embodiment of the present application provides an electronic device 100.
  • the electronic device 100 is, for example, a mobile terminal such as a mobile phone, a tablet computer, and a wearable device.
  • the wearable device is, for example, a head-mounted display device (Head Mount Display, HMD).
  • the head-mounted display device cooperates with a computing system and an optical system. After the user wears the head-mounted display device, it can send optical to the user’s eyes.
  • Signal to achieve different effects such as virtual reality (Virtual Reality, VR), augmented reality (Augmented Reality, AR) and mixed reality (Mixed Reality, MR).
  • the electronic device 100 of the embodiment of the present application is described in detail by taking a head-mounted display device as an example.
  • the electronic device 100 of the embodiment of the present application includes a sensor assembly 10, a housing 20 and an electrochromic device 120.
  • the sensor assembly 10 is provided in the housing 20.
  • the electrochromic device 120 is arranged in the housing 20 and corresponding to the sensor assembly 10.
  • the electrochromic device 120 covers the sensor assembly 10.
  • the electrochromic device 120 can change its own light transmittance according to the state of the electronic device 100, thereby shielding or exposing the sensor assembly 10, and improving the appearance effect of the electronic device 100.
  • the state of the electronic device 100 is, for example, a working state and a non-working state.
  • the electronic device 100 can display images for the user, play video, audio, and other information, and can perform user operations.
  • the electronic device 100 may switch the display screen according to a user operation.
  • the light transmittance of the electrochromic device 120 can be controlled to increase so that the sensor assembly 10 is exposed, so as to obtain external information of the electronic device 100 or to The outside of the electronic device 100 transmits information. If the sensor assembly 10 is closed, the light transmittance of the electrochromic device 120 can be controlled to decrease to shield the sensor assembly 10, thereby improving the appearance of the electronic device 100.
  • the sensor assembly 10 includes at least one of a light emitting component 11, a depth camera 12, an environment camera 13, a light sensor 14 and a proximity sensor 15.
  • the sensor assembly 10 includes a depth camera 12, a proximity sensor 15 or a light sensor 14.
  • the sensor assembly 10 includes a depth camera 12 and a proximity sensor 15.
  • the sensor assembly 10 includes a light emitting component 11, a depth camera 12, an environment camera 13, and a proximity sensor 15. Therefore, the light emitting component 11, the depth camera 12, the environment camera 13, and the proximity sensor are all provided in the housing 20.
  • the electrochromic device 120 covers the light emitting part 11, the depth camera 12 and the environment camera 13 and is used to change its own light transmittance to shield or expose at least one of the light emitting part 11, the depth camera 12 and the environment camera 13.
  • the light emitting part 11 is used to emit light.
  • the light emitting component 11 may emit visible light or invisible light such as infrared light.
  • the environmental camera 13 includes, but is not limited to, a color camera, an infrared camera, and a black and white camera.
  • the electronic device 100 may use the environmental camera 13 to capture an image of an object.
  • the environment camera 13 is used to obtain spatial environment information.
  • the electronic device 100 can recognize the type of the object according to the image captured by the environmental camera 13. For example, it can be recognized that the object is an object such as a human hand or a table based on the image taken by the environment camera 13.
  • the electronic device 100 may obtain spatial environment information from the environment camera 13 to form a spatial environment map.
  • the depth camera 12 includes, but is not limited to, a TOF (Time of Flight) camera or a structured camera.
  • the depth camera 12 can acquire a depth image of an object. After the depth image is processed, it can be used to obtain a three-dimensional model of an object, action recognition, and so on.
  • the proximity sensor includes an infrared transmitter and an infrared receiver, and the infrared transmitter and the infrared receiver can cooperate to detect the distance between the external object and the electronic device 100.
  • the light sensor 14 can be used to detect the brightness of the environment, and the electronic device 100 can display images of appropriate brightness according to the brightness of the environment to improve user experience.
  • the sensor assembly 10 can be directly arranged on the housing 20 or indirectly on the housing 20.
  • the sensor assembly 10 is arranged on the housing 20 through a bracket, or in other words, the sensor assembly 10 is fixed on the bracket, and the bracket is fixed on the housing 20.
  • the number of the sensor assembly 10 can be one or more. When the number of sensor components 10 is multiple, multiple sensor components 10 can be respectively arranged in different positions of the housing 20, as long as it is ensured that the sensor components 10 will not interfere with the normal use of the user, as shown in FIG.
  • the electrochromic device 120 may have different light transmittances according to different applied voltages.
  • the electrochromic device 120 can filter light of a predetermined color.
  • the electrochromic device 120 can filter colored light such as blue light.
  • the electrochromic device 120 has a sheet shape.
  • the electrochromic device 120 may be arranged on the housing 20 or on the sensor assembly 10 or between the housing 20 and the sensor assembly 10.
  • the electrochromic device 120 is also pasted on the housing 20 or the sensor assembly 10 through optical glue; another example is the electrochromic device 120 is arranged between the housing 20 and the sensor assembly 10 through a transparent frame, and the electrochromic device There is a gap between 120 and the sensor assembly 10 and the housing 20.
  • the electrochromic device 120 covering the sensor assembly 10 means that the orthographic projection of the sensor assembly 10 on the electrochromic device 120 is located in the electrochromic device 120.
  • the orthographic projection of at least one of the light emitting component 11, the depth camera 12, the environment camera 13, and the proximity sensor is located in the electrochromic device 120.
  • each electrochromic device 120 corresponds to one of the light emitting component 11, the depth camera 12, the environment camera 13, and the proximity sensor.
  • the housing 20 includes an inner surface 201 and an outer surface 202.
  • the housing 20 is formed with a light-passing hole 203 penetrating the inner surface 201 and the outer surface 202, and the sensor assembly 10 and the light-passing hole 203
  • the electrochromic device 120 is attached to the outer surface 202 of the housing 20.
  • at least one of the light emitting component 11, the depth camera 12, the environment camera 13, and the proximity sensor is arranged corresponding to the light-passing hole 203.
  • the sensor assembly 10 can transmit signals to the outside and/or receive signals from the outside through the light-through hole 203.
  • the electrochromic device 120 may cover the light-passing hole 203 and cover the sensor assembly 10. It can be understood that when the sensor assembly 10 transmits a signal to the outside, the signal passes through the light-passing hole 203 and the electrochromic device 120.
  • the light-passing hole 203 may be a round hole, an elliptical hole, or a square-shaped hole, and the shape of the light-passing hole 203 is not limited herein.
  • the number of light-passing holes 203 may be one or multiple.
  • the number of the light-passing hole 203 is one.
  • the number of the light-passing holes 203 is multiple.
  • the light emitting component 11, the depth camera 12, the environment camera 13 and the proximity sensor may be arranged corresponding to one light through hole 203.
  • the housing 20 is formed with a receiving chamber 22, and the inner surface 201 of the housing 20 is a surface surrounding the receiving chamber.
  • the outer surface 202 of the casing 20 is a surface opposite to the inner surface 201 of the casing 20.
  • the sensor assembly 10 is contained in the receiving chamber 22.
  • the sensor assembly 10 is at least partially located in the light-passing hole 203.
  • the sensor assembly 10 may be partially located in the light-passing hole 203, or all of it can be located in the light-passing hole 203. In this way, the structure between the sensor assembly 10 and the housing 20 is relatively compact, and the volume of the electronic device 100 can be reduced.
  • the housing 20 includes a light-transmitting portion 204 corresponding to the sensor assembly 10, and an electrochromic device 120 is attached to the inner surface 201 of the light-transmitting portion 204.
  • the housing 20 is at least partially light-transmissive, so that the sensor assembly 10 can transmit signals to the outside and receive signals from the outside.
  • the light emitting part 11 may emit light through the light transmitting part 204.
  • the depth camera 12 can obtain the depth information of the target object through the light transmitting part 204.
  • the light-transmitting portion 204 may be made of a light-transmitting material.
  • the material of the light-transmitting portion 204 is a light-transmitting material such as acrylic.
  • the cross section of the light-transmitting portion 204 may be square, round, or irregular. It should be noted that the light-transmitting portion 204 can transmit visible light or non-visible light.
  • the other parts of the housing 20 except for the light-transmitting portion 204 may be light-transmissive or non-light-transmissive.
  • the housing 20 is a light-transmitting housing, and the electrochromic device 120 is attached to and covered on the outer surface 202.
  • the electrochromic device 202 covers the outer surface 202 of the housing 20. In this way, the electrochromic device 120 can not only cover the sensor assembly 10, but also improve the appearance of the electronic device 100.
  • the electrochromic device 120 can be controlled to present different colors according to different requirements to change the overall appearance of the electronic device 100. It can be understood that after the voltage of the electrochromic device 120 is changed, it can show different colors. For example, the electrochromic device 120 can be green, red, blue, or a gradient color, so that the entire electronic device 100 appears green, Colors such as red, blue, or gradient.
  • the electrochromic device 120 is only shown attached to a part of the outer surface 202 of the housing 20.
  • the electronic device 100 includes an antireflection film 130 laid on the electrochromic device 120, and the electrochromic device 120 is sandwiched between the outer surface 202 and the antireflection film 130.
  • the antireflection film 130 can not only protect the electrochromic device 120, but also improve the overall appearance of the electronic device 100.
  • the material of the antireflection film 130 can be calcium fluoride, etc., and its function is to reduce reflection and increase light transmittance.
  • the light emitting component 11 includes a packaging shell 111, a first light emitting source 112, a second light emitting source 113, a substrate 114 and a diffuser 115 (diffuser).
  • the first light-emitting source 112 and the second light-emitting source 113 are both disposed on the substrate 114 and located in the packaging shell 111.
  • the substrate 114 is fixedly connected to the packaging shell 111.
  • the substrate 114 is fixedly connected to the package shell 111 by bonding or welding.
  • the packaging shell 111 may be made of materials such as plastic or metal.
  • the material of the packaging shell 111 may be stainless steel.
  • the cross section of the packaging shell 111 may be square, circular, oval, or the like.
  • An opening 1110 is formed at one end of the packaging shell 111 away from the substrate 114.
  • the first light source 112 is used to emit the first light to the outside of the electronic device 100.
  • the second light source 113 is used to emit a second light to the outside of the electronic device 100 and used to fill the environment camera 13 with light.
  • the depth camera 12 is used to receive the first light reflected by the target object to obtain the depth information of the target object. Further, both the first light and the second light are used to pass through the diffuser 115 to exit.
  • the first light and the second light are both infrared light, and the wavelength of the first light is different from the wavelength of the second light.
  • the wavelength of the first light is 940 nm.
  • the wavelength of the second light is 850 nm.
  • the first light and/or the second light may be visible light. It can be understood that when the first light is infrared light, the depth camera 12 is an infrared camera.
  • the number of the second light-emitting source 113 is multiple, and the multiple second light-emitting sources 113 are arranged at intervals around the first light-emitting source 112.
  • the number of the second light-emitting sources 113 is 4, and the 4 second light-emitting sources are distributed at equal angular intervals around the first light-emitting source.
  • the first light-emitting source 112 and/or the second light-emitting source 113 includes a Vertical Cavity Surface Emitting Laser (VCSEL) chip, and the VCSEL chip includes a plurality of VCSEL light sources arranged in an array.
  • VCSEL Vertical Cavity Surface Emitting Laser
  • the substrate 114 may be a flexible circuit board or a rigid circuit board. Or a combination of a flexible circuit board and a rigid circuit board.
  • the diffusion sheet 115 is provided at the opening 1110.
  • the diffusion sheet 115 is used to disperse the first light and the second light so that the first light and the second light can be uniformly projected on the target object.
  • the first light-emitting source 112 and the second light-emitting source 113 are both arranged in the same packaging shell 111, which can make the structure of the light emitting component 11 more compact and reduce the volume of the electronic device 100 .
  • the electronic device 100 of the embodiment of the present application includes a display 40, a light guide member 70, and a light quantity adjustment member 80.
  • the light emitting component 11, the depth camera 12 and the environment camera 13 are all staggered from the display 40.
  • the light emitting part 11, the depth camera 12 and the environment camera 13 are all staggered from the light guide part 70.
  • the light guide member 70 is provided separately from the display 40.
  • the light guide member 70 includes a first side 71 and a second side 72 opposite to each other.
  • the light guide member 70 is used to guide the light generated by the display 40 and emit it from the first side 71.
  • the light quantity adjusting component 80 is disposed on the second side 72, and the light quantity adjusting component 80 is used to adjust the amount of ambient light incident on the second side 72.
  • augmented reality devices users can see the content displayed by the augmented reality device in a real scene through the augmented reality device. It can be understood that the ambient light and the light formed by the enhanced display device enter the human eye at the same time. If the brightness of the ambient light is high, the contrast between the display brightness of the augmented reality device and the ambient brightness is too low, and it is difficult for the human eye to see the display of the augmented reality device. content. If the light brightness of the environment is low, the contrast between the display brightness of the augmented reality device and the environment brightness is too high, and the display content of the virtual reality device is likely to irritate people and cause eye fatigue.
  • the related technology In order to solve the problem that the contrast between the display brightness of the augmented reality device and the ambient brightness is too high or high, the related technology generally adjusts the display brightness of the augmented reality device.
  • the ambient brightness is high, in order to improve the image clarity observed by the human eye, if the display brightness of the augmented reality device is increased, the power consumption of the augmented reality device will be greater, and a large amount of heat generated will affect the user experience.
  • the light amount adjusting component 80 can adjust the amount of ambient light incident from the second side 72 and emitted from the first side 71, thereby reducing the amount of ambient light generated on the display 40 and emitted from the first side 71
  • the influence of the light is beneficial to the user to watch the content displayed on the display 40 and improve the user experience.
  • the human eye when the user wears the electronic device 100, the human eye is located outside the first side 71. Therefore, the light generated by the display 40 can enter the human eye after exiting the first side 71, so that the user can observe the display 40 displayed on the display 40. image.
  • the ambient light passes through the light quantity adjusting component 80, the second side 72, and the first side 71 in sequence, and then enters the human eye, so that the user can see the environment. Therefore, the light quantity adjusting component 80 of the present application can adjust the ambient light entering the human eye, thereby reducing the influence of the ambient light on the image observed by the human eye.
  • the electronic device 100 of the embodiment of the present application further includes a supporting member 30, a refractive member 50, an adjusting mechanism 60, a processor 90, a light sensor 14 and a collimating member 92.
  • the housing 20 is an external component of the electronic device 100 and plays a role of protecting and fixing the internal components of the electronic device 100.
  • the internal components can be enclosed by the housing 20, and direct damage to these internal components caused by external factors can be avoided.
  • the housing 20 may be used to fix at least one of the display 40, the diopter 50, the adjustment mechanism 60, the light guide member 70, and the light quantity adjustment member 80.
  • the housing 20 is formed with a receiving chamber 22, and the display 40 and the diopter 50 are received in the receiving chamber 22.
  • the adjustment mechanism 60 is partially exposed from the housing 20.
  • the housing 20 further includes a housing top wall 24, a housing bottom wall 26 and a housing side wall 28.
  • a gap 262 is formed in the middle of the bottom wall 26 of the housing facing the top wall 24 of the housing.
  • the housing 20 has a substantially "B" shape.
  • the housing 20 can be formed by machining an aluminum alloy by a Computerized Numerical Control (CNC) machine tool, or it can be made of polycarbonate (PC) or PC and acrylonitrile butadiene styrene (Acrylonitrile Butadiene Styrene). plastic, ABS) injection molding.
  • CNC Computerized Numerical Control
  • PC polycarbonate
  • PC acrylonitrile butadiene styrene
  • plastic ABS injection molding.
  • the specific manufacturing method and specific materials of the housing 20 are not limited here.
  • the supporting member 30 is used to support the electronic device 100.
  • the electronic device 100 may be fixed on the user's head through the supporting member 30.
  • the supporting member 30 includes a first bracket 32, a second bracket 34 and an elastic band 36.
  • the first bracket 32 and the second bracket 34 are symmetrically arranged about the notch 262. Specifically, the first bracket 32 and the second bracket 34 are rotatably arranged on the edge of the casing 20, and when the user does not need to use the electronic device 100, the first bracket 32 and the second bracket 34 can be stacked close to the casing 20 , To facilitate storage. When the user needs to use the electronic device 100, the first bracket 32 and the second bracket 34 can be unfolded to realize the function of supporting the first bracket 32 and the second bracket 34.
  • a first bending portion 322 is formed at one end of the first bracket 32 away from the casing 20, and the first bending portion 322 is bent toward the bottom wall 26 of the casing. In this way, when the user wears the electronic device 100, the first bending portion 322 can be erected on the user's ear, so that the electronic device 100 is not easy to slip off.
  • a second bent portion 342 is formed at one end of the second bracket 34 away from the housing 20.
  • the explanation and description of the second bending portion 342 may refer to the first bending portion 322, and in order to avoid redundancy, it will not be repeated here.
  • the elastic band 36 detachably connects the first bracket 32 and the second bracket 34. In this way, when the user wears the electronic device 100 for strenuous activities, the elastic band 36 can be used to further fix the electronic device 100 to prevent the electronic device 100 from loosening or even falling during strenuous activities. It can be understood that in other examples, the elastic band 36 may also be omitted.
  • the display 40 includes an OLED display screen.
  • the OLED display screen does not require a backlight, which is beneficial to the thinning of the electronic device 100.
  • the OLED screen has a large viewing angle and low power consumption, which is conducive to saving power consumption.
  • the display 40 may also be an LED display or a Micro LED display. These displays are merely examples and the embodiments of the present application are not limited thereto.
  • the refractive component 50 is arranged on the side of the display 40.
  • the refractive member is located on the first side 71 of the light guide member 70.
  • the refractive component 50 includes a refractive cavity 52, a light-transmitting liquid 54, a first film layer 56, a second film layer 58 and a side wall 59.
  • the light-transmitting liquid 54 is disposed in the refractive cavity 52.
  • the adjustment mechanism 60 is used to adjust the amount of the light-transmitting liquid 54 to adjust the shape of the diopter 50.
  • the second film layer 58 is disposed relative to the first film layer 56
  • the side wall 59 connects the first film layer 56 and the second film layer 58
  • the first film layer 56, the second film layer 58 and the side wall 59 surround
  • the refractive cavity 52 and the adjusting mechanism 60 are used to adjust the amount of the transparent liquid 54 to change the shape of the first film layer 56 and/or the second film layer 58.
  • "changing the shape of the first film layer 56 and/or the second film layer 58" includes three cases: the first case: changing the shape of the first film layer 56 without changing the shape of the second film layer 58; The second case: the shape of the first film layer 56 is not changed and the shape of the second film layer 58 is changed; the third case: the shape of the first film layer 56 is changed and the shape of the second film layer 58 is changed.
  • the first case is taken as an example for description.
  • the first film layer 56 may have elasticity. It can be understood that when the amount of the light-transmitting liquid 54 in the refractive cavity 52 changes, the pressure in the refractive cavity 52 also changes, so that the shape of the refractive component 50 changes.
  • the adjustment mechanism 60 reduces the amount of the light-transmitting liquid 54 in the refractive cavity 52, the pressure in the refractive cavity 52 is reduced, and the pressure difference between the pressure outside the refractive cavity 52 and the pressure in the refractive cavity 52 As it increases, the refractive cavity 52 becomes more concave.
  • the adjusting mechanism 60 increases the amount of the light-transmitting liquid 54 in the refractive cavity 52, the pressure in the refractive cavity 52 increases, and the pressure outside the refractive cavity 52 is equal to the pressure in the refractive cavity 52. The difference is reduced, and the refractive cavity 52 is more convex.
  • the shape of the refractive member 50 can be adjusted by adjusting the amount of the light-transmitting liquid 54.
  • the adjustment mechanism 60 is connected to the diopter 50.
  • the adjustment mechanism 60 is used to adjust the form of the diopter 50 to adjust the diopter of the diopter 50.
  • the adjustment mechanism 60 includes a cavity 62, a sliding member 64, a driving part 66, an adjustment cavity 68 and a switch 61.
  • the sliding member 64 is slidably disposed in the cavity 62, the driving member 66 is connected to the sliding member 64, the cavity 62 and the sliding member 64 jointly define an adjustment cavity 68, the adjustment cavity 68 communicates with the refractive cavity 52 through the side wall 59, and the driving member 66 is used to drive the sliding member 64 to slide relative to the cavity 62 to adjust the volume of the adjustment cavity 68 to adjust the amount of the light-transmitting liquid 54 in the refractive cavity 52.
  • the volume of the adjustment cavity 68 is adjusted by the sliding member 64 to adjust the amount of the light-transmitting liquid 54 in the refractive cavity 52.
  • the sliding member 64 slides away from the side wall 59, the volume of the adjusting cavity 68 increases, the pressure in the adjusting cavity 68 decreases, and the light-transmitting liquid 54 in the refractive cavity 52 enters Adjusting the cavity 68, the first film layer 56 is more and more inwardly recessed.
  • the sliding member 64 slides toward the side wall 59, the volume of the adjusting cavity 68 decreases, the pressure in the adjusting cavity 68 increases, and the light-transmitting liquid 54 in the adjusting cavity 68 enters In the refractive cavity 52, the first film layer 56 protrudes more and more outward.
  • the side wall 59 is formed with a flow channel 5, and the flow channel 5 communicates with the adjusting cavity 68 and the refractive cavity 52.
  • the adjustment mechanism 60 includes a switch 61 provided in the flow channel 5, and the switch 61 is used to control the open and close state of the flow channel 5.
  • the number of switches 61 is two, and both switches 61 are one-way switches.
  • One switch 61 is used to control the flow of light-transmitting liquid 54 from the adjusting cavity 68 to the refractive cavity 52, and the other switch 61 It is used to control the flow of the light-transmitting liquid 54 from the refractive cavity 52 to the regulating cavity 68.
  • the flow of the transparent liquid 54 between the adjusting cavity 68 and the refractive cavity 52 is realized through the switch 61 to maintain the pressure balance on both sides of the side wall 59.
  • the change in the volume of the adjustment cavity 68 will cause the pressure in the adjustment cavity 68 to change, thereby causing the flow of the transparent liquid 54 between the adjustment cavity 68 and the refractive cavity 52.
  • the switch 61 controls whether the flow of the light-transmitting liquid 54 between the adjusting cavity 68 and the refractive cavity 52 can be realized by controlling the open and closed state of the flow channel 5, thereby controlling the adjustment of the shape of the refractive component 50.
  • the switch 61 that controls the flow of the light-transmitting liquid 54 from the refractive cavity 52 to the adjustment cavity 68 is turned on, the sliding member 64 slides away from the side wall 59, and the volume of the adjustment cavity 68 increases.
  • the pressure in the adjustment cavity 68 is reduced, the light-transmitting liquid 54 in the refractive cavity 52 enters the adjustment cavity 68 through the switch 61, and the first film layer 56 is more and more inwardly recessed.
  • the switch 61 that controls the flow of the light-transmitting liquid 54 from the refractive cavity 52 to the adjustment cavity 68 is closed. Even if the slider 64 slides away from the side wall 59, the volume of the adjustment cavity 68 increases, and the adjustment cavity 68 The internal pressure decreases, the light-transmitting liquid 54 in the refractive cavity 52 cannot enter the adjustment cavity 68, and the shape of the first film layer 56 does not change.
  • the switch 61 that controls the flow of the light-transmitting liquid 54 from the adjusting cavity 68 to the refractive cavity 52 is turned on, the sliding member 64 slides toward the side wall 59, and the volume of the adjusting cavity 68 decreases. , The pressure in the regulating cavity 68 increases, the light-transmitting liquid 54 in the regulating cavity 68 enters the refractive cavity 52 through the switch 61, and the first film layer 56 protrudes more and more outward.
  • the switch 61 that controls the flow of the light-transmitting liquid 54 from the adjustment cavity 68 to the refractive cavity 52 is closed. Even if the slider 64 slides toward the side wall 59, the volume of the adjustment cavity 68 decreases, and the adjustment cavity 68 The internal pressure increases, the light-transmitting liquid 54 in the regulating cavity 68 cannot enter the refractive cavity 52, and the shape of the first film layer 56 does not change.
  • the driving member 66 can realize its function of driving the sliding member 64 to slide based on various structures and principles.
  • the driving component 66 includes a knob 662 and a screw 664.
  • the screw 664 is connected to the knob 662 and the sliding member 64.
  • the knob 662 is used to drive the screw 664 to rotate to drive the sliding member 64 relative to the cavity. 62 slide.
  • the slider 64 can be driven by the knob 662 and the lead screw 664. Since the screw 664 and the knob 662 cooperate to convert the rotary motion of the knob 662 into the linear motion of the screw 664, when the user rotates the knob 662, the screw 664 can drive the slider 64 to slide relative to the cavity 62, thereby causing adjustment
  • the change in the volume of the cavity 68 further adjusts the amount of the transparent liquid 54 in the refractive cavity 52.
  • the knob 662 can be exposed from the housing 20 to facilitate the rotation of the user.
  • a threaded part is formed on the knob 662
  • a threaded part that matches the knob 662 is formed on the screw 664
  • the knob 662 and the screw 664 are threadedly connected.
  • the switch 61 can be opened correspondingly. In this way, the light-transmitting liquid 54 can flow, and the pressure balance on both sides of the side wall 59 is ensured.
  • the knob 662 rotates clockwise and the sliding member 64 slides in a direction away from the side wall 59 to turn on the switch 61 that controls the flow of the light-transmitting liquid 54 from the refractive cavity 52 to the adjustment cavity 68.
  • the switch 61 that controls the flow of the light-transmitting liquid 54 from the adjusting cavity 68 to the refractive cavity 52 is turned on.
  • the rotation angle of the knob 662 and the diopter power of the diopter 50 are not associated, and the user only needs to rotate the knob 662 to a position with the best visual experience.
  • the rotation angle of the knob 662 and the diopter power of the diopter 50 may also be correlated.
  • the driving component 66 includes a gear 666 and a rack 668 meshing with the gear 666.
  • the rack 668 connects the gear 666 and the sliding member 64.
  • the gear 666 is used to drive the rack 668 to move to drive the sliding member 64 relative to the cavity. 62 slide.
  • the sliding member 64 is driven by the gear 666 and the rack 668. Since the gear 666 and the rack 668 cooperate to convert the rotary motion of the gear 666 into the linear motion of the rack 668, when the user rotates the gear 666, the rack 668 can drive the sliding member 64 to slide relative to the cavity 62, thereby causing adjustment
  • the change in the volume of the cavity 68 further adjusts the amount of the transparent liquid 54 in the refractive cavity 52.
  • the gear 666 can be exposed from the housing 20 to facilitate the rotation of the user.
  • the switch 61 can be opened correspondingly. In this way, the light-transmitting liquid 54 can flow, and the pressure balance on both sides of the side wall 59 is ensured.
  • the gear 666 rotates clockwise so that the rack 668 is engaged with the gear 666, the length of the rack 668 is shortened, and the sliding member 64 is pulled to move away from the side wall 59, which will control the light-transmitting liquid 54 from the refractive index.
  • the switch 61 from the cavity 52 to the regulating cavity 68 is opened.
  • the gear 666 rotates counterclockwise so that the rack 668 meshed with the gear 666 is disengaged from the gear 666, the length of the rack 668 increases, and the sliding member 64 is pushed to move toward the side wall 59, which will control the penetration
  • the switch 61 of the optical liquid 54 flowing from the adjusting cavity 68 to the refractive cavity 52 is turned on.
  • the rotation angle of the gear 666 and the diopter power of the diopter 50 are not associated, and the user only needs to rotate the gear 666 to a position with the best visual experience.
  • the rotation angle of the gear 666 and the diopter power of the diopter 50 may also be correlated.
  • the structure of the refractive component 50 not only includes the above refractive cavity 52, the transparent liquid 54, the first film layer 56, the second film layer 58 and the side wall 59, as long as the refractive component 50 can achieve diopter
  • the refractive component 50 includes a plurality of lenses and a driving member, and the driving member is used to drive each lens to move from the storage position to the refractive position.
  • the refractive power of the diopter 50 can be changed by the combination of a plurality of lenses.
  • the driving member can also drive each lens moved to the refractive position to move on the refractive axis, thereby changing the refractive power of the refractive component 50.
  • the shape of the above refractive component includes the shape and state of the refractive component.
  • the structure of the above refractive cavity 52, light-transmitting liquid 54, first film layer 56, second film layer 58, and sidewall 59 is changed by changing the first
  • the shape of the first film layer 56 and/or the second film layer 58 can be used to change the refractive power; the structure of the above multiple lenses and the driving member can change the refractive power by changing the state of the lenses.
  • the light guide member 70 is located between the refractive member 50 and the light quantity adjusting member 80.
  • the light guide member 70 may be a plate-shaped light guide element, and the light guide member 70 may be made of a light-transmitting material such as resin.
  • the light rays of different propagation directions are totally reflected and propagated in the light guide member 70, and finally exit from the first side 71 of the light guide member 70 to the light guide member 70.
  • the component 70 is outside, so that the human eye can observe the content displayed on the display 40.
  • the light quantity adjusting member 80 may be fixed to the light guide member 70 by optical glue.
  • the light quantity adjusting member 80 includes an electrochromic element, and the light transmittance of the electrochromic element changes after a voltage is applied to the electrochromic element. In this way, the amount of light passing through the electrochromic element can be adjusted by changing the light transmittance of the electrochromic element, so that the amount of ambient light passing through the second side 72 and the first side 71 can be adjusted.
  • the phenomenon that the electrochromic element undergoes a stable and reversible color change under the action of an external electric field is manifested as a reversible change in color and transparency in appearance. In this way, the electrochromic element can realize the change of light transmittance.
  • the electrochromic element may include a first conductive layer 81, a second conductive layer 82, an electrochromic layer 83, an electrolyte layer 84, and an ion storage layer 85, which are stacked, and the electrochromic layer 83 It is provided between the first conductive layer 81 and the second conductive layer 82.
  • the first conductive layer 81 and the second conductive layer 82 are used to cooperate to apply a voltage to the electrochromic layer 83.
  • the electrolyte layer 84 and the ion storage layer 85 are sequentially stacked and disposed between the electrochromic layer 83 and the second conductive layer 82.
  • the first conductive layer 81 and the second conductive layer 82 can provide voltage for the electrochromic, so that the light transmittance of the electrochromic can be changed, thereby changing the light transmittance of the electrochromic, the electrolyte layer and the ion storage layer 85 It can be ensured that the electrochromic layer 83 can normally change the light transmittance.
  • the structure of the electrochromic device 120 mentioned above is similar to the structure of the electrochromic element. Therefore, for the structure of the electrochromic device 120 of the present application, please refer to the structure of the electrochromic element, which will not be repeated in this application. .
  • the processor 90 is connected to the light quantity adjustment component 80.
  • the processor 90 is used to control the light transmittance of the light quantity adjusting component 80 so that the light quantity adjusting component 80 adjusts the amount of ambient light incident on the second side 72. In this way, the processor 90 can accurately adjust the light transmittance of the light quantity adjusting component 80.
  • the processor 90 can control the voltage applied to the electrochromic element, thereby controlling the light transmittance of the electrochromic element.
  • the light transmittance of the light quantity adjusting member 80 is controlled by adjusting the applied voltage of the electrochromic element.
  • the processor 90 may include vital components such as a circuit board and a processing chip arranged on the circuit board.
  • the light sensor 14 is connected to the processor 90.
  • the light sensor 14 is used to detect the environmental brightness
  • the processor 90 is used to adjust the light transmittance of the light quantity adjusting component 80 according to the environmental brightness, wherein the environmental brightness and the light transmittance of the light quantity adjusting component 80 are in an inverse relationship.
  • the light transmittance of the light quantity adjusting member 80 can be automatically adjusted so that the user can clearly observe the content displayed on the display 40, and the user is not easily fatigued.
  • the light transmittance of the light quantity adjusting member 80 decreases; when the environmental brightness decreases, the light transmittance of the light quantity adjusting member 80 increases. In this way, the contrast of the display screen of the display 40 is in a comfortable zone for human eyes to view, and the user experience is improved.
  • the collimating part 92 is disposed between the display 40 and the light guide part 70, and the collimating part 92 is used to collimate the light generated by the display 40 and then emit the light to the light guide part 70. In this way, the collimating component 92 can convert the light generated by the display 40 into parallel light and then enter the light guiding component 70, thereby reducing the loss of light.
  • the collimating component 92 may include a plurality of lenses, and the plurality of lenses can be superimposed together to collimate light.
  • the light generated by the display 40 enters the light guide member 70 after passing through the collimating member 92, and the light is totally reflected or diffracted in the light guide member 70 and then exits from the first side 71 of the light guide member 70.
  • the processor 90 is used to turn on the first light source 112, the depth camera 12, and the environment camera 13 when the current environment brightness is less than the preset brightness to enable the depth camera 12 to obtain the depth information of the target object, and turn on the second
  • the light source 113 supplements light for the environment camera 13 and the environment camera 13 obtains spatial environment information.
  • the second light source 113 can be turned on when the current environment brightness is less than the preset brightness to fill the environment camera 13 with light, so that the environment camera 13 can have better quality images, so that the electronic device 100 still obtains environmental information in dim light.
  • the second light emitted by the second light source 113 can be emitted to the target object, so as to supplement the light intensity in the environment when the ambient light is weak.
  • the electronic device 100 includes a driver chip 94, and a driver chip 94 is connected to the processor 90, the first light source 112 and the second light source 113, and the processor 90 is used for the brightness of the current environment.
  • the driving chip 94 is controlled to output a first driving signal and a second driving signal.
  • the first driving signal is used to drive the first light-emitting source 112
  • the second driving signal is used to drive the second light-emitting source 113.
  • one driving chip 94 can drive two light-emitting sources, which can reduce the amount of hardware of the electronic device 100, thereby reducing the cost of the electronic device 100.
  • the electronic device 100 includes two driving chips 94, both of which are connected to the processor 90, one of which is connected to the first light source 112, and the other is connected to the first light source 112.
  • 94 is connected to the second light source 113, and the processor 90 is used to control one of the driving chips 94 to output the first driving signal when the current ambient brightness is less than the preset brightness, and the other driving chip 94 to output the second driving signal, the first driving signal It is used to drive the first light-emitting source 112, and the second driving signal is used to drive the second light-emitting source 113.
  • the two driving chips 94 respectively control the corresponding light-emitting sources, which makes it easier to control the working state of each light-emitting source.
  • the processor 90 is used to obtain the current ambient brightness through the light sensor 14.
  • the current ambient brightness detected by the light sensor 14 may be transmitted to the processor 90. In this way, it is convenient and effective to obtain the brightness of the current environment.
  • the processor 90 is used to obtain the spatial environment image collected by the environmental camera 13 and used to calculate the gray level of the spatial environment image; and used to obtain the current environment brightness according to the gray level.
  • the light sensor 14 can be omitted, so that the cost of the electronic device 100 can be reduced.
  • FIG. 18 is a schematic diagram of internal modules of the electronic device 100 in an embodiment.
  • the electronic device 100 includes a processor 90, a memory 102 (for example, a non-volatile storage medium), an internal memory 103, a display device 104, and an input device 105 connected through a system bus 109.
  • the processor 90 can be used to provide computing and control capabilities, and support the operation of the entire electronic device 100.
  • the internal memory 103 of the electronic device 100 provides an environment for the execution of computer readable instructions in the memory 102.
  • the display device 104 of the electronic device 100 can be a display 40 provided on the electronic device 100, and the input device 105 can be an acoustic and electrical element and a vibration sensor provided on the electronic device 100, or it can be a button or track provided on the electronic device 100.
  • the ball or trackpad can also be an external keyboard, trackpad, or mouse.
  • the electronic device may be a smart bracelet, smart watch, smart helmet, electronic glasses, etc.
  • the image of an object can be acquired through a camera, through image recognition technology, image segmentation, extraction of hand features and calculation of gesture nodes, etc., to determine the action type of the gesture, and complete the interaction of the gesture.
  • the disadvantage of this solution is that the camera must always be turned on to obtain images, and the central processing unit or digital signal processor must always run related gesture algorithms to process image information.
  • the problems may be as follows: 1. The overall power consumption is high, the heat is serious, and the effective use time of the battery is shortened; 2. The CPU or digital signal processor has a high occupancy rate, causing other algorithms and processes to fail to run quickly , The system has problems such as freezing.
  • the embodiments of the present application also provide a control method of the electronic device 100.
  • the electronic device 100 includes a camera 110, and the control method includes:
  • 030 Perform motion gesture recognition on the external object 400 according to the image of the external object 400.
  • the control device 200 of the embodiment of the present application is used in the electronic device 100.
  • the control device 200 includes a detection module 210 and a control module 220.
  • Step 010 can be executed by the detection module 210
  • steps 020-030 can be executed by the control module 220.
  • the detection module 210 is used to detect whether there is an external object 400 in the preset range of the camera 110;
  • the control module 220 is used to turn on the camera 110 to obtain the external object 400 in the preset range of the camera 110.
  • steps 010-030 may be executed by the processor 90, or in other words, the processor 90 is used to detect whether there is an external object 400 within the preset range of the camera 110; When there is an external object 400 within the range, the camera 110 is turned on to obtain an image of the external object 400; and it is used to recognize the external object 400 based on the image of the external object 400.
  • the camera 110 is turned on when there is an external object 400 within a preset range of the camera 110, so as to perform an action posture on the external object 400 Recognition, this can prevent the camera 110 from being always on, thereby reducing the running time of the camera 110 and the processor 90, and reducing the power consumption and heat generation of the electronic device 100.
  • the preset range of the camera 110 refers to a fan-shaped or conical shape formed in the field of view of the camera 110 with the lens center surface of the camera 110 as the center and a predetermined distance as the radius. Range; or a fan-shaped or cone-shaped range formed with the center of the image sensor of the camera 110 as the center and a predetermined distance as the radius.
  • the preset distance can be specifically set according to actual needs.
  • the preset distance is 10cm, 20cm, 30cm, 40cm, or 60cm.
  • the external objects 400 may be living objects such as hands, eyes, and heads of the human body; and may also be non-living objects such as pens and books.
  • turning on the camera 110 refers to driving the camera 110 to work so that the camera 110 can sense the light intensity of the external environment and form an image of the external object 400 according to the imaging effect of the lens of the camera 110.
  • the camera 110 is, for example, the environmental camera 110 or the depth camera 110 described above.
  • the camera 110 acquires a two-dimensional image.
  • the camera 110 acquires a three-dimensional image. Therefore, the image of the external object 400 mentioned in the present application may be a two-dimensional image or a three-dimensional image.
  • the image of the external object 400 includes information such as the type, shape, and size of the object.
  • Performing action posture recognition is, for example, segmenting the image of the external object 400, extracting features, recognizing the object category, and determining whether the action posture is satisfied. process.
  • the processor 90 cooperates with related hardware to run a corresponding program to execute step 030, so as to realize the purpose of motion gesture recognition.
  • the action posture mentioned in this embodiment includes at least one of gestures and eye movements.
  • the gesture is a user's hand movement.
  • Hand motions can be used by the user to control finger motions to form predetermined motions. For example, actions such as the user's thumbs up or five fingers spread out.
  • Eye movements can be used for eye movements, for example, turning the eyes to the left and right; eye movements can also be actions of a user's blinking, for example, the length of time the user closes his eyes or the frequency of blinking.
  • the gestures are not limited to the gestures and eye movements discussed above.
  • the electronic device 100 includes a proximity sensor 15 provided on one side of the camera 110, and step 010 includes:
  • the proximity sensor 15 is turned on in response to a trigger instruction issued by the user to trigger the proximity sensor 15 to detect whether there is an external object 400 within the preset range of the camera 110.
  • the processor 90 is configured to turn on the proximity sensor 15 in response to a trigger instruction issued by the user, so as to detect whether there is an external object 400 within a preset range of the camera 110 through the proximity sensor 15.
  • the proximity sensor 15 can be arranged on the upper side of the camera 110 or on the lower side of the camera 110, and the orientation of the proximity sensor 15 relative to the camera 110 is not limited here.
  • the proximity sensor 15 may be arranged in contact with the camera 110 or may be arranged spaced apart from the camera 110.
  • the trigger instruction may be formed according to a user operation.
  • the user presses a key of the electronic device 100 or an input device such as a touch screen to make the electronic device 100 start to run a program of an action gesture and form a trigger instruction.
  • orientations such as “up” and “down” referred to herein refer to the down orientation of the electronic device 100 in a normal use state.
  • the proximity sensor 15 can emit infrared rays and receive the infrared rays reflected by the external object 400 to detect the distance between the external object 400 and the electronic device 100.
  • the proximity sensor 15 can detect the distance between the external object 400 and the electronic device 100 by means of ultrasonic waves, electromagnetic fields, and millimeter waves.
  • the proximity sensor 15 can accurately detect whether there is an external object 400 within the preset range of the camera 110.
  • the power consumption of the proximity sensor 15 is low, which can further reduce the power of the electronic device 100 in the process of performing the action posture. Consumption.
  • step 030 includes:
  • control the camera 110 to operate at the first frame rate to obtain the image of the external object 400 to obtain the action posture of the external object 400;
  • the camera 110 is controlled to operate at the second frame rate to obtain the image of the external object 400 to determine whether the external object 400 is a predetermined object. Less than the first frame rate.
  • steps 031-032 may be executed by the processor 90.
  • the processor 90 is used to control the camera 110 to perform the first step when confirming that the external object 400 is a predetermined object according to the image of the external object 400.
  • the frame rate operation acquires the image of the external object 400 to acquire the action posture of the external object 400; and according to the image of the external object 400, when it is confirmed that the external object 400 is not a predetermined object, the camera 110 is controlled to operate at the second frame rate to acquire the external object.
  • the image of the object 400 is used to determine whether the external object 400 is a predetermined object, and the second frame rate is lower than the first frame rate.
  • the type of the external object 400 is determined and the frame rate of the operation of the camera 110 is controlled, so as to avoid the camera 110 from running at a higher frame rate all the time and consuming more energy.
  • the predetermined object is an object that can perform an action gesture.
  • the predetermined object is a human hand, head, and/or eyes. It can be understood that when the predetermined object can be the head of a human body, the head can perform actions such as nodding and shaking the head.
  • the external object 400 is a predetermined object, it can be predicted that the external object 400 will make an action posture. Therefore, the camera 110 is controlled to obtain an image of the external object 400 at a higher frame rate, so that the image of the external object 400 can be accurately obtained. posture.
  • step 032 when the external object 400 is not a predetermined object, at this time, it can be predicted that the external object 400 will not make an action posture. Therefore, the camera 110 is controlled to obtain an image of the external object 400 at a lower frame rate. In this way, the power consumption of the electronic device 100 can be reduced.
  • determining whether the external object 400 is a predetermined object is based on the result of the electronic device 100 according to the image recognition.
  • the external object 400 is a human head
  • the camera 110 only captures a partial image of the human head
  • the external object 400 cannot be analyzed according to the image to be a human head. If the camera 110 only captures a complete image of the human head, then according to the image, it can be analyzed that the external object 400 is a human head
  • step 031 runs at a lower frame rate, so that the image of the external object 400 can be continuously acquired to further identify whether the external object 400 is a predetermined object during the external movement, thereby improving the recognition of the external object 400 and the recognition of motion gestures. Accuracy.
  • the first frame rate is 30 frames/sec or 60 frames/sec; the second frame rate is 5 frames/sec or 10 frames/sec.
  • step 031 includes:
  • steps 0321-0323 may be executed by the processor 90, or in other words, the processor 90 is used to control the camera 110 to obtain continuous frame images of the external object 400; and to confirm the external object 400 according to the continuous frame images. Whether the action posture is a predetermined posture; and used to generate a corresponding control instruction when the action posture is a predetermined posture.
  • the action posture is generally a dynamic process. Therefore, the recognition of the action posture is a continuous process. According to the continuous frame images of the external object 400, the action posture of the external object 400 can be accurately obtained to generate the corresponding response more accurately. Control instructions.
  • the predetermined gesture includes at least one of clicking, sliding, and zooming.
  • the external object 400 is a hand.
  • the camera 110 can be controlled to obtain 10 consecutive frames of images to determine whether the hand has made a "click" gesture based on these 10 frames. Click on the corresponding control instruction.
  • control method further includes the steps:
  • step 040 may be executed by the processor 90, or in other words, the processor 90 is configured to generate a corresponding control instruction when the action posture is a predetermined posture, and control the operation of the electronic device 100 according to the control instruction.
  • the electronic device 100 can perform corresponding functions according to the action posture of the external object 400.
  • the electronic device 100 can be controlled to unlock the screen, take a screenshot, close the screen, and fast-forward the video according to the control instruction.
  • the electronic device 100 can play a video according to the "click” motion.
  • control method further includes the steps:
  • the camera 110 is turned off.
  • the processor 90 is also used to detect whether the external object 400 moves out of the preset range; and used to turn off the camera 110 when the external object 400 moves out of the preset range.
  • a non-volatile computer-readable storage medium containing computer-executable instructions.
  • the processor 90 can execute the control method in any embodiment.
  • the structure shown in the figure is only a schematic diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
  • the specific electronic device may include More or fewer parts than shown in the figure, or some parts are combined, or have a different arrangement of parts.
  • the description with reference to the terms “one embodiment”, “certain embodiments”, “exemplary embodiments”, “examples”, “specific examples”, or “some examples” etc. means to combine The specific features, structures, materials or characteristics described in the embodiments or examples are included in at least one embodiment or example of the present application.
  • the schematic representation of the above-mentioned terms does not necessarily refer to the same embodiment or example.
  • the described specific features, structures, materials or characteristics can be combined in any one or more embodiments or examples in a suitable manner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Signal Processing (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Computing Systems (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种控制方法、控制装置、电子装置(100)和存储介质。电子装置(100)的控制方法中,所述电子装置(100)包括摄像头,所述控制方法包括:(010)检测在所述摄像头的预设范围内是否有外部物体;(020)在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;(030)根据所述外部物体的图像对所述外部物体进行动作姿势识别。

Description

控制方法、控制装置、电子装置和存储介质
优先权信息
本申请请求2019年08月30日向中国国家知识产权局提交的、专利申请号为201910817693.6的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及显示技术领域,特别涉及一种控制方法、控制装置、电子装置和存储介质。
背景技术
在相关的技术中,手机、可穿戴设备等电子装置可以通过摄像头获取手势动作,并根据手势动作控制电子装置执行相关的操作。例如,电子装置可以根据手势动作锁定屏幕。
发明内容
本申请提供了一种控制方法、控制装置、电子装置和存储介质。
本申请实施方式的电子装置的控制方法中,所述电子装置包括摄像头,所述控制方法包括:
检测在所述摄像头的预设范围内是否有外部物体;
在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;
根据所述外部物体的图像对所述外部物体进行动作姿势识别。
本申请实施方式的控制装置用于电子装置,所述电子装置包括摄像头,所述控制装置包括:
检测模块,用于检测在所述摄像头的预设范围内是否有外部物体;
控制模块,用于在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;及用于根据所述外部物体的图像对所述外部物体进行动作姿势识别。
本申请实施方式的电子装置包括摄像头和处理器,所述处理器用于检测在所述摄像头的预设范围内是否有外部物体;及用于在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;以及用于根据所述外部物体的图像对所述外部物体进行动作姿势识别。
一种包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行以上所述的控制方法。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请实施方式的电子装置的立体示意图;
图2是本申请实施方式的电子装置的剖面示意图;
图3是本申请实施方式的电子装置的剖面示意图;
图4是本申请实施方式的电子装置的剖面示意图;
图5是本申请实施方式的电子装置的光线发射部件的剖面示意图;
图6是本申请另一个实施方式的电子装置的光线发射部件的剖面示意图;
图7是本申请实施方式的电子装置的立体示意图;
图8是本申请实施方式的电子装置的原理结构示意图;
图9是本申请另一实施方式的电子装置的平面示意图;
图10是本申请实施方式的电子装置部分结构的平面示意图;
图11是本申请实施方式的电子装置的调节过程的示意图;
图12是本申请实施方式的电子装置的调节过程的另一示意图;
图13是本申请另一实施方式的电子装置部分结构的平面示意图;
图14是本申请实施方式的光量调节部件的平面示意图;
图15是本申请实施方式的环境亮度和光量调节部件的透光率的关系示意图;
图16是本申请实施方式的电子装置的模块示意图;
图17是本申请另一实施方式的电子装置的模块示意图;
图18是本申请实施方式的电子装置的内部模块示意图;
图19是本申请实施方式的电子装置的场景示意图;
图20是本申请实施方式的控制方法的流程示意图;
图21是本申请实施方式的控制装置的模块示意图;
图22是本申请实施方式的控制方法的流程示意图;
图23是本申请实施方式的控制方法的流程示意图;
图24是本申请实施方式的控制方法的流程示意图。
主要元件符号说明:
电子装置100、传感器组件10、光线发射部件11、封装壳111、第一发光源112、第二发光源113、基板114、扩散片115、深度摄像头12、环境摄像头13、光线传感器14、电致变色器件120、增透膜130、壳体20、内表面201、外表面202、通光孔203、透光部204、收容腔室22、壳体顶壁24、壳体底壁26、缺口262、壳体侧壁28、支撑部件30、第一支架32、第一弯折部322、第二支架34、第二弯折部342、弹性带36、显示器40、屈光部件50、屈光腔52、透光液体54、第一膜层56、第二膜层58、侧壁59、调节机构60、腔体62、滑槽622、滑动件64、驱动部件66、旋钮662、丝杠664、齿轮666、齿条668、调节腔68、导光部件70、第一侧71、第二侧72、光量调节部件80、第一导电层81、第二导电层82、电致变色层83、电解质层84、离子存储层85、处理器90、准直部件92、驱动芯片94;
控制装置200、检测模块210、控制模块220。
具体实施方式
下面详细描述本申请的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本申请,而不能理解为对本申请的限制。
下文的公开提供了许多不同的实施方式或例子用来实现本申请的不同结构。为了简化本申请的公开,下文中对特定例子的部件和设置进行描述。当然,它们仅仅为示例,并且目的不在于限制本申请。此外,本申请可以在不同例子中重复参考数字和/或参考字母,这种重复是为了简化和清楚的目的,其本身不指示所讨论各种实施方式和/或设置之间的关系。此外,本申请提供了的各种特定的工艺和材料的例子,但是本领域普通技术人员可以意识到其他工艺的应用和/或其他材料的使用。
本申请实施方式提供了一种电子装置100,电子装置100例如为手机、平板电脑、可穿戴设备等移动终端。可穿戴设备例如为头戴式显示设备(Head Mount Display,HMD),头戴式显示设备通过计算***与光学***的配合,在用户在佩戴头戴式显示设备后,可向用户的眼睛发送光学信号,从而实现虚拟现实(Virtual Reality,VR)、增强现实(Augmented Reality,AR)和混合现实(Mixed Reality,MR)等不同效果。
为了方便理解,本申请实施方式的电子装置100以头戴式显示设备作为例子进行详细地描述。
请参阅图1,本申请实施方式的电子装置100包括传感器组件10、壳体20和电致变色器件120。传感器组件10设置在壳体20。电致变色器件120设置在壳体20并与传感器组件10对应设置。电致变色器件120覆盖传感器组件10。
本申请实施方式的电子装置100中,电致变色器件120可以根据电子装置100的状态改变自身的透光率,从而遮蔽或露出传感器组件10,提高了电子装置100的外观效果。
具体地,电子装置100的状态例如为工作状态和非工作状态。在电子装置100处于工作状态时,电子装置100可以为用户展现画面,播放视频、音频等信息,可以执行用户的操作。例如,电子装置100可以根据用户操作切换显示画面。在一个例子中,在电子装置100处于工作状态时,如果传感器组件10开启,那么可以控制电致变色器件120的透光率增加以使传感器组件10露出,从而获取电子装置100的外部信息或者向电子装置100的外部发射信息。如果传感器组件10关闭,那么可以控制电致变色器件120 的透光率降低以遮蔽传感器组件10,从而提高电子装置100的外观效果。
传感器组件10包括光线发射部件11、深度摄像头12、环境摄像头13、光线传感器14以及接近传感器15中的至少一种。例如,传感器组件10包括深度摄像头12、接近传感器15或光线传感器14。又如,传感器组件10包括深度摄像头12和接近传感器15。
本实施方式中,传感器组件10包括光线发射部件11、深度摄像头12、环境摄像头13和接近传感器15。因此,光线发射部件11、深度摄像头12、环境摄像头13和接近传感器均设置在壳体20。电致变色器件120覆盖光线发射部件11、深度摄像头12和环境摄像头13并用于改变自身透光率以遮蔽或暴露光线发射部件11、深度摄像头12和环境摄像头13中的至少一个。
具体地,光线发射部件11用于发射光线。光线发射部件11可以发射可见光,也可以发射例如为红外光等不可见光。
环境摄像头13包括但不限于彩色摄像头、红外摄像头以及黑白摄像头。电子装置100可以利用环境摄像头13拍摄物体图像。或者说,环境摄像头13用于获取空间环境信息。电子装置100根据环境摄像头13拍摄的图像可以识别物体的类型。例如,可以根据环境摄像头13拍摄的图像识别出物体为人手或者桌子等物体。另外,电子装置100可以根据环境摄像头13获取空间环境信息形成空间环境地图。
深度摄像头12包括但不限于TOF(Time of Flight,飞行时间)摄像头或结构摄像头。深度摄像头12可以获取物体的深度图像。深度图像经过处理后可以用于获取物体的三维模型、动作识别等。
接近传感器包括红外发射器和红外接收器,红外发射器和红外接收器配合可以检测外界物体与电子装置100之间的距离。
光线传感器14可以用于检测环境亮度,电子装置100可以根据环境亮度显示合适亮度的图像以提高用户体验。
传感器组件10可以直接地设置壳体20上,也可以间接地设置在壳体20上。在一个例子中,传感器组件10通过支架设置在壳体20上,或者说,传感器组件10固定在支架上,支架固定在壳体20上。传感器组件10的数量可以一个,也可以为多个。在传感器组件10的数量为多个时,多个传感器组件10可以分别设置在壳体20的不同位置,只要保证传感器组件10不会干涉用户正常使用即可了,如图1所示。
可以理解,电致变色器件120可以根据施加的不同电压而具有不同的透光率。另外,电致变色器件120可以过滤预定颜色的光线,例如,电致变色器件120可以过滤蓝色光等有色光线。
电致变色器件120呈片状。电致变色器件120可以设置在壳体20上,也可以设置在传感器组件10上,也可以设置在壳体20和传感器组件10之间。例如,电致变色器件120也通过光学胶粘贴在壳体20或传感器组件10上;又如电致变色器件120通过透明框架设置在壳体20和传感器组件10之间,并且电致变色器件120与传感器组件10和壳体20之间均具有间隙。
电致变色器件120覆盖传感器组件10指的是,传感器组件10在电致变色器件120上的正投影位于电致变色器件120内。或者说,光线发射部件11、深度摄像头12、环境摄像头13和接近传感器中的至少一个的正投影位于电致变色器件120内。
可以理解,电致变色器件120的数量可以为多个,每个电致变色器件120对应一个光线发射部件11、深度摄像头12、环境摄像头13和接近传感器中的一个。
请参阅图2,在一些实施方式中,壳体20包括内表面201和外表面202,壳体20形成有贯穿内表面201和外表面202的通光孔203,传感器组件10与通光孔203对应设置,电致变色器件120贴设在壳体20的外表面202。或者说,光线发射部件11、深度摄像头12、环境摄像头13和接近传感器中的至少一个与通光孔203对应设置。
如此,传感器组件10可以通过通光孔203向外界发射信号和/或接收外界的信号。电致变色器件120可以遮盖通光孔203并覆盖传感器组件10。可以理解,传感器组件10向外界发射信号时,信号经过通光孔203和电致变色器件120。
通光孔203可以为圆形孔、椭圆形孔或方形孔等形状的通孔,在此不限制通光孔203的形状。通光孔203的数量可以为一个,也可以为多个。例如,光线发射部件11、深度摄像头12、环境摄像头13和接近传感器靠近设置或者形成一个整体时,通光孔203的数量为一个。光线发射部件11、深度摄像头12、环境摄像头13和接近传感器分离设置时,通光孔203的数量为多个。光线发射部件11、深度摄像头12、环境摄像头13和接近传感器可以对应一个通光孔203设置。
需要指出的是,壳体20形成有收容腔室22,壳体20的内表面201为围成收容腔室的表面。壳体20的外表面202为与壳体20的内表面201相背的表面。传感器组件10收容于收容腔室22内。
进一步地,传感器组件10至少部分位于通光孔203中。也即是说,传感器组件10可以部分地位于通光孔203中,也可以全部位于通光孔203中。如此,传感器组件10与壳体20之间的结构比较紧凑,可以减小电子装置100的体积。
请参阅图3,在一些实施方式中,壳体20包括与传感器组件10对应设置的透光部204,位于透光部204的内表面201贴设有电致变色器件120。或者说,壳体20至少部分是透光的,以使传感器组件10可以向外部发射信号以及接收外部的信号。例如,光线发射部件11可以通过透光部204发射光线。深度摄像头12可以通过透光部204获取目标物体的深度信息。
透光部204可以采用透光材料制成,例如,透光部204的材料为亚克力等透光材料。透光部204的横截面可以为方形、圆形或者不规则等形状。需要指出的是,透光部204可以透过可见光,也可以透光非可见光。壳体20除透光部204外的其他部位可以是透光的,也可以是非透光的。
请参阅图4,在一些实施方式中,壳体20为透光壳体,电致变色器件120贴设并包覆在外表面202。或者说,电致变色器件202布满壳体20的外表面202。如此,电致变色器件120不仅可以覆盖传感器组件10,还可以提高电子装置100的外观效果。
例如,可以根据不同的需求控制电致变色器件120呈现不同的颜色以改变电子装置100的整体外观。可以理解,电致变色器件120的电压改变后,可以呈现出不同的颜色,例如,电致变色器件120可以呈绿色、红色、蓝色或者渐变色等以使电子装置100的整体呈现出绿色、红色、蓝色或者渐变色等颜色。
需要指出的是,在图4中,为了便于理解,电致变色器件120只展示贴合在壳体20的部分外表面202。
进一步地,电子装置100包括铺设在电致变色器件120上的增透膜130,电致变色器件120夹设在外表面202和增透膜130之间。如此,增透膜130不仅可以保护电致变色器件120,还可以提高电子装置100的整体外观。其中,增透膜130的材质可以为氟化钙等,其作用是减少反射进而提高光的透过率。
请参阅图5,本实施方式中,光线发射部件11包括封装壳111、第一发光源112、第二发光源113、基板114和扩散片115(diffuser)。第一发光源112和第二发光源113均设置在基板114上并位于封装壳111内。基板114与封装壳111固定连接。例如,基板114通过粘接或者焊接等方式与封装壳111固定连接。
具体地,封装壳111可以采用塑料、金属等材料制成。例如封装壳111的材料可以为不锈钢。封装壳111的横截面可以呈方形、圆形或者椭圆形等形状。封装壳111远离基板114的一端形成有一开口1110。
第一发光源112用于向电子装置100外发射第一光线。第二发光源113用于向电子装置100外发射第二光线,并用于为环境摄像头13补光。深度摄像头12用于接收经目标物体反射的第一光线以获取目标物体的深度信息。进一步地,第一光线和第二光线均用于透过扩散片115出射。
本实施方式中,第一光线和第二光线均为红外光,第一光线的波长与第二光线的波长不同。例如,第一光线的波长为940nm。第二光线的波长为850nm。另外,当然,在其他实施方式中,第一光线和/或第二光线可以为可见光。可以理解,在第一光线为红外光时,深度摄像头12为红外摄像头。
如图6所示,在一些实施方式中,第二发光源113的数量为多个,多个第二发光源113围绕第一发光源112间隔设置。例如,第二发光源113的数量为4个,4个第二发光源围绕第一发光源等角度间隔分布。第一发光源112和/或第二发光源113包括垂直腔面发射激光器(Vertical Cavity Surface Emitting Laser,VCSEL)芯片,VCSEL芯片包括多个阵列排布的VCSEL光源。
基板114可以柔性电路板,也可以为刚性电路板。或者为柔性电路板和刚性电路板的组合。
扩散片115设置在开口1110处。扩散片115用于分散第一光线和第二光线以使第一光线和第二光线可以均匀地投射到目标物体上。
本申请实施方式的电子装置100中,第一发光源112和第二发光源113均设置在同一封装壳111中,这样可以使得光线发射部件11的结构更加紧凑,以减小电子装置100的体积。
请参阅图7-8,本申请实施方式的电子装置100包括显示器40、导光部件70和光量调节部件80。光线发射部件11、深度摄像头12和环境摄像头13均与显示器40错开设置。光线发射部件11、深度摄像头12和环境摄像头13均与导光部件70错开设置。
导光部件70与显示器40分离设置。导光部件70包括相对的第一侧71和第二侧72。导光部件70 用于导入显示器40产生的光线并从第一侧71出射。光量调节部件80设置在第二侧72,光量调节部件80用于调节入射至第二侧72的环境光量。
在相关的增强现实设备中,用户可以通过增强现实设备在现实场景中看到增强现实设备显示的内容。可以理解,环境光线和增强显示设备形成的光线同时进入人眼,如果环境的光线亮度较高,使得增强现实设备的显示亮度与环境亮度的对比度过低,人眼较难看清增强现实设备的显示内容。如果环境的光线亮度较低,使得增强现实设备的显示亮度与环境亮度的对比度过高,虚拟现实设备的显示内容容易刺激人员,造成人眼疲劳。
为了解决增强现实设备的显示亮度与环境亮度的对比度过高或者高低的问题,相关技术一般通过调节增强现实设备的显示亮度。然而,在环境亮度高时,为了提高人眼观察到的画面清晰度,如果提高增强现实设备的显示亮度,那么则使得增强现实设备的功耗较大,产生的大量的热量而影响用户体验。
而本申请实施方式的电子装置100中,光量调节部件80可以调节从第二侧72入射并从第一侧71出射的环境光量,从而可以减少环境光量对显示器40产生并从第一侧71出射的光线的影响,有利于用户观看显示器40显示的内容,提高用户体验。
可以理解,用户在佩戴电子装置100时,人眼位于第一侧71外,因此,显示器40产生的光线从第一侧71出射后可以进入人眼内,从而使得用户可以观察到显示器40显示的图像。
环境光线依次经过光量调节部件80、第二侧72和第一侧71后进入人眼中,从而使得用户可以看到环境事物。因此,本申请的光量调节部件80可以调节进入人眼的环境光,从而减少环境光对人眼观察到的图像的影响。
请参阅图7-图9,本申请实施方式的电子装置100还包括支撑部件30、屈光部件50、调节机构60、处理器90、光线传感器14和准直部件92。
壳体20为电子装置100的外部零部件,起到了保护和固定电子装置100的内部零部件的作用。通过壳体20可以将内部零部件包围起来,可以避免外界因素对这些内部零部件造成直接的损坏。
具体地,在本实施方式中,壳体20可用于固定显示器40、屈光部件50、调节机构60、导光部件70和光量调节部件80中的至少一个。在图7的示例中,壳体20形成有收容腔室22,显示器40和屈光部件50收容在收容腔室22中。调节机构60部分地从壳体20露出。
壳体20还包括壳体顶壁24、壳体底壁26和壳体侧壁28。壳体底壁26的中部朝向壳体顶壁24形成缺口262。或者说,壳体20大致呈“B”字型。在用户佩戴电子装置100时,电子装置100可通过缺口262架设在用户的鼻梁上,这样既可以保证电子装置100的稳定性,又可以保证用户佩戴的舒适性。调节机构60可部分地从壳体侧壁28露出,以便用户对屈光部件50进行调节。
另外,壳体20可以通过计算机数控(Computerized Numerical Control,CNC)机床加工铝合金形成,也可以采用聚碳酸酯(Polycarbonate,PC)或者PC和丙烯腈-丁二烯-苯乙烯塑料(Acrylonitrile Butadiene Styrene plastic,ABS)注塑成型。在此不对壳体20的具体制造方式和具体材料进行限定。
支撑部件30用于支撑电子装置100。在用户佩戴电子装置100时,电子装置100可通过支撑部件30固定在用户的头部。在图7的示例中,支撑部件30包括第一支架32、第二支架34和弹性带36。
第一支架32和第二支架34关于缺口262对称设置。具体地,第一支架32和第二支架34可转动地设置在壳体20的边缘,在用户不需要使用电子装置100时,可将第一支架32和第二支架34贴近壳体20叠放,以便于收纳。在用户需要使用电子装置100时,可将第一支架32和第二支架34展开,以实现第一支架32和第二支架34支撑的功能。
第一支架32远离壳体20的一端形成有第一弯折部322,第一弯折部322朝向壳体底壁26弯折。这样,用户在佩戴电子装置100时,第一弯折部322可架设在用户的耳朵上,从而使电子装置100不易滑落。
类似地,第二支架34远离壳体20的一端形成有第二弯折部342。第二弯折部342的解释和说明可参照第一弯折部322,为避免冗余,在此不再赘述。
弹性带36可拆卸地连接第一支架32和第二支架34。如此,在用户佩戴电子装置100进行剧烈活动时,可以通过弹性带36进一步固定电子装置100,防止电子装置100在剧烈活动中松动甚至掉落。可以理解,在其他的示例中,弹性带36也可以省略。
在本实施方式中,显示器40包括OLED显示屏。OLED显示屏无需背光灯,有利于电子装置100的轻 薄化。而且,OLED屏幕可视角度大,耗电较低,有利于节省耗电量。
当然,显示器40也可以采用LED显示器或Micro LED显示器。这些显示器仅作为示例而本申请的实施例并不限于此。
请一并参阅图10,屈光部件50设置在显示器40一侧。本实施方式中,屈光部件位于导光部件70的第一侧71。
屈光部件50包括屈光腔52、透光液体54、第一膜层56、第二膜层58和侧壁59。
透光液体54设置在屈光腔52内。调节机构60用于调节透光液体54的量以调节屈光部件50的形态。具体地,第二膜层58相对于第一膜层56设置,侧壁59连接第一膜层56和第二膜层58,第一膜层56、第二膜层58和侧壁59围成屈光腔52,调节机构60用于调节透光液体54的量以改变第一膜层56和/或第二膜层58的形状。
如此,实现屈光部件50屈光功能的实现。具体地,“改变第一膜层56和/或第二膜层58的形状”包括三种情况:第一种情况:改变第一膜层56的形状且不改变第二膜层58的形状;第二种情况:不改变第一膜层56的形状且改变第二膜层58的形状;第三种情况:改变第一膜层56的形状且改变第二膜层58的形状。请注意,为方便解释,在本实施方式中,以第一种情况为例进行说明。
第一膜层56可具有弹性。可以理解,在屈光腔52中的透光液体54的量变化的情况下,屈光腔52内的压强也随之变化,从而使得屈光部件50的形态发生变化。
在一个例子中,调节机构60将屈光腔52中透光液体54的量减少,屈光腔52内的压强减小,屈光腔52外的压强与屈光腔52内的压强的压差增大,屈光腔52更加凹陷。
在另一个例子中,调节机构60将屈光腔52中透光液体54的量增多,屈光腔52内的压强增大,屈光腔52外的压强与屈光腔52内的压强的压差减小,屈光腔52更加凸出。
这样,就实现了通过调节透光液体54的量来调节屈光部件50的形态。
调节机构60连接屈光部件50。调节机构60用于调节屈光部件50的形态以调节屈光部件50的屈光度。具体地,调节机构60包括腔体62、滑动件64、驱动部件66、调节腔68和开关61。
滑动件64滑动地设置在腔体62中,驱动部件66与滑动件64连接,腔体62和滑动件64共同限定出调节腔68,调节腔68通过侧壁59连通屈光腔52,驱动部件66用于驱动滑动件64相对于腔体62滑动以调整调节腔68的容积以调节屈光腔52内的透光液体54的量。
如此,实现通过滑动件64来调整调节腔68的容积,以调节屈光腔52内的透光液体54的量。在一个例子中,请参阅图11,滑动件64往背离侧壁59的方向滑动,调节腔68的容积增大,调节腔68内的压强减小,屈光腔52内的透光液体54进入调节腔68,第一膜层56愈发向内凹陷。
在另一个例子中,请参阅图12,滑动件64往朝向侧壁59的方向滑动,调节腔68的容积减小,调节腔68内的压强增大,调节腔68内的透光液体54进入屈光腔52,第一膜层56愈发向外凸出。
侧壁59形成有流动通道5,流动通道5连通调节腔68和屈光腔52。调节机构60包括设置在流动通道5的开关61,开关61用于控制流动通道5的开闭状态。
在本实施方式中,开关61的数量为两个,两个开关61均为单向开关,其中一个开关61用于控制透光液体54从调节腔68流至屈光腔52,另一个开关61用于控制透光液体54从屈光腔52流至调节腔68。
如此,通过开关61实现透光液体54在调节腔68和屈光腔52之间的流动,以保持侧壁59两侧的压强平衡。如前,调节腔68容积的改变,会引起调节腔68中压强的变化,从而引起现透光液体54在调节腔68和屈光腔52之间的流动。而开关61通过控制流动通道5的开闭状态,来控制透光液体54在调节腔68和屈光腔52之间的流动能否实现,从而控制屈光部件50的形态的调节。
在一个例子中,请参阅图11,控制透光液体54从屈光腔52流至调节腔68的开关61打开,滑动件64往背离侧壁59的方向滑动,调节腔68的容积增大,调节腔68内的压强减小,屈光腔52内的透光液体54通过开关61进入调节腔68,第一膜层56愈发向内凹陷。
在另一个例子中,控制透光液体54从屈光腔52流至调节腔68的开关61关闭,即使滑动件64往背离侧壁59的方向滑动,调节腔68的容积增大,调节腔68内的压强减小,屈光腔52内的透光液体54也无法进入调节腔68,第一膜层56的形态不发生改变。
在又一个例子中,请参阅图12,控制透光液体54从调节腔68流至屈光腔52的开关61打开,滑动件64往朝向侧壁59的方向滑动,调节腔68的容积减小,调节腔68内的压强增大,调节腔68内的透光 液体54通过开关61进入屈光腔52,第一膜层56愈发向外凸出。
在又一个例子中,控制透光液体54从调节腔68流至屈光腔52的开关61关闭,即使滑动件64往朝向侧壁59的方向滑动,调节腔68的容积减小,调节腔68内的压强增大,调节腔68内的透光液体54也无法进入屈光腔52,第一膜层56的形态不发生改变。
驱动部件66可基于多种结构和原理实现其驱动滑动件64滑动的功能。
在图8-图12的示例中,驱动部件66包括旋钮662和丝杠664,丝杠664连接旋钮662和滑动件64,旋钮662用于驱动丝杠664转动以带动滑动件64相对于腔体62滑动。
如此,实现通过旋钮662和丝杠664来驱动滑动件64。由于丝杠664和旋钮662的配合可将旋钮662的回转运动转化为丝杠664直线运动,在用户旋转旋钮662时,丝杠664即可带动滑动件64相对于腔体62滑动,从而引起调节腔68容积的变化,进而调节屈光腔52内的透光液体54的量。旋钮662可自壳体20露出,以方便用户旋转。
具体地,旋钮662上形成有螺纹部,丝杠664上形成有与旋钮662配合的螺纹部,旋钮662和丝杠664螺纹连接。
在旋钮662旋转的同时,开关61可对应地打开。如此,使得透光液体54可以流动,保证侧壁59两侧的压强平衡。
在一个例子中,旋钮662顺时针旋转,滑动件64往背离侧壁59的方向滑动,则将控制透光液体54从屈光腔52流至调节腔68的开关61打开。在另一个例子中,旋钮662逆时针旋转,滑动件64往朝向侧壁59的方向滑动,则将控制透光液体54从调节腔68流至屈光腔52的开关61打开。
请注意,本实施方式中,没有关联旋钮662的旋转角度与屈光部件50的屈光度数,用户将旋钮662旋转到视觉体验最佳的位置即可。当然,在其他的实施方式中,也可以关联旋钮662的旋转角度与屈光部件50的屈光度数。在此,不对旋钮662的旋转角度与屈光部件50的屈光度数是否关联进行限定。
请参阅图13,驱动部件66包括齿轮666和与齿轮666啮合的齿条668,齿条668连接齿轮666和滑动件64,齿轮666用于驱动齿条668移动以带动滑动件64相对于腔体62滑动。
如此,实现通过齿轮666和齿条668来驱动滑动件64。由于齿轮666和齿条668的配合可将齿轮666的回转运动转化为齿条668直线运动,在用户旋转齿轮666时,齿条668即可带动滑动件64相对于腔体62滑动,从而引起调节腔68容积的变化,进而调节屈光腔52内的透光液体54的量。齿轮666可自壳体20露出,以方便用户旋转。
类似地,在齿轮666旋转的同时,开关61可对应地打开。如此,使得透光液体54可以流动,保证侧壁59两侧的压强平衡。
在一个例子中,齿轮666顺时针转动使得齿条668啮合在齿轮666上,齿条668的长度缩短,拉动滑动件64往背离侧壁59的方向移动,则将控制透光液体54从屈光腔52流至调节腔68的开关61打开。
在另一个例子中,齿轮666逆时针转动使得啮合在齿轮666上的齿条668从齿轮666脱离,齿条668的长度增长,推动滑动件64往朝向侧壁59的方向移动,则将控制透光液体54从调节腔68流至屈光腔52的开关61打开。
类似地,本实施方式中,没有关联齿轮666的旋转角度与屈光部件50的屈光度数,用户将齿轮666旋转到视觉体验最佳的位置即可。当然,在其他的实施方式中,也可以关联齿轮666的旋转角度与屈光部件50的屈光度数。在此,不对齿轮666的旋转角度与屈光部件50的屈光度数是否关联进行限定
需要注意的是,屈光部件50的结构不仅包括以上的屈光腔52、透光液体54、第一膜层56、第二膜层58和侧壁59,只要保证屈光部件50可以实现屈光度的改变的效果即可。例如,在其他方式中,屈光部件50包括多个镜片和驱动件,驱动件用于驱动每个镜片从收容位置移动到屈光位置。这样,即可通过多个镜片的组合,来改变屈光部件50的屈光度。当然,驱动件也可驱动移动到屈光位置上的每个镜片在屈光光轴上移动,从而改变屈光部件50的屈光度。
因此,以上的屈光部件的形态包括屈光部件的形状和状态,以上屈光腔52、透光液体54、第一膜层56、第二膜层58和侧壁59的结构方式通过改变第一膜层56和/或第二膜层58的形状以实现屈光度的改变;以上多个镜片和驱动件的结构方式,通过改变镜片的状态以实现屈光度的改变。
请参阅图8及图9,导光部件70位于屈光部件50和光量调节部件80之间。导光部件70可以为板状的导光元件,导光部件70可以采用树脂等透光材料制成。如图8所示,显示器40产生的光线进入导光 部件70内之后,不同传播方向的光线在导光部件70内产生全反射传播,最终从导光部件70的第一侧71出射至导光部件70外,以使人眼可以观察到显示器40显示的内容。
光量调节部件80可以通过光学胶固定在导光部件70。光量调节部件80包括电致变色元件,电致变色元件的透光率在电致变色元件被施加电压后改变。如此,通过改变电致变色元件的透光率可以调节经过电致变色元件的光量,从而可以调节经过第二侧72和第一侧71的环境光量。
可以理解,电致变色元件在外加电场的作用下发生稳定、可逆的颜色变化的现象,在外观上表现为颜色和透明度的可逆变化。如此使得电变色元件能够实现透光率的改变。
具体地,请参阅图14,电致变色元件可以包括层叠设置的第一导电层81、第二导电层82、电致变色层83、电解质层84和离子存储层85,,电致变色层83设置在第一导电层81和第二导电层82之间。第一导电层81和第二导电层82用于配合向电致变色层83施加电压。电解质层84和离子存储层85依次层叠设置在电致变色层83和第二导电层82之间。如此,第一导电层81和第二导电层82可以为电致变色提供电压,以使电致变色的透光率可以改变,从而改变电致变色的透光率,电解质层和离子存储层85可以保证电致变色层83可以正常的改变透光率。
需要指出的是,以上所说的电致变色器件120的结构与电致变色元件的结构类似,因此,本申请的电致变色器件120的结构请参考电致变色元件的结构,本申请不在赘述。
本申请实施方式中,处理器90与光量调节部件80连接。处理器90用于控制光量调节部件80的透光率以使光量调节部件80调节入射至第二侧72的环境光量。如此,处理器90可以准确地调节光量调节部件80的透光率。
如以上所述,在光量调节部件80为电致变色元件时,处理器90可以控制施加至电致变色元件的电压,从而控制电致变色元件的透光率。或者说,光量调节部件80的透光率通过调节电致变色元件的施加电压控制。处理器90可以包括电路板和设置在电路板上的处理芯片等元气件组件。
光线传感器14与处理器90连接。光线传感器14用于检测环境亮度,处理器90用于根据环境亮度调节光量调节部件80的透光率,其中,环境亮度与光量调节部件80的透光率为反相关关系。
如此可以自动调节光量调节部件80的透光率以使用户可以清楚地观察到显示器40显示的内容,并且用户不易疲劳。
如图15所示,在环境亮度增大时,光量调节部件80的透光率降低;在环境亮度降低时,光量调节部件80的透光率增大。这样使得显示器40的显示画面的对比度在人眼观看的舒适区,提高用户体验。
准直部件92设置在显示器40和导光部件70之间,准直部件92用于将显示器40产生的光线准直后出射至导光部件70。如此,准直部件92可以将显示器40产生的光线变成平行光后进入导光部件70中,从而可以减少光线的损失。
准直部件92可以包括多个透镜,多个透镜叠加一起可以准直光线。显示器40产生的光线经过准直部件92后进入导光部件70中,光线在导光部件70中全反射或者衍射后从导光部件70的第一侧71出射。
在一些实施方式中,处理器90用于当前环境亮度小于预设亮度时,开启第一发光源112、深度摄像头12和环境摄像头13以使深度摄像头12获取目标物体的深度信息,并开启第二发光源113以为环境摄像头13补光和环境摄像头13获取空间环境信息。
本申请实施方式的电子装置100中,第二发光源113在当前环境亮度小于预设亮度时可以开启用以为环境摄像头13补光,如此使得环境摄像头13可以品质较佳的图像,从而使得电子装置100在暗光下依然获得环境信息。
可以理解,第二发光源113发射的第二光线可以发射至目标物体上,从而在环境光线较弱时补充环境中的光线强度。
请参阅图16,在一些实施方式中,电子装置100包括一个驱动芯片94,一个驱动芯片94连接处理器90、第一发光源112和第二发光源113,处理器90用于在当前环境亮度小于预设亮度时控制驱动芯片94输出第一驱动信号和第二驱动信号,第一驱动信号用于驱动第一发光源112,第二驱动信号用于驱动第二发光源113。如此,一个驱动芯片94可以通过驱动两个发光源,这样可以降低电子装置100的硬件数量,从而降低电子装置100的成本。
请参阅图17,在一些实施方式中,电子装置100包括两个驱动芯片94,两个驱动芯片94均与处理器90连接,其中一个驱动芯片94与第一发光源112连接,另一个驱动芯片94与第二发光源113连接, 处理器90用于在当前环境亮度小于预设亮度时控制其中一个驱动芯片94输出第一驱动信号,另一个驱动芯片94输出第二驱动信号,第一驱动信号用于驱动第一发光源112,第二驱动信号用于驱动第二发光源113。如此,两个驱动芯片94分别控制对应的发光源,这样使得每个发光源的工作状态更加容易控制。
在一些实施方式中,处理器90用于通过光线传感器14获取当前环境亮度。或者说,光线传感器14检测到当前环境亮度可以传送至处理器90。如此,当前环境亮度获取方便有效。
在一些实施方式中,处理器90用于获取环境摄像头13采集的空间环境图像,及用于计算空间环境图像的灰度;以及用于根据灰度得到当前环境亮度。在此实施方式中,光线传感器14可以省略,这样可以降低电子装置100的成本。
图18为一个实施例中的电子装置100的内部模块示意图。电子装置100包括通过***总线109连接的处理器90、存储器102(例如为非易失性存储介质)、内存储器103、显示装置104和输入装置105。
处理器90可用于提供计算和控制能力,支撑整个电子装置100的运行。电子装置100的内存储器103为存储器102中的计算机可读指令运行提供环境。电子装置100的显示装置104可以是设置在电子装置100上的显示器40,输入装置105可以是设置在电子装置100上的声电元件和振动传感器,也可以是电子装置100上设置的按键、轨迹球或触控板,也可以是外接的键盘、触控板或鼠标等。该电子装置可以是智能手环、智能手表、智能头盔、电子眼镜等。
在相关技术中,可以通过摄像头获取物体图像,经过图像识别技术,分割图像,提取手部特征和计算手势节点等步骤,判断手势的动作类型,完成手势的交互。然而,这种方案的缺点是必须一直打开摄像头获取图像,且中央处理器或者数字信号处理器必须一直运行相关的手势算法,处理图像信息。导致的问题可有:1、整体功耗较高,发热严重,且缩短了电池的有效使用时间;2、中央处理器或者数字信号处理器占用率高,导致其他的算法和进程无法快速的运行,***出现卡顿等问题。
对此,请参阅19及图20,本申请实施方式还提供了一种电子装置100的控制方法,电子装置100包括摄像头110,控制方法包括:
010,检测在摄像头110的预设范围内是否有外部物体400;
020,在摄像头110的预设范围内有外部物体400的情况下,开启摄像头110以获取外部物体400的图像;
030,根据外部物体400的图像对所述外部物体400进行动作姿势识别。
请参阅图21,本申请实施方式的控制装置200用于电子装置100,控制装置200包括检测模块210和控制模块220,步骤010可以由检测模块210执行,步骤020-030可以由控制模块220执行。或者说,检测模块210用于检测在摄像头110的预设范围内是否有外部物体400;控制模块220用于在摄像头110的预设范围内有外部物体400的情况下,开启摄像头110以获取外部物体400的图像;及用于根据外部物体400的图像对所述外部物体400进行动作姿势识别。
在某些实施方式中,步骤010-030可以由处理器90执行,或者说,处理器90用于检测在摄像头110的预设范围内是否有外部物体400;及用于在摄像头110的预设范围内有外部物体400的情况下,开启摄像头110以获取外部物体400的图像;以及用于根据外部物体400的图像对所述外部物体400进行动作姿势识别。
本申请实施方式的控制方法、控制装置200、电子装置100和存储介质中,在摄像头110的预设范围内有外部物体400的情况下开启摄像头110,以对所述外部物体400进行动作姿势的识别,这样可以避免摄像头110一直处于开启的状态,从而减少了摄像头110及处理器90的运行时间,降低了电子装置100的功耗及发热量。
具体地,在步骤010中,摄像头110的预设范围指的是,在摄像头110的视场范围内,可以以摄像头110的镜头中心表面为中心,以预定距离为半径形成的扇形或者锥形的范围;或者可以以摄像头110的图像传感器的中心为中心,以预定距离为半径形成的扇形或者锥形的范围。
该预设距离可以根据实际需要具体设定。例如,预设距离为10cm、20cm、30cm、40cm或者60cm等尺寸。
外部物体400可以为人体的手部、眼睛、头部等活体物体;也可以为笔、书本等非活体物体。
在步骤020中,开启摄像头110指的是,驱动摄像头110工作,以使摄像头110可以感测外部环境的光强并根据摄像头110的镜头的成像效果形成外部物体400图像。摄像头110例如为以上的环境摄像 头110或者深度摄像头110。摄像头110为环境摄像头110时,摄像头110获取的是二维图像。摄像头110为深度摄像头110时,摄像头110获取的是三维图像。因此,本申请所说的外部物体400的图像可以为二维图像,也可以为三维图像。
在步骤030中,外部物体400的图像包括物体的类别、形状、大小等信息,进行动作姿势识别例如为对外部物体400的图像进行分割、提取特征、识别物体类别、判断是否满足动作姿势等系列过程。在此过程中,处理器90配合相关的硬件运行相应的程序,以执行步骤030,从而实现动作姿势识别的目的。
需要指出的是,本实施方式所说的动作姿势包括手势和眼部动作中的至少一种。可以理解,手势为用户的手部动作。手部动作可以为用户控制手指活动以形成预定的动作。例如,用户竖起大拇指或者五指张开等动作。
眼部动作可以用于的眼球运动的动作,例如,眼球向左右方向转动;眼部动作也可以用户眨眼的动作,例如,用户闭眼的时长或者眨眼的频率等动作。
当然,在其他实施方式中,动作姿势不限于以上讨论的手势和眼部动作。
在某些实施方式中,电子装置100包括设置在摄像头110一侧的接近传感器15,步骤010包括:
响应于用户发出的触发指令开启接近传感器15,以触发接近传感器15检测在摄像头110的预设范围内是否有外部物体400。
在某些实施方式中,处理器90用于响应于用户发出的触发指令开启接近传感器15,以通过接近传感器15检测在摄像头110的预设范围内是否有外部物体400。
具体地,接近传感器15可以设置摄像头110的上侧,也可以设置在摄像头110的下侧,在此不限制接近传感器15相对于摄像头110的方位。另外,接近传感器15可以与摄像头110接触设置,也可以与摄像头110间隔设置。
本申请中,触发指令可以根据用户操作形成,例如,用户按下电子装置100的按键或者触摸屏等输入装置以使电子装置100开始运行动作姿势的程序,并形成触发指令。
需要指出的是,此处所指的“上”、“下”等方位,指的是电子装置100正常使用的状态的下的方位。
在一个例子中,接近传感器15可以发射红外线,并接收外部物体400反射后的红外线以检测得到外部物体400与电子装置100之间的距离。当然,接近传感器15可以通过超声波、电磁场、毫米波的方式检测外部物体400与电子装置100之间的距离。
如此,利用接近传感器15可以准确地检测到摄像头110的预设范围内是否有外部物体400,另外,接近传感器15的功耗较低,这样可以进一步降低电子装置100在执行动作姿势的过程的功耗。
请参阅图22,在某些实施方式中,步骤030包括:
031,根据所述外部物体400的图像,确认外部物体400为预定物体时,控制摄像头110以第一帧率运行获取外部物体400的图像以获取外部物体400的动作姿势;
032,根据所述外部物体400的图像,确认外部物体400不是预定物体时,控制摄像头110以第二帧率以运行获取外部物体400的图像以判断外部物体400是否为预定物体,第二帧率小于第一帧率。
在某些实施方式中,步骤031-032可以由处理器90执行,或者说,处理器90用于根据所述外部物体400的图像,确认外部物体400为预定物体时,控制摄像头110以第一帧率运行获取外部物体400的图像以获取外部物体400的动作姿势;以及用于根据所述外部物体400的图像,确认外部物体400不是预定物体时,控制摄像头110以第二帧率运行获取外部物体400的图像以判断外部物体400是否为预定物体,第二帧率小于第一帧率。
具体地,可以理解,并非所有的外部物体400均可以执行动作姿势。因此,确定外部物体400的类型,在控制摄像头110的运行的帧率,这样可以避免摄像头110一直以较高的帧率运行而消耗较多的能量。
在步骤031中,预定物体为可以执行动作姿势的物体。例如,预定物体为人的手部、头部和/或眼睛。可以理解,在预定物体可以人体的头部时,头部可以执行点头及摇头等动作。在外部物体400为预定物体时,可以预判外部物体400将会做出动作姿势,因此,控制摄像头110以较高的帧率获取外部物体400的图像,这样可以准确地获取到外部物体400的姿势。
在步骤032中,在外部物体400不是预定物体的情况下,此时,可以预判外部物体400不会做出动作姿势,因此,控制摄像头110以较低的帧率获取外部物体400的图像,这样可以降低电子装置100的 功耗。
另外,需要指出的是,判断外部物体400是否为预定物体是根据电子装置100根据图像识别的结果。在一个例子中,外部物体400是人体头部时,假若摄像头110仅拍摄到人体头部的部分图像,那么根据该图像无法分析得到外部物体400是人体头部。若摄像头110仅拍摄到人体头部的完整图像,那么根据该图像则可以分析得到外部物体400是人体头部
因此,步骤031以较低的帧率运行,这样可以持续地获取外部物体400的图像,以在外部运动的过程中进一步识别外部物体400是否为预定物体,从而提高外部物体400识别以及动作姿势识别的准确性。
在一个例子中,第一帧率为30帧/秒或者60帧/秒;第二帧率为5帧/秒或者10帧/秒。
请参阅图23,在某些实施方式中,步骤031包括:
0321,控制摄像头110获取外部物体400的连续帧图像;
0322,根据连续帧图像确认外部物体400的动作姿势。
在某些实施方式中,步骤0321-0323可以由处理器90执行,或者说,处理器90用于控制摄像头110获取外部物体400的连续帧图像;及用于根据连续帧图像确认外部物体400的动作姿势是否为预定姿势;以及用于在动作姿势为预定姿势的情况下,产生相应的控制指令。
可以理解,动作姿势一般是一个动态的过程,因此,动作姿势的识别是一个连续的过程,根据外部物体400的连续帧图像,可以准确地获取外部物体400的动作姿势,以较准确地产生相应的控制指令。
预定姿态包括点击、滑动、缩放中的至少一种。在一个例子中,外部物体400为手部,此时,可以控制摄像头110获取连续10帧图像,以根据这10帧图像判断手部是否做出“点击”的动作姿势,若是,则产生与“点击”相应的控制指令。
请参阅图24,在某些实施方式中,控制方法还包括步骤:
040,在动作姿势为预定姿势的情况下产生相应的控制指令,并根据控制指令控制电子装置100运行。
在某些实施方式中,步骤040可以由处理器90执行,或者说,处理器90用于在动作姿势为预定姿势的情况下产生相应的控制指令,并根据控制指令控制电子装置100运行。
如此,电子装置100可以根据外部物体400的动作姿势运行相应的功能。例如,可以根据该控制指令控制电子装置100解锁屏幕、截图、关闭画面以及快进视频等功能。
在一个例子中,用户的做出“点击”的手势动作后,电子装置100可以根据“点击”的动作播放视频。
在某些实施方式中,在步骤030后,控制方法还包括步骤:
检测外部物体400是否移出预设范围;
在外部物体400移出预设范围的情况下,关闭摄像头110。
在某些实施方式中,处理器90还用于检测外部物体400是否移出预设范围;及用于在外部物体400移出预设范围的情况下,关闭摄像头110。
如此,在外部物体400移出摄像头110的预设范围的情况下,此时,可以认为不会再有动作姿势产生,关闭摄像头110可以降低电子装置100的功耗,延长电子装置100的用电时间。
一种包含计算机可执行指令的非易失性计算机可读存储介质,当计算机可执行指令被一个或多个处理器90执行时,使得处理器90执行任一实施方式中的控制方法。
本领域技术人员可以理解,图中示出的结构,仅仅是与本申请方案相关的部分结构的示意图,并不构成对本申请方案所应用于其上的电子装置的限定,具体的电子装置可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。
在本说明书的描述中,参考术语“一个实施方式”、“某些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
尽管已经示出和描述了本申请的实施方式,本领域的普通技术人员可以理解:在不脱离本申请的原理和宗旨的情况下可以对这些实施方式进行多种变化、修改、替换和变型,本申请的范围由权利要求及其等同物限定。

Claims (20)

  1. 一种电子装置的控制方法,其特征在于,所述电子装置包括摄像头,所述控制方法包括:
    检测在所述摄像头的预设范围内是否有外部物体;
    在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;
    根据所述外部物体的图像对所述外部物体进行动作姿势识别。
  2. 根据权利要求1所述的控制方法,其特征在于,所述电子装置包括设置在所述摄像头一侧的接近传感器,所述检测在所述摄像头的预设范围内是否有外部物体,包括:
    响应于用户发出的触发指令开启所述接近传感器,以触发所述接近传感器检测在所述摄像头的预设范围内是否有外部物体。
  3. 根据权利要求1所述的控制方法,其特征在于,所述根据所述外部物体的图像进行动作姿势识别,包括:
    根据所述外部物体的图像,确认所述外部物体为预定物体时,控制所述摄像头以第一帧率运行获取所述外部物体的图像以获取所述外部物体的动作姿势;
    根据所述外部物体的图像,确认所述外部物体不是预定物体时,控制所述摄像头以第二帧率运行获取所述外部物体的图像以判断所述外部物体是否为预定物体,所述第二帧率小于所述第一帧率。
  4. 根据权利要求3所述的控制方法,其特征在于,所述控制所述摄像头以第一帧率运行获取所述外部物体的图像以获取所述外部物体的动作姿势,包括:
    控制所述摄像头获取所述外部物体的连续帧图像;
    根据所述连续帧图像确认所述外部物体的动作姿势。
  5. 根据权利要求1所述的控制方法,其特征在于,在根据所述外部物体的图像进行动作姿势识别后,所述控制方法包括:
    检测所述外部物体是否移出所述预设范围;
    在所述外部物体移出所述预设范围的情况下,关闭所述摄像头。
  6. 根据权利要求1所述的控制方法,其特征在于,所述控制方法包括:
    在所述动作姿势为预定姿势的情况下产生相应的控制指令;
    根据所述控制指令控制所述电子装置运行。
  7. 根据权利要求6所述的控制方法,其特征在于,所述预定姿态包括点击、滑动、缩放中的至少一种。
  8. 一种控制装置,用于电子装置,其特征在于,所述电子装置包括摄像头,所述控制装置包括:
    检测模块,用于检测在所述摄像头的预设范围内是否有外部物体;
    控制模块,用于在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;及用于根据所述外部物体的图像对所述外部物体进行动作姿势识别。
  9. 一种电子装置,其特征在于,包括摄像头和处理器,所述处理器用于检测在所述摄像头的预设范围内是否有外部物体;及用于在所述摄像头的预设范围内有所述外部物体的情况下,开启所述摄像头以获取所述外部物体的图像;以及用于根据所述外部物体的图像对所述外部物体进行动作姿势识别。
  10. 根据权利要求9所述的电子装置,其特征在于,所述处理器用于根据所述外部物体的图像确认所述外部物体为所述预定物体时,控制所述摄像头以第一帧率运行获取所述外部物体的图像以获取所述 外部物体的动作姿势;以及用于根据所述外部物体的图像确认所述外部物体不是所述预定物体时,控制所述摄像头以第二帧率以运行获取所述外部物体的图像以判断所述外部物体是否为预定物体,所述第二帧率小于所述第一帧率。
  11. 根据权利要求10所述的电子装置,其特征在于,所述处理器用于控制所述摄像头获取所述外部物体的连续帧图像;及用于根据所述连续帧图像确认所述外部物体的动作姿势。
  12. 根据权利要求9所述的电子装置,其特征在于,所述电子装置包括设置在所述摄像头一侧的接近传感器,所述处理器用于响应于用户发出的触发指令开启所述接近传感器,以触发所述接近传感器检测在所述摄像头的预设范围内是否有外部物体。
  13. 根据权利要求9所述的电子装置,其特征在于,所述处理器用于检测所述外部物体是否移出所述预设范围;以及用于在所述外部物体移出所述预设范围的情况下,关闭所述摄像头。
  14. 根据权利要求9所述的电子装置,其特征在于,所述处理器用于在所述动作姿势为预定姿势的情况下产生相应的控制指令;以及用于根据所述控制指令控制所述电子装置运行。
  15. 根据权利要求14所述的电子装置,其特征在于,所述预定姿态包括点击、滑动、缩放中的至少一种。
  16. 根据权利要求9所述的电子装置,其特征在于,所述电子装置包括第一发光源、第二发光源和一个驱动芯片,所述第一发光源用于向所述电子装置外发射第一光线,所述第二发光源用于向所述电子装置外发射第二光线,所述处理器用于在当前环境亮度小于预设亮度时控制所述驱动芯片输出第一驱动信号和第二驱动信号,所述第一驱动信号用于驱动所述第一发光源,所述第二驱动信号用于驱动所述第二发光源。
  17. 根据权利要求9所述的电子装置,其特征在于,所述电子装置包括第一发光源、第二发光源和两个驱动芯片,其中一个所述驱动芯片与所述第一发光源连接,另一个所述驱动芯片与所述第二发光源连接,所述第一发光源用于向所述电子装置外发射第一光线,所述第二发光源用于向所述电子装置外发射第二光线,所述处理器用于在当前环境亮度小于预设亮度时控制其中一个所述驱动芯片输出第一驱动信号,另一个所述驱动芯片输出第二驱动信号,所述第一驱动信号用于驱动所述第一发光源,所述第二驱动信号用于驱动所述第二发光源。
  18. 根据权利要求16或17所述的电子装置,其特征在于,所述电子装置包括深度摄像头,所述深度摄像头用于接收经目标物体反射的所述第一光线以获取目标物体的深度信息。
  19. 根据权利要求16或17所述的电子装置,其特征在于,第一光线的波长与第二光线的波长不同。
  20. 一种包含计算机可执行指令的非易失性计算机可读存储介质,其特征在于,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行权利要求1-7中的任一项所述的控制方法。
PCT/CN2020/103341 2019-08-30 2020-07-21 控制方法、控制装置、电子装置和存储介质 WO2021036591A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20859159.4A EP3979046A4 (en) 2019-08-30 2020-07-21 CONTROL METHOD, CONTROL DEVICE, ELECTRONIC DEVICE AND STORAGE MEDIA
US17/565,373 US20220121292A1 (en) 2019-08-30 2021-12-29 Control method, control device, electronic device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910817693.6 2019-08-30
CN201910817693.6A CN110515468A (zh) 2019-08-30 2019-08-30 控制方法、控制装置、电子装置和存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/565,373 Continuation US20220121292A1 (en) 2019-08-30 2021-12-29 Control method, control device, electronic device, and storage medium

Publications (1)

Publication Number Publication Date
WO2021036591A1 true WO2021036591A1 (zh) 2021-03-04

Family

ID=68628616

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/103341 WO2021036591A1 (zh) 2019-08-30 2020-07-21 控制方法、控制装置、电子装置和存储介质

Country Status (4)

Country Link
US (1) US20220121292A1 (zh)
EP (1) EP3979046A4 (zh)
CN (1) CN110515468A (zh)
WO (1) WO2021036591A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110515468A (zh) * 2019-08-30 2019-11-29 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置和存储介质
US11107280B1 (en) * 2020-02-28 2021-08-31 Facebook Technologies, Llc Occlusion of virtual objects in augmented reality by physical objects
US11514649B2 (en) * 2020-05-29 2022-11-29 Microsoft Technology Licensing, Llc Camera for augmented reality display
US11726339B2 (en) * 2021-11-30 2023-08-15 Samsung Electronics Co., Ltd. System for digital recording protection and electrochromic device frame
WO2023249820A1 (en) * 2022-06-22 2023-12-28 Snap Inc. Hand-tracking pipeline dimming

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2428870A1 (en) * 2010-09-13 2012-03-14 Samsung Electronics Co., Ltd. Device and method for controlling gesture for mobile device
CN104714642A (zh) * 2015-03-02 2015-06-17 惠州Tcl移动通信有限公司 一种移动终端及其手势识别处理方法和***
US9268404B2 (en) * 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
CN107607957A (zh) * 2017-09-27 2018-01-19 维沃移动通信有限公司 一种深度信息获取***及方法、摄像模组和电子设备
CN110121031A (zh) * 2019-06-11 2019-08-13 Oppo广东移动通信有限公司 图像采集方法和装置、电子设备、计算机可读存储介质
CN110515468A (zh) * 2019-08-30 2019-11-29 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置和存储介质

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210249116A1 (en) * 2012-06-14 2021-08-12 Medibotics Llc Smart Glasses and Wearable Systems for Measuring Food Consumption
KR102091028B1 (ko) * 2013-03-14 2020-04-14 삼성전자 주식회사 사용자 기기의 오브젝트 운용 방법 및 장치
US9908048B2 (en) * 2013-06-08 2018-03-06 Sony Interactive Entertainment Inc. Systems and methods for transitioning between transparent mode and non-transparent mode in a head mounted display
JP5802727B2 (ja) * 2013-11-11 2015-10-28 東芝テリー株式会社 同期式カメラ
US10771714B2 (en) * 2014-02-25 2020-09-08 Ams Sensors Singapore Pte. Ltd. Image sensor modules including primary high-resolution imagers and secondary imagers
US9766461B2 (en) * 2015-01-20 2017-09-19 Microsoft Technology Licensing, Llc Head-mounted display device with stress-resistant components
US10063776B2 (en) * 2015-05-01 2018-08-28 Gopro, Inc. Camera mode control
JP6398870B2 (ja) * 2015-05-25 2018-10-03 コニカミノルタ株式会社 ウェアラブル電子機器およびウェアラブル電子機器のジェスチャー検知方法
CN105759935B (zh) * 2016-01-29 2019-01-18 华为技术有限公司 一种终端控制方法及终端
US10560679B2 (en) * 2016-08-30 2020-02-11 Microsoft Technology Licensing, Llc Deformation detection and automatic calibration for a depth imaging system
US10129377B2 (en) * 2017-01-04 2018-11-13 Essential Products, Inc. Integrated structure including image capture and depth sensing components
SE541650C2 (en) * 2017-05-30 2019-11-19 Crunchfish Ab Improved activation of a virtual object
US10789777B1 (en) * 2017-06-29 2020-09-29 Facebook Technologies, Llc Generating content for presentation by a head mounted display based on data captured by a light field camera positioned on the head mounted display
US10466360B1 (en) * 2017-08-31 2019-11-05 Facebook Technologies, Llc Depth measurement using scanning diffractive optical elements
CN108229391B (zh) * 2018-01-02 2021-12-24 京东方科技集团股份有限公司 手势识别装置及其服务器、手势识别***、手势识别方法
CN108989638B (zh) * 2018-08-01 2021-03-05 Oppo(重庆)智能科技有限公司 成像装置及其控制方法、电子装置和计算机可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9268404B2 (en) * 2010-01-08 2016-02-23 Microsoft Technology Licensing, Llc Application gesture interpretation
EP2428870A1 (en) * 2010-09-13 2012-03-14 Samsung Electronics Co., Ltd. Device and method for controlling gesture for mobile device
CN104714642A (zh) * 2015-03-02 2015-06-17 惠州Tcl移动通信有限公司 一种移动终端及其手势识别处理方法和***
CN107607957A (zh) * 2017-09-27 2018-01-19 维沃移动通信有限公司 一种深度信息获取***及方法、摄像模组和电子设备
CN110121031A (zh) * 2019-06-11 2019-08-13 Oppo广东移动通信有限公司 图像采集方法和装置、电子设备、计算机可读存储介质
CN110515468A (zh) * 2019-08-30 2019-11-29 Oppo广东移动通信有限公司 控制方法、控制装置、电子装置和存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3979046A4 *

Also Published As

Publication number Publication date
EP3979046A4 (en) 2022-07-27
US20220121292A1 (en) 2022-04-21
CN110515468A (zh) 2019-11-29
EP3979046A1 (en) 2022-04-06

Similar Documents

Publication Publication Date Title
WO2021036591A1 (zh) 控制方法、控制装置、电子装置和存储介质
US11947735B2 (en) Controller movement tracking with light emitters
EP3008567B1 (en) User focus controlled graphical user interface using an head mounted device
WO2021047331A1 (zh) 控制方法、电子装置和存储介质
EP2834723B1 (en) Touch sensitive user interface
US9116666B2 (en) Gesture based region identification for holograms
CN110398839B (zh) 头戴式显示设备和控制方法
CN110290330B (zh) 控制方法、电子装置和存储介质
US20220174764A1 (en) Interactive method, head-mounted device, interactive system and storage medium
JP6776578B2 (ja) 入力装置、入力方法、コンピュータープログラム
CN210015296U (zh) 头戴装置
JPWO2019189868A1 (ja) ヘッドマウントディスプレイ
US20170285765A1 (en) Input apparatus, input method, and computer program
US20210405851A1 (en) Visual interface for a computer system
CN110515461A (zh) 交互方法、头戴设备、交互***和存储介质
CN206906983U (zh) 增强现实设备
JP6790769B2 (ja) 頭部装着型表示装置、プログラム、及び頭部装着型表示装置の制御方法
US20240126376A1 (en) Computing system with head wearable display
US11934586B2 (en) Gesture detection via image capture of subdermal tissue from a wrist-pointing camera system
US20210405852A1 (en) Visual interface for a computer system
KR20240061547A (ko) 프로젝트 장치 및 이를 포함하는 전자 디바이스
KR20240019914A (ko) 광학 장치 및 이를 포함하는 전자 디바이스
KR20240026606A (ko) 광학 장치 및 이를 포함하는 전자 디바이스
KR20240050198A (ko) 사용자의 자세를 가이드하기 위한 웨어러블 장치 및 그 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20859159

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020859159

Country of ref document: EP

Effective date: 20211229

NENP Non-entry into the national phase

Ref country code: DE