WO2021196976A1 - 一种光发射装置及电子设备 - Google Patents

一种光发射装置及电子设备 Download PDF

Info

Publication number
WO2021196976A1
WO2021196976A1 PCT/CN2021/079346 CN2021079346W WO2021196976A1 WO 2021196976 A1 WO2021196976 A1 WO 2021196976A1 CN 2021079346 W CN2021079346 W CN 2021079346W WO 2021196976 A1 WO2021196976 A1 WO 2021196976A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light source
beams
collimating lens
light sources
Prior art date
Application number
PCT/CN2021/079346
Other languages
English (en)
French (fr)
Inventor
魏文雄
王帆
俞锋
余恺
邱晨
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202011511815.8A external-priority patent/CN113534484A/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP21780128.1A priority Critical patent/EP4119976A4/en
Publication of WO2021196976A1 publication Critical patent/WO2021196976A1/zh
Priority to US17/955,261 priority patent/US20230026858A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1086Beam splitting or combining systems operating by diffraction only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0816Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
    • G02B26/0833Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • G02B26/105Scanning systems with one or more pivoting mirrors or galvano-mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/106Beam splitting or combining systems for splitting or combining a plurality of identical beams or images, e.g. image replication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor

Definitions

  • the embodiments of the present application relate to the technical field of optical electronic devices, and in particular to a light emitting device and electronic equipment.
  • a three-dimensional (3D) camera also called a depth camera photographs a target object, not only can obtain a two-dimensional image of the target object (such as a human face), but also can obtain depth information of the target object.
  • the depth information of the target object includes the distance between each feature on the target object and the camera, which can characterize the three-dimensional feature of the target object.
  • the 3D camera can use the two-dimensional image and depth information of the target object that it shoots to achieve functions such as face recognition and three-dimensional map reconstruction.
  • the 3D camera includes a transmitter and a receiver.
  • the 3D camera can obtain the depth information of the target object in the following ways.
  • the emitting end is used to emit light, and the light emitted by the emitting end is projected on the target object and is reflected by the target object.
  • the receiving end can receive the light reflected by the target object.
  • the 3D camera can calculate the depth information of the target object according to the time difference between the time when the transmitting end emits light and the time when the receiving end receives the light reflected by the target object.
  • FIG. 1 is a schematic diagram of the structure of the transmitting end based on the point scanning technology.
  • the transmitting end includes a light source 110 and a scanning mirror 120, and the scanning mirror 120 includes a reflecting mirror 121.
  • the light source can emit light beam 1, and light beam 1 irradiates the mirror 121 on the scanning mirror 120, and is reflected by the mirror 121 as light beam 2, which can irradiate the target object.
  • the propagation direction of the light beam 1 emitted by the light source 110 is constant.
  • the reflecting mirror 121 can be rotated in the horizontal direction and the vertical direction to adjust the angle between the reflecting mirror 121 and the light beam 1 (that is, the incident angle of the light beam 1 on the reflecting mirror 121).
  • the incident angle of the light beam 1 on the mirror 121 changes, and the exit angle of the light beam 2 also changes accordingly. That is to say, the emission end can control the change of the exit angle of the light beam 2 by adjusting the reflector 121, so as to emit light in a point scanning manner (that is, the reflected light beam 2).
  • the transmitting end that adopts the point scanning mode requires multiple adjustments of the reflector 121 in the horizontal and vertical directions before light can be emitted at different angles, and the receiving end can collect images of the target object. For example, when using a 3D camera to scan a video graphics array (VGA) image (an image with a default resolution of 640*480), at least 9 million points need to be collected per second, that is, at least 9 million times to adjust the reflection Mirror 121. Moreover, when adjusting the reflector 121, the angle of the scanning mirror needs to be adjusted in both the horizontal direction and the vertical direction, which requires a higher rotation angle of the scanning mirror, and a very high requirement for the modulation speed of the light source 110.
  • VGA video graphics array
  • the present application provides a light emitting device and electronic equipment, which can reduce the rotation angle requirement of the scanning rotating mirror in the light emitting device and increase the utilization rate of the light beam emitted by the light emitting device.
  • the present application provides a light emitting device.
  • the light emitting device may include an array light source, a collimator lens, a scanning mirror, and an optical beam splitter.
  • the array light source may include M ⁇ N light sources.
  • N light sources are light sources in M rows and N columns.
  • M and N are both positive integers.
  • the interval between two adjacent columns of light sources is the first preset distance.
  • two adjacent The interval between the row light sources is the second preset distance, and the angle between any row of light sources in the N rows of light sources and any row of light sources in the M rows of light sources is the preset angle.
  • the array light source can emit K light beams, K ⁇ 1, and K is a positive integer.
  • the array light source is located on the first side of the collimating lens, the plane where the array light source is located is perpendicular to the optical axis of the collimating lens, and the distance between the plane where the array light source is located and the center point of the collimating lens is the focal length of the collimating lens .
  • the collimating lens can convert the K light beams emitted by the array light source into K first collimated light beams.
  • the scanning mirror is located on the second side of the collimating lens, the scanning mirror is used to realize one-dimensional rotation, and the center point of the reflective surface of the scanning mirror is on the optical axis of the collimating lens; the reflective surface is used to The first collimated beams are reflected into K second collimated beams.
  • the scanning mirror rotates in one dimension, one array can be projected at a time, and scanning can be completed through multiple projections to achieve a predetermined resolution.
  • the optical beam splitter is used to receive the K second collimated light beams, and split the K second collimated light beams into i ⁇ K third collimated light beams; where i ⁇ 2 and i is a positive integer.
  • the light source in the light emitting device is an array light source, including M ⁇ N light sources.
  • the array light source may be composed of M rows of light sources and N columns of light sources, or composed of N rows of light sources and M columns of light sources.
  • the array light source can emit K light beams, K ⁇ 1. Compared with the point scanning method, when the scanning mirror rotates in one direction, a two-dimensional beam array can be formed, which reduces the requirement on the rotation angle of the scanning mirror.
  • the optical beam splitter in the light emitting device can split 1 light beam into i light beams, so that the number of light beams projected by the light emitting device increases. If each beam is projected on the target object, it corresponds to a point of the target object, that is, after one beam in the light emitting device is split into i beams, one beam i corresponds to a point on the target object. Compared with the point scanning mode, the light emitting device in the present application improves the utilization rate of the light source.
  • a light source emits 1 light beam, which can be split into i light beams. It is assumed that each light beam corresponds to one pixel, that is, each light source can correspond to i pixels. The light sources are closely arranged. The processing difficulty and installation difficulty of the light source are reduced.
  • the processing difficulty and installation difficulty of the array light source in the light emitting device provided in the present application are relatively low, easy to implement, and the utilization rate of the light source to the light emitting device is improved.
  • the optical beam splitter may include at least one of a one-dimensional grating, a two-dimensional diffractive optical element, and a prism film.
  • the optical beam splitter can be a one-dimensional grating, a two-dimensional diffractive optical element or a prism film, or it can be composed of the above two elements.
  • the optical beam splitter can be a combination of a one-dimensional grating and a prism film.
  • the preset angle of the included angle between any one of the N columns of light sources and any one of the M rows of light sources is an acute angle.
  • the optical beam splitter when the optical beam splitter is a prism film, the incident surface of the optical beam splitter is a flat surface, and the exit surface of the optical beam splitter is a prism film structure.
  • the prism film structure includes i beam splitting surfaces, and the i beam splitting surfaces are used to split a beam into i beams, and the i beams have different propagation directions.
  • the prism film structure is an optical film with a micro prism structure on the surface, that is, the prism film structure is an optical film with a micro prism structure on the surface.
  • the micro prism on the surface of the prism film structure can split a beam into i beams, so that the optical beam splitter can split a beam into i beams.
  • the optical beam splitter is a two-dimensional diffractive optical element
  • the two-dimensional diffractive optical element is used to split a beam into a beam matrix including i beams.
  • the one-dimensional grating element can split a beam into i beams, and when the i beams are projected on the same plane, the transmission of the beam The connection of the points is a straight line.
  • the light source in the array light source may be composed of an edge-emitting semiconductor laser, a vertical-cavity surface-emitting laser (VCSEL), a fiber laser, or a solid-state laser.
  • VCSEL vertical-cavity surface-emitting laser
  • the wavelength of the light beam emitted by the light source in the array light source can be any wavelength.
  • the array light source may be a light beam emitted by any light source, or a light beam emitted by a light source in any column or row in the array light source.
  • the light emitting device further includes a controller; the controller is connected to the scanning mirror.
  • the controller is used for receiving the control signal and transmitting the control signal to the scanning rotating mirror, and the control signal is used for instructing the scanning rotating mirror to adjust the angle of the reflecting surface on the scanning rotating mirror.
  • the scanning mirror is used to receive the control signal and adjust the angle of the reflecting surface according to the control signal to adjust the propagation direction of the K second collimated beams.
  • the above-mentioned scanning mirror includes: a micro-electromechanical system MEMS mirror or a digital micro-mirror DMD.
  • the present application also provides an electronic device, including the light emitting device in the first aspect and any one of its possible implementations; wherein, the light beam emitted by the light emitting device irradiates the target object and is Object reflection.
  • the electronic device also includes a receiving device for receiving the light beam reflected by the target object.
  • the above-mentioned receiving device includes: a receiving lens and an image sensor; wherein, the optical axis of the receiving lens is perpendicular to the plane where the image sensor is located.
  • the receiving lens is used to receive the light beam reflected by the target object and refract the light beam reflected by the target object into a refracted beam; wherein the refracted beam irradiates the image sensor, so that the target object is imaged on the image sensor.
  • the distance between the image sensor and the receiving lens is less than twice the focal length of the receiving lens.
  • the pixels of the image sensor are E ⁇ F, and E and F are both positive integers.
  • the image sensor includes j detectors, where j is less than E ⁇ F, and j is a positive integer.
  • the number of detectors in the image sensor is smaller than the number of pixels, and an image with a corresponding resolution can be formed on the image sensor, that is, the number of detectors of the image sensor can be reduced, which reduces the cost.
  • the detector includes at least one single-photon detector.
  • the receiving device further includes a filter; the filter is disposed between the receiving lens and the image sensor, and the filter is parallel to the plane where the image sensor is located. Among them, the filter is used to filter out the ambient light in the refracted beam.
  • the present application also provides a method for emitting light beams.
  • the method can be applied to the light emitting device in the first aspect and any one of its possible implementations.
  • the light emitting device includes an array light source, a collimating lens, Scanning mirror, optical beam splitter and processor.
  • the array light source includes M ⁇ N light sources.
  • the M ⁇ N light sources are light sources with M rows and N columns. M and N are both positive integers.
  • the interval between two adjacent columns of light sources is the first predetermined
  • the distance, in the M rows of light sources is the second preset distance
  • the angle between any one of the N rows of light sources and any one of the M rows of light sources is the preset angle;
  • the method may include: the processor controls the array light source to emit K light beams, K ⁇ 1, and K is a positive integer.
  • the array light source is located on the first side of the collimating lens, the plane where the array light source is located is perpendicular to the optical axis of the collimating lens, and the distance between the plane where the array light source is located and the center point of the collimating lens is the focal length of the collimating lens . Then the K light beams emitted by the array light source can propagate to the collimating lens, and the K light beams are converted into K first collimated light beams through the collimating lens.
  • the scanning rotating mirror is located on the second side of the collimating lens, the scanning rotating mirror is used to realize one-dimensional rotation, and the optical axis of the collimating lens passes through the reflecting surface of the scanning rotating mirror.
  • the K collimated light beams converted by the collimated light beam can propagate to the scanning rotating mirror; the processor controls the scanning rotating mirror to reflect the K first collimated light beams into K second collimated light beams.
  • the K second collimated light beams are split into i ⁇ K third collimated light beams by the optical beam splitter, and i ⁇ K third collimated light beams are emitted; where i ⁇ 2, i is a positive integer.
  • the aforementioned preset angle is an acute angle.
  • an embodiment of the present application also provides a computer-readable storage medium including computer instructions.
  • the electronic device can implement the method for emitting light beams in the third aspect.
  • FIG. 1 is a schematic diagram of the transmitting end structure of a point scanning method provided by this application;
  • FIG. 2 is a schematic structural diagram of a diffractive optical element provided by an embodiment of the application.
  • FIG. 3 is a schematic structural diagram of a transmitting end in a line scan mode provided by an embodiment of the application.
  • FIG. 4 is a schematic structural diagram of a light emitting device provided by an embodiment of the application.
  • 5A is a schematic structural diagram of an array light source provided by an embodiment of the application.
  • 5B is a schematic structural diagram of another array light source provided by an embodiment of the application.
  • FIG. 6 is a schematic structural diagram of a collimating lens provided by an embodiment of the application.
  • FIG. 7 is a schematic structural diagram of an optical beam splitter provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of a light exit angle of a light emitting device according to an embodiment of the application.
  • FIG. 9 is a schematic structural diagram of a receiving device according to an embodiment of the application.
  • FIG. 10 is a schematic diagram of collection and scanning of a receiving device according to an embodiment of the application.
  • FIG. 11 is a schematic structural diagram of a detector provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram of a detector collecting and scanning results according to an embodiment of the application.
  • FIG. 13 is a schematic diagram of another detector collecting and scanning results according to an embodiment of the application.
  • FIG. 14 is a schematic diagram of another detector collecting and scanning results according to an embodiment of the application.
  • 15A is a schematic structural diagram of another receiving device according to an embodiment of the application.
  • 15B is a schematic diagram of collecting light beams on an image sensor provided by an embodiment of the application.
  • 16 is a schematic diagram of a light spot received by a detector in a receiving device according to an embodiment of the application.
  • FIG. 17 is a flowchart of a method for emitting a light beam according to an embodiment of the application.
  • FIG. 18 is a schematic structural diagram of a 3D camera provided by an embodiment of this application.
  • FIG. 19 is a schematic structural diagram of an electronic device provided by an embodiment of this application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the present embodiment, unless otherwise specified, “plurality” means two or more.
  • Diffractive optical elements Also known as binary optical elements, the diffractive optical elements have a specific surface structure design, so that the light beam propagating to itself is diffracted.
  • the DOE may have a one-dimensional lattice beam splitting function, and the DOE may also have a two-dimensional lattice beam splitting function.
  • a DOE with a one-dimensional lattice beam splitting function can divide a beam into multiple beams in one direction. For example, as shown in Fig. 2(a), the beam 1 is divided into 4 beams in the vertical direction by the DOE.
  • a DOE with a two-dimensional lattice beam splitting function can split a beam into a matrix beam. For example, as shown in Figure 2(b), the beam 1 is split by the DOE into an 8*8 beam array.
  • Collimation Generally speaking, light rays are divergent, that is, when two adjacent light rays begin to spread, they will become farther and farther apart. Collimation is to keep the light rays in a beam of light parallel.
  • Single-photon avalanche diode Once a photon propagates to the surface of the SPAD, the single-photon detector will be triggered to detect the light signal. Among them, after the SPAD is triggered, it can be restored to the initial state (that is, the untriggered state) after a certain period of time. SPAD can only detect whether there are photons irradiated on the surface of SPAD, but cannot detect the number of photons irradiated on the surface of SPAD.
  • the transmitting end of the 3D camera can also use line scanning to emit light.
  • the line scanning method means that the transmitting end can emit multiple beams in one direction, and the scanning mirror at the transmitting end rotates in one direction to make the receiving end
  • the image of the target object can be collected.
  • the transmitting end can emit multiple beams in the vertical direction, and the scanning mirror at the transmitting end rotates in the horizontal direction, so that the receiving end can collect images of the target object. Therefore, the use of line scanning to emit light can reduce the requirement on the rotation angle of the scanning mirror.
  • FIG. 3 is a schematic diagram of the structure of the transmitting end of a line scanning technology.
  • the transmitting end includes a linear light source 210, a lens 220 and a scanning mirror 230.
  • the linear light source 210 is formed by a plurality of light sources arranged in one direction.
  • the linear light source 210 emits a light beam, and the light beam is projected outward through the lens 220 and the scanning mirror 230.
  • the linear light source 210 emits 480 beams in the vertical direction, the scanning mirror rotates in the horizontal direction, and the angle of the scanning mirror is adjusted 640 times so that the receiving end collects a complete VGA image.
  • the use of the line scan method can reduce the requirement for the rotation angle of the scanning mirror in the transmitting end, and reduce the difficulty of implementation of the transmitting end.
  • one light beam emitted by the linear light source enables the receiving end to collect a point of the target object, and a point on the target object corresponds to a pixel on the image.
  • adjacent light sources on the linear light source are closely arranged. This arrangement of light sources increases the processing difficulty and installation difficulty of the linear light source, and is difficult to realize.
  • a light beam emitted by the light source is projected to a point of the target object, and the receiving end collects the light beam reflected by that point of the target object, and forms a pixel in the image of the target object.
  • a light beam emitted by the light source ultimately corresponds to a pixel on the image.
  • the problem of low utilization rate of the light source has not been solved.
  • the embodiment of the present application provides a light emitting device, and the light source adopts an array light source, that is, the light source adopts an arrangement of multiple rows and multiple columns.
  • the light emitting device includes an optical beam splitter, and the optical beam splitter can split one light beam into i light beams, where i is a positive integer and i ⁇ 2.
  • One beam emitted by the light source can be projected to i points of the target object, and the utilization rate of the light source is improved.
  • the light emitting device provided in the embodiment of the present application can be used in a 3D camera, and can also be used in a device using 3D sensing technology, such as a 3D scanner.
  • the 3D camera can be installed on an electronic device.
  • the electronic devices in the embodiments of the present application may be mobile phones, tablet computers, desktops, laptops, handheld computers, notebook computers, ultra-mobile personal computers (UMPC), netbooks, and cellular phones, Personal digital assistant (personal digital assistant, PDA), augmented reality (AR) ⁇ virtual reality (VR) devices, etc.
  • PDA personal digital assistant
  • AR augmented reality
  • VR virtual reality
  • FIG. 4 is a schematic structural diagram of a light emitting device provided by an embodiment of this application.
  • the light emitting device includes: an array light source 401, a collimating lens 402, a scanning rotating mirror 403, and an optical beam splitter 404.
  • the array light source 401 is located on the first side S1 of the collimating lens 402, the plane where the array light source 401 is located is perpendicular to the optical axis of the collimating lens 402, and the plane where the array light source 401 is located is at the center of the collimating lens 402.
  • the distance between is the focal length of the collimating lens 402.
  • the scanning mirror 403 is located on the second side S2 of the collimating lens 402, and the center point of the reflective surface of the scanning mirror 403 is on the optical axis of the collimating lens 402.
  • the array light source 401 includes M ⁇ N light sources.
  • the array light source 401 can emit K light beams, where K ⁇ 1, and K is a positive integer. That is, the array light source 401 can emit multiple light beams.
  • the K light beams may be emitted by K light sources among M ⁇ N light sources.
  • the plane where the array light source 401 is located is perpendicular to the optical axis of the collimating lens 402, that is, the array light source 401 faces the collimating lens 402. Then, for the K light beams emitted by the array light source 401, the K light beams will propagate to the collimating lens 402 along the optical axis.
  • the collimating lens 402 has a function of collimating a light beam. Therefore, the collimating lens 402 can convert K light beams into K collimated light beams (ie, the first collimated light beam).
  • the center point of the reflective surface of the scanning mirror 403 is on the optical axis of the collimating lens 402, so that the K first collimated beams can propagate to the scanning mirror 403, and the reflective surface of the scanning mirror 403 is used to transfer the K first collimated beams.
  • the collimated beams are reflected as K second collimated beams.
  • the arrangement of the foregoing M ⁇ N light sources may be M rows of light sources and N columns of light sources, or N rows of light sources and M columns of light sources.
  • the array light source 401 is a light source with M rows and N columns as an example.
  • the interval between two adjacent rows of light sources in the N columns is the first preset distance; the interval between the M rows of light sources and between two adjacent rows is the second preset distance; any column in the N columns of light sources
  • the angle between the light source and any row of light sources in the M rows is a preset angle.
  • FIG. 5A it is a schematic diagram of the structure of the array light source 401.
  • the light source on the line L1 is the first row of light sources
  • the light source on the straight line L2 is the second row of light sources
  • the light source on the straight line L3 is the first row of light sources
  • the light source on the straight line L4 is the second row of light sources.
  • the interval between two adjacent columns is the interval between adjacent light sources on the same row
  • the interval between two adjacent rows is the interval between adjacent light sources on the same column.
  • the interval between L1 and L2 is the distance X1 between adjacent light sources on L3, and the interval between L3 and L4 is the distance X2 between adjacent light sources on L1.
  • the angle between the first row of light sources L1 and the first column of light sources L3 is ⁇ . If the difference between two adjacent light sources in the first row of light sources L1 in the horizontal direction is X3, X3 can also be understood as the second The column of light sources is moved down by X3 relative to the first column of light sources. Then ⁇ can be expressed as
  • the angle between any column of light sources in the array light source 401 and any row of light sources is a preset angle, and the preset angle may be an acute angle, that is, less than 90°. If each column of light sources are arranged in a vertical and horizontal direction, the light sources in each row are not on the same horizontal line. As shown in Fig. 5A, the light sources in each row are not arranged on the same horizontal line.
  • the array light source 401 is 8*8 light sources, X1 is 40 ⁇ m (micrometers), X2 is 40 ⁇ m (micrometers), and X3 is 5 ⁇ m (micrometers).
  • the position of each row of light sources is moved down by 5 ⁇ m compared to the position of the previous row of light sources.
  • the light source in the array light source 401 may be a laser.
  • the arrangement of the light sources in the array light source 401 may also be as shown in FIG. 5B.
  • the array light source 401 as an example with 8*8 light sources, it specifically includes, as shown in Figure 5B (a), the angle between any one of the N columns of light sources and any one of the M rows of light sources is a preset angle of 90 °.
  • the light source in the next row of the array light source 401 moves to the right by a preset distance compared to the position of the light source in the previous row.
  • the angle between the light sources is a preset angle of 0° ⁇ 90°.
  • FIG. 5B (c) in the first column of light sources indicated by L in the array light source 401, each light source in a column of light sources may not be in the same vertical direction.
  • any light source in the array light source may emit a light beam, or any row or column of the array light source may emit a light beam.
  • part of the light source in the array light source emits light beams, for example, the array light source includes 8*8 light sources, and 4*4 light sources in the array light source can be controlled to emit light beams.
  • the light source in the array light source may be composed of at least one of an edge emitting semiconductor laser, a vertical-cavity surface-emitting laser (VCSEL), a fiber laser, or a solid-state laser.
  • the light beam emitted by the light source in the array light source can be a light beam of any wavelength. If the light emitting device is used as the emitting end of the 3D camera, the wavelength of the light beam emitted by the array light source needs to be determined, so that the receiving end of the 3D camera can determine that the received light beam is the light beam emitted by the emitting end.
  • the function of the collimating lens 402 is to convert the light beam emitted by the light source into the first collimated light beam.
  • the collimating lens 402 in the embodiment of the present application may be a single lens, and the single lens refers to a lens formed by a piece of optical glass.
  • the collimating lens 402 may be a convex lens.
  • the collimating lens 402 may also be formed of multiple optical elements, for example, the collimating lens is formed of multiple lenses.
  • the collimating lens 402 can convert the light beam emitted by the light source into a collimated light beam.
  • the distance between the plane where the array light source 401 is located and the center point of the collimating lens 402 is the focal length of the collimating lens 402, that is, the plane where the array light source 401 is located includes the focal point of the collimating lens 402. Therefore, the light beam emitted from the array light source 401 becomes a collimated light beam after passing through the collimating lens 402.
  • the position of each light source in the array light source 401 is different, the position where the light beam emitted by each light source enters the collimating lens 402 is also different.
  • the exit angle of the first collimated beam is also different.
  • the collimating lens 402 is a single lens, that is, the collimating lens 402 is a convex lens.
  • the focal length of the convex lens may be 3.4 mm (millimeters).
  • FIG. 6 it is a schematic diagram of the structure of the collimating lens 402.
  • the multiple light beams respectively represent light beams emitted by light sources at different positions on the array light source 401.
  • the lens emitted by the light source passes through the collimator lens 402 and becomes the first collimated light beam.
  • the first collimated light beam passes through the focus of the collimator lens 402 on the second side. .
  • the light beam 1 is converted by the collimator lens 402 and propagates along the exit direction of the collimator lens 402.
  • the exit angle of the light beam emitted by the light source after passing through the collimator lens 402 is expressed as:
  • ⁇ i represents the exit angle of the light beam emitted by the light source
  • Li represents the distance value between the light beam and the optical axis of the collimator lens 402
  • f represents the focal length of the collimator lens 402.
  • the collimator lens 402 converts the light beam emitted by the array light source 401 into a first collimated light beam, and the first collimated light beam propagates along the exit direction.
  • the array light source 401 is located on the first side of the collimating lens 402.
  • the light beam emitted by the array light source 401 is incident on the collimating lens 402 from the first side.
  • After the collimating lens 402 converts the light beam into a first collimated beam, the first collimated light
  • the light beam propagates on the second side of the collimator lens 402 along the exit direction.
  • the scanning rotating mirror 403 is located on the second side of the collimating lens 402, and the center point of the reflecting surface of the scanning rotating mirror 403 is on the optical axis of the collimating lens 402.
  • the scanning rotating mirror 403 After the first collimated light beam propagates to the scanning rotating mirror 403, it is reflected by the reflective surface of the scanning rotating mirror 403 as a second collimated light beam.
  • the scanning mirror 403 is used to change the propagation direction of the first collimated light beam.
  • a controller is provided on the scanning mirror 403, and the controller is used to receive a control signal and transmit the control signal to the scanning mirror 403.
  • the control signal is used to instruct the scanning mirror 403 to adjust the scanning mirror 403.
  • the angle of the reflecting surface is used to receive the control signal and adjust the angle of the reflecting surface according to the control signal to adjust the propagation direction of the second collimated light beam.
  • the scanning mirror 403 may be a micro-electro-mechanical system (MEMS) mirror, or the scanning mirror may be a digital micromirror device (DMD).
  • MEMS micro-electro-mechanical system
  • DMD digital micromirror device
  • MEMS mirrors are driven electrostatically or electromagnetically.
  • the aforementioned control signal may be an electrostatic signal, and an electrostatic drive is generated by the electrostatic signal to control the angle of the MEMS mirror.
  • the above-mentioned control signal is a current signal, and an electromagnetic drive is generated by the change of the current to control the angle of the MEMS mirror. Because the MEMS mirror is small in size, light in weight, and short in response time, the light emission efficiency of the light emitting device can be improved.
  • the DMD includes a plurality of micro mirrors. That is, each DMD is composed of multiple micro mirrors. DMD is driven by digital signals. Specifically, the aforementioned control signal may be a digital signal, and the angle of the DMD is controlled by the digital signal to achieve the purpose of adjusting the propagation direction of the second collimated light beam.
  • the scanning mirror can also be a lens driven by a motor. Moreover, the scanning mirror can realize one-dimensional rotation (that is, rotation in one direction), or two-dimensional rotation (that is, rotation in two directions). When the scanning mirror rotates in one dimension, one array can be projected at a time, and scanning can be completed through multiple projections to achieve a predetermined resolution.
  • the shape of the scanning mirror can be circular, rectangular, polygonal, or the like.
  • the optical beam splitter 404 is used to split a beam into i beams, where i ⁇ 2, and i is a positive integer.
  • the optical beam splitter 404 includes an entrance surface and an exit surface. One light beam enters the optical beam splitter 404 from the entrance surface, and i beams are emitted from the exit surface.
  • the optical beam splitter can be a one-dimensional grating, a two-dimensional diffractive optical element or a prism film.
  • the one-dimensional grating can split a beam into i beams, and the specific number of beams split is related to the number of gratings of the one-dimensional grating.
  • the beam splitting effect of the one-dimensional grating can refer to the beam splitting diagram of the diffractive optical element shown in Figure 2(a).
  • the two-dimensional diffractive optical element can split a beam into a beam matrix including i beams.
  • the beam splitting effect of the two-dimensional diffractive optical element can refer to the beam splitting diagram of the diffractive optical element shown in Figure 2 (b).
  • the exit surface of the optical beam splitter 404 is a prism film structure.
  • the prismatic film structure refers to a film with a prism-shaped surface.
  • the incident surface 70 is a flat surface
  • the exit surface 71 includes 8 prisms. Each prism has a different angle to the horizontal plane. The propagation direction of the light beam is also different.
  • the angle between prism 1 and the horizontal plane is ⁇ 1, the angle between prism 2 and the horizontal plane is - ⁇ 1; the angle between prism 3 and the horizontal plane is ⁇ 2, and the angle between prism 4 and the horizontal plane is - ⁇ 2; the angle between the prism 5 and the horizontal plane is ⁇ 3, the angle between the prism 6 and the horizontal plane is - ⁇ 3; the angle between the prism 7 and the horizontal plane is ⁇ 4, and the angle between the prism 8 and the horizontal plane is - ⁇ 4.
  • the optical beam splitter 404 can split one incident beam into eight outgoing beams.
  • I1 represents an incident beam
  • O1 represents an output beam 1
  • O2 represents an output beam 2
  • O3 represents an output beam 3
  • O8 represents an output beam 8
  • the arrow in the figure represents a light beam Propagation direction, the propagation direction of each outgoing beam is different.
  • the 8*8 first collimated light beams propagate to the reflecting surface of the scanning rotating mirror 403, and the reflecting surface of the scanning rotating mirror 403 reflects the 8*8 first collimated light beams into 8*8 second collimated light beams.
  • the 8*8 second collimated light beams travel along the reflection direction to the optical beam splitter 404, and the optical beam splitter 404 can split the 8*8 second collimated light beams into 8*8*8 (512) light beams.
  • the exit angles of 512 beams are shown in Figure 8.
  • the beams emitted by one light source in each column are split into 8 beams.
  • the figure shows 64*8 beams. Each point corresponds to an angle in the vertical direction. Indicates the exit angle of a beam in the longitudinal direction.
  • the optical beam splitter when the optical beam splitter is a prism film, the optical beam splitter may be made of glass, polymethyl methacrylate (PMMA), polycarbonate (PC), polyimide film (PolyimideFilm, PI) and other optical plastics are made of materials. Specifically, it can be processed by techniques such as etching, embossing, and micro-replication.
  • PMMA polymethyl methacrylate
  • PC polycarbonate
  • PolyimideFilm PolyimideFilm, PI
  • other optical plastics are made of materials. Specifically, it can be processed by techniques such as etching, embossing, and micro-replication.
  • the optical beam splitter 404 may be an optical beam splitter 404 with a one-dimensional lattice beam splitting function, or may be an optical beam splitter 404 with a two-dimensional lattice beam splitting function.
  • the array light source 401 includes M ⁇ N light sources. When the array light source 401 is in the working state, all the M ⁇ N light sources can be controlled to emit light, or some of the M ⁇ N light sources can be controlled to emit light. For example, controlling the first column of light sources in the array light source 401 to emit light, controlling the first row of light sources in the array light source 401 to emit light, or controlling a certain light source in the array light source 401 to emit light.
  • the array light source 401 in the light emitting device can emit K light beams, where 2 ⁇ K ⁇ M ⁇ N, and K is a positive integer.
  • the K beams propagate to the collimating lens 402.
  • the collimating lens 402 converts the K beams into K first collimated beams.
  • the K first collimated beams propagate to the reflective surface of the scanning lens.
  • the collimated beams are reflected into K second collimated beams, K second collimated beams propagate to the optical beam splitter 404, and the optical beam splitter 404 splits the K second collimated beams into i ⁇ K third beams Collimate the beam.
  • the i ⁇ K third collimated light beams propagate to the target object along the propagation direction, and are emitted or refracted by the target object.
  • the emitting end will project the emitted light beam on the target object
  • the light beam projected on the target object is reflected by the target object
  • the receiving device is used to receive the light emitted by the target object, and The light beam generates an image of the target object.
  • the light beam is projected on a point of the target object, and according to the light reflected by the target object received by the receiving device, one light beam corresponds to a pixel point on the generated image of the target object.
  • an embodiment of the present application also provides a receiving device.
  • the receiving device includes: a receiving lens 901 and an image sensor 902.
  • the optical axis of the receiving lens 901 is perpendicular to the plane where the image sensor 902 is located.
  • the receiving lens 901 is used to receive the light beam reflected by the target object and refract the light beam reflected by the target object into a refracted beam; wherein the refracted beam irradiates the image sensor 902 so that the target object is imaged on the image sensor 902.
  • the receiving device is used to receive the light reflected by the target object. If the light in the environment irradiates the target object and is reflected by the target object, it can also enter the receiving device, so that the target object is imaged on the image sensor 902. Among them, the light in the environment may affect the image of the target object obtained by the receiving device.
  • the receiving device may further include a filter; the filter is disposed between the receiving lens 901 and the image sensor 902, and the filter is parallel to the plane where the image sensor 902 is located. Among them, the filter is used to filter out the ambient light in the refracted beam.
  • the above-mentioned light emitting device is used to project a light beam to a target object
  • the receiving device provided in an embodiment of the present application is used to receive a light beam reflected by the target object. If the above-mentioned light emitting device and receiving device can work together, then the image of the target object can be taken.
  • the image sensor 902 on the receiving device includes a detector, and the detector is used to collect photons reflected to itself.
  • the image sensor 902 determines the position of the pixel based on the photons reflected to the detector, and each pixel on the image sensor 902 corresponds to a detector.
  • the detector here is a detector in a functional sense, that is, the detector is used to identify whether the pixel has photons.
  • the detector can include one SPAD or multiple SPADs. If a detector includes a SPAD, the detector only recognizes the photons on the pixel. If the detector includes multiple SPADs, each SPAD detects photons on a pixel.
  • the detector may also include a silicon avalanche photodiode (SiAPD) or an avalanche photodiode (APD).
  • one detector includes one SPAD. Since one detector includes one SPAD, the number of detectors in the image sensor 902 in the receiving device corresponds to the pixels of the generated image. For example, if the number of detectors in the receiving device is 640*480, the resolution of the image generated by the receiving device is also 640*480.
  • the optical beam splitter 404 can split one light beam into two light beams. Therefore, the light emitting device can emit 3 ⁇ 16 light beams.
  • a detector includes a SPAD, and the photons collected by each detector correspond to a pixel on the generated image. As shown in Figure 10, it is a schematic diagram of the detector after three acquisitions. Each square in Figure 10 (only part of the detector is shown) is equivalent to one pixel on the image.
  • the detector When the scanning mirror 403 is at the first angle and the time is T0, the detector first collects the position of the pixel on the image corresponding to the photons reflected by the target object (as shown in the first column, second column and third column in Figure 10) Pixels collected at one time); when the scanning mirror 403 rotates a preset angle, when the time is T1, the detector collects the position of the pixel on the image corresponding to the photon reflected by the target object for the second time (as shown in the second column in Figure 10, The pixels collected the second time in the third and fourth columns); when the scanning mirror 403 rotates the preset angle again, and the time is T2, the detector collects the photons reflected by the target object for the third time and corresponds to the pixels on the image Position (as shown in the third column, fourth column and fifth column in the third column of the pixel in Figure 10).
  • one detector includes multiple SPADs.
  • the pixels of the image sensor 902 are E ⁇ F, and E and F are both positive integers.
  • the image sensor 902 includes j detectors, where j is greater than or equal to 2, j is less than E ⁇ F, and j is a positive integer.
  • FIG. 11 is a schematic structural diagram of a detector provided by an embodiment of this application. As shown in Figure 11, a detector includes 8*8 SPADs, and each SPAD is not fully shown in the figure.
  • SPAD can only recognize the presence of photons but not the number of photons.
  • SPAD is triggered when it recognizes a photon, and it takes a preset time to return to the initial state.
  • a detector includes multiple SPADs, if one SPAD is triggered, the detector is in the triggered state. Therefore, in the embodiment of the present application, the state of the detector can be controlled in a time-sharing manner for the detector.
  • Example 1 Taking the array light source 401 in the light emitting device as an 8*8 array light source 401 as an example, the optical beam splitter 404 splits 1 beam into 8 beams, and the receiving device uses a detector as shown in Figure 11 Including 8*8 SPAD as an example. Specifically, the array light source 401 emits 8*8 light beams, and each light source is split into 8 light beams. Take 1 beam divided into 8 beams in the horizontal direction as an example. Since each detector includes 8*8 SPADs, if a VGA image is to be acquired, the image sensor 902 may include 60*80 detectors. As shown in Fig. 12, it is a schematic diagram of imaging on the detector of the receiving device. The pixels collected by the small black squares (not all the pixels are shown in the figure).
  • the detector can collect 512 pixels at the same time. Since the beams of each column are distributed in different detectors, when the angle of the scanning mirror 403 in the light emitting device changes, the position of the small black square in FIG. 12 will also change. In this case, each detector can collect 8*8 pixels, and no two pixels can be collected on one detector at the same time. If a receiving device with this structure is used to acquire a VGA image, the image sensor 902 may include 60*80 detectors.
  • the light emitting device needs to scan 696 columns (56 columns of incomplete columns + 640 columns of complete columns).
  • the VGA image formed by the receiving device is schematically shown, in which there are 640 columns in the vertical direction and 480 rows of pixels in the horizontal direction. Because the image in the first 7 columns of pixels is not completely discarded. In actual use, the pixels formed in the first 7 columns can be ignored (the detectors in the last 7 columns are not shown in the figure).
  • Example 2 Taking the array light source 401 in the light emitting device as an 8*8 array light source 401 as an example, the optical beam splitter 404 can split a beam into a 3*8 two-dimensional array beam, and a detector in the receiving device Including 8*8 SPAD as an example.
  • the number of detectors in the image sensor 902 is less than the number of pixels, and an image with a corresponding resolution can be formed on the image sensor 902, that is, the number of detectors in the image sensor 902 can be reduced, thereby reducing the cost.
  • the array light source 401 includes 8*8 light sources, if each light source is in a working state, the light emitting device can generate 512 light beams each time.
  • the individual control of each light source in the array light source 401 as an example, one light source can be split into 8 light beams, and each light source in the array light source 401 works independently to sequentially light up 8*8 light sources. That is, after each light source completes the scanning task, the next light source is turned on. In this way, setting 8*10 detectors on the image detector can meet the demand for generating VGA images.
  • the number of detectors is smaller than the number of pixels.
  • the spot formed by the light beam reflected by the target object on the imaging surface of the image sensor 902 will become smaller.
  • the distance between the image sensor 902 and the receiving lens 901 can be smaller than the focal length of the receiving lens 901, so that a larger spot can be obtained on the image sensor 902.
  • the value of the distance between the image sensor 902 and the receiving lens 901 is greater than the focal length of the receiving lens 901 and smaller than twice the focal length of the receiving lens 901.
  • FIG. 15A the distance between the image sensor 902 and the receiving lens 901 is greater than the focal length of the receiving lens 901, so that the light spot formed on the image sensor 902 becomes larger.
  • FIG. 15B it is a schematic diagram of the shape of the light spot on the image sensor 902. Among them, the shape of the light spot becomes larger in order to form an image of the target object.
  • FIG. 15A since the detectors are closely arranged on the image sensor 902, when the theoretical imaging position of the detection point is at the edge of the detector, the enlarged light spot will be received by multiple detectors, causing crosstalk .
  • a dynamically allocated SPAD array can be set on the image sensor 902.
  • the dynamic allocation of the SPAD array means that the position of the detector can move with the position of the light spot.
  • Figure 16 (a) shows the position of the light spot and the position of the detector in state 1.
  • the black circle represents the light spot
  • the rectangular frame outside the black circle represents the position of the detector.
  • Figure 16 (b) shows the position of the detector after the spot position has moved.
  • an interface can be set on the detector, and a controller (such as an MCU) can control the position of the detector through the interface.
  • the controller can send a series of SPAD configuration commands to the detector through the interface, and the detector completes the movement of the SPAD array according to the configuration commands.
  • each detector contains 16 SPADs.
  • the first 16 SPADs (rows 1 to 4) on the image sensor 902 form a detection.
  • the SPAD on the image sensor 902 (rows 2 to 5) forms a new detector.
  • the embodiment of the present application also provides a method for emitting a light beam, and the method can be applied to the light emitting device mentioned in the above embodiment. Alternatively, the method can also be applied to the above-mentioned electronic equipment. As shown in FIG. 17, it is a schematic diagram of the implementation process of the method of emitting a light beam.
  • the method may include step 601-step 602.
  • the light emitting device may include an array light source, a collimating lens, a scanning rotating mirror, an optical beam splitter, and a processor.
  • the processor in the light emitting device can execute the above method of emitting light beams.
  • Step 601 The processor controls the array light source to emit K light beams, K ⁇ 1, and K is a positive integer; the K light beams are converted into K first collimated light beams through the collimating lens.
  • the array light source is located on the first side of the collimating lens, the plane where the array light source is located is perpendicular to the optical axis of the collimating lens, and the distance between the plane where the array light source is located and the center point of the collimating lens is the focal length of the collimating lens . Then the K light beams emitted by the array light source can propagate to the collimating lens.
  • the scanning rotating mirror is located on the second side of the collimating lens, and the optical axis of the collimating lens passes through the reflecting surface of the scanning rotating mirror. The K collimated beams converted by the collimated beam can be propagated to the scanning mirror.
  • Step 602 The processor controls the scanning mirror to reflect the K first collimated beams into K second collimated beams; the K second collimated beams are split into i ⁇ K third collimated beams by the optical beam splitter Beams, and emit i ⁇ K third collimated beams.
  • An embodiment of the present application also provides a 3D camera.
  • the 3D camera includes the above-mentioned light emitting device 1701 and the above-mentioned receiving device 1702.
  • the 3D camera can also link various circuits such as peripherals, processors, power management circuits, etc., which are all known in the art, and therefore, no further description will be given herein.
  • FIG. 19 is a schematic structural diagram of an electronic device 500 provided by an embodiment of the present application.
  • the electronic device 500 may include a processor 510, an external memory interface 520, an internal memory 521, a universal serial bus (USB) interface 530, a charging management module 540, a power management module 541, and a battery 542 , Antenna 1, antenna 2, mobile communication module 550, wireless communication module 560, audio module 570, sensor module 580, camera 593, display screen 594, subscriber identification module (SIM) card interface 595, etc.
  • a processor 510 an external memory interface 520, an internal memory 521, a universal serial bus (USB) interface 530, a charging management module 540, a power management module 541, and a battery 542 , Antenna 1, antenna 2, mobile communication module 550, wireless communication module 560, audio module 570, sensor module 580, camera 593, display screen 594, subscriber identification module (SIM) card interface 595, etc.
  • SIM subscriber identification
  • the sensor module 580 may include a pressure sensor 580A, a gyroscope sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity light sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, and a touch sensor 580K.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 500.
  • the electronic device 500 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 510 may include one or more processing units.
  • the processor 510 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 500.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • a memory may also be provided in the processor 510 to store instructions and data.
  • the memory in the processor 510 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 510. If the processor 510 needs to use the instruction or data again, it can be directly called from the memory. Repeated accesses are avoided, the waiting time of the processor 510 is reduced, and the efficiency of the system is improved.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is merely a schematic description, and does not constitute a structural limitation of the electronic device 500.
  • the electronic device 500 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the charging management module 540 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the power management module 541 is used to connect the battery 542, the charging management module 540 and the processor 510.
  • the power management module 541 receives input from the battery 542 and/or the charge management module 540, and supplies power to the processor 510, the internal memory 521, the external memory, the display screen 594, the camera 593, and the wireless communication module 560.
  • the power management module 541 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the wireless communication function of the electronic device 500 can be implemented by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 500 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 550 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 500.
  • the mobile communication module 550 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the wireless communication module 560 can provide applications on the electronic device 500, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (bluetooth, BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 560 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 560 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 510.
  • the wireless communication module 560 can also receive the signal to be sent from the processor 510, perform frequency modulation, amplify it, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the electronic device 500 implements a display function through a GPU, a display screen 594, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 594 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 510 may include one or more GPUs that execute program instructions to generate or change display information.
  • the electronic device 500 can implement a shooting function through an ISP, a camera 593, a video codec, a GPU, a display screen 594, and an application processor.
  • the camera 593 may include a light emitting device 1701 and a receiving device 1702.
  • the light emitting device 1701 is used to emit a light beam, the light is projected to the target object and reflected by the target object, and the receiving device 1702 is used to receive the light beam reflected by the target object, and generate an image of the target object according to the collected light beam.
  • the ISP is used to process the data fed back from the camera 593. For example, when taking a picture, the shutter is opened, and the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, which is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 593.
  • the camera 593 is used to capture still images or videos.
  • the object generates an optical image through the lens and is projected to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transfers the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 500 may include 1 or N cameras 593, and N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 500 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 500 may support one or more video codecs. In this way, the electronic device 500 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, and so on.
  • the external memory interface 520 may be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 500.
  • the external memory card communicates with the processor 510 through the external memory interface 520 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 521 may be used to store computer executable program code, and the executable program code includes instructions.
  • the processor 510 executes various functional applications and data processing of the electronic device 500 by running instructions stored in the internal memory 521.
  • the internal memory 521 may include a program storage area and a data storage area. Among them, the storage program area can store an operating system, an application program (such as a sound playback function, an image playback function, etc.) required by at least one function, and the like.
  • the electronic device 500 may implement an audio function through the audio module 570.
  • the pressure sensor 580A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 580A may be provided on the display screen 594.
  • the gyro sensor 580B may be used to determine the movement posture of the electronic device 500.
  • the angular velocity of the electronic device 500 around three axes ie, x, y, and z axes
  • the gyroscope sensor 580B can be used for shooting anti-shake.
  • the air pressure sensor 580C is used to measure air pressure.
  • the electronic device 500 calculates the altitude based on the air pressure value measured by the air pressure sensor 580C to assist positioning and navigation.
  • the magnetic sensor 580D includes a Hall sensor.
  • the electronic device 500 can use the magnetic sensor 580D to detect the opening and closing of the flip holster.
  • the acceleration sensor 580E can detect the magnitude of the acceleration of the electronic device 500 in various directions (generally three axes).
  • the electronic device 500 can measure the distance by infrared or laser.
  • the proximity light sensor 580G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 500 emits infrared light to the outside through the light emitting diode.
  • the electronic device 500 uses a photodiode to detect infrared reflected light from nearby objects.
  • the ambient light sensor 580L is used to sense the brightness of the ambient light.
  • the fingerprint sensor 580H is used to collect fingerprints.
  • the electronic device 500 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 580J is used to detect temperature.
  • the electronic device 500 uses the temperature detected by the temperature sensor 580J to execute a temperature processing strategy.
  • Touch sensor 580K also called “touch panel”.
  • the touch sensor 580K can be arranged on the display screen 594, and the touch screen is composed of the touch sensor 580K and the display screen 594, which is also called a “touch screen”.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Semiconductor Lasers (AREA)
  • Mechanical Light Control Or Optical Switches (AREA)
  • Mechanical Optical Scanning Systems (AREA)

Abstract

一种光发射装置及电子设备,涉及光学电子器件技术领域,光发射装置包括:阵列光源(401)、准直透镜(402)、扫描转镜(403)和光学分束器(404),阵列光源(401)可以包括M行N列光源,M和N均为正整数,N列光源中任一列光源与M行光源中任一行光源之间的夹角为预设角度。阵列光源(401)位于准直透镜(402)的第一侧,阵列光源(401)所在平面垂直于准直透镜(402)的光轴,且阵列光源(401)所在平面与准直透镜(402)的中心点之间的距离为准直透镜(402)的焦距。扫描转镜(403)位于准直透镜(402)的第二侧,且扫描转镜(403)的反射面的中心点在准直透镜(402)的光轴上。可以降低光发射装置中扫描转镜(403)的旋转角度要求,并且增加光发射装置发出的光束的利用率。

Description

一种光发射装置及电子设备
本申请要求于2020年03月31日提交国家知识产权局、申请号为202010246351.6、申请名称为“一种光发射装置及电子设备”、以及2020年12月18日提交国家知识产权局、申请号为202011511815.8、申请名称为“一种光发射装置及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及光学电子器件技术领域,尤其涉及一种光发射装置及电子设备。
背景技术
三维(three-dimensional,3D)相机(又称为深度相机)拍摄目标对象,不仅可以得到该目标对象(如人脸)的二维图像,还可以得到该目标对象的深度信息。其中,目标对象的深度信息包括目标对象上的各个特征与相机之间的距离,可以表征该目标对象的三维特征。基于此,3D相机可以利用其拍摄得到的目标对象的二维图像和深度信息,实现人脸识别和三维地图重建等功能。
其中,3D相机包括发射端和接收端。3D相机可以采用以下方式获得目标对象的深度信息。发射端用于发射光,发射端发射的光投射在目标对象上,被目标对象反射。接收端可以接收到目标对象反射的光。3D相机可以根据发射端发射光的时间与接收端接收目标对象反射的光的时间的时间差,计算目标对象的深度信息。
目前,3D相机的发射端采用点扫描的方式发射光。例如,图1为一种基于点扫描技术的发射端的结构示意图。如图1所示,发射端包括光源110和扫描镜120,扫描镜120包括反射镜121。光源可以发出光束1,光束1照射到扫描镜120上的反射镜121上,被反射镜121反射为光束2,光束2可以照射到目标对象上。在图1所示的发射端中,光源110发出的光束1的传播方向是不变的。反射镜121可以在水平方向旋转和竖直方向旋转,以调整反射镜121与光束1的夹角(即光束1在反射镜121的入射角)。光束1在反射镜121的入射角发生变化,光束2的出射角也会随之变化。也就是说,发射端可以通过调整反射镜121,控制光束2的出射角变化,以采用点扫描的方式发射光(即反射光束2)。
采用点扫描的方式的发射端,需要在水平方向和竖直方向多次调整反射镜121,才可以以不同的角度发射光,接收端才可以采集到目标对象的图像。例如,使用3D相机扫描视频图形阵列(video graphics array,VGA)图像(默认分辨率为640*480的图像)时,每秒至少要采集九百万个点,即至少需要九百万次调整反射镜121。并且,调整反射镜121时,需要在水平方向和竖直方向两个方向上调节扫描镜的角度,对扫描镜旋转角度的要求较高,以及对光源110的调制速度要求极高。因此,采用点扫描的方式对扫描镜的旋转角度要求高,实现难度大。而且,光源投射到目标对象的一个点,对应接收端只能采集到一个像素点。例如,发射端的光束需要照射到目标对象上640*480次,接收端才可以采集到完整的VGA图像。在点扫描的方式中,对光源的利用率较低。
发明内容
本申请提供一种光发射装置及电子设备,可以降低光发射装置中扫描转镜的旋转角度 要求,并且增加光发射装置发出的光束的利用率。
为实现上述技术目的,本申请采用如下技术方案:
第一方面,本申请提供了一种光发射装置,该光发射装置可以包括:阵列光源、准直透镜、扫描转镜和光学分束器,阵列光源可以包括M×N个光源,该M×N个光源为M行N列的光源,M和N均为正整数,其中,N列光源中,相邻两列光源之间的间隔为第一预设距离,M行光源中,相邻两行光源之间的间隔为第二预设距离,N列光源中任一列光源与M行光源中任一行光源之间的夹角为预设角度。
该阵列光源可以发射K个光束,K≥1,K为正整数。其中,阵列光源位于准直透镜的第一侧,阵列光源所在平面垂直于该准直透镜的光轴,且阵列光源所在平面与准直透镜的中心点之间的距离为该准直透镜的焦距。该准直透镜可以将阵列光源发射的K个光束转换为K个第一准直光束。
该扫描转镜位于准直透镜的第二侧,该扫描转镜用于实现一维旋转,且该扫描转镜的反射面的中心点在准直透镜的光轴上;反射面用于将K个第一准直光束反射为K个第二准直光束。扫描转镜在一维旋转时,每次可以投射一个阵列,通过多次的投射能够完成扫描,实现预定的分辨率。
光学分束器用于接收上述K个第二准直光束,并将K个第二准直光束分束为i×K个第三准直光束;其中,i≥2,i为正整数。
一方面,光发射装置中的光源为阵列光源,包括M×N个光源。其中,阵列光源可以是由M行光源和N列光源组成的,或者是由N行光源和M列光源组成的。该阵列光源可以发射K个光束,K≥1。相对于点扫描的方式,当扫描转镜在一个方向上旋转时,就可以形成二维的光束阵列,降低了对扫描转镜的旋转角度的要求。
另一方面,光发射装置中的光学分束器可以将1个光束分束为i个光束,使得光发射装置投射的光束的数量变多。如果每个光束投射到目标对象上,对应目标对象的一个点,也就是说,光发射装置中的1个光束分束为i个光束之后,1个光束i个对应目标对象上的点。相对于点扫描方式,本申请中的光发射装置提高了光源的利用率。另外,一个光源发出1个光束,这1个光束可以分束为i个光束,假设每个光束对应一个像素,也就是说,每个光源可以对应i个像素,在设置光源时,不需要每个光源紧密排列。降低了光源的加工难度和安装难度。
综上所述,本申请提供的光发射装置中阵列光源的加工难度和安装难度较低,易于实现,且提高了对光发射装置对光源的利用率。
在一种可能的实施方式中,光学分束器可以包括:一维光栅,二维衍射光学元件和棱镜膜中的至少一种。也就是说,光学分束器可以是一维光栅,二维衍射光学元件或者棱镜膜,也可以是由以上两个元件组成。如,光学分束器可以是一维光栅和棱镜膜的组合。
另一种可能的实施方式中,N列光源中任一列光源与M行光源中任一行光源之间的夹角的预设角度为锐角。
另一种可能的实施方式中,当光学分束器为棱镜膜时,光学分束器的入射面为平面,光学分束器的出射面为棱镜膜结构。其中,棱镜膜结构包括i个分束面,i个分束面用于将1个光束分束为i个光束,且i个光束的传播方向不同。
可以理解的是,棱镜膜结构是表面具备微棱镜结构的光学薄膜,也就是说,棱镜膜结 构是表面为微型棱镜的光学薄膜。棱镜膜结构表面的微型棱镜可以将1个光束分束为i个光束,使得光学分束器可以将1个光束分为i个光束。
另一种可能的实施方式中,当光学分束器为二维衍射光学元件时;该二维衍射光学元件用于将一个光束分束为包括i个光束的光束矩阵。
另一种可能的实施方式中,当光学分束器为一维光栅时,该一维光栅元件可以将一个光束分束为i个光束,且i个光束的投射在同一平面时,光束的透射点的连线为一条直线。
另一种可能的实施方式中,阵列光源中的光源可以由边缘发光半导体激光器,垂直腔面发射激光器(Vertical-Cavity Surface-Emitting Laser,VCSEL),光纤激光器或者固体激光器组成的。
其中,阵列光源中的光源发出的光束的波长可以是任意波长。阵列光源可以是任一个光源发出光束,或者,阵列光源中的任一列或任一行的光源发出的光束。
另一种可能的实施方式中,该光发射装置还包括控制器;控制器与扫描转镜连接。该控制器用于接收控制信号,并向扫描转镜传输该控制信号,控制信号用于指示扫描转镜调节扫描转镜上的反射面的角度。扫描转镜用于接收该控制信号,并根据该控制信号调整反射面的角度,以调节K个第二准直光束的传播方向。
另一种可能的实施方式中,上述扫描转镜包括:微机电***MEMS反射镜或者数字微反射镜DMD。
第二方面,本申请还提供一种电子设备,包括上述第一方面及其任一种可能的实施方式中的光发射装置;其中,该光发射装置发射的光束照射在目标对象上,被目标对象反射。该电子设备还包括接收装置,用于接收目标对象反射的光束。
在一种可能的实施方式中,上述接收装置包括:接收透镜和图像传感器;其中,接收透镜的光轴垂直于图像传感器所在的平面。接收透镜用于接收目标对象反射的光束,并将目标对象反射的光束折射为折射光束;其中,折射光束照射在图像传感器上,使得目标对象在图像传感器上成像。
另一种可能的实施方式中,图像传感器与接收透镜之间的距离小于接收透镜的焦距的两倍。
另一种可能的实施方式中,图像传感器的像素为E×F,E、F均为正整数。其中,图像传感器包括j个探测器,其中,j小于E×F,j为正整数。
其中,图像传感器中探测器的数量小于像素数量,且能够在图像传感器上形成对应分辨率的图像,也就是说,图像传感器的探测器的数量可以减少,降低了成本。
另一种可能的实施方式中,探测器包括至少一个的单光子探测器。
另一种可能的实施方式中,接收装置还包括滤光片;滤光片设置于接收透镜与图像传感器之间,滤光片平行于图像传感器所在平面。其中,滤光片用于滤除折射光束中的环境光。
第三方面,本申请还提供一种发射光束的方法,该方法可以应用于上述第一方面及其任一种可能的实施方式中的光发射装置,光发射装置包括阵列光源,准直透镜、扫描转镜、光学分束器和处理器。阵列光源包括M×N个光源,M×N个光源为M行N列的光源,M和N均为正整数,其中,N列光源中,相邻两列光源之间的间隔为第一预设距离,M行光源中,相邻两行光源之间的间隔为第二预设距离,N列光源中任一列光源与M行光源中任 一行光源之间的夹角为预设角度;该方法可以包括:处理器控制阵列光源发射K个光束,K≥1,K为正整数。其中,阵列光源位于准直透镜的第一侧,阵列光源所在平面垂直于该准直透镜的光轴,且阵列光源所在平面与准直透镜的中心点之间的距离为该准直透镜的焦距。则阵列光源发出的K个光束可以传播至准直透镜,K个光束经过准直透镜转换为K个第一准直光束。其中,扫描转镜位于准直透镜的第二侧,扫描转镜用于实现一维旋转,且准直透镜的光轴经过扫描转镜的反射面。准直光束转换的K个准直光束可以传播至扫描转镜;处理器控制扫描转镜将K个第一准直光束反射为K个第二准直光束。K个第二准直光束经过光学分束器分束为i×K个第三准直光束,并发射i×K个第三准直光束;其中,i≥2,i为正整数。
在一种可能的实施方式中,上述预设角度为锐角。
第四方面,本申请实施例还提供一种计算机可读存储介质,包括有计算机指令。当计算机指令在如第一方面及其任一种可能的实施方式中的电子设备上运行时,该电子设备可以实现上述第三方面中的发射光束的方法。
可以理解的是,上述提供的第二方面的电子设备,第三方面的发射光束的方法,第四方面的计算机可读存储介质所能达到的有益效果,可参考第一方面及其任一种可能的实施方式中的有益效果,此处不再赘述。
附图说明
图1为本申请提供的一种点扫描方式的发射端结构示意图;
图2为本申请实施例提供的一种衍射光学元件的结构示意图;
图3为本申请实施例提供的一种线扫描方式的发射端结构示意图;
图4为本申请实施例提供的一种光发射装置的结构示意图;
图5A为本申请实施例提供的一种阵列光源的结构示意图;
图5B为本申请实施例提供的另一阵列光源的结构示意图;
图6为本申请实施例提供的一种准直透镜的结构示意图;
图7为本申请实施例提供的一种光学分束器的结构示意图;
图8为本申请实施例提供的一种光发射装置的光线出射角示意图;
图9为本申请实施例提供的一种接收装置的结构示意图;
图10为本申请实施例提供的一种接收装置采集扫描示意图;
图11为本申请实施例提供的一种探测器的结构示意图;
图12为本申请实施例提供的一种探测器采集扫描结果的示意图;
图13为本申请实施例提供的另一探测器采集扫描结果的示意图;
图14为本申请实施例提供的另一探测器采集扫描结果的示意图;
图15A为本申请实施例提供的另一接收装置的结构示意图;
图15B为本申请实施例提供的图像传感器上采集光束的示意图;
图16为本申请实施例提供的一种接收装置中探测器接收光斑的示意图;
图17为本申请实施例提供的一种发射光束的方法的流程图;
图18为本申请实施例提供的一种3D相机的结构示意图;
图19为本申请实施例提供的一种电子设备的结构示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
下面对于本申请实施例中可能出现的术语进行解释。
衍射光学元件(diffractive optical elements,DOE):又称为二元光学器件,衍射光学元件具有特定的表面结构设计,使得传播至自身的光束发生衍射。
具体地说,衍射光学元件中表面结构不同,则该衍射光学元件具备的功能也不同。就本申请中使用的DOE而言,DOE可以具备一维点阵分束功能,DOE还可以具备二维点阵分束功能。其中,具备一维点阵分束功能的DOE可以将1个光束在一个方向上分为多个光束。例如,如图2中(a)所示,光束1被DOE在竖直方向上分为4个光束。具备二维点阵分束功能的DOE可以将1个光束分束为一个矩阵光束。例如,如图2中(b)所示,光束1被DOE分束为8*8的光束阵列。
准直:一般而言,光线是发散的,即开始相邻的两条光线传播的过程中会相离越来越远。准直就是保持一束光束中的光线之间是平行的。
单光子探测器(single-photon avalanche diode,SPAD):一旦有光子传播至SPAD的表面,单光子探测器就会被触发,用于检测光信号。其中,SPAD被触发之后可以在一定时间之后恢复到初始状态(即未被触发的状态)。SPAD只能检测到是否有光子照射在SPAD的表面,而不能检测到照射在SPAD的表面的光子的数量。
在3D相机中,如果发射端是采用点扫描的方式发射光,则对发射端中扫描转镜旋转角度的要求较高,并且对光源的调制速度要求极高,实现困难。另外,在3D相机的发射端还可以采用线扫描的方式发射光,线扫描的方式是指发射端可以在一个方向上发射出多个光束,发射端的扫描转镜在一个方向旋转,使得接收端可以采集到目标对象的图像。例如,线扫描方式中发射端可以在竖直方向上发射出多个光束,发射端的扫描转镜在水平方向上旋转,使得接收端可以采集到目标对象的图像。因此,采用线扫描的方式发射光,可以降低对扫描转镜的旋转角度的要求。
例如,图3为一种线扫描技术的发射端的结构示意图。如图3所示,发射端包括线型光源210、透镜220和扫描转镜230。其中,线型光源210为多个光源在一个方向上排列形成的。图3所示的线扫描方式中,线型光源210发出光束,光束经过透镜220和扫描转镜230向外投射。以使用线扫描技术扫描VGA图像为例,线型光源210在竖直方向上发射480个光束,扫描转镜在水平方向上旋转,调节扫描转镜的角度640次使得接收端采集到完整的VGA图像。
可以理解的是,采用线扫描方式可以降低发射端中扫描转镜的旋转角度的要求,降低发射端的实现难度。其中,在上述的线扫描方式中,线型光源发射的1个光束使得接收端可以采集到目标对象的一个点,目标对象上的一个点对应图像上的一个像素。为了满足扫描精度的要求,在线型光源上的相邻光源紧密排列。这种光源的排列方式增加了线型光源的加工难度和安装难度,实现困难。
在上述点扫描方式中,光源发出的1个光束投射到目标对象的一个点,接收端采集到目标对象的该点反射的光束,并在目标对象的图像形成一个像素。也就是说,光源发出一 个光束最终对应图像上的一个像素。在线扫描的方式中,并未解决光源利用率低的问题。
本申请实施例提供一种光发射装置,光源采用阵列光源,也就是说,光源采用多行多列的排列方式。阵列光源中不需要相邻光源紧密排列,降低了光源的加工难度和安装难度。而且,光发射装置中包括光学分束器,光学分束器可以将1个光束分束为i个光束,i为正整数,且i≥2。使得光源发出的1个光束可以投射到目标对象i个点,提高对光源的利用率。
本申请实施例提供的光发射装置可以用于3D相机,也可以用于3D扫描仪等使用3D传感技术的装置中。
如果本申请实施例中的光发射装置作为3D相机的发射端,该3D相机可以设置于电子设备上。其中,本申请实施例中的电子设备可以是手机、平板电脑、桌面型、膝上型、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备等,本申请实施例对该电子设备的具体形态不作特殊限制。
请参考图4,为本申请实施例提供的光发射装置的结构示意图。该光发射装置包括:阵列光源401、准直透镜402、扫描转镜403和光学分束器404。
如图4所示,阵列光源401位于准直透镜402的第一侧S1,阵列光源401所在平面垂直于该准直透镜402的光轴,且阵列光源401所在平面与准直透镜402的中心点之间的距离为该准直透镜402的焦距。扫描转镜403位于准直透镜402的第二侧S2,且该扫描转镜403的反射面的中心点在准直透镜402的光轴上。
其中,阵列光源401包括M×N个光源。当阵列光源401中的K个光源处于工作状态时,阵列光源401可以发射K个光束,其中,K≥1,K为正整数。也就是说,阵列光源401可以发射多个光束。该K个光束可以是M×N个光源中的K个光源发射的。阵列光源401所在平面垂直于准直透镜402的光轴,也就是说,阵列光源401朝向准直透镜402。则阵列光源401发出的K个光束,K个光束会沿光轴传播至准直透镜402。准直透镜402具备准直光束的功能。因此,准直透镜402可以将K个光束转换为K个准直光束(即第一准直光束)。扫描转镜403的反射面的中心点在准直透镜402的光轴上,使得K个第一准直光束可以传播至扫描转镜403,扫描转镜403的反射面用于将K个第一准直光束反射为K个第二准直光束。
示例性的,上述M×N个光源的排列方式可以是M行光源和N列光源,也可以是N行光源和M列光源。例如,以阵列光源401为M行N列的光源为例。其中,N列光源中、相邻两列光源之间的间隔为第一预设距离;M行光源中、相邻两行光源之间的间隔为第二预设距离;N列光源中任一列光源与M行光源中任一行光源之间的夹角为预设角度。
示例性的,假设M和N均为8,如图5A所示,为阵列光源401的结构示意图。如图5A中直线L1上的光源为第一行光源,直线L2上的光源第二行光源,直线L3上的光源第一列光源,直线L4上的光源第二列光源为例。相邻两列之间的间隔为同一行上相邻光源之间的间隔,相邻两行之间的间隔为同一列上相邻光源之间的间隔。如图5A中,L1和L2之间的间隔为L3上的相邻光源之间的距离X1,L3和L4之间的间隔为L1上相邻光源之间的距离X2。第一行光源L1和第一列光源L3之间的夹角为θ,如果第一行光源L1中相邻两个光源在水平方向上的差值为X3,其中,X3也可以理解为第二列光源相对于第一列光源下移X3的距离。则θ可以表示为
Figure PCTCN2021079346-appb-000001
本申请实施例中,阵列光源401中任一列光源与任一行光源之间的夹角为预设角度,预设角度可以为锐角,即小于90°。如果每一列光源都是沿垂直水平的方向依次排列,则每一行中的光源不是处于同一水平线的。如图5A中,每行中的各个光源不是设置在同一水平线的。
在一些实施方式中,阵列光源401为8*8个光源,X1为40μm(微米),X2为40μm(微米),X3为5μm(微米)。其中,如图5A所示,每列光源相比于前一列光源的位置都下移了5μm。具体地说,阵列光源401中的光源可以是激光器。
又示例性的,阵列光源401中光源的排列方式还可以是如图5B所示。以阵列光源401为8*8个光源为例,具体包括,如图5B中(a)中,N列光源中任一列光源与M行光源中任一行光源之间的夹角为预设角度90°。如图5B中(b)所示,阵列光源401中后一行光源相比于前一行光源中位置都向右移动的预设距离,此时N列光源中任一列光源与M行光源中任一行光源之间的夹角为预设角度0°~90°。如图5B中(c)所示,阵列光源401中L表示的第一列光源,一列光源中的每个光源可以不在同一竖直方向。
可以理解的是,当阵列光源发出光束的时候,可以是阵列光源中的任一光源发出光束,也可以是阵列光源中的任一行或任一列光源发出光束。或者阵列光源中的一部分光源发出光束,如,阵列光源中包括8*8个光源,可以控制阵列光源中4*4个光源发出光束。
其中,阵列光源中的光源可以是由边缘发光半导体激光器,垂直腔面发射激光器(Vertical-Cavity Surface-Emitting Laser,VCSEL),光纤激光器或者固体激光器中的至少一种组成的。并且,阵列光源中的光源发出的光束可以是任意波长的光束。如果使用该光发射装置作为3D相机的发射端,则需要确定阵列光源发出的光束的波长,以便3D相机的接收端可以确定接收到的光束为发射端发出的光束。
本申请实施例中,准直透镜402的作用是将光源发出的光束转换为第一准直光束。
在一些实施方式中,本申请实施例中的准直透镜402可以是单透镜,单透镜是指由一块光学玻璃形成的透镜。也就是说,准直透镜402可以是一个凸透镜。在另一些实施方式中,准直透镜402也可以是由多个光学元件形成的,如准直透镜是由多个镜片组成的。准直透镜402可以将光源发出的光束转换为准直的光束。
阵列光源401所在的平面与准直透镜402的中心点之间的距离为该准直透镜402的焦距,也就是说,阵列光源401所在的平面包括准直透镜402的焦点。因此,阵列光源401上的发出的光束经过准直透镜402之后变为准直光束。但是,由于阵列光源401中的每个光源的位置不同,则每个光源发出的光束入射到准直透镜402的位置也不同,准直透镜402将光源发出的光束转换为第一准直光束之后,第一准直光束的出射角也不同。
示例性的,假设准直透镜402为单透镜,即准直透镜402是一个凸透镜。该凸透镜的焦距可以为3.4mm(毫米)。
如图6所示,为准直透镜402的结构示意图。多个光束分别表示阵列光源401上不同位置的光源发出的光束,光源射出的透镜经过准直透镜402之后成为第一准直光束,第一准直光束经过准直透镜402在第二侧的焦点。如图6中,光束1经过准直透镜402的转换,沿准直透镜402的出射方向传播。其中,光源发出的光束经过准直透镜402之后的出射角表示为:
Figure PCTCN2021079346-appb-000002
其中,αi表示光源发出的光束的出射角,Li表示该光束与准直透镜402的光轴的距离值,f表示准直透镜402的焦距。
准直透镜402将阵列光源401发出的光束转换为第一准直光束,第一准直光束沿出射方向传播。阵列光源401位于准直透镜402的第一侧,阵列光源401发出的光束从第一侧入射至该准直透镜402,准直透镜402将光束转换为第一准直光束之后,第一准直光束沿出射方向在准直透镜402的第二侧传播。扫描转镜403位于准直透镜402的第二侧,且扫描转镜403的反射面的中心点在准直透镜402的光轴上。
其中,第一准直光束传播至扫描转镜403后,经过扫描转镜403上反射面反射为第二准直光束。也就是说,扫描转镜403用于改变第一准直光束的传播方向。
在一些实施方式中,扫描转镜403上设置有控制器,该控制器用于接收控制信号,并向扫描转镜403传输该控制信号,控制信号用于指示扫描转镜403调节扫描转镜403上的反射面的角度。扫描转镜403用于接收该控制信号,并根据该控制信号调整反射面的角度,以调节第二准直光束的传播方向。
示例性的,扫描转镜403可以是微机电***(micro-electro-mechanical system,MEMS)反射镜,或者,扫描转镜可以是数字微反射镜(digital micromirror device,DMD)。
例如,MEMS反射镜是通过静电驱动或电磁驱动的。具体地说,上述的控制信号可以是静电信号,通过静电信号产生静电驱动以控制MEMS反射镜的角度。或者,上述的控制信号是电流信号,通过电流的变化产生电磁驱动,以控制MEMS反射镜的角度。由于MEMS反射镜体积小,重量轻,且响应时间短,可以提高光发射装置的光发射效率。
又例如,DMD包括多个微反射镜。即每个DMD都是由多个微反射镜组成的。DMD是通过数字信号驱动的。具体地说,上述的控制信号可以是数字信号,通过数字信号控制DMD的角度,以实现调节第二准直光束的传播方向的目的。
其中,扫描转镜还可以是通过马达驱动的镜片。并且,扫描转镜可以实现一维旋转(即在一个方向上旋转),也可以二维旋转(即在两个方向上旋转)。扫描转镜在一维旋转时,每次可以投射一个阵列,通过多次的投射能够完成扫描,实现预定的分辨率。另外,扫描转镜的形状可以是圆形、矩形或者多边形等。
上述光发射装置中,光学分束器404用于将1个光束分束为i个光束,其中,i≥2,i为正整数。光学分束器404包括入射面和出射面,1个光束从入射面进入光学分束器404,在出射面射出i个光束。
其中,光学分束器可以是一维光栅,二维衍射光学元件或者棱镜膜。一维光栅可以将一个光束分束为i个光束,具体分束的光束个数与一维光栅的光栅数相关。一维光栅的分束效果可以参考图2中(a)所示衍射光学元件的分束示意。二维衍射光学元件可以将一个光束分束为包括i个光束的光束矩阵。二维衍射光学元件的分束效果可以参考图2中(b)所示衍射光学元件的分束示意。
在一些实施方式中,当光学分束器为棱镜膜时,光学分束器404的出射面为棱镜膜结构。其中,棱镜膜结构是指表面为棱镜形状的薄膜。以棱镜为三角形棱镜结构为例,如图7中的(a)所示,入射面70为平面,出射面71包括8个棱镜,每个棱镜与水平面的角度不同,则从每个棱镜射出的光束的传播方向也不同。如图7中的(a)所示,棱镜1与水平面的夹角为β1,棱镜2与水平面的夹角为-β1;棱镜3与水平面的夹角为β2,棱镜4与水平面的夹角为-β2;棱镜5与水平面的夹角为β3,棱镜6与水平面的夹角为-β3;棱镜7与水平面的夹角为β4,棱镜8与水平面的夹角为-β4。如图7中(b)所示,当1个光束传播 至光学分束器404,光学分束器404可以将1个入射光束分束为8个出射光束。如图7中(b),I1表示1个入射光束,O1表示出射光束1,O2表示出射光束2,O3表示出射光束3,……,O8表示出射光束8,图示中的箭头表示光线的传播方向,每个出射光束的传播方向不同。
示例性的,如图7中的(a)所示的出射面71上的8个棱镜中,假设β1为5.4°,β2为15.5°,β3为24.2°,β4为31.5°。可以理解的,如果阵列光源401为8*8个光源,假设每个光源都发出光束。阵列光源401发出的光束传播至准直透镜402,准直透镜402将8*8个光束转换为8*8个第一准直光束。8*8个第一准直光束传播至扫描转镜403的反射面,扫描转镜403的反射面将8*8个第一准直光束反射为8*8个第二准直光束。8*8个第二准直光束沿反射方向传播至光学分束器404,光学分束器404可以将8*8个第二准直光束分束为8*8*8(512)个光束。其中,512个光束的出射角如图8所示,每列中的一个光源发出的光束都被分束为8个光束,如图为64*8个光束,每个点对应竖直方向的角度表示一个光束在纵向上的出射角度。
其中,当光学分束器为棱镜膜时,光学分束器可以是由玻璃,聚甲基丙烯酸甲酯(polymethyl methacrylate),简称PMMA),聚碳酸酯(Polycarbonate,PC),聚酰亚胺薄膜(PolyimideFilm,PI)等光学塑料为材料制作。具体可以通过刻蚀、压印、微复制等技术加工。
在另一些实施方式中,光学分束器404可以是一维点阵分束功能的光学分束器404,也可以是具备二维点阵分束功能的光学分束器404。
需要说明的是,阵列光源401中包括M×N个光源。阵列光源401处于工作状态时,可以控制M×N个光源全部发光,也可以控制M×N个光源中的部分光源发光。例如,控制阵列光源401中的第一列光源发光,控制阵列光源401中的第一行光源发光,或者,控制阵列光源401中的某个光源发光等。
本申请实施例中,光发射装置中的阵列光源401可以发出K个光束,其中,2≤K≤M×N,K为正整数。K个光束传播至准直透镜402,准直透镜402将K个光束转换为K个第一准直光束,K个第一准直光束传播至扫描透镜的反射面,扫描透镜将K个第一准直光束反射为K个第二准直光束,K个第二准直光束传播至光学分束器404,光学分束器404将K个第二准直光束分束为i×K个第三准直光束。i×K个第三准直光束沿传播方向传播至目标对象,并被目标对象发射或折射。
一般而言,采用3D相机拍摄目标对象时,发射端将发出的光束投射在目标对象上,投射至目标对象的光束被目标对象反射,接收装置用于接收目标对象发射的光,并根据接收到的光束生成目标对象的图像。其中,光束投射在目标对象的一个点,接收装置根据接收到目标对象反射的光,一个光束对应生成的目标对象的图像上的一个像素点。
请参考图9,本申请实施例还提供一种接收装置。如图9所示,该接收装置包括:接收透镜901和图像传感器902。其中,接收透镜901的光轴垂直于图像传感器902所在的平面。接收透镜901用于接收目标对象反射的光束,并将目标对象反射的光束折射为折射光束;其中,折射光束照射在图像传感器902上,使得目标对象在图像传感器902上成像。
接收装置用于接收目标对象反射的光,如果环境中的光照射到目标对象上,被目标对象反射,也可以进入接收装置,使得目标对象在图像传感器902上成像。其中,环境中的 光可能会影响接收装置获取到的目标对象的图像。
在一些实施方式中,接收装置还可以包括滤光片;滤光片设置于接收透镜901与图像传感器902之间,滤光片平行于图像传感器902所在平面。其中,滤光片用于滤除折射光束中的环境光。
具体地说,上述光发射装置用于向目标对象投射光束,本申请实施例提供的接收装置用于接收目标对象反射的光束。如果上述的光发射装置和接收装置可以配合工作,进而拍摄目标对象的图像。
接收装置上的图像传感器902上包括探测器,探测器用于采集反射至自身的光子。图像传感器902根据反射至探测器上的光子确定位置的像素,图像传感器902上每个像素对应一个探测器。需要说明的是,此处的探测器为功能意义上的探测器,即探测器用于识别该像素是否有光子。该探测器可以包括一个SPAD,也可以包括多个SPAD。如果一个探测器包括一个SPAD,则该探测器仅识别该像素点上的光子。如果该探测器包括多个SPAD,每个SPAD对应检测一个像素上的光子。另外,该探测器上还可以包括硅雪崩光电二极管(Si-Avalanche Photon Diode,SiAPD)或者雪崩光电二极管(Avalanche Photon Diode,APD)。
在第一种实现方式中,一个探测器包括一个SPAD。由于一个探测器包括一个SPAD,则接收装置中图像传感器902中的探测器的数量与生成的图像的像素对应。例如,接收装置中探测器的数量为640*480,则接受装置生成的图像的分辨率也是640*480。
示例性的,假设光发射装置中的阵列光源401为3×8的阵列光源401,光学分束器404可以将1个光束分束为2个光束。因此,光发射装置可以发射3×16个光束。具体地说,一个探测器包括一个SPAD,每个探测器采集到的光子对应生成图像上的一个像素点。如图10所示,为探测器采集三次之后的示意图。如图10中(仅示出部分探测器)的每个方块相当于图像上的一个像素。当扫描转镜403处于第一角度,时间为T0时,探测器第一次采集目标对象反射的光子对应图像上像素的位置(如图10中第一列、第二列和第三列中第一次采集到的像素);当扫描转镜403转动预设角度之后,时间为T1时,探测器第二次采集目标对象反射的光子对应图像上像素的位置(如图10中第二列、第三列和第四列中第二次采集到的像素);当扫描转镜403再次转动预设角度之后,时间为T2时,探测器第三次采集目标对象反射的光子对应图像上像素的位置(如图10中第三列、第四列和第五列中第三次采集到的像素)。
在第二种实现方式中,一个探测器包括多个SPAD。示例性的,图像传感器902的像素为E×F,E、F均为正整数。其中,图像传感器902包括j个探测器,其中,j大于等于2,j小于E×F,j为正整数。
请参考图11,为本申请实施例提供的一种探测器的结构示意图。如图11所示,一个探测器包括8*8个SPAD,图示中未完全展示每个SPAD。
需要说明的是,SPAD仅能识别出有光子并不能识别出光子的数量。SPAD识别到光子时被触发,需要经过预设时间之后才能恢复到初始状态。当一个探测器上包括多个SPAD时,如果一个SPAD被触发,则该探测器就处于被触发状态。因此,本申请实施例中对探测器可以采用分时工作的方式控制探测器的状态。
示例1:以光发射装置中阵列光源401是8*8的阵列光源401为例,光学分束器404将1个光束分束为8个光束,接收装置中以图11所示的一个探测器包括8*8个SPAD为例。 具体地说,阵列光源401发出8*8个光束,每个光源被分束为8个光束。以1个光束在水平方向上被分成8个光束为例。由于每个探测器上包括8*8个SPAD,如果要获取VGA图像,则图像传感器902上可以包括60*80个探测器。如图12所示,为接收装置的探测器上的成像示意图。黑色小方块采集到的像素点(图中未完全示出全部的像素点)。
具体地说,如果阵列光源401中8*8个光源同时处于工作状态,同一时间,该探测器可以采集到512个像素。由于每列光束分布在不同的探测器中,当光发射装置中的扫描转镜403的角度发生改变,图12中黑色小方块的位置也会随着改变。在这种情况下,每个探测器可以采集8*8个像素点,而且,一个探测器上不会同时采集到两个像素点。如果采用这种结构的接收装置获取VGA图像,图像传感器902可以包括60*80个探测器。
例如,由于前7列的探测器采集到的图像是不完整的,实际操作中,光发射装置需要扫描696列(56列不完整列+640列完整列)。如图13所示,该接收装置形成的VGA图像示意,其中,纵向640列,横向480行的像素点。由于前7列的像素点中的图像不完全被舍弃。在实际使用中可以忽略前7列的形成的像素点(图中未示出后7列的探测器)。
示例2:以光发射装置中阵列光源401是8*8的阵列光源401为例,光学分束器404可以将一个光束分束为3*8的二维的阵列光束,接收装置中一个探测器包括8*8个SPAD为例。
其中,阵列光源401可以在同一时间发出64*8*3=512*3个点阵光束,可以将阵列光源401划分为3部分,每部分包括8列,每列包括64个点阵。这3部分可以是均匀的分布在接收装置的探测器上。每部分占整个图像像素的1/3。如图14所示(仅示出部分的探测器),以获取VGA图像为例,该光发射装置完成696列的扫描,由于被分为3部分的探测器,每部分的探测器获取696/3=232列的探测器扫描。使得光发射装置中的扫描转镜403支撑每个点阵扫描232列探测器就可以使接收装置获取到VGA图像。这样可以大大降低探测器的数量。
其中,图像传感器902中探测器的数量小于像素数量,且能够在图像传感器902上形成对应分辨率的图像,也就是说,图像传感器902的探测器的数量可以减少,从而降低了成本。
在一种可能的实施方式中,由于阵列光源401包括8*8个光源,如果每个光源都处于工作状态,光发射装置每次可以产生512个光束。以阵列光源401中的每个光源单独控制为例,一个光源可以被分束为8个光束,阵列光源401中的每个光源单独工作,依次点亮8*8个光源。即每个光源完成扫描任务之后,开启下一个光源。如此操作,图像探测器上设置8*10个探测器就可以满足生成VGA图像的需求。
需要说明的是,在上述实施方式中,探测器的数量小于像素数量。目标对象反射的光束在图像传感器902的成像面形成的光斑会变小,可以图像传感器902与接收透镜901之间的距离小于接收透镜901的焦距,以便图像传感器902上可以获得较大的光斑。或者,图像传感器902与接收透镜901之间的距离值大于接收透镜901的焦距,小于接收透镜901的焦距的两倍。如图15A所示,图像传感器902与接收透镜901之间的距离大于接收透镜901的焦距,使得图像传感器902上的形成的光斑变大。如图15B所示,为光斑在图像传感器902的形状示意图。其中,光斑的形状变大,以便形成目标对象的图像。
在如图15A所示的电路结构中,由于探测器是紧密排列在图像传感器902上的,当检 测点的理论成像位置位于探测器边缘时,扩大的光斑会被多个探测器接收,造成串扰。具体的,可以在图像传感器902上设置动态分配SPAD阵列。动态分配SPAD阵列表示该探测器的位置可以随着光斑的位置的移动而移动。如图16中(a)表示状态1时,光斑的位置和探测器的位置。图16中(a)中黑色圆形表示光斑,黑色圆形外的矩形框表示探测器的位置。图16中(b)表示光斑位置移动后探测器的位置示意。
其中,可以在探测器上设置接口,控制器(比如MCU)可以通过该接口控制探测器位置。例如,控制器可以通过接口向探测器发送一串SPAD配置命令,探测器根据配置命令完成SPAD阵列的移动。
需要说明的是,SPAD和探测器并不是一一对应的,比如,每个探测器包含16个SPAD,第一个光斑时刻,图像传感器902上前16个SPAD(1~4排)组成一个探测器,第二个光斑时刻,图像传感器902上(2~5排)的SPAD组成新的探测器。
本申请实施例还提供一种发射光束的方法,该方法可以应用于上述实施例提到的光发射装置。或者,该方法还可以应用于上述的电子设备中。如图17所示,为发射光束的方法的实施流程示意图。该方法可以包括步骤601-步骤602。
其中,光发射装置中可以包括阵列光源,准直透镜、扫描转镜、光学分束器和处理器。光发射装置中的处理器可以执行上述发射光束的方法。
步骤601:处理器控制阵列光源发射K个光束,K≥1,K为正整数;K个光束经过准直透镜转换为K个第一准直光束。
其中,阵列光源位于准直透镜的第一侧,阵列光源所在平面垂直于该准直透镜的光轴,且阵列光源所在平面与准直透镜的中心点之间的距离为该准直透镜的焦距。则阵列光源发出的K个光束可以传播至准直透镜。并且,扫描转镜位于准直透镜的第二侧,且准直透镜的光轴经过扫描转镜的反射面。准直光束转换的K个准直光束可以传播至扫描转镜。
步骤602:处理器控制扫描转镜将K个第一准直光束反射为K个第二准直光束;K个第二准直光束经过光学分束器分束为i×K个第三准直光束,并发射i×K个第三准直光束。
可以理解的,当扫描转镜发生转动时,K个第二准直光束的传播方向也会发生改变。即使扫描转镜发生转动,K个第二准直光束也可以传播到光学分束器。
本申请实施例还提供一种3D相机,如图18所示,该3D相机包括上述的光发射装置1701和上述的接收装置1702。
该3D相机还可以将诸如***设备、处理器和功率管理电路等之类的各种电路链接在一起,这些都是本领域所公知的,因此,本文不再对其进行进一步的描述。
本申请实施例还提供一种电子设备,请参考图19,为本申请实施例提供的一种电子设备500的结构示意图。如图19所示,电子设备500可以包括处理器510,外部存储器接口520,内部存储器521,通用串行总线(universal serial bus,USB)接口530,充电管理模块540,电源管理模块541,电池542,天线1,天线2,移动通信模块550,无线通信模块560,音频模块570,传感器模块580,摄像头593,显示屏594,以及用户标识模块(subscriber identification module,SIM)卡接口595等。其中,传感器模块580可以包括压力传感器580A,陀螺仪传感器580B,气压传感器580C,磁传感器580D,加速度传感器580E,距离传感器580F,接近光传感器580G,指纹传感器580H,温度传感器580J,触摸传感器580K等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备500的具体限定。在本申请另一些实施例中,电子设备500可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器510可以包括一个或多个处理单元,例如:处理器510可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
其中,控制器可以是电子设备500的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器510中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器510中的存储器为高速缓冲存储器。该存储器可以保存处理器510刚用过或循环使用的指令或数据。如果处理器510需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器510的等待时间,因而提高了***的效率。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备500的结构限定。在本申请另一些实施例中,电子设备500也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块540用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。
电源管理模块541用于连接电池542,充电管理模块540与处理器510。电源管理模块541接收电池542和/或充电管理模块540的输入,为处理器510,内部存储器521,外部存储器,显示屏594,摄像头593,和无线通信模块560等供电。电源管理模块541还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。
电子设备500的无线通信功能可以通过天线1,天线2,移动通信模块550,无线通信模块560,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备500中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块550可以提供应用在电子设备500上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块550可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。
无线通信模块560可以提供应用在电子设备500上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星***(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块560可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块560经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理 后的信号发送到处理器510。无线通信模块560还可以从处理器510接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备500通过GPU,显示屏594,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏594和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器510可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
电子设备500可以通过ISP,摄像头593,视频编解码器,GPU,显示屏594以及应用处理器等实现拍摄功能。
摄像头593可以包括光发射装置1701和接收装置1702。光发射装1701用于发出光束,光出投射至目标对象并被目标对象反射,接收装置1702用于接收目标对象反射的光束,并根据采集到的光束生成目标对象的图像。
ISP用于处理摄像头593反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头593中。
摄像头593用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备500可以包括1个或N个摄像头593,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备500在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备500可以支持一种或多种视频编解码器。这样,电子设备500可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
外部存储器接口520可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备500的存储能力。外部存储卡通过外部存储器接口520与处理器510通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器521可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器510通过运行存储在内部存储器521的指令,从而执行电子设备500的各种功能应用以及数据处理。内部存储器521可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。
电子设备500可以通过音频模块570,实现音频功能。
压力传感器580A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器580A可以设置于显示屏594。
陀螺仪传感器580B可以用于确定电子设备500的运动姿态。在一些实施例中,可以通过陀螺仪传感器580B确定电子设备500围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器580B可以用于拍摄防抖。
气压传感器580C用于测量气压。在一些实施例中,电子设备500通过气压传感器580C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器580D包括霍尔传感器。电子设备500可以利用磁传感器580D检测翻盖皮套的开合。
加速度传感器580E可检测电子设备500在各个方向上(一般为三轴)加速度的大小。
距离传感器580F,用于测量距离。电子设备500可以通过红外或激光测量距离。
接近光传感器580G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备500通过发光二极管向外发射红外光。电子设备500使用光电二极管检测来自附近物体的红外反射光。
环境光传感器580L用于感知环境光亮度。
指纹传感器580H用于采集指纹。电子设备500可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器580J用于检测温度。在一些实施例中,电子设备500利用温度传感器580J检测的温度,执行温度处理策略。
触摸传感器580K,也称“触控面板”。触摸传感器580K可以设置于显示屏594,由触摸传感器580K与显示屏594组成触摸屏,也称“触控屏”。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以上述权利要求的保护范围为准。

Claims (16)

  1. 一种光发射装置,其特征在于,包括:阵列光源,准直透镜、扫描转镜和光学分束器,其中:
    所述阵列光源包括M×N个光源,所述M×N个光源为M行N列的光源,M和N均为正整数,其中,所述N列光源中,相邻两列光源之间的间隔为第一预设距离,所述M行光源中,相邻两行光源之间的间隔为第二预设距离,所述N列光源中任一列光源与所述M行光源中任一行光源之间的夹角为预设角度;
    所述阵列光源用于发射K个光束,K≥1,K为正整数;其中,所述阵列光源位于所述准直透镜的第一侧;所述阵列光源所在平面垂直于所述准直透镜的光轴,且所述阵列光源所在平面与所述准直透镜的中心点之间的距离为所述准直透镜的焦距;
    所述准直透镜,用于将所述K个光束转换为K个第一准直光束;
    所述扫描转镜位于所述准直透镜的第二侧,所述扫描转镜用于实现一维旋转,且所述准直透镜的光轴经过所述扫描转镜的反射面,所述反射面用于将所述K个第一准直光束反射为K个第二准直光束;
    所述光学分束器,用于接收所述K个第二准直光束,并将所述K个第二准直光束分束为i×K个第三准直光束;其中,i≥2,i为正整数。
  2. 根据权利要求1所述的光发射装置,其特征在于,
    所述光学分束器包括:一维光栅,二维衍射光学元件和棱镜膜中的至少一种。
  3. 根据权利要求1或2所述的光发射装置,其特征在于,所述预设角度为锐角。
  4. 根据权利要求2所述的光发射装置,其特征在于,所述光学分束器为所述一维光栅;
    所述一维光栅用于将一个光束分束为i个光束,其中,所述i个光束的传播方向不同。
  5. 根据权利要求2所述的光发射装置,其特征在于,所述光学分束器为所述棱镜膜;所述光学分束器的入射面为平面,所述光学分束器的出射面为棱镜膜结构;
    其中,所述棱镜膜结构包括i个分束面,所述i个分束面用于将1个光束分束为i个光束,且所述i个光束的传播方向不同。
  6. 根据权利要求2所述的光发射装置,其特征在于,所述光学分束器为二维衍射光学元件;
    所述二维衍射光学元件用于将一个光束分束为包括i个光束的光束矩阵。
  7. 根据权利要求1-6中任一项所述的光发射装置,其特征在于,所述光发射装置还包括控制器;所述控制器与所述扫描转镜连接;
    所述控制器,用于接收控制信号,并向所述扫描转镜传输所述控制信号,所述控制信号用于指示所述扫描转镜调节所述反射面的角度;
    所述扫描转镜,用于接收所述控制信号,并根据所述控制信号调整所述反射面的角度,以调节所述K个第二准直光束的传播方向。
  8. 根据权利要求7所述的光发射装置,其特征在于,所述扫描转镜包括:微机电***MEMS反射镜或者数字微反射镜DMD。
  9. 一种电子设备,其特征在于,包括:
    如权利要求1-8任一项所述的光发射装置;其中,所述光发射装置发射的光束照射在目标对象上,被所述目标对象反射;
    接收装置,用于接收所述目标对象反射的光束。
  10. 根据权利要求9所述的电子设备,其特征在于,所述接收装置包括:接收透镜和图像传感器;其中,所述接收透镜的光轴垂直于所述图像传感器所在平面;
    所述接收透镜,用于接收所述目标对象反射的光束,并将所述目标对象反射的光束折射为折射光束;
    其中,所述折射光束照射在所述图像传感器上,使得所述目标对象在所述图像传感器上成像。
  11. 根据权利要求10所述的电子设备,其特征在于,所述图像传感器与所述接收透镜之间的距离小于所述接收透镜的焦距的两倍。
  12. 根据权利要求10或11所述的电子设备,其特征在于,所述图像传感器的像素为E×F,E、F均为正整数;
    其中,所述图像传感器包括j个探测器;其中,j小于E×F,j为正整数。
  13. 根据权利要求12所述的电子设备,其特征在于,所述探测器包括至少一个的单光子探测器。
  14. 根据权利要求10-13中任一项所述的电子设备,其特征在于,所述接收装置还包括滤光片;
    所述滤光片设置于所述接收透镜与所述图像传感器之间,所述滤光片平行于所述图像传感器所在平面;
    其中,所述滤光片用于滤除所述折射光束中的环境光。
  15. 一种发射光束的方法,其特征在于,应用于光发射装置,所述光发射装置包括阵列光源,准直透镜、扫描转镜、光学分束器和处理器;
    所述阵列光源包括M×N个光源,所述M×N个光源为M行N列的光源,M和N均为正整数,其中,所述N列光源中,相邻两列光源之间的间隔为第一预设距离,所述M行光源中,相邻两行光源之间的间隔为第二预设距离,所述N列光源中任一列光源与所述M行光源中任一行光源之间的夹角为预设角度;
    所述方法包括:
    所述处理器控制所述阵列光源发射K个光束,K≥1,K为正整数;其中,所述阵列光源位于所述准直透镜的第一侧;所述阵列光源所在平面垂直于所述准直透镜的光轴,且所述阵列光源所在平面与所述准直透镜的中心点之间的距离为所述准直透镜的焦距;所述K个光束经过所述准直透镜转换为K个第一准直光束,其中,所述扫描转镜位于所述准直透镜的第二侧,所述扫描转镜用于实现一维旋转,且所述准直透镜的光轴经过所述扫描转镜的反射面;
    所述处理器控制所述扫描转镜将所述K个第一准直光束反射为K个第二准直光束;所述K个第二准直光束经过所述光学分束器分束为i×K个第三准直光束,并发射所述i×K个第三准直光束;其中,i≥2,i为正整数。
  16. 根据权利要求15所述的发射光束的方法,其特征在于,所述预设角度为锐角。
PCT/CN2021/079346 2020-03-31 2021-03-05 一种光发射装置及电子设备 WO2021196976A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21780128.1A EP4119976A4 (en) 2020-03-31 2021-03-05 ELECTROLUMINESCENT APPARATUS AND ELECTRONIC DEVICE
US17/955,261 US20230026858A1 (en) 2020-03-31 2022-09-28 Optical transmitting apparatus and electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202010246351 2020-03-31
CN202010246351.6 2020-03-31
CN202011511815.8A CN113534484A (zh) 2020-03-31 2020-12-18 一种光发射装置及电子设备
CN202011511815.8 2020-12-18

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/955,261 Continuation US20230026858A1 (en) 2020-03-31 2022-09-28 Optical transmitting apparatus and electronic device

Publications (1)

Publication Number Publication Date
WO2021196976A1 true WO2021196976A1 (zh) 2021-10-07

Family

ID=77927677

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/079346 WO2021196976A1 (zh) 2020-03-31 2021-03-05 一种光发射装置及电子设备

Country Status (3)

Country Link
US (1) US20230026858A1 (zh)
EP (1) EP4119976A4 (zh)
WO (1) WO2021196976A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2623760A (en) * 2022-10-24 2024-05-01 Newell Neil Display

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160352073A1 (en) * 2015-05-28 2016-12-01 Vixar Vcsels and vcsel arrays designed for improved performance as illumination sources and sensors
CN106972347A (zh) * 2017-05-04 2017-07-21 深圳奥比中光科技有限公司 用于3d成像的激光阵列
CN107078454A (zh) * 2014-12-15 2017-08-18 极光先进雷射株式会社 激光照射装置
CN110716190A (zh) * 2019-09-27 2020-01-21 深圳奥锐达科技有限公司 一种发射器及距离测量***
CN110716189A (zh) * 2019-09-27 2020-01-21 深圳奥锐达科技有限公司 一种发射器及距离测量***
WO2020030916A1 (en) * 2018-08-07 2020-02-13 Cambridge Mechatronics Limited Improved 3d sensing

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017208052A1 (de) * 2017-05-12 2018-11-15 Robert Bosch Gmbh Senderoptik für ein LiDAR-System, optische Anordnung für ein LiDAR-System, LiDAR-System und Arbeitsvorrichtung
DE102018109544A1 (de) * 2018-04-20 2019-10-24 Sick Ag Optoelektronischer Sensor und Verfahren zur Abstandsbestimmung
DE102018113848A1 (de) * 2018-06-11 2019-12-12 Sick Ag Optoelektronischer Sensor und Verfahren zur Erfassung von dreidimensionalen Bilddaten

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107078454A (zh) * 2014-12-15 2017-08-18 极光先进雷射株式会社 激光照射装置
US20160352073A1 (en) * 2015-05-28 2016-12-01 Vixar Vcsels and vcsel arrays designed for improved performance as illumination sources and sensors
CN106972347A (zh) * 2017-05-04 2017-07-21 深圳奥比中光科技有限公司 用于3d成像的激光阵列
WO2020030916A1 (en) * 2018-08-07 2020-02-13 Cambridge Mechatronics Limited Improved 3d sensing
CN110716190A (zh) * 2019-09-27 2020-01-21 深圳奥锐达科技有限公司 一种发射器及距离测量***
CN110716189A (zh) * 2019-09-27 2020-01-21 深圳奥锐达科技有限公司 一种发射器及距离测量***

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4119976A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2623760A (en) * 2022-10-24 2024-05-01 Newell Neil Display

Also Published As

Publication number Publication date
EP4119976A4 (en) 2023-08-30
US20230026858A1 (en) 2023-01-26
EP4119976A1 (en) 2023-01-18

Similar Documents

Publication Publication Date Title
US11924396B2 (en) Non-mechanical beam steering assembly
US10877281B2 (en) Compact optical system with MEMS scanners for image generation and object tracking
CN107407810B (zh) 具有柱镜状微透镜阵列的光引擎
US20150138325A1 (en) Camera integrated with light source
US20200192206A1 (en) Structured light projector, three-dimensional camera module and terminal device
US11399139B2 (en) High dynamic range camera assembly with augmented pixels
JPWO2006077718A1 (ja) レンズアレイ及びレンズアレイを備えるイメージセンサ
US10481739B2 (en) Optical steering of component wavelengths of a multi-wavelength beam to enable interactivity
CN112505713A (zh) 距离测量装置及方法、计算机可读介质和电子设备
CN113534484A (zh) 一种光发射装置及电子设备
WO2021196976A1 (zh) 一种光发射装置及电子设备
WO2022002162A1 (zh) 一种电子设备和深度图像的拍摄方法
US20240127566A1 (en) Photography apparatus and method, electronic device, and storage medium
US20220050354A1 (en) Optical modulator and electronic apparatus including the same
US10855896B1 (en) Depth determination using time-of-flight and camera assembly with augmented pixels
US10178372B2 (en) Long focal length monocular 3D imager
US20230044716A1 (en) Meta-lens, imaging optics, and electronic device including the same
KR20230020336A (ko) 메타 렌즈, 결상 광학계, 및 이를 포함하는 전자 장치
JP2004170305A (ja) 3次元形状計測方法および3次元形状計測装置
CN109005316B (zh) 影像感测模块
US20220357425A1 (en) LiDAR DEVICE AND ELECTRONIC APPARATUS INCLUDING THE SAME
WO2024041034A1 (zh) 一种显示模组、光学显示***、终端设备及成像方法
US10499036B2 (en) Image sensing module
KR20230061135A (ko) 광 필터와 이를 포함하는 이미지 센서 및 전자 장치
Voskerchyan et al. Flexible and highly scalable concept for an FMCW LiDAR PIC based on Tilted Grating Couplers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21780128

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021780128

Country of ref document: EP

Effective date: 20221012

NENP Non-entry into the national phase

Ref country code: DE