WO2022089113A1 - 镜头组件及电子设备、深度检测方法、存储介质 - Google Patents

镜头组件及电子设备、深度检测方法、存储介质 Download PDF

Info

Publication number
WO2022089113A1
WO2022089113A1 PCT/CN2021/120705 CN2021120705W WO2022089113A1 WO 2022089113 A1 WO2022089113 A1 WO 2022089113A1 CN 2021120705 W CN2021120705 W CN 2021120705W WO 2022089113 A1 WO2022089113 A1 WO 2022089113A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
module
lens
beam splitting
prism
Prior art date
Application number
PCT/CN2021/120705
Other languages
English (en)
French (fr)
Inventor
陈嘉伟
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2022089113A1 publication Critical patent/WO2022089113A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/0055Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras employing a special optical element
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/106Beam splitting or combining systems for splitting or combining a plurality of identical beams or images, e.g. image replication

Definitions

  • the present disclosure relates to the technical field of electronic devices, and in particular, to a lens assembly and electronic device, a depth detection method, and a storage medium.
  • the purpose of the present disclosure is to provide a lens assembly, an electronic device, a depth detection method, and a storage medium, so as to solve one or more problems caused by the defects of the related art at least to a certain extent.
  • a lens assembly includes a lens module, a beam splitting prism group, and N sensor modules, and the beam splitting prism group is disposed on a light-emitting side of the lens module.
  • the dichroic prism group is used to divide the light into N light beams with different directions and output; each sensor module correspondingly receives the light beam output by the dichroic prism, and the N sensor modules are configured to have different back focal lengths;
  • N is a positive integer greater than or equal to 2.
  • an electronic device including the above-mentioned lens assembly.
  • a depth detection method which is applied to the above-mentioned electronic device, and the depth detection method includes:
  • controlling a plurality of sensor modules to collect images of real objects in the current environment have different back focal lengths of the plurality of sensor modules;
  • the depth information of the real object in the image collected by each sensor module is determined according to the back focal lengths of the multiple sensor modules, so as to reconstruct the three-dimensional space object information.
  • a computer-readable storage medium having a computer program stored thereon, the computer program implementing the above method when executed by a processor.
  • FIG. 1 is a schematic diagram of a first lens assembly provided by an exemplary embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of a second lens assembly provided by an exemplary embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a third lens assembly provided by an exemplary embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of a beam splitting prism provided by an exemplary embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of another beam splitting prism provided by an exemplary embodiment of the present disclosure.
  • FIG. 6 is a schematic diagram of a fourth lens assembly provided by an exemplary embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of a first electronic device provided by an exemplary embodiment of the present disclosure.
  • FIG. 8 is a schematic diagram of a second electronic device according to an exemplary embodiment of the present disclosure.
  • FIG. 9 is a flowchart of a first depth detection method provided by an exemplary embodiment of the present disclosure.
  • FIG. 10 is a flowchart of a second depth detection method provided by an exemplary embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram of a computer-readable storage medium provided by an exemplary embodiment of the present disclosure.
  • Example embodiments will now be described more fully with reference to the accompanying drawings.
  • Example embodiments can be embodied in various forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
  • the same reference numerals in the drawings denote the same or similar structures, and thus their detailed descriptions will be omitted.
  • the lens assembly includes: a lens module 100 , a beam splitting prism group 200 and N sensor modules 300 , and the beam splitting prism group 200 is disposed in the lens module On the light-emitting side of 100, the beam splitting prism group 200 is used to divide the light into N beams with different directions and output them; The focal length is different.
  • N is a positive integer greater than or equal to 2.
  • the N sensor modules 300 are configured to have different back focal lengths, that is, the optical distances between the N sensor modules 300 and the lens module 100 are all different. Since the back focal lengths of each sensor module 300 are different, the distance between the imaging object surface of each sensor module 300 and the lens assembly is different, and the distance can be determined by the back focal length.
  • the light entering from the lens module 100 is divided into multiple beams by the beam splitting prism group 200 , and the multiple beams are respectively transmitted to the corresponding sensor module 300 .
  • Due to the back focal length of the sensor module 300 Therefore, the object distances of the objects in the images collected by each sensor module 300 are different, and the corresponding object distances can be obtained according to the back focal length of the sensor modules 300, so that it can be determined that in the images obtained by each sensor module 300
  • the depth of the object in the real environment can be detected by the lens module 100, and it is possible to avoid setting a depth sensor in the electronic device, thereby reducing the cost of the electronic device at least to a certain extent.
  • the lens assembly provided by the embodiment of the present disclosure may further include a package housing 500 and a right angle prism 400 , the package housing 500 is provided with a lens hole 510 , and the lens module 100 is provided on the lens
  • the hole 510 , the beam splitting prism group 200 and the sensor module 300 are packaged in the package casing 500 .
  • the right angle prism 400 is disposed on the light incident side of the lens module 100, and the right angle prism 400 is used to change the direction of the incident light.
  • Changing the direction of the incident light through the right angle prism 400 enables the lens module 100 to be arranged in the length direction or width direction of the electronic device when it is installed in the electronic device, thereby reducing the size of the electronic device in the thickness direction, and making the internal space of the electronic device less space. Optimization is conducive to the thinning of electronic devices.
  • the right angle prism 400 includes two right angle surfaces and one inclined surface, and the inclined surface of the right angle prism 400 is opposite to the light inlet hole.
  • a reflective film for example, a specular silver reflective film, etc.
  • the incident light is reflected by the reflective film, and the propagation direction of the incident light is changed.
  • the right angle prism may be an isosceles right angle prism.
  • the lens module 100 includes a plurality of optical lenses, and the plurality of optical lenses are sequentially arranged on the light entrance side of the beam splitting prism group 200 .
  • the optical axes of the multiple optical lenses may be coaxially arranged, and the multiple optical lenses may include various lens combinations such as concave lenses, convex lenses, and plane mirrors.
  • the plurality of optical lenses may be plastic lenses or glass lenses; or some of the optical lenses of the plurality of optical lenses are plastic lenses and the lenses are not divided into glass lenses.
  • the plurality of optical lenses may be spherical lenses or aspherical lenses, or the like.
  • the lens module 100 may include a first lens, a second lens, a third lens and a fourth lens, the first lens has a convex surface, and the convex surface faces the right angle prism 400 ; one side, and the side of the second lens close to the first lens has a concave surface; the third lens is located on the side of the second lens away from the first lens, and both sides of the third lens are aspherical; the fourth lens is located on the third lens On the side away from the second lens, both sides of the fourth lens are aspherical.
  • the side of the third lens close to the second lens has a concave surface at the optical axis
  • the side of the third lens close to the fourth lens has a convex surface at the optical axis
  • the side of the fourth lens close to the third lens has a concave surface at the optical axis
  • the side of the fourth lens away from the third lens has a concave surface at the optical axis.
  • the first lens is convex toward the right angle prism 400 at the optical axis and has positive power.
  • the side of the second lens close to the first lens has a concave surface at the optical axis and has a negative refractive power.
  • the third lens is concave toward the second lens side in the vicinity of the optical axis and has negative refractive power.
  • the fourth lens has a concave surface facing the image side in the vicinity of the optical axis and has negative refractive power, and the image side surface of the fourth lens is formed as an aspheric surface having a pole at positions other than the optical axis.
  • the first lens has a positive refractive power, and is shaped so that a convex surface faces the object side in the vicinity of the optical axis. Therefore, spherical chromatic aberration, field curvature, and distortion can be corrected well.
  • the second lens has a negative refractive power and is shaped so that the concave surface faces the first lens side in the vicinity of the optical axis and the concave surface has a meniscus shape. Therefore, spherical chromatic aberration, field curvature, and distortion can be corrected well.
  • the third lens has a positive refractive power and is shaped so that the concave surface faces the second lens side and the convex surface faces the image side in the vicinity of the optical axis.
  • the fourth lens has a negative refractive power, and is shaped such that the convex surface faces the third lens side near the optical axis, and the concave surface faces the sensor module 300 side. Therefore, chromatic aberration, astigmatism, field curvature, and distortion can be corrected well.
  • the object-side surface and the image-side surface of the fourth lens are formed as aspherical surfaces having poles at positions other than the optical axis. Therefore, field curvature and distortion are better corrected, and the angle of incidence of light rays to the lens assembly can be appropriately controlled.
  • the combination of multiple optical lenses may be any one of 4P (Plastic Lens, plastic lens), 4G (Glass Lens, glass lens), 3P+1G, 2P+2G, and P+3G.
  • 4P Physical Lens, plastic lens
  • 4G Glass Lens, glass lens
  • 3P+1G, 2P+2G, and P+3G the number of optical lenses in the lens module 100 provided by the embodiments of the present disclosure may also be in number, such as three, five, or six, and the embodiments of the present disclosure are not limited thereto.
  • the light beam entering the lens module 100 may be a light beam generated by the convergence of composite light.
  • the composite light refers to light composed of light in different wavelength ranges, and the composite light includes white light, natural light, and the like.
  • the beam splitting prism group 200 is disposed on the light exit side of the lens module 100 , and the beam splitting prism group 200 is used for dividing the light into N beams with different directions and outputting them.
  • the beam splitting prism group 200 may include N prisms, and the N prisms are arranged in sequence.
  • the light entrance surface of the beam splitting prism close to the lens module 100 is perpendicular to the optical axis of the lens module 100 .
  • the light transmitted by the lens module 100 is prevented from being refracted on the light entrance surface of the beam splitting prism close to the lens module 100, which facilitates the design of the beam splitting prism.
  • the light entrance surface of the beam splitting prism close to the lens module 100 may not be perpendicular to the optical axis of the lens module 300 , and the embodiment of the present disclosure is not limited to this.
  • the refractive indices of the N dichroic prisms are the same. Since the refractive indices of the N dichroic prisms are the same, the directions of the transmitted light beams of each dichroic prism are the same, which facilitates the angle design of the dichroic prisms. Of course, in practical applications, the angle of each beam splitting prism may also be different, and the embodiment of the present disclosure is not limited to this.
  • the N prisms may be arranged in sequence according to the irradiation direction of the light beam, and the N prisms include N ⁇ 1 first dichroic prisms 210 and one second dichroic prism 220 .
  • the N-1 first dichroic prisms 210 are arranged in sequence according to the direction in which the light beams are irradiated, and the second dichroic prisms 220 are arranged at the end of the first prism away from the lens module 100 .
  • the first dichroic prism 210 has a dichroic surface 211 on which a transflective film 213 is disposed, and the transflective film 213 is used for dividing the light beam irradiated on the dichroic surface 211 into a reflected light beam and a transmitted light beam.
  • a beam splitting surface 211 can be provided on the side of the first beam splitting prism 210 away from the lens module 100, the light passing through the beam splitting surface 211 enters the first beam splitting prism 210 at the next level, and the light reflected by the beam splitting surface 211 exits directly or indirectly from the prism. , and enter the corresponding sensor module 300 .
  • the first dichroic prism 210 also has a light entrance surface 212 and a light exit surface 216 .
  • the light entrance surface 212 is the surface of the first dichroic prism 210 close to the lens module 100 , and the light exit surface 216 corresponds to the sensor module 300 .
  • the light reflected by the light splitting surface 211 enters the sensing module 300 through the light exit surface 216 .
  • the N dichroic prisms are sequentially arranged in the irradiation direction from the start end in the irradiation direction of the light beam.
  • the photosensitive surface of the ith image sensing module 300 among the N image sensing modules 300 is covered on the light emitting surface 216 corresponding to the ith level beam splitting prism.
  • the sensing module 300 is a CCD image sensor or a CMOS image sensor.
  • the first dichroic prism 210 may be a transparent prism such as a glass prism or a plastic prism, and the first dichroic prism 210 may have multiple faces.
  • the first dichroic prism 210 may be a triangular prism, and in this case, the first dichroic prism 210 may have three prism faces.
  • a semi-transmissive and semi-reflective film 213 may be provided on the side of the first beam splitting prism 210 away from the lens module 100 .
  • the second dichroic prism 220 may be a transparent prism.
  • the light beam emitted from the last stage of the first dichroic prism 210 enters the second dichroic prism 220 and is transmitted to the corresponding sensor module 300 through the second dichroic prism 220 .
  • the last-stage first beam splitting prism 210 is the first beam splitting prism 210 farthest from the lens module 100 .
  • the reflectivity of the transflective films 213 on the beam splitting surfaces 211 of the plurality of first beam splitting prisms 210 can be increased as the distance from the lens module 100 increases, so that the light beams received by each sensing module are illuminated Uniform strength. That is, the transmittance of the semi-transmissive and semi-reflective film 213 of the first-stage first beam splitting prism 210 is greater than that of the second-stage first beam-splitting prism 210, and the transmittance of the second-stage first beam splitting prism 210 The transmittance of the transflective film 213 is greater than the transmittance of the transflective film 213 of the first beam splitting prism 210 of the third stage, and so on, the transmittance of the semi-transmissive and semi-reflective film 213 of the first beam splitting prism 210 of the last stage is the smallest. .
  • the transflective film 213 may be a neutral beam splitting film, which can divide an incident light beam into two beams without changing the spectrum of the beam.
  • the neutral beam-splitting film may be a metal beam-splitting film, a polarized neutral beam-splitting film or a dielectric beam-splitting film.
  • the metal beam splitter film has the advantages of good neutrality, wide spectral range, small polarization effect and simple fabrication.
  • Dielectric spectroscopic film has the advantages of small absorption and high spectral efficiency.
  • Polarized neutral beam splitting films can be used for neutral beam splitting of natural light.
  • the transflective film provided by the embodiments of the present disclosure may also be other types of film layers, and the embodiments of the present disclosure are not limited thereto.
  • N beam splitting prisms are sequentially arranged on the light emitting side of the lens module.
  • the light enters the first beam splitter prism 210 from the light entrance surface 212, and the reflected light beam reflected by the beam splitter surface 211 is totally reflected on the light entrance surface 212.
  • a dielectric layer is provided on the side of the light-entering surface 212 of the first beam splitting prism 210 away from the beam splitting surface 211 , and the refractive index of the dielectric layer is smaller than that of the first beam splitting prism.
  • the dielectric layer may be an air gap, and the air gap is provided on the side of the light-entering surface 212 of the first beam splitting prism 210 away from the beam splitting surface 211 .
  • An air layer is provided on the light entrance surface side of the first-level first beam splitter prism, and the light enters the first beam splitter prism 210 from the light entrance surface 212, and after passing through the beam splitter surface 211, the reflected beam and the transmitted beam are separated.
  • 212 and the light splitting surface 211 are configured so that the reflected light beam can achieve total reflection on the light entrance surface 212 by using an air gap.
  • the light entrance surface 212 of the first beam splitting prism 210 close to the lens module 100 is perpendicular to the optical axis of the lens module 100, and the angle between the beam splitting surface 211 and the optical axis of the lens module 100 is ⁇ .
  • the reflected light needs to achieve total reflection on the light entrance surface 212, so according to the law of total reflection:
  • n1 is the refractive index of the first beam splitting prism 210
  • n2 is the refractive index of the air gap.
  • the angles between the beam splitting surfaces 211 of the remaining first beam splitting prisms 210 and the optical axis of the lens module 100 can also be calculated by the total reflection formula, which will not be repeated in this embodiment of the present disclosure.
  • the dichroic prism group 200 includes three dichroic prisms arranged in sequence, the irradiation direction of the light beam is perpendicular to the light entrance surface of the first-stage first dichroic prism 210 , and one of the first-stage first dichroic prisms 210
  • the distance between the vertex and a vertex of the second-level first beam splitting prism 210 is 1.5mm, the angle between the beam splitting surface of the first-level first beam-splitting prism 210 and the irradiation direction of the light beam is 60°, and the first-level first beam-splitting prism is 60°.
  • the side length of the side perpendicular to the incident direction of the reflected light on the light exit surface of 210 is 7.22 mm; the side length of the side perpendicular to the incident direction of the reflected light on the light exit surface of the second-level first beam splitter prism 210 is 6.77mm, the angle between the light entrance surface and the light exit surface of the second-level first beam splitter prism 210 is 40°; the angle between the first surface of the second beam splitter prism 220 and the straight line set in the direction of the light beam is 80°, the side length of the side perpendicular to the incident direction of the transmitted light on the light-emitting surface of the second beam splitting prism 220 is 7.95 mm, and the side length of the side of the third surface of the second beam splitting prism 220 is 8.06 mm, here
  • the incident direction of the transmitted light is the same as the irradiation direction of the light beam.
  • the dichroic prism group 200 provided by the embodiment of the present disclosure may include two first dichroic prisms 210 and one second dichroic prism 220 .
  • the first-level first beam splitting prism 210 is disposed on the light-emitting side of the lens module 100
  • the second-level first beam-splitting prism 210 is disposed on the side of the first-level first beam-splitting prism 210 away from the lens module 100
  • the second-stage first beam splitting prism 210 is disposed on the side away from the lens module 100 .
  • An air gap is provided between the light splitting surface of the first-level first beam splitting prism 210 and the light-entering surface of the second-level first beam splitting prism.
  • the light-splitting surface of the second-stage first beam-splitting prism 210 is attached to the side of the second beam-splitting prism 220 close to the lens module 100 .
  • an air gap is provided between adjacent first beam splitting prisms.
  • the lens assembly provided by the embodiment of the present disclosure further includes N ⁇ 1 reflective films 214 .
  • a reflective film 214 is provided on the first beam splitting prism, and the reflective film 214 is disposed on the propagation path of the light reflected by the beam splitting surface 211 of the first beam splitting prism 210 .
  • the reflective film 214 may be disposed on the light entrance surface of the first beam splitter prism 210 .
  • the reflective film 214 is provided with a light-transmitting gap 215, and the light-transmitting gap 215 is opposite to the lens module 100, so that the light passing through the lens module 100 is at least partially transmitted to the next-level first beam splitting prism 210 through the gap until it enters the lens module 100.
  • the second beam splitting prism 220 is provided with a light-transmitting gap 215, and the light-transmitting gap 215 is opposite to the lens module 100, so that the light passing through the lens module 100 is at least partially transmitted to the next-level first beam splitting prism 210 through the gap until it enters the lens module 100.
  • the second beam splitting prism 220 is provided with a light-transmitting gap 215, and the light-transmitting gap 215 is opposite to the lens module 100, so that the light passing through the lens module 100 is at least partially transmitted to the next-level first beam splitting prism 210 through the gap until it enters the lens module 100.
  • the second beam splitting prism 220 is provided with a light-transmit
  • the reflective film 214 may include a substrate and a reflective layer, the reflective layer is coated on the substrate, and the reflective layer is located on the side of the substrate away from the lens module 100 .
  • the substrate may be a transparent substrate or an opaque substrate, such as a plastic substrate, a glass substrate, a metal substrate, and the like.
  • the substrate may be a flexible substrate, such as a rubber substrate.
  • the reflective layer can also be directly coated on the corresponding surface of the first beam splitter prism 210 .
  • the reflective layer may be a metal layer, such as a mirror silver layer or the like.
  • the dichroic prism group 200 provided by the embodiment of the present disclosure may include two first dichroic prisms 210 and one second dichroic prism 220 .
  • the first-level first beam splitting prism 210 is disposed on the light-emitting side of the lens module 100
  • the second-level first beam-splitting prism 210 is disposed on the side of the first-level first beam-splitting prism 210 away from the lens module 100
  • the second-stage first beam splitting prism 210 is disposed on the side away from the lens module 100 .
  • the side of the first-level first beam splitting prism 210 away from the lens module 100 is attached to the side of the second-level first beam-splitting prism 210 close to the lens module 100, and the second-level first beam splitter prism 210 is away from the lens module 100.
  • the second beam splitting prism 220 is attached to a side close to the lens module 100 .
  • the dichroic prism group 200 includes more dichroic prisms arranged in sequence in practical applications, the surfaces of the adjacent dichroic prisms are bonded together.
  • a reflection film 214 is provided on the side of the first-stage first beam splitting prism 210 close to the lens module 100 , and a light transmission gap is formed on the reflection film 214 , and the light transmission gap is opposite to the lens module 100 .
  • An anti-reflection film may be provided on the light transmission gap of the first-stage first beam splitting prism 210 , and the anti-reflection film is used to transmit the light emitted from the lens module 100 to the inside of the first-stage first beam splitting prism 210 .
  • the side of the first-stage first beam splitting prism 210 away from the lens module 100 is provided with a beam splitter film.
  • a reflection film 214 is provided on the side of the second-stage first beam splitting prism 210 close to the lens module 100 , and a light transmission gap is formed on the reflection film 214 , and the light transmission gap is opposite to the lens module 100 .
  • the second-stage first beam splitting prism 210 is provided with a beam splitter film on one side away from the lens module 100 , the transmittance of the beam splitter film is equal to the reflectivity, and the ratio of the transmittance to the reflectivity of the beam splitter film is 1:1.
  • the package housing 500 is provided with a lens hole 510 and an accommodating cavity 520 , and the lens hole 510 and the accommodating cavity 520 communicate with each other.
  • the lens module 100 is disposed in the lens hole 510 , and the beam splitting prism group 200 , the right angle prism 400 and the sensor module 300 are packaged in the accommodating cavity 520 of the package casing 500 .
  • a light shielding layer may be provided on the lens hole 510 of the package casing 500 and the inner wall of the accommodating cavity 520 to avoid leakage and loss of the light received by the lens module 100 during the transmission process.
  • the light shielding layer can be attached to the lens hole 510 and the inner wall of the accommodating cavity 520.
  • the light-shielding layer may be formed on the lens hole 510 and the inner wall of the accommodating cavity 520 through processes such as electroplating, deposition, or sputtering.
  • the lens assembly may further include: a first The driving module and the second driving module (not shown in the figure), the first driving module is connected with the lens module 100 , and the first driving module is used to adjust the position of the lens module 100 .
  • the second driving module is connected to the sensor module 300 , and the second driving module is used to adjust the position of the sensor module 300 , thereby adjusting the back focus of the sensor module 300 .
  • the first driving module can be disposed in the packaging casing 500 to adjust the position of the lens module 100 relative to the packaging casing 500 .
  • the first driving module drives the lens module 100 to move in the lens hole 510 on the package casing 500 .
  • the first driving module is respectively connected with one or more of the plurality of optical lenses, and the first driving module is used for adjusting the position of the lens module 100 or adjusting the focal length of the lens module 100 .
  • the first driving module may include one or more motors, and the one or more motors are connected with the lens module 100 to drive the lens module 100 to move.
  • the multiple optical lenses can be driven by one motor, or each optical lens in the multiple optical lenses can be connected to a motor to be driven independently.
  • the second driving module can be disposed in the packaging casing 500 to adjust the relative positional relationship between each sensor module 300 and the packaging casing 500 .
  • the second driving module may include motors, and the number of motors in the second driving module may be the same as the number of the sensor modules 300 , and each motor drives one sensor module to move.
  • the package housing 500 may be provided with a plurality of sensing channels, the sensing channels and the light beams received by the sensor module 300 are in the same direction, and the motor can drive the corresponding sensing module to move along the sensing channels.
  • the parameters of the lens module 100 are known, so the object distance of the object captured in the image at the current back focal length can be calculated and obtained according to the back focal lengths of different sensor modules 300, that is, the Obtain depth information of objects in the real environment.
  • the depth information may be a depth point value or a depth range value.
  • the automatic focusing function of the lens assembly can be combined to realize the depth information of multiple real objects that can be captured by the lens assembly.
  • acquiring images of objects with different depths through multiple sensor modules 300 can improve the efficiency of the lens assembly in capturing the depth of real objects, especially in high-speed moving shooting scenes, which can quickly capture the depth information of different objects .
  • the mapping relationship between the back focal length and the depth may be pre-stored in the electronic device, and the mapping relationship may be a table (depth of field table) or a function (object distance calculation function) or the like.
  • the depth of field information is obtained according to the back focal lengths of different sensor modules 300 relative to the lens module 100 .
  • the depth of field information is the depth information of the real object in the image obtained by the sensor module 300 .
  • the lens assembly may further include a control module, the control module is connected with the sensor module 300, and the control module stores the mapping relationship between the back focal length of the sensor module 300 and the object distance (such as a depth-of-field table, etc.), and the control module is used for according to the sensor module.
  • the mapping relationship between the back focal length and the object distance of the module 300 determines the depth of the target object.
  • the sensor module 300 may be a CCD sensor or a CMOS sensor.
  • the sensor module 300 includes photodiodes distributed in an array, an output circuit layer and a substrate, the photodiodes are connected to the output circuits, and the photodiodes and the output circuits are packaged in the substrate.
  • the photodiode is used to convert the optical signal into an electrical signal, and the output circuit is used to output the electrical signal.
  • the substrate of the sensor module 300 can be connected with the second driving module.
  • the light entering from the lens module 100 is divided into multiple beams by the beam splitting prism group 200 , and the multiple beams are respectively transmitted to the corresponding sensor module 300 .
  • Due to the back focal length of the sensor module 300 Therefore, the object distances of the objects in the images collected by each sensor module 300 are different, and the corresponding object distances can be obtained according to the back focal length of the sensor modules 300, so that it can be determined that in the images obtained by each sensor module 300
  • the depth of the object in the real environment can be detected by the lens module 100, and it is possible to avoid setting a depth sensor in the electronic device, thereby reducing the cost of the electronic device at least to a certain extent.
  • An exemplary embodiment of the present disclosure further provides an electronic device, as shown in FIG. 7 , the electronic device includes the above-mentioned lens assembly 10 .
  • the lens assembly 10 includes a lens module 100, a beam splitting prism group 200 and N sensor modules 300.
  • the beam splitting prism group 200 is arranged on the light exit side of the lens module 100, and the beam splitting prism group 200 is used for dividing the light into N beams with different directions.
  • Each sensor module 300 correspondingly receives the beam output by the beam splitter prism, and the N sensor modules 300 are configured to have different back focal lengths; wherein, N is a positive integer greater than or equal to 2.
  • the light entering from the lens module 100 is divided into multiple beams by the beam splitting prism group 200, and the multiple beams are respectively transmitted to the corresponding sensor module 300. Due to the back focal length of the sensor module 300 Therefore, the object distances of the objects in the images collected by each sensor module 300 are different, and the corresponding object distances can be obtained according to the back focal length of the sensor modules 300, so that it can be determined that in the images obtained by each sensor module 300
  • the depth of the object in the real environment can be detected by the lens module 100, and it is possible to avoid setting a depth sensor in the electronic device, thereby reducing the cost of the electronic device at least to a certain extent.
  • the electronic device further includes a control module 11, the control module 10 is connected to the sensor module 300, the control module 11 stores the mapping relationship between the back focal length of the sensor module 300 and the object distance, and the control module 11 is used for according to the sensor module 300.
  • the mapping relationship between the back focal length and the object distance determines the depth of the target object.
  • the control module 11 can also be used to detect the working mode of the lens assembly. When it is detected that the working mode of the lens assembly is the photographing mode, the control module 11 can control one of the plurality of sensor modules 300 to work to take a photograph. When it is detected that the working mode of the lens assembly is the depth detection mode, the control module 11 controls the plurality of sensor modules 300 to work. At this time, the control module can control the first driving module and the second driving module to adjust the sensor modules 300 back focus. The control module 11 can determine the working mode of the lens assembly by detecting the instruction input by the user.
  • the parameters of the lens module 100 are known, so the object distance of the object captured in the image at the current back focal length can be calculated and obtained according to the back focal length of different sensors, that is, The depth information of objects in the real environment can be obtained.
  • the depth information can be a depth point value or a depth range value.
  • the automatic focusing function of the lens assembly can be combined to realize the depth information of multiple real objects that can be captured by the lens assembly.
  • acquiring images of objects with different depths through multiple sensor modules 300 can improve the efficiency of the lens assembly in capturing the depth of real objects, especially in high-speed moving shooting scenes, which can quickly capture the depth information of different objects .
  • the mapping relationship between the back focal length and the depth may be pre-stored in the electronic device, and the mapping relationship may be a table or a function or the like.
  • the depth of field information is obtained according to the back focal lengths of different sensor modules 300 relative to the lens module 100 .
  • the depth of field information is the depth information of the real object in the image obtained by the sensor module 300 .
  • the electronic device in the embodiment of the present disclosure may be an electronic device having a camera component, such as a mobile phone, a tablet computer, a wearable device, a camera, or a video camera.
  • a camera component such as a mobile phone, a tablet computer, a wearable device, a camera, or a video camera.
  • the following takes the electronic device as a mobile phone as an example to illustrate:
  • the electronic device may also include a middle frame 20 , a main board 30 , a display screen 70 , a battery 40 and other devices.
  • the display screen 70 , the middle frame 20 and the back cover 50 form an accommodation space for accommodating the electronic equipment.
  • the display screen 70 forms a display surface of the electronic device, and is used to display information such as images and texts.
  • the display screen 70 may be a liquid crystal display (Liquid Crystal Display, LCD) or an organic light-emitting diode (Organic Light-Emitting Diode, OLED) type display screen.
  • a glass cover plate may be provided on the display screen 70 .
  • the glass cover plate can cover the display screen 70 to protect the display screen 70 and prevent the display screen 70 from being scratched or damaged by water.
  • the display screen 70 may include a display area and a non-display area.
  • the display area performs the display function of the display screen 70 for displaying information such as images and texts. No information is displayed in the non-display area.
  • the non-display area can be used to set functional modules such as cameras, receivers, and proximity sensors.
  • the non-display area may include at least one area located above and below the display area.
  • the display screen 70 may be a full screen. At this time, the display screen 70 can display information in a full screen, so that the electronic device has a larger screen ratio.
  • the display screen 70 includes only a display area and does not include a non-display area.
  • the middle frame 20 may be a hollow frame structure.
  • the material of the middle frame 20 may include metal or plastic.
  • the main board 30 is installed inside the above-mentioned accommodation space.
  • the main board 30 can be installed on the middle frame 20 and accommodated in the above-mentioned receiving space together with the middle frame 20 .
  • the main board 30 is provided with a ground point to realize the grounding of the main board 30 .
  • the main board 30 may be integrated with one or more functional modules such as a motor, a microphone, a speaker, a receiver, an earphone interface, a universal serial bus interface (USB interface), a proximity sensor, an ambient light sensor, a gyroscope, and a processor. Meanwhile, the display screen 70 may be electrically connected to the main board 30 .
  • a motor such as a motor, a microphone, a speaker, a receiver, an earphone interface, a universal serial bus interface (USB interface), a proximity sensor, an ambient light sensor, a gyroscope, and a processor.
  • USB interface universal serial bus interface
  • the display screen 70 may be electrically connected to the main board 30 .
  • the sensor module may include depth sensor, pressure sensor, gyroscope sensor, air pressure sensor, magnetic sensor, acceleration sensor, distance sensor, proximity light sensor, fingerprint sensor, temperature sensor, touch sensor, ambient light sensor and bone conduction sensor, etc.
  • the processor may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processor (Image Signal Processor, ISP), a controller, and a video codec , digital signal processor (Digital Signal Processor, DSP), baseband processor and/or neural network processor (Neural-network Processing Unit, NPU), etc.
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • the main board 30 is also provided with a display control circuit.
  • the display control circuit outputs electrical signals to the display screen 70 to control the display screen 70 to display information.
  • the light-emitting control unit and the color-changing control unit can be arranged on the main board.
  • the battery 40 is installed inside the above-mentioned accommodation space.
  • the battery 40 may be installed on the middle frame 20 and housed in the above-mentioned storage space together with the middle frame 20 .
  • the battery 40 may be electrically connected to the main board 30 to enable the battery 40 to supply power to the electronic device.
  • the mainboard 30 may be provided with a power management circuit.
  • the power management circuit is used to distribute the voltage provided by the battery 40 to the various electronic components in the electronic device.
  • the back cover 50 is used to form the outer contour of the electronic device.
  • the rear cover 50 may be integrally formed.
  • structures such as a rear camera hole, a fingerprint identification module mounting hole and the like may be formed on the back cover 50 .
  • the camera assembly can be arranged on the main board and the middle frame, and the camera assembly receives the light from the rear camera hole.
  • the camera assembly may also be a front camera, and the embodiment of the present disclosure is not limited to this.
  • An exemplary embodiment of the present disclosure further provides a depth detection method, which is applied to the above-mentioned electronic device.
  • the depth detection method may include the following steps:
  • Step S910 receiving a depth detection instruction
  • Step S920 in response to the depth detection instruction, control a plurality of sensor modules to collect images of real objects in the current environment, and the back focal lengths of the plurality of sensor modules are different;
  • Step S930 Determine the depth information of the real object in the image collected by each sensor module according to the back focal lengths of the multiple sensor modules, so as to reconstruct the three-dimensional space object information.
  • the back focal length of the sensor module is adjusted according to the depth detection instruction, so as to obtain images with different object distances, and the image collected by each sensor module is determined according to the back focal lengths of the multiple sensor modules.
  • Depth information of real objects When capturing images, the light entering from the lens module is divided into multiple beams by the beam splitting prism group, and the multiple beams of light are respectively transmitted to the corresponding sensor modules. Since the back focal lengths of the sensor modules are different, each sensor module The object distances of the objects in the collected images are different, and the corresponding object distances can be obtained according to the back focal length of the sensor module, so that the depth of the objects in the images obtained by each sensor module can be determined.
  • the depth of objects in the real environment can be detected, and depth sensors can be avoided in electronic equipment, thereby reducing the cost of electronic equipment at least to a certain extent.
  • step S910 a depth detection instruction may be received.
  • the depth detection instruction may be used to indicate whether to perform depth detection, and to indicate the back focus of the corresponding sensor module. That is, the depth detection command may include the driving signals of the first driving module and the second driving module, the driving signal of the first driving module is used to drive the lens module, and the driving signal of the second driving module is used to drive the sensor module. Or the depth detection instruction can include depth detection requirements, and the depth detection requirements can be converted into driving signals of the first driving module and the second driving module.
  • multiple sensor modules in response to the depth detection instruction, multiple sensor modules may be controlled to collect images of real objects in the current environment, and the multiple sensor modules have different back focal lengths.
  • the sensor module is controlled to be powered on. After the sensor module is powered on, it receives the light transmitted by the beam splitting prism, converts the light signal into an electrical signal, and collects images of objects in the real environment.
  • the depth information of the real object in the image collected by each sensor module may be determined according to the back focal lengths of the multiple sensor modules, so as to reconstruct the three-dimensional space object information.
  • the electronic device stores the mapping relationship between the sensor module and the object distance, so the object distance can be determined according to the back focal length of the sensor module, and the object distance is the depth information of the real object in the current image.
  • the depth information may be a depth point value or a depth range value.
  • the depth detection method provided by the embodiment of the present disclosure may include:
  • Step S940 in response to the depth detection instruction, adjust the back focal lengths of the plurality of sensor modules.
  • Step S940 may be performed before step S920.
  • the electronic device After receiving the depth detection instruction, the electronic device firstly adjusts the plurality of sensor modules to the initial position, that is, the back focus of the sensor modules is initialized.
  • the detection instruction includes a further instruction instruction, the back focus of the sensor module is adjusted to the target position.
  • a computer-readable storage medium on which a program product capable of implementing the above-described method of the present specification is stored.
  • various aspects of the present disclosure may also be implemented in the form of a program product including program code for causing the program product to run on a terminal device when the program product is run on a terminal device.
  • the terminal device performs the steps according to various exemplary embodiments of the present disclosure described in the above-mentioned "Example Method" section of this specification.
  • a program product 1100 for implementing the above method according to an embodiment of the present disclosure is described, which can adopt a portable compact disk read only memory (CD-ROM) and include program codes, and can be used in a terminal device, For example running on a personal computer.
  • CD-ROM portable compact disk read only memory
  • the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device.
  • the program product may employ any combination of one or more readable media.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • the readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above. More specific examples (non-exhaustive list) of readable storage media include: electrical connections with one or more wires, portable disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the foregoing.
  • a computer readable signal medium may include a propagated data signal in baseband or as part of a carrier wave with readable program code embodied thereon. Such propagated data signals may take a variety of forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • a readable signal medium can also be any readable medium, other than a readable storage medium, that can transmit, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
  • Program code embodied on a readable medium may be transmitted using any suitable medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Program code for performing the operations of the present disclosure may be written in any combination of one or more programming languages, including object-oriented programming languages—such as Java, C++, etc., as well as conventional procedural Programming Language - such as the "C" language or similar programming language.
  • the program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server execute on.
  • the remote computing device may be connected to the user computing device through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device (eg, using an Internet service provider business via an Internet connection).
  • LAN local area network
  • WAN wide area network
  • an external computing device eg, using an Internet service provider business via an Internet connection

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Optical Elements Other Than Lenses (AREA)

Abstract

一种镜头组件及电子设备、深度检测方法、存储介质,镜头组件包括:镜头模组(100)、分光棱镜组(200)和N个传感器模组(300),分光棱镜组(200)设于镜头模组(100)的出光侧,分光棱镜组(200)用于将光线分成N束方向不同的光束并输出;每个传感器模组(300)对应接收分光棱镜组(200)输出的光束,并且N个传感器模组(300)被配置为后焦距不同;其中,N为大于等于2的正整数。

Description

镜头组件及电子设备、深度检测方法、存储介质
交叉引用
本公开要求于2020年11月02日提交的申请号为202011204573.8名称为“镜头组件及电子设备、深度检测方法、存储介质”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入本文。
技术领域
本公开涉及电子设备技术领域,具体而言,涉及一种镜头组件及电子设备、深度检测方法、存储介质。
背景技术
随着技术的发展和进步,虚拟现实技术以及增强现实技术在各类电子设备中逐渐开始应用。为了实现虚拟现实或者增强现实往往需要获取现实环境中的物体的深度信息,目前主要通过在电子设备上设置深度传感器(比如,时间飞行器件)检测现实环境中物体的深度信息。但是,在电子设备中设置深度传感器会增加电子设备的成本。
在所述背景技术部分公开的上述信息仅用于加强对本公开的背景的理解,因此它可以包括不构成对本领域普通技术人员已知的现有技术的信息。
公开内容
本公开的目的在于提供一种镜头组件及电子设备、深度检测方法、存储介质,进而至少一定程度上解决由于相关技术的缺陷而导致的一个或多个问题。
根据本公开的第一个方面,提供一种镜头组件,所述镜头组件包括:镜头模组、分光棱镜组和N个传感器模组,所述分光棱镜组设于所述镜头模组的出光侧,所述分光棱镜组用于将光线分成N束方向不同的光束并输出;每个传感器模组对应接收所述分光棱镜输出的光束,并且N个所述传感器模组被配置为后焦距不同;其中,N为大于等于2的正整数。
根据本公开的第二个方面,提供一种电子设备,所述电子设备包括上述的镜头组件。
根据本公开的三个方面,提供一种深度检测方法,应用于上述的电子设备,所述深度检测方法包括:
接收深度检测指令;
响应于所述深度检测指令,控制多个传感器模组采集当前环境中现实物体的图像多个所述传感器模组的后焦距不同;
根据多个传感器模组的后焦距确定每个传感器模组采集到的图像中现实物体的深度信息,以对三维空间物体信息进行重构。
根据本公开的四个方面,提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现上述方法。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开示例性实施例提供的第一种镜头组件的示意图;
图2为本公开示例性实施例提供的第二种镜头组件的示意图;
图3为本公开示例性实施例提供的第三种镜头组件的示意图;
图4为本公开示例性实施例提供的一种分光棱镜的示意图;
图5为本公开示例性实施例提供的另一种分光棱镜的示意图;
图6为本公开示例性实施例提供的第四种镜头组件的示意图;
图7为本公开示例性实施例提供的第一种电子设备的示意图;
图8为本公开示例性实施例提供的第二种电子设备的示意图;
图9为本公开示例性实施例提供的第一种深度检测方法的流程图;
图10为本公开示例性实施例提供的第二种深度检测方法的流程图;
图11为本公开示例性实施例提供的一种计算机可读存储介质的示意图。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的实施方式;相反,提供这些实施方式使得本公开将全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。图中相同的附图标记表示相同或类似的结构,因而将省略它们的详细描述。
虽然本说明书中使用相对性的用语,例如“上”“下”来描述图标的一个组件对于另一组件的相对关系,但是这些术语用于本说明书中仅出于方便,例如根据附图中所述的示例的方向。能理解的是,如果将图标的装置翻转使其上下颠倒,则所叙述在“上”的组件将会成为在“下”的组件。当某结构在其它结构“上”时,有可能是指某结构一体形成于其它结构上,或指某结构“直接”设置在其它结构上,或指某结构通过另一结构 “间接”设置在其它结构上。
本公开示例性实施例首先提供了一种镜头组件,如图1所示,镜头组件包括:镜头模组100、分光棱镜组200和N个传感器模组300,分光棱镜组200设于镜头模组100的出光侧,分光棱镜组200用于将光线分成N束方向不同的光束并输出;每个传感器模组300对应接收分光棱镜组200输出的光束,并且N个传感器模组300被配置为后焦距不同。
其中,N为大于等于2的正整数。N个传感器模组300被配置为后焦距不同,也即是N个传感器模组300和镜头模组100的之间的光程均不同。由于各传感器模组300的后焦距不同,每个传感器模组300的成像物面和镜头组件的距离不同,该距离能够通过后焦距确定。
本公开实施例提供的镜头组件,从镜头模组100进入的光线通过分光棱镜组200被分为多束,多束光线分别被传输至对应的传感器模组300,由于传感器模组300的后焦距不同,因此每个传感器模组300所采集的图像中的物体的物距不同,根据传感器模组300的后焦距能够获取到相应的物距,从而能够确定每个传感器模组300获取的图像中的物体的深度,实现了通过镜头模组100检测现实环境中物体的深度,并且可以避免在电子设备中设置深度传感器,进而至少一定程度上降低电子设备的成本。
进一步的,如图2和图3所示,本公开实施例提供的镜头组件还可以包括封装壳体500和直角棱镜400,封装壳体500上设置有镜头孔510,镜头模组100设于镜头孔510,分光棱镜组200和传感器模组300封装于封装壳体500。直角棱镜400设于镜头模组100的入光侧,直角棱镜400用于改变入射光线的方向。
通过直角棱镜400改变入射光线的方向能够使镜头模组100设置于电子设备中时,沿电子设备的长度方向或者宽度方向排布,从而减少电子设备厚度方向上的尺寸,使得电子设备内部空间的优化,有利于电子设备的轻薄化。
下面将对本公开实施例提供的镜头组件的各部分进行详细说明:
直角棱镜400包括两个直角面和一个斜面,直角棱镜400的斜面和进光孔相对。直角棱镜的直角面上可以设置有反射膜(比如,镜面银反射膜等)。通过反射膜反射入射光线,改变入射光线的传播方向。示例的,直角棱镜可以是等腰直角棱镜。
镜头模组100包括多个光学透镜,多个光学透镜依次排布于分光棱镜组200的进光侧。多个光学透镜的光轴可以是同轴设置,多个光学透镜中可以包括凹透镜、凸透镜和平面镜等多种透镜组合。多个光学透镜可以是塑料透镜或者玻璃透镜;或者多个光学透镜中部分光学透镜为塑料透镜不分透镜为玻璃透镜。多个光学透镜可以是球面透镜或者非球面透镜等。
示例的,镜头模组100可以包括第一透镜、第二透镜、第三透镜和第四透镜,第一透镜具有凸面,凸面朝向直角棱镜400;第二透镜设于第一透镜远离直角棱镜400的 一侧,并且第二透镜靠近第一透镜的一侧具有凹面;第三透镜设于第二透镜远离第一透镜的一侧,第三透镜双面为非球面;第四透镜设于第三透镜远离第二透镜的一侧,第四透镜双面为非球面。第三透镜靠近第二透镜的一侧在光轴处具有凹面,第三透镜靠近第四透镜的一侧在光轴处具有凸面,第四透镜靠近第三透镜的一侧在光轴处具有凹面,第四透镜远离第三透镜的一侧在光轴处具有凹面。
第一透镜在光轴处凸面朝向直角棱镜400并且具有正的光焦度。第二透镜靠近第一透镜的一面在光轴处具有凹面,并且具有负的光焦度。第三透镜在光轴附近凹面朝向第二透镜侧且具有负的光焦度。第四透镜在光轴附近凹面朝向像侧且具有负的光焦度,第四透镜的像侧的面形成为在光轴上以外的位置具有极点的非球面。
第一透镜具有正的光焦度,其形状形成为在光轴附近凸面朝向物体侧。因此,能够良好地校正球面色差、场曲和畸变。第二透镜具有负的光焦度,其形状形成为在光轴附近凹面朝向第一透镜侧且凹面呈弯月形状。因此,能够良好地校正球面色差、场曲和畸变。第三透镜具有正的光焦度,其形状形成为在光轴附近凹面朝向第二透镜侧,凸面朝向像侧。因此,光线向第三透镜的入射角变为适当的值,并且能够良好地校正色差、场曲和畸变。第四透镜具有负的光焦度,其形状形成为在光轴附近凸面朝向第三透镜侧,且凹面朝向传感器模组300的一侧。因此,能够良好地校正色像差、像散、场曲和畸变。第四透镜的物体侧的面和像侧的面形成为在光轴上以外的位置具有极点的非球面。因此,更好地校正场曲和畸变,并且能够适当地控制光线向镜头组件的入射角。
在此基础上,多个光学透镜的组合可以是4P(Plastic Lens,塑料镜片)、4G(Glass Lens,玻璃镜片)、3P+1G、2P+2G和P+3G中的任意一种。当然,在实际应用中本公开实施例提供的镜头模组100中的光学镜片也可以是数量,比如三个、五个或者六个等,本公开实施例并不以此为限。
在本公开示例性实施例中,进入镜头模组100的光束可以为一种复合光汇聚后产生的光束,复合光是指不同波长范围的光组成的光线,复合光包括白光、自然光等等。
分光棱镜组200设于镜头模组100的出光侧,分光棱镜组200用于将光线分成N束方向不同的光束并输出。分光棱镜组200可以包括N个棱镜,N个棱镜依次排列。
N个分光棱镜中靠近镜头模组100的分光棱镜的进光面与镜头模组100的光轴垂直。避免镜头模组100传输的光线在靠近镜头模组100的分光棱镜的进光面发生折射,便于分光棱镜的设计。当然在实际应用中,靠近镜头模组100的分光棱镜的进光面与镜头模组300的光轴也可以不垂直,本公开实施例并不以此为限。
N个分光棱镜的折射率相同,由于N个分光棱镜的折射率相同,因此每个分光棱镜的透射光束的方向相同,便于分光棱镜的角度设计。当然在实际应用中每个分光棱镜的角度也可以不同,本公开实施例并不以此为限。
其中,如图4所示,N个棱镜可以按照光束照射的方向依次排列,N个棱镜包括N-1个第一分光棱镜210和一个第二分光棱镜220。N-1个第一分光棱镜210按照光束照射的方向依次排列,第二分光棱镜220设于第一棱镜远离镜头模组100的一端。
第一分光棱镜210具有分光面211,分光面211上设置有半透半反射膜213,该半透半反射膜213用于将照射至分光面211的光束分为反射光束和透射光束。可以在第一分光棱镜210远离镜头模组100的一侧设置分光面211,透过分光面211的光线进入下一级的第一分光棱镜210,分光面211反射的光线直接或者间接从棱镜射出,进入对应的传感器模组300。
第一分光棱镜210还具有进光面212和出光面216,进光面212为第一分光棱镜210靠近镜头模组100一侧的面,出光面216和传感模组300对应。分光面211反射的光线通过出光面216进入传感模组300。
将N个分光棱镜从光束的照射方向的开始端依次在照射方向上进行排列。N个图像传感模组300中第i个图像传感模组300的感光面覆盖在第i级分光棱镜对应的出光面216上。其中,传感模组300为CCD图像传感器或CMOS图像传感器。
第一分光棱镜210可以是玻璃棱镜或者塑料棱镜等透明棱镜,第一分光棱镜210可以具有多个面。比如,第一分光棱镜210可以是三角棱镜,此时第一分光棱镜210可以具有三个棱镜面。可以在第一分光棱镜210远离镜头模组100的一面设置半透半反射膜213。第二分光棱镜220可以是透明棱镜,从最后一级第一分光棱镜210射出的光束进入第二分光棱镜220,并通过第二分光棱镜220被传输至对应的传感器模组300。其中,最后一级第一分光棱镜210为距离镜头模组100最远的第一分光棱镜210。
多个第一分光棱镜210的分光面211上的半透半反射膜213反射率可以随着与镜头模组100的距离的增加而增加,以使得每个传感模组接收到的光束的光照强度均匀。也即是第一级第一分光棱镜210的半透半反射膜213的透射率大于第二级第一分光棱镜210的半透半反射膜213的透射率,第二级第一分光棱镜210的半透半反射膜213的透射率大于第三级第一分光棱镜210的半透半反射膜213的透射率,依次类推最后一级的第一分光棱镜210的半透半反射膜213透射率最小。
半透半反射膜213可以是中性分光膜,中性分光膜能够将一束入射光线分成两束光线并且不改变光束的光谱。比如,中性分光膜可以是金属分光膜、偏振中性分光膜或者介质分光膜。金属分光膜具有中性好,光谱范围宽、偏振效应小和制作简单等优点。介质分光膜具有吸收小、分光效率高等优点。偏振中性分光膜可以用于自然光的中性分束。当然在实际应用中本公开实施例提供的半透半反射膜也可以是其他种类的膜层,本公开实施例并不以此为限。
在本公开一可行的实施方式中,N个分光棱镜依次设置于所述镜头模组的出光侧。光线从进光面212进入第一分光棱镜210,且由分光面反射211的反射光束在进光面 212实现全反射。第一分光棱镜210进光面212远离分光面211的一侧设置有介质层,介质层的折射率小于所述第一分光棱镜的折射率。
比如,介质层可以是空气间隙,空气间隙设于第一分光棱镜210进光面212远离分光面211的一侧。第一级第一分光棱镜的进光面侧设置有空气层,光线从所述进光面212进入第一分光棱镜210,通过所述分光面211后分出反射光束和透射光束,进光面212和分光面211被配置为能够使反射光束利用空气间隙在所述进光面212实现全反射。
示例的,靠近镜头模组100的第一分光棱镜210的进光面212和镜头模组100的光轴垂直,分光面211和镜头模组100的光轴的夹角为α,由于分光面211反射的光线在进光面212上需要实现全反射,因此根据全反射定律可得:
Figure PCTCN2021120705-appb-000001
其中,n1为第一分光棱镜210的折射率,n2为空气间隙的折射率。在实际应用中,其余第一分光棱镜210的分光面211和镜头模组100光轴的夹角也可以通过全反射公式计算,本公开实施例在此不复赘述。
比如,如图6所示,分光棱镜组200包括依次排列的三个分光棱镜,光束的照射方向垂直于第一级第一分光棱镜210的进光面,第一级第一分光棱镜210的一个顶点和第二级第一分光棱镜210的一个顶点的距离为1.5mm,第一级第一分光棱镜210的分光面和以光束的照射方向的夹角为60°,第一级第一分光棱镜210的出光面上的垂直于反射光的射入方向的边的边长为7.22mm;第二级第一分光棱镜210的出光面上的垂直于反射光的射入方向的边的边长为6.77mm,第二级第一分光棱镜210的进光面和出光面之间的夹角为40°;第二分光棱镜220的第一个面和以光束的照射方向设置的直线的夹角为80°,第二分光棱镜220的出光面上的垂直于透射光的射入方向的边的边长为7.95mm,第二分光棱镜220的第三个面的边的边长为8.06mm,这里透射光的射入方向和光束的照射方向相同。
示例的,本公开实施例提供的分光棱镜组200可以包括两个第一分光棱镜210和一个第二分光棱镜220。第一级第一分光棱镜210设于镜头模组100的出光侧,第二级第一分光棱镜210设于第一级第一分光棱镜210远离镜头模组100的一侧,第二分光棱镜220设于第二级第一分光棱镜210远离镜头模组100的一侧。
第一级第一分光棱镜210的分光面和第二级第一分光棱镜的进光面之间设置有空气间隙。第二级第一分光棱镜210的分光面和第二分光棱镜220靠近镜头模组100的一面贴合。当然在实际应用中分光棱镜组200包括更多的依次排布的分光棱镜时,相邻的第一分光棱镜之间设置有空气间隙。
在本公开另一可行的实施方式中,如图5所示,本公开实施例提供的镜头组件还包括N-1个反射膜214。在第一分光棱上设置有反射膜214,该反射膜214设置于第一分光棱镜210分光面211反射光线的传播路径上。比如,反射膜214可以设于第一分光棱镜210的进光面。反射膜214上设置有透光缺口215,该透光缺口215和镜头模组100相对,以使经过镜头模组100的光线至少部分通过该缺口传递至下一级第一分光棱镜210,直至进入第二分光棱镜220。
反射膜214可以包括基板和反射层,反射层涂覆于基板上,并且反射层位于基板远离镜头模组100的一侧。基板可以是透明基板或者不透明基板,比如,塑料基板、玻璃基板及金属基板等。为了保证反射膜214和棱镜的贴合,基板可以是柔性基板,比如橡胶基板等。当然在实际应用中反射层也可以直接涂覆于第一分光棱镜210相应的表面。反射层可以是金属层,比如镜面银层等。
示例的,本公开实施例提供的分光棱镜组200可以包括两个第一分光棱镜210和一个第二分光棱镜220。第一级第一分光棱镜210设于镜头模组100的出光侧,第二级第一分光棱镜210设于第一级第一分光棱镜210远离镜头模组100的一侧,第二分光棱镜220设于第二级第一分光棱镜210远离镜头模组100的一侧。
第一级第一分光棱镜210远离镜头模组100的一面和第二级第一分光棱镜210靠近镜头模组100的一面贴合,第二级第一分光棱镜210远离镜头模组100的一面和第二分光棱镜220靠近镜头模组100的一面贴合。当然在实际应用中分光棱镜组200包括更多的依次排布的分光棱镜时,相邻的分光棱镜的面贴合。
第一级第一分光棱镜210靠近镜头模组100的一侧设置有反射膜214,反射膜214上设置有透光缺口,该透光缺口和镜头模组100相对。在第一级第一分光棱镜210的透光缺口上可以设置有增透膜,该增透膜用于将镜头模组100出射的光线传输至第一级第一分光棱镜210的内部。第一级第一分光棱镜210远离镜头模组100的一面设置有分光膜,该分光膜的透过率大于反射率,比如该分光膜的透过率和反射率的比值为2:1。
第二级第一分光棱镜210靠近镜头模组100的一侧设置有反射膜214,反射膜214上设置有透光缺口,该透光缺口和镜头模组100相对。第二级第一分光棱镜210远离镜头模组100的一面设置有分光膜,该分光膜的透过率等于反射率,该分光膜的透过率和反射率的比值为1:1。
封装壳体500上设置有镜头孔510和容置腔520,镜头孔510和容置腔520连通。镜头模组100设于镜头孔510,分光棱镜组200、直角棱镜400和传感器模组300封装于封装壳体500的容置腔520。
在封装壳体500的镜头孔510和容置腔520的内壁上可以设置遮光层,避免镜头模组100接收的光线在传输过程中外泄损失。遮光层可以贴附在镜头孔510和容置腔 520的内壁。或者遮光层可以通过电镀、沉积或者溅射等工艺形成于镜头孔510和容置腔520的内壁。
进一步的,为了调节各传感器模组300相对于镜头模组100的后焦距,从而使得传感器模组300可以获得不同物距的物体的图像,本公开实施例提供的镜头组件还可以包括:第一驱动模组和第二驱动模组(图中未示出),第一驱动模组和镜头模组100连接,第一驱动模组用于调节镜头模组100的位置。第二驱动模组和传感器模组300连接,第二驱动模组用于调节传感器模组300的位置,进而调节传感器模组300的后焦距。
第一驱动模组可以设于封装壳体500,进而调节镜头模组100相对于封装壳体500的位置。第一驱动模组驱动镜头模组100在封装壳体500上的镜头孔510内运动。第一驱动模组分别和多个光学镜片中的一个或多个连接,第一驱动模组用于调节镜头模组100的位置或调节镜头模组100的焦距。
第一驱动模组可以包括一个或多个马达,一个或多个马达和镜头模组100连接,驱动镜头模组100运动。其中,当镜头模组100包括多个光学镜片时,多个光学镜片可以通过一个马达驱动,或者多个光学镜片中的每个光学镜片可以连接一马达单独驱动。
第二驱动模组可以设于封装壳体500,进而调节各传感器模组300和封装壳体500的相对位置关系。第二驱动模组可以包括马达,第二驱动模组中的马达数量可以和传感器模组300的数量相同,每个马达驱动一个传感模组运动。封装壳体500上可以设置有多个传感通道,传感通道和传感器模组300接收的光束的方向一致,马达可以驱动对应的传感模组沿传感通道运动。
对于确定的镜头模组100,镜头模组100的参数已知,因此可以根据不同传感器模组300的后焦距计算获得在当前后焦距下,图像中拍摄到的物体的物距,也即是能够获得现实环境中物体的深度信息。在实际应用中该深度信息可以是深度点值或者深度范围值。
在实际应用中可以结合镜头组件的自动调焦功能,实现对于镜头组件能够拍摄到的多个现实物体的深度信息。在运动的拍摄场景中,通过多个传感器模组300获取不同深度的物体的图像,能够提高镜头组件采集现实物体深度的效率,尤其是在高速运动的拍摄场景下能够快速捕捉不同物体的深度信息。
可以在电子设备内预存储后焦距和深度的映射关系,该映射关系可以是表格(景深表)或者函数(物距计算函数)等。在拍摄过程中,根据不同的传感器模组300相对于镜头模组100的后焦距获取到景深信息,此时该景深信息即为传感器模组300获取的图像中的现实物体的深度信息。
进一步的,镜头组件还可以包括控制模块,控制模块和传感器模组300连接,控 制模块中存储有传感器模组300后焦距和物距的映射关系(比如景深表等),控制模块用于根据传感器模组300后焦距和物距的映射关系确定目标物体的深度。
传感器模组300可以是CCD传感器或者CMOS传感器。传感器模组300中包括阵列式分布的光电二极管、输出电路层和衬底,光电二极管和输出电路连接,光电二极管和输出电路封装于衬底。该光电二极管用于将光信号转为电信号,输出电路用于将电信号输出。传感器模组300的衬底可以和第二驱动模组连接。
本公开实施例提供的镜头组件,从镜头模组100进入的光线通过分光棱镜组200被分为多束,多束光线分别被传输至对应的传感器模组300,由于传感器模组300的后焦距不同,因此每个传感器模组300所采集的图像中的物体的物距不同,根据传感器模组300的后焦距能够获取到相应的物距,从而能够确定每个传感器模组300获取的图像中的物体的深度,实现了通过镜头模组100检测现实环境中物体的深度,并且可以避免在电子设备中设置深度传感器,进而至少一定程度上降低电子设备的成本。
本公开示例性实施例还提供一种电子设备,如图7所示,电子设备包括上述的镜头组件10。该镜头组件10包括镜头模组100、分光棱镜组200和N个传感器模组300,分光棱镜组200设于镜头模组100的出光侧,分光棱镜组200用于将光线分成N束方向不同的光束并输出;每个传感器模组300对应接收分光棱镜输出的光束,并且N个传感器模组300被配置为后焦距不同;其中,N为大于等于2的正整数。
本公开实施例提供的电子设备,从镜头模组100进入的光线通过分光棱镜组200被分为多束,多束光线分别被传输至对应的传感器模组300,由于传感器模组300的后焦距不同,因此每个传感器模组300所采集的图像中的物体的物距不同,根据传感器模组300的后焦距能够获取到相应的物距,从而能够确定每个传感器模组300获取的图像中的物体的深度,实现了通过镜头模组100检测现实环境中物体的深度,并且可以避免在电子设备中设置深度传感器,进而至少一定程度上降低电子设备的成本。
进一步的,电子设备还包括控制模块11,控制模块10和传感器模组300连接,控制模块11中存储有传感器模组300后焦距和物距的映射关系,控制模块11用于根据传感器模组300后焦距和物距的映射关系确定目标物体的深度。
控制模块11还可以用于检测镜头组件的工作模式,当检测到镜头组件的工作模式为拍照模式时,控制模块11可以控制多个传感器模组300中的一个工作进行拍照。当检测到镜头组件的工作模式为深度检测模式时,控制模块11控制多个传感器模组300工作,此时控制模组可以控制第一驱动模组和第二驱动模组调节传感器模组300的后焦距。控制模块11可以通过检测用户输入的指令判断镜头组件的工作方式。
在电子设备内对于确定的镜头模组100,镜头模组100的参数已知,因此可以根据不同传感器的后焦距计算获得在当前后焦距下,图像中拍摄到的物体的物距,也即是能够获得现实环境中物体的深度信息。在实际应用中该深度信息可以是深度点值或 者深度范围值。
在实际应用中可以结合镜头组件的自动调焦功能,实现对于镜头组件能够拍摄到的多个现实物体的深度信息。在运动的拍摄场景中,通过多个传感器模组300获取不同深度的物体的图像,能够提高镜头组件采集现实物体深度的效率,尤其是在高速运动的拍摄场景下能够快速捕捉不同物体的深度信息。
可以在电子设备内预存储后焦距和深度的映射关系,该映射关系可以是表格或者函数等。在拍摄过程中,根据不同的传感器模组300相对于镜头模组100的后焦距获取到景深信息,此时该景深信息即为传感器模组300获取的图像中的现实物体的深度信息。
其中,本公开实施例中的电子设备可以是手机、平板电脑、可穿戴设备、照相机或者摄像机等具有摄像组件的电子设备。下面以电子设备为手机为例进行说明:
如图8所示,该电子设备还可以包括中框20、主板30、显示屏70和电池40等器件,显示屏70、中框20与后盖50形成一收容空间,用于容纳电子设备的其他电子元件或功能模块。同时,显示屏70形成电子设备的显示面,用于显示图像、文本等信息。显示屏70可以为液晶显示屏(Liquid Crystal Display,LCD)或有机发光二极管显示屏(Organic Light-Emitting Diode,OLED)等类型的显示屏。
显示屏70上可以设置有玻璃盖板。其中,玻璃盖板可以覆盖显示屏70,以对显示屏70进行保护,防止显示屏70被刮伤或者被水损坏。
显示屏70可以包括显示区域以及非显示区域。其中,显示区域执行显示屏70的显示功能,用于显示图像、文本等信息。非显示区域不显示信息。非显示区域可以用于设置摄像头、受话器、接近传感器等功能模块。在一些实施例中,非显示区域可以包括位于显示区域上部和下部的至少一个区域。
显示屏70可以为全面屏。此时,显示屏70可以全屏显示信息,从而电子设备具有较大的屏占比。显示屏70只包括显示区域,而不包括非显示区域。
中框20可以为中空的框体结构。其中,中框20的材质可以包括金属或塑胶。主板30安装在上述收容空间内部。例如,主板30可以安装在中框20上,并随中框20一同收容在上述收容空间中。主板30上设置有接地点,以实现主板30的接地。
主板30上可以集成有马达、麦克风、扬声器、受话器、耳机接口、通用串行总线接口(USB接口)、接近传感器、环境光传感器、陀螺仪以及处理器等功能模块中的一个或多个。同时,显示屏70可以电连接至主板30。
其中,传感器模组可以包括深度传感器、压力传感器、陀螺仪传感器、气压传感器、磁传感器、加速度传感器、距离传感器、接近光传感器、指纹传感器、温度传感器、触摸传感器、环境光传感器及骨传导传感器等。处理器可以包括应用处理器(Application Processor,AP)、调制解调处理器、图形处理器(Graphics Processing Unit, GPU)、图像信号处理器(Image Signal Processor,ISP)、控制器、视频编解码器、数字信号处理器(Digital Signal Processor,DSP)、基带处理器和/或神经网络处理器(Neural-etwork Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
主板30上还设置有显示控制电路。显示控制电路向显示屏70输出电信号,以控制显示屏70显示信息。发光控制单元和变色控制单元可以设于主板。
电池40安装在上述收容空间内部。例如,电池40可以安装在中框20上,并随中框20一同收容在上述收容空间中。电池40可以电连接至主板30,以实现电池40为电子设备供电。其中,主板30上可以设置有电源管理电路。电源管理电路用于将电池40提供的电压分配到电子设备中的各个电子元件。
后盖50用于形成电子设备的外部轮廓。后盖50可以一体成型。在后盖50的成型过程中,可以在后盖50上形成后置摄像头孔、指纹识别模组安装孔等结构。摄像组件可以设于主板和中框,并且摄像组件接收后置摄像头孔的光线。当然在实际应用中摄像组件也可以是前置摄像头,本公开实施例并不以此为限。
本公开示例性实施例还提供一种深度检测方法,应用于上述的电子设备,如图9所示,所述深度检测方法可以包括如下步骤:
步骤S910,接收深度检测指令;
步骤S920,响应于所述深度检测指令,控制多个传感器模组采集当前环境中现实物体的图像,多个传感器模组的后焦距不同;
步骤S930,根据多个传感器模组的后焦距确定每个传感器模组采集到的图像中现实物体的深度信息,以对三维空间物体信息进行重构。
本公开实施例提供的深度检测方法,根据深度检测指令调节传感器模组的后焦距,从而获取不同物距的图像,根据多个传感器模组的后焦距确定每个传感器模组采集到的图像中现实物体的深度信息。在采集图像时,从镜头模组进入的光线通过分光棱镜组被分为多束,多束光线分别被传输至对应的传感器模组,由于传感器模组的后焦距不同,因此每个传感器模组所采集的图像中的物体的物距不同,根据传感器模组的后焦距能够获取到相应的物距,从而能够确定每个传感器模组获取的图像中的物体的深度,实现了通过镜头模组检测现实环境中物体的深度,并且可以避免在电子设备中设置深度传感器,进而至少一定程度上降低电子设备的成本。
在步骤S910中,可以接收深度检测指令。
其中,深度检测指令可以用于指示是否进行深度检测,以及指示相应传感器模组的后焦距。也即是深度检测指令中可以包括第一驱动模组和第二驱动模组的驱动信号,第一驱动模组的驱动信号用于驱动镜头模组,第二驱动模组的驱动信号用于驱动传感器模组。或者深度检测指令中可以包括深度检测需求,该深度检测需求可以被转换为 第一驱动模组和第二驱动模组的驱动信号。
在步骤S920中,可以响应于所述深度检测指令,控制多个传感器模组采集当前环境中现实物体的图像,多个传感器模组的后焦距不同。
控制传感器模组上电,传感器模组上电后接收分光棱镜传输的光线,将光信号转换为电信号,从而采集到现实环境中物体的图像。
在步骤S930中,可以根据多个传感器模组的后焦距确定每个传感器模组采集到的图像中现实物体的深度信息,以对三维空间物体信息进行重构。
电子设备中存储有传感模组和物距的映射关系,因此可以根据传感器模组的后焦距确定物距,该物距即为当前图像中现实物体的深度信息。该深度信息可以是深度点值或者深度范围值。
进一步的,如图10所示,本公开实施例提供的深度检测方法可以包括:
步骤S940,响应于所述深度检测指令,调节多个传感器模组的后焦距。
步骤S940可以在步骤S920之前执行,当接收到深度检测指令后,电子设备首先调节多个传感器模组处于初始位置,也即是使得传感器模组的后焦距处于初始化状态。当检测指令包括进一步指示指令时,调节传感器模组的后焦距至目标位置。
需要说明的是,尽管在附图中以特定顺序描述了本公开中方法的各个步骤,但是,这并非要求或者暗示必须按照该特定顺序来执行这些步骤,或是必须执行全部所示的步骤才能实现期望的结果。附加的或备选的,可以省略某些步骤,将多个步骤合并为一个步骤执行,以及/或者将一个步骤分解为多个步骤执行等。
在本公开的示例性实施例中,还提供了一种计算机可读存储介质,其上存储有能够实现本说明书上述方法的程序产品。在一些可能的实施例中,本公开的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当所述程序产品在终端设备上运行时,所述程序代码用于使所述终端设备执行本说明书上述“示例性方法”部分中描述的根据本公开各种示例性实施例的步骤。
参考图11所示,描述了根据本公开的实施例的用于实现上述方法的程序产品1100,其可以采用便携式紧凑盘只读存储器(CD-ROM)并包括程序代码,并可以在终端设备,例如个人电脑上运行。然而,本公开的程序产品不限于此,在本文件中,可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行***、装置或者器件使用或者与其结合使用。
所述程序产品可以采用一个或多个可读介质的任意组合。可读介质可以是可读信号介质或者可读存储介质。可读存储介质例如可以为但不限于电、磁、光、电磁、红外线、或半导体的***、装置或器件,或者任意以上的组合。可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或 闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了可读程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。可读信号介质还可以是可读存储介质以外的任何可读介质,该可读介质可以发送、传播或者传输用于由指令执行***、装置或者器件使用或者与其结合使用的程序。
可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、有线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言的任意组合来编写用于执行本公开操作的程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
此外,上述附图仅是根据本公开示例性实施例的方法所包括的处理的示意性说明,而不是限制目的。易于理解,上述附图所示的处理并不表明或限制这些处理的时间顺序。另外,也易于理解,这些处理可以是例如在多个模块中同步或异步执行的。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由所附的权利要求指出。

Claims (20)

  1. 一种镜头组件,所述镜头组件包括:
    镜头模组;
    分光棱镜组,所述分光棱镜组设于所述镜头模组的出光侧,所述分光棱镜组用于将光线分成N束方向不同的光束并输出;及
    N个传感器模组,每个传感器模组对应接收所述分光棱镜组输出的光束,并且N个所述传感器模组被配置为后焦距不同;
    其中,N为大于等于2的正整数。
  2. 如权利要求1所述的镜头组件,所述镜头组件还包括:
    直角棱镜,所述直角棱镜设于所述镜头模组的入光侧,所述直角棱镜用于改变入射光线的方向。
  3. 如权利要求1所述的镜头组件,所述分光棱镜组包括:
    N个分光棱镜,N个所述分光棱镜依次设置于所述镜头模组的出光侧,N个所述分光棱镜中靠近所述镜头模组的所述分光棱镜的进光面与所述镜头模组的光轴垂直。
  4. 如权利要求3所述的镜头组件,N个所述分光棱镜的折射率相同。
  5. 如权利要求3所述的镜头组件,N个所述分光棱镜包括:
    N-1个第一分光棱镜,N-1个所述第一分光棱镜顺序排布于所述镜头模组的出光侧,每个所述第一分光棱镜远离所述镜头模组的一侧具有分光面,所述分光面上设置有半透半反射膜,以将照射至所述分光面的光束分为反射光束和透射光束。
  6. 如权利要求5所述的镜头组件,所述第一分光棱镜还具有进光面,光线从所述进光面进入所述第一分光棱镜,且由所述分光面反射的所述反射光束在所述进光面实现全反射。
  7. 如权利要求6所述的镜头组件,所述第一分光棱镜的所述进光面设置有介质层,所述介质层的折射率小于所述第一分光棱镜的折射率。
  8. 如权利要求6所述的镜头组件,所述分光棱镜组还包括:
    N-1个反射膜,每个所述反射膜均设于所述第一分光棱镜的进光面上,并位于对应的所述半透半反射膜反射光束的路径上,每个所述反射膜和一所述传感器模组相对,所述反射膜用于将对应的半透半反射膜反射的反射光束反射至所述传感器模组,其中,所述反射膜上设置有透光缺口,所述透光缺口和所述镜头模组相对。
  9. 如权利要求5所述的镜头组件,N个所述分光棱镜还包括:
    第二分光棱镜,所述第二分光棱镜设于第N-1级所述第一分光棱镜远离所述镜头模组的一侧,所述第二分光棱镜用于将第N-1级所述第一分光棱镜的透射光束直接传输至对应的所述传感器模组。
  10. 如权利要求9所述的镜头组件,所述镜头组件包括两个所述第一分光棱镜和一个所述第二分光棱镜,第一级第一分光棱镜设于镜头模组的出光侧,第二级第一分光棱镜设于第一级第一分光棱镜远离镜头模组的一侧,所述第二分光棱镜设于第二级第一分光棱镜远离镜头模组的一侧。
  11. 如权利要求10所述的镜头组件,其特征在于,光束的照射方向垂直于所述第一级第一分光棱镜的进光面,所述第一级第一分光棱镜的分光面和光束的照射方向的夹角为60°,所述第二级第一分光棱镜的进光面和出光面之间的夹角为40°;第二分光棱镜220的进光面和光束的照射方向的夹角为80°。
  12. 如权利要求1所述的镜头组件,所述镜头组件还包括:
    第一驱动模组,所述第一驱动模组和所述镜头模组连接,所述第一驱动模组用于调节所述镜头模组的位置或调节所述镜头模组的焦距。
  13. 如权利要求12所述的镜头组件,所述镜头组件还包括:
    第二驱动模组,所述第二驱动模组和所述传感器模组连接,所述第二驱动模组用于调节所述传感器模组的位置,以调节所述传感模组的后焦距。
  14. 如权利要求1所述的镜头模组,所述镜头组件还包括:
    封装壳体,所述封装壳体上设置有镜头孔和容置腔,所述镜头模组设于所述镜头孔,所述分光棱镜组和所述传感器模组封装于所述容置腔。
  15. 如权利要求14所述的镜头模组,所述镜头组件还可以包括:
    遮光层,所述遮光层设于所述容置腔的内壁,以避免所述镜头模组接收的光线在传输过程中外泄损失。
  16. 一种电子设备,所述电子设备包括权利要求1-15任一所述的镜头组件。
  17. 如权利要求16所述的电子设备,所述电子设备还包括:
    控制模块,所述控制模块和传感器模组连接,所述控制模块中存储有传感器模组后焦距和物距的映射关系,所述控制模块用于根据所述传感器模组后焦距和物距的映射关系确定现实物体的深度信息。
  18. 一种深度检测方法,所述深度检测方法包括:
    接收深度检测指令;
    响应于所述深度检测指令,控制多个传感器模组采集当前环境中现实物体的图像,多个所述传感器模组的后焦距不同;
    根据多个传感器模组的后焦距确定每个传感器模组采集到的图像中现实物体的深度信息,以对三维空间物体信息进行重构。
  19. 如权利要求18所述的深度检测方法,所述深度检测方法还包括:
    响应于所述深度检测指令,调节多个所述传感器模组的后焦距。
  20. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理 器执行时实现根据权利要求18或19所述方法。
PCT/CN2021/120705 2020-11-02 2021-09-26 镜头组件及电子设备、深度检测方法、存储介质 WO2022089113A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011204573.8 2020-11-02
CN202011204573.8A CN112285887B (zh) 2020-11-02 2020-11-02 镜头组件及电子设备、深度检测方法、存储介质

Publications (1)

Publication Number Publication Date
WO2022089113A1 true WO2022089113A1 (zh) 2022-05-05

Family

ID=74353977

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/120705 WO2022089113A1 (zh) 2020-11-02 2021-09-26 镜头组件及电子设备、深度检测方法、存储介质

Country Status (2)

Country Link
CN (1) CN112285887B (zh)
WO (1) WO2022089113A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112285887B (zh) * 2020-11-02 2022-11-08 Oppo广东移动通信有限公司 镜头组件及电子设备、深度检测方法、存储介质
CN117348131A (zh) * 2023-09-13 2024-01-05 杭州开亚科技合伙企业(有限合伙) 一种空间复用光学元件及光学***

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009122895A (ja) * 2007-11-14 2009-06-04 Taito Corp 観者の視点移動に対応した立体オブジェクトの擬似3次元画像を表示する方法及び装置
CN101636747A (zh) * 2007-03-19 2010-01-27 索尼株式会社 二维/三维数字信息获取和显示设备
CN105049698A (zh) * 2015-08-21 2015-11-11 广东欧珀移动通信有限公司 摄像模组及电子装置
CN109491176A (zh) * 2019-01-09 2019-03-19 凌云光技术集团有限责任公司 基于棱镜分光的大景深成像***及方法
CN209746285U (zh) * 2019-05-17 2019-12-06 杭州海康机器人技术有限公司 棱镜分光相机装置
CN111327810A (zh) * 2020-04-16 2020-06-23 上海易清智觉自动化科技有限公司 一种大景深工业相机及摄像机
CN211152041U (zh) * 2020-02-25 2020-07-31 RealMe重庆移动通信有限公司 电子设备及其摄像头组件
CN111726493A (zh) * 2020-06-17 2020-09-29 Oppo广东移动通信有限公司 摄像头模组及终端设备
CN112285887A (zh) * 2020-11-02 2021-01-29 Oppo广东移动通信有限公司 镜头组件及电子设备、深度检测方法、存储介质

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104935915B (zh) * 2015-07-17 2018-05-11 珠海康弘发展有限公司 成像装置、三维成像***及三维成像方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101636747A (zh) * 2007-03-19 2010-01-27 索尼株式会社 二维/三维数字信息获取和显示设备
JP2009122895A (ja) * 2007-11-14 2009-06-04 Taito Corp 観者の視点移動に対応した立体オブジェクトの擬似3次元画像を表示する方法及び装置
CN105049698A (zh) * 2015-08-21 2015-11-11 广东欧珀移动通信有限公司 摄像模组及电子装置
CN109491176A (zh) * 2019-01-09 2019-03-19 凌云光技术集团有限责任公司 基于棱镜分光的大景深成像***及方法
CN209746285U (zh) * 2019-05-17 2019-12-06 杭州海康机器人技术有限公司 棱镜分光相机装置
CN211152041U (zh) * 2020-02-25 2020-07-31 RealMe重庆移动通信有限公司 电子设备及其摄像头组件
CN111327810A (zh) * 2020-04-16 2020-06-23 上海易清智觉自动化科技有限公司 一种大景深工业相机及摄像机
CN111726493A (zh) * 2020-06-17 2020-09-29 Oppo广东移动通信有限公司 摄像头模组及终端设备
CN112285887A (zh) * 2020-11-02 2021-01-29 Oppo广东移动通信有限公司 镜头组件及电子设备、深度检测方法、存储介质

Also Published As

Publication number Publication date
CN112285887B (zh) 2022-11-08
CN112285887A (zh) 2021-01-29

Similar Documents

Publication Publication Date Title
US20200326545A1 (en) Substrate-guided optical device
WO2019218733A1 (zh) 用于指纹识别的光学装置、模组、设备及***
CN210895483U (zh) 屏下指纹识别***、液晶显示屏指纹识别装置及电子设备
US20210044729A1 (en) Lens Module and Camera
WO2022089113A1 (zh) 镜头组件及电子设备、深度检测方法、存储介质
TWI687733B (zh) 成像鏡片系統、辨識模組及電子裝置
TW201901220A (zh) 攝影鏡片系統、取像裝置及電子裝置
TWI657259B (zh) 透鏡系統、投射裝置、感測模組及電子裝置
US10701252B2 (en) Imaging optical system, imaging system, and imaging apparatus
CN111290102A (zh) 透镜组件和包括所述透镜组件的电子装置
CN211152041U (zh) 电子设备及其摄像头组件
US20210041692A1 (en) Stray light suppression in eye-tracking imaging
WO2021052190A1 (zh) 电子设备
CN111866328A (zh) 一种摄像头模组及移动终端
JP5846275B2 (ja) 光学系および撮像システム
KR20210054768A (ko) 렌즈 어셈블리 및 그를 포함하는 전자 장치
TWI565320B (zh) 攝像與感光整合型光學裝置
CN114859511A (zh) 光学镜头、摄像模组及电子设备
KR102252287B1 (ko) 소형 반사부를 이용한 카메라 모듈 및 이를 이용한 증강 현실용 광학 장치
CN114002814A (zh) 光学镜头、摄像模组及电子设备
CN111580255A (zh) 镜头模组及电子设备
US20210021743A1 (en) Handheld electronic device and head mounted electronic device
CN108227204B (zh) 一种具有夜视功能的vr眼镜
CN212255859U (zh) 镜头模组及电子设备
CN115236830B (zh) 光学镜片***及飞时测距感测模组

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21884843

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21884843

Country of ref document: EP

Kind code of ref document: A1