CN110099225B - Array camera module, depth information acquisition method thereof and electronic equipment - Google Patents

Array camera module, depth information acquisition method thereof and electronic equipment Download PDF

Info

Publication number
CN110099225B
CN110099225B CN201810441714.4A CN201810441714A CN110099225B CN 110099225 B CN110099225 B CN 110099225B CN 201810441714 A CN201810441714 A CN 201810441714A CN 110099225 B CN110099225 B CN 110099225B
Authority
CN
China
Prior art keywords
camera module
opening
array
image information
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810441714.4A
Other languages
Chinese (zh)
Other versions
CN110099225A (en
Inventor
吴旭东
粟登超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Opotech Co Ltd
Original Assignee
Ningbo Sunny Opotech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sunny Opotech Co Ltd filed Critical Ningbo Sunny Opotech Co Ltd
Publication of CN110099225A publication Critical patent/CN110099225A/en
Application granted granted Critical
Publication of CN110099225B publication Critical patent/CN110099225B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The system comprises an array camera module, a depth information acquisition method thereof and electronic equipment, wherein the array camera module comprises a first camera module; a second camera module; and a third camera module. The first camera module and the second camera module are respectively an infrared camera module, wherein the first camera module and the second camera module are arranged at intervals and used for respectively collecting first IR image information and second IR image information of the measured target. The third camera module is an RGB camera module which is adjacently arranged on the first camera module and is used for collecting RGB image information of the detected target. The depth information of the target to be measured is obtained by the first IR image information and the second IR image information, and further, the RGB image information of the target to be measured and the depth information of the target to be measured can be fused to obtain the RGB-D image information of the target to be measured. Therefore, the imaging quality of the array camera module is improved.

Description

Array camera module, depth information acquisition method thereof and electronic equipment
Technical Field
The invention relates to the field of camera modules, in particular to an array camera module, a depth information acquisition method thereof and electronic equipment.
Background
In the current information age, how to accurately identify the identity of a person and ensure the information security becomes a key social problem which must be solved. Under this background, technologies based on biometric recognition are gradually perfected and are gradually applied to various fields, such as application of biometric recognition technology to mobile terminals, unlocking, payment, and the like. Among biometric techniques, face recognition technique is one of the popular techniques, and is attracting much attention due to its advantages of convenience, high efficiency, and rapidness.
The existing face recognition technology has two basic technical directions: two-dimensional face image recognition and three-dimensional face image recognition. As the name implies, the two-dimensional face image recognition refers to collecting two-dimensional image information of a detected face through a two-dimensional imaging module, such as an RGB camera module, and performing recognition and judgment through matching and comparison with a background face database. However, it is well known that human faces have extremely complex geometries. When the RGB camera module is used to collect two-dimensional image information of a detected face, many pieces of information of the face, such as the absolute size of the face (the height of the nose, the depth of the concave portion of the glasses, etc.), and parts that are not visible due to occlusion, are lost. In other words, the two-dimensional face image recognition technology is not high in recognition accuracy in practical application.
Compared with two-dimensional face image recognition, the three-dimensional face image recognition technology fully considers the complex three-dimensional structural characteristics of the face, so that the three-dimensional face image recognition technology is dedicated to collecting the three-dimensional image information of the detected face and carries out recognition and judgment through matching and comparison with a background face database. Therefore, the three-dimensional face image recognition technology has relatively high recognition accuracy. The module of making a video recording that current has three-dimensional formation of image function includes: the depth information camera module based on structured light technology, the depth information camera module based on TOF technology (Time OF Flight) or the binocular depth information camera module.
In the working process, a laser or a structured light is actively projected to the surface of a measured target, so that the depth information of the measured target is obtained by analyzing the flight time or deformation of the laser or the structured light. Because of this, its module structure of degree of depth information camera module based on structured light technique or TOF technique is comparatively complicated, with high costs. Meanwhile, since the laser needs to be actively projected during the working process, although the laser is set within the safety range of human eyes, in the actual working process, the light intensity of the emitted laser is affected by many factors, such as temperature, humidity, and the like, which is undoubtedly a safety hazard for users.
The existing binocular depth information camera module comprises two RGB camera modules, in the working process, RGB images of a detected target are collected by the RGB camera modules respectively, and depth information of the detected target is obtained by corresponding algorithms, such as a triangular distance measurement algorithm and the like. However, the requirement of the RGB camera module to the shooting environment is relatively high, and especially is susceptible to the influence of ambient light, and when being in a dark state environment (dark or dim light environment), the depth information of the detected target cannot be accurately collected by the RGB camera module, so that the application of the binocular depth information camera module is influenced.
Disclosure of Invention
The invention mainly aims to provide an array camera module, a depth information acquisition method thereof and electronic equipment, wherein the array module comprises a first camera module and a second camera module, the first camera module and the second camera module are infrared camera modules, and the first camera module and the second camera module are matched with each other to acquire the depth information of a detected target.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein the array camera module has a relatively high dark shooting capability, that is, the array camera module still has a relatively good depth information collecting function in a dark environment.
Another object of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the array camera module further includes a light supplementing device, and the light supplementing device is configured to supplement an illumination intensity of the array camera module when collecting depth information of a target to be detected, so as to ensure image collecting quality of the first camera module and the second camera module.
Another object of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the array camera module can turn on/off the light supplement device based on external environment information, so that on one hand, the light supplement device can supplement imaging light timely based on external environment; on the other hand, when the light supplement device is not needed, the light supplement device is closed timely to save energy consumption.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein the infrared light projected by the light supplementing device is consistent with the infrared light wavelength segments that can be sensed by the IR sensing chips of the first camera module and the second camera module, so that the shooting environment of the array camera module can be optimized by the infrared light projected by the light supplementing device, thereby enhancing the imaging quality of the first camera module and the second camera module and finally improving the accuracy of the depth information collected by the array camera module.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the light supplementing device is disposed between the first camera module and the second camera module, so that the imaging light supplemented by the light supplementing device can be relatively uniformly reflected to the first camera module and the second camera module, respectively, so as to ensure the imaging quality of the first camera module and the second camera module at the same time.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof, and an electronic device, wherein the first camera module and the second camera module respectively include a filter element, and the filter element is respectively disposed in a photosensitive path of the first camera module and a photosensitive path of the second camera module for filtering stray light, so as to improve the imaging quality of the array camera module and the precision of the collected depth information.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein the array camera module further includes a third camera module, and the third camera module is an RGB camera module, so as to improve the imaging quality of the array camera module and the precision of the collected depth information by the RGB camera module.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein the third camera module is configured to work synchronously with the first camera module and the second camera module to collect RGB image information of a target to be measured, so as to form RGB-D image information by fusing the RGB image information collected by the third camera module and the depth information of the target to be measured collected by the first camera module and the second camera module, thereby optimizing the imaging quality and the depth information collecting precision of the array camera module.
Another object of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein the array camera module further includes a bracket, and in an embodiment of the present invention, the bracket is used for positioning the first camera module and the second camera module and enhancing the structural strength of the first camera module and the second camera module. Another object of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the bracket is used for combining and positioning the first camera module, the second camera module and the third camera module, so as to enhance the structural strength of the first camera module, the second camera module and the third camera module by the bracket.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the bracket has a recess portion, the recess portion is located between the first camera module and the second camera module and integrally extends downward from a top surface of the bracket to form a reserved space between the first camera module and the second camera module, wherein when the array camera module is assembled in an electronic device, the reserved space can be used for installing other electronic components of the electronic device, so as to maximally save an assembly space of the electronic device.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof, and an electronic device, wherein in an embodiment of the present invention, the bracket has an opening, and the opening corresponds to a space between the first camera module and the second camera module, so as to form the reserved space between the first camera module and the second camera module, so that when the array camera module is assembled in an electronic device, the reserved space can be used for installing other electronic components of the electronic device, thereby maximally saving an assembly space of the electronic device.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof, and an electronic device, wherein in an embodiment of the present invention, the light supplement device is installed in the reserved space formed by the bracket and the array camera module, so that the array camera module has a more compact structure.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein the third camera module is adjacently disposed to the first camera module or the second camera module, so that a shooting angle of the third camera module is close to a shooting angle of the depth image information collected by the array camera module, which is beneficial for synthesizing subsequent images and obtaining RGB-D image information.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the third camera module and the first camera module or the second camera module have an integrated modular structure, so as to facilitate the installation and calibration of the array camera module.
Another object of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the array camera module further includes an inner bracket, so that the third camera module and the first camera module or the second camera module are combined through the inner bracket, so that the third camera module and the first camera module or the second camera module have an integrated structure.
Another object of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the third camera module and the first camera module or the third camera module share a same base, so that the third camera module and the first camera module or the second camera module have an integrated structure.
Another objective of the present invention is to provide an array camera module, a depth information collecting method thereof and an electronic device, wherein in an embodiment of the present invention, the third camera module and the first camera module or the third camera module share a same base, so that the third camera module and the first camera module or the second camera module have an integrated structure.
Other advantages and features of the invention will become apparent from the following description and may be realized by means of the instrumentalities and combinations particularly pointed out in the appended claims.
In accordance with the present invention, the foregoing and other objects and advantages can be realized by an array camera module for collecting depth information of a target, comprising:
a first camera module; and
and the first camera module and the second camera module are respectively an infrared camera module and are used for respectively acquiring first IR image information and second IR image information of the detected target, wherein the first IR image information and the second IR image information are transmitted to an image processor, and the image processor is used for processing the first IR image information and the second IR image information to acquire the depth information of the detected target.
In an embodiment of the invention, the array camera module further includes a light supplement device, and the light supplement device is configured to project an infrared light with a specific wavelength to the target.
In an embodiment of the invention, the light supplement device is configured to work synchronously with the first camera module and the second camera module.
In an embodiment of the invention, the light supplement device is located between the first camera module and the second camera module.
In an embodiment of the invention, the first camera module includes a first filter element, wherein the second filter element is located in a photosensitive path of the first camera module for filtering stray light.
In an embodiment of the invention, the second camera module includes a second filter element, wherein the second filter element is located in a photosensitive path of the second camera module for filtering stray light.
In an embodiment of the invention, the array camera module further includes a support, the support has a receiving cavity for receiving the first camera module and the second camera module therein, wherein the support has a first opening and a second opening, the first opening and the second opening are respectively communicated with the receiving cavity, the first opening corresponds to the first camera module to expose the first camera module, and the second opening corresponds to the second camera module to expose the second camera module.
In an embodiment of the invention, the bracket further has a third opening, and the third opening is located between the first opening and the second opening to form a reserved space between the first camera module and the second camera module
In an embodiment of the invention, the first opening, the second opening and the third opening extend integrally to form an opening of the bracket.
In an embodiment of the present invention, the bracket further has a recessed portion located between the first camera module and the second camera module and integrally extending downward from the top surface of the bracket to form a reserved space between the first camera module and the second camera module.
In an embodiment of the present invention, the array camera module further includes a third camera module, and the third camera module is an RGB camera module for collecting RGB image information of the target to be measured.
In an embodiment of the invention, the image processor is communicably connected to the third camera module, and is configured to receive the RGB image information of the target to be detected collected by the third camera module and fuse the RGB image information of the target to be detected and the depth information of the target to obtain an RGB-D image information.
In an embodiment of the invention, the third camera module is adjacently disposed to the first camera module or the second camera module.
In an embodiment of the present invention, the third camera module and the first camera module or the second camera module disposed adjacent to the third camera module have an integrated structure.
In an embodiment of the invention, the array camera module further includes a support, the support has a receiving cavity for receiving the first camera module, and the second camera module and the third camera module are disposed therein, wherein the support has a first opening, a second opening and a third opening, the first opening, the second opening and the third opening are respectively communicated with the receiving cavity, the first opening corresponds to the first camera module to expose the first camera module, the second opening corresponds to the second camera module to expose the second camera module, and the third opening corresponds to the third camera module to expose the third camera module.
In an embodiment of the present invention, the bracket further has a fourth opening, and the fourth opening is located between the first opening and the second opening, so as to form a reserved space between the first camera module and the second camera module.
In an embodiment of the present invention, the first opening, the second opening, the third opening and the third opening integrally extend to form an opening of the bracket.
In an embodiment of the present invention, the array camera module further includes an inner bracket for combining the first camera module and the second camera module, so that the first camera module and the second camera module have an integrated structure.
In an embodiment of the present invention, the array camera module further includes an inner bracket for combining the first camera module and the third camera module, so that the first camera module and the third camera module have an integrated structure.
According to another aspect of the present invention, there is also provided a depth information collecting method, including the steps of:
s1, obtaining a first IR image information of a detected target by a first camera module, wherein the first camera module is an infrared camera module;
s2, obtaining a second IR image information of the detected target by a second camera module, wherein the second camera module is an infrared camera module; and
s3 processes the first IR image information and the second IR image information according to a preset algorithm to obtain depth information of the target under test.
In an embodiment of the present invention, in step S1 and step S2, the method further includes the steps of:
S10A light supplement device is used to project an infrared light to the surface of the target.
In an embodiment of the present invention, the depth information collecting method further includes the steps of:
s4, obtaining an RGB image information of the measured object by a third camera module, wherein the third camera module is an RGB camera module; and
s5 blends the RGB image information and the depth information of the target to be measured to obtain an RGB-D image information.
According to another aspect of the present invention, the present invention also provides an electronic device, comprising:
an electronic device body; and
and the array camera shooting module is assembled on the electronic equipment body and used for collecting the depth information of a measured target.
Further objects and advantages of the invention will be fully apparent from the ensuing description and drawings.
These and other objects, features and advantages of the present invention will become more fully apparent from the following detailed description, the accompanying drawings and the claims.
Drawings
Fig. 1 is a perspective view of an array camera module according to a first preferred embodiment of the invention.
Fig. 2 is a block diagram of the array camera module according to the above preferred embodiment.
Fig. 3 is a schematic cross-sectional view of the array camera module according to the above preferred embodiment.
Fig. 4 is a schematic working diagram of a light supplement device of the array camera module according to the above preferred embodiment.
Fig. 5 is a schematic cross-sectional view of a modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 6 is a schematic cross-sectional view of another variation of the array camera module according to the above preferred embodiment.
Fig. 7 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 8 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 9 is a schematic cross-sectional view of another variation of the array camera module according to the above preferred embodiment.
Fig. 10 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 11 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 12 is a schematic cross-sectional view of another variation of the array camera module according to the above preferred embodiment.
Fig. 13 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 14 is a cross-sectional view of another variation of the array camera module according to the above preferred embodiment.
Fig. 15 is a block diagram of an array camera module according to a second preferred embodiment of the invention.
Fig. 16 is a cross-sectional view of the array camera module according to the above preferred embodiment.
Fig. 17 is a cross-sectional view of a modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 18 is a cross-sectional view of another variation of the array camera module according to the above preferred embodiment.
Fig. 19 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 20 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 21 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 22 is a cross-sectional view of another modified embodiment of the array camera module according to the above preferred embodiment.
Fig. 23A and 23B are schematic views illustrating an assembly process of the array camera module according to the above preferred embodiment.
Fig. 24 is a schematic view of another arrangement of the real-image capturing module according to the above preferred embodiment.
Fig. 25 is a schematic flow chart of a depth information collecting method of the array camera module according to the present invention.
Fig. 26 is a schematic perspective view illustrating the array camera module provided by the present invention being assembled in an electronic device.
Fig. 27 is another schematic perspective view illustrating the array camera module provided by the present invention being assembled in an electronic device.
Fig. 28 is another schematic perspective view illustrating the array camera module provided by the present invention being assembled in an electronic device.
Fig. 29 is a schematic diagram illustrating a matching effect between the reserved space of the array camera module and the electronic device body when the array camera module is assembled in the electronic device. .
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be constructed and operated in a particular orientation and thus are not to be considered limiting.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
Referring to fig. 1 to 4, an array camera module according to a first preferred embodiment of the present invention is illustrated, wherein the array camera module is used for collecting depth information of a measured object. For example, in a specific application of the present invention, the array camera module can be applied to a three-dimensional face recognition technology. In the application, the array camera module is arranged to acquire three-dimensional image information of the detected face, and further, the three-dimensional face recognition function can be realized by matching and comparing with a background three-dimensional face database.
As described above, the existing camera module with three-dimensional imaging function includes: the depth information camera module based on structured light technology, the depth information camera module based on Time OF Flight (TOF) technology, and the binocular depth information camera module. The depth information camera module is limited by an imaging mechanism, based on a structured light technology and a flight time rule technology, and has relatively complex and high cost module structure and certain potential safety hazard. Although, current binocular depth information module of making a video recording, its simple structure only includes two RGB modules of making a video recording, but is higher relatively to the requirement of shooting the environment, especially easily receives the influence of ambient light. When the binocular depth information camera module is in a dark state environment (dark or low-light environment), the binocular depth information camera module cannot accurately acquire (even cannot acquire) the depth information of the detected target. Therefore, the existing three-dimensional imaging function cannot give consideration to both the module structure and the depth information acquisition performance, and a certain improvement space or a certain replaceable space still exists.
Correspondingly, as shown in fig. 2, the array camera module provided by the present invention includes a first camera module 10, a second camera module 20 and an image processor 40, wherein the first camera module 10 and the second camera module 20 are respectively communicably connected to the image processor 40 and cooperate with each other to collect depth information of a target to be measured. Particularly, in the preferred embodiment of the present invention, the first camera module 10 and the second camera module 20 are respectively an Infrared camera module (IR) for respectively collecting a first IR image information and a second IR image information of the measured object, and further the image processor 40 processes the first IR image information and the second IR image information according to a predetermined algorithm to obtain the depth information of the measured object.
In terms of module structure, the structure of the array camera module provided by the invention is similar to that of the existing binocular depth information camera module, and the difference is that in the invention, the first camera module 10 and the second camera module 20 of the array camera module are infrared camera modules instead of RGB camera modules. It will be appreciated that the infrared camera module has a relatively superior dark state shooting performance compared to the RGB camera module. That is, even in a dark environment (dark or low-light environment), the array camera module provided by the invention still has a relatively excellent depth information acquisition function, so that the application range of the array camera module provided by the invention is greatly expanded compared with the existing binocular depth information camera module.
More specifically, as shown in fig. 3, in the preferred embodiment of the present invention, the first camera module 10 includes a first photosensitive chip 11, a first optical lens 12 and a first circuit board 13, the first photosensitive chip 11 is electrically connected to the first circuit board 13, the first optical lens 12 is held in a photosensitive path of the first photosensitive chip 11, so that the imaging light of the detected object collected by the first optical lens 12 can reach the first photosensitive chip 11 along the photosensitive path, and an imaging reaction occurs at the first photosensitive chip 11. The second camera module 20 includes a second photosensitive chip 21, a second optical lens 22 and a second circuit board 23, the second photosensitive chip 21 is electrically connected to the second circuit board 23, the second optical lens 22 is held in a photosensitive path of the second photosensitive chip 21, so that the imaging light of the detected object collected by the second optical module can reach the second photosensitive chip 21 along the photosensitive path, and an imaging reaction occurs at the second photosensitive chip 21.
In particular, in the present invention, the first camera module 10 and the second camera module 20 are infrared camera modules, in other words, the first photosensitive chip 11 and the second photosensitive chip 21 are ir (infrared radiation) photosensitive chips. Compared with a conventional RGB photosensitive chip, the IR photosensitive chip can sense infrared light of a specific wavelength band, for example, 850nm, 940nm, etc., so that, even under a dark environment (dark or low-light environment), the IR photosensitive chip can still sense the infrared light of the specific wavelength band reflected from the surface of the measured object and generate the first IR image and the second IR image for obtaining the depth information of the measured object. Therefore, the array camera shooting module provided by the invention can not only play a role under the condition of better illumination condition, but also have good performance under the dark state condition, thereby covering the day and night and meeting the requirements of users for use at any time.
In order to avoid the interference of stray light from affecting the imaging quality of the first camera module 10 and the second camera module 20, the first camera module 10 further includes a first filter element 14, and the first filter element 14 is held in the photosensitive path of the first photosensitive chip 11 for filtering the stray light in the imaging light of the target to be detected collected by the first optical lens 12. Similarly, the second camera module 20 further includes a second filter element 24, and the second filter element 24 is retained in the light sensing path of the second light sensing chip 21 for filtering stray light in the imaging light of the detected object collected by the second optical lens 22. In particular, in the preferred embodiment of the present invention, the filter elements (the first filter element 14 and the second filter element 24) are arranged to allow only the photosensitive chips (the first photosensitive chip 11 and the second photosensitive chip 21) to transmit, so that when the first filter element 14 and the second filter element 24 are respectively disposed at the photosensitive paths of the first photosensitive chip 11 and the second photosensitive chip 21, only the infrared light of the specific wavelength band that can be sensed by the first photosensitive chip 11 and the second photosensitive chip 21 can pass through the first filter element 14 and the second filter element 24, while the remaining wavelength bands of light are blocked or absorbed by the first filter element 14 and the second filter element 24, in this way, the imaging quality of the first camera module 10 and the second camera module 20 is ensured.
As shown in fig. 3, in the preferred embodiment of the present invention, the filter elements (the first filter element 14 and the second filter element 24) have a sheet-like structure, and are supported inside the first camera module 10 and the second camera module 20. It should be easily understood that, in another embodiment of the present invention, the filter elements (the first filter element 14 and the second filter element 24) may be installed outside the first camera module 10 and the second camera module 20, and correspond to the light sensing paths corresponding to the first light sensing chip 11 and the second light sensing chip 21. For example, the filter elements (the first filter element 14 and the second filter element 24) may be installed at top sides of the first optical lens 12 and the second optical lens 22, so that imaging light from the outside is effectively filtered by the filter elements (the first filter element 14 and the second filter element 24) before entering the first optical lens 12 and the second optical lens 22, to ensure imaging quality of the first camera module 10 and the second camera module 20.
It should be noted that, in another embodiment of the present invention, the filtering elements (the first filtering element 14 and the second filtering element 24) may also be implemented as a filter, wherein the filter may be coated on any position of the corresponding photosensitive path of the first photosensitive chip 11 and the second photosensitive chip 21 for filtering the stray light in the imaging light. For example, in a specific embodiment, the filter may be directly coated on top sides of the first photo sensor chip 11 and the second photo sensor chip 21 in an overlapping manner, so that imaging light from the outside is effectively filtered by the filter before reaching the first photo sensor chip 11 and the second photo sensor chip 21 and generating an image reaction, so as to ensure the imaging quality of the first camera module 10 and the second camera module 20. In other words, in the preferred embodiment of the present invention, the embodiment (sheet-like, film-like) of the filter element and the mounting position of the filter element are not limited to the present invention.
Further, those skilled in the art will know that the infrared light of a specific wavelength band required for imaging by the first photosensitive chip 11 and the second photosensitive chip 21 in nature is not abundant and is easily absorbed by other substances, although it exists. Alternatively, in some cases, for example, when the shooting environment is in a dark state, there is a high possibility that the intensity of the infrared light of the specific wavelength band existing in nature is insufficient. Therefore, in order to ensure that the first camera module 10 and the second camera module 20 can collect sufficient infrared light of the specific wavelength band, in the preferred embodiment of the present invention, the array camera module further includes a light supplement device 50, and the light supplement device 50 is used for supplementing the infrared light of the specific wavelength band required by the imaging of the first camera module 10 and the second camera module 20. The light filling device 50 may be implemented as a Vertical Cavity Surface Emitting Laser (VCSEL), a side Emitting Laser, or an LED.
In the preferred embodiment of the present invention, the light supplement device 50 can be electrically connected to the first circuit board 13 of the first camera module 10 or the second circuit board 23 of the second camera module 20, so as to provide the light supplement device 50 with electric energy required for operation through the first circuit board 13 and the second circuit board 23. When the light supplement device 50 is activated, the light supplement device 50 generates and projects the infrared light with a specific wavelength to the surface of the target to be measured, so as to supplement the infrared light required for imaging of the first camera module 10 and the second camera module 20. In other words, in the preferred embodiment of the present invention, the light supplement device 50 forms an active light source of the array camera module, so as to supplement the intensity of the imaging light required by the first camera module 10 and the second camera module 20.
Fig. 4 illustrates a schematic diagram of the light supplement device supplementing the imaging light of the first camera module and the second camera module. As shown in fig. 4, the infrared light with a specific wavelength generated by the light supplement device 50 is projected onto the surface of the target to be measured. Further, the infrared light is reflected at the surface of the target to be detected and is radiated to the first camera module and the second camera module respectively, wherein the infrared light reaching the first camera module passes through the first optical lens 12 and is filtered by the first filter element 14, so as to reach the first photosensitive chip 11 finally; the infrared light reaching the second camera module 20 passes through the second optical lens 22, and is filtered by the second filter 24, so as to reach the second sensor chip 21.
Preferably, in the preferred embodiment of the present invention, the light supplement device 50 is disposed between the first camera module 10 and the second camera module 20. As shown in fig. 4, such an arrangement may make the first camera module 10, the second camera module 20, and the fill-in light device 50 have a relatively more compact structure. Those skilled in the art will appreciate that although the first camera module 10 and the second camera module 20 of the array camera module are infrared camera modules rather than RGB camera modules in the present invention, the existing depth information extraction algorithm applied to the binocular depth information camera module, such as triangulation algorithm, can still be applied to the array camera module provided by the present invention. Corresponding to the characteristics of the triangulation algorithm, a preset distance is set between the first camera module 10 and the second camera module 20, and the larger the preset distance is, the higher the depth information acquisition precision of the array camera module is. In other words, in the preferred embodiment of the present invention, there is a gap between the first camera module 10 and the second camera module 20. Here, when the light supplement device 50 is installed between the first camera module 10 and the second camera module 20, the void space can be effectively utilized, so that the array camera module has a more compact structure.
In addition, it should be noted that when the light supplement device 50 is disposed between the first camera module 10 and the second camera module 20, the infrared light with specific wavelength projected by the light supplement device 50 ensures that the infrared light with specific wavelength projected by the light supplement device can be uniformly distributed on the surface of the object to be measured, and is easily perceived by the first camera module 10 and the second camera module 20, so as to ensure the image acquisition quality of the first camera module 10 and the second camera module 20. In other words, when the light supplement device 50 is disposed between the first camera module 10 and the second camera module 20, the imaging light supplemented by the light supplement device 50 can be relatively uniformly reflected to the first camera module 10 and the second camera module 20, respectively, so as to ensure the imaging quality of the first camera module and the second camera module.
It should be noted that, in another embodiment of the present invention, the light supplement device 50 can be disposed at other positions, for example, an outer portion of the first camera module 10 or the second camera module 20 (refer to fig. 5), which is not limited by the present invention. Specifically, when the light supplement device 50 is disposed at other positions, the power supply circuit of the light supplement device 50 can be separately disposed, for example, the light supplement device 50 can be separately connected to an additional circuit board (not shown). Here, the configuration of the power supply circuit of the light supplement device 50 is not a key point of the present invention, and thus is not described in detail.
In addition, it is worth mentioning that, in the specific application scene of the array camera module, for example, the array camera module is assembled to an electronic device (for example, a smart phone), and if the electronic device is provided with a component similar to the light supplement device 50, the array camera module itself may not provide the light supplement device 50. In other words, in the present invention, the light supplement device 50 is an unnecessary element, and can be optionally omitted in a specific case.
Further, in order to reduce energy consumption, in the preferred embodiment of the present invention, the operation mode of the light supplement device 50 may be set as follows: working synchronously with the first camera module 10 and the second camera module 20. Alternatively, in another embodiment of the present invention, the operation mode of the light supplement device 50 may be intelligently adjusted based on the external environment. For example, when the intensity of the imaging light received by the first camera module 10 and the second camera module 20 is detected to be smaller than a preset threshold, the light supplement device 50 is set to be turned on, and when the intensity of the imaging light received by the first camera module 10 and the second camera module 20 is detected to meet the preset threshold, the light supplement device 50 is kept turned off. Here, the array camera module intelligently opens or closes the working mode of the light supplement device 50 based on the external environment, on one hand, the light supplement device 50 can supplement imaging light timely based on the external environment; on the other hand, when the light supplement device is not needed, the light supplement device 50 is turned off at a proper time, so as to further save energy consumption. It should be noted that, when the working mode is adopted, the array camera module further includes a detection device (not shown in the figure) for detecting the intensity of the infrared light with the specific wavelength in the external environment, so as to intelligently control the on/off of the light supplement device 50 according to the detection result of the detection device.
Further, in the preferred embodiment of the present invention, the array camera module can be implemented as an integrated array camera module. For example, as shown in fig. 3, in the preferred embodiment of the present invention, the first circuit board 13 of the first camera module 10 integrally extends to the second circuit board 23 of the second camera module 20, so that the first circuit board 13 and the second circuit board 23 have an integrated structure. That is, in the preferred embodiment of the present invention, the first camera module 10 and the second camera module 20 share a circuit board. Here, the first circuit board 13 and the second circuit board 23 have an integrated structure to form a positioning and mounting base surface for mounting and calibrating the first camera module 10 and the second camera module 20. It should be noted that, in another embodiment of the present invention (refer to fig. 7, 10 or 12), the first circuit board 12 and the second circuit board 23 may be configured separately, that is, in another embodiment of the present invention, the array camera module has a split structure.
Further, in the present invention, the combination manner (whether the first camera module 10 and the second camera module 20 are integrated or separated) between the first camera module and the second camera module can be configured by other manners.
More specifically, in the present invention, the first camera module 10 further includes a first base 15, the first base 15 is mounted on the first circuit board 13, wherein the first optical lens 12 is mounted on the top side of the first base 15, so that the first optical lens 12 is held on the photosensitive path of the first photosensitive chip 11 by the first base 15. In particular, the first base 15 has a first light through hole 150, and the first light through hole 150 corresponds to at least a light sensing area of the first light sensing chip 11, so as to define a light sensing path of the first light sensing chip 11 through the first optical lens 12 and the first light through hole 150. The second camera module 20 further includes a second base 25, the second base 25 is mounted on the circuit board, wherein the second optical lens 22 is mounted on the top side of the second base 25, so that the second optical lens 22 is held on the photosensitive path of the second photosensitive chip 21 by the second base 25. Similarly, the second base 25 forms a second light-passing hole 250, and the second light-passing hole 250 corresponds to at least a light-sensing area of the second light-sensing chip 21, so as to define a light-sensing path of the second light-sensing chip 21 through the second optical lens 22 and the second light-passing hole 250.
Accordingly, in the present invention, the molding structure of the array camera module can be adjusted by the combination between the first base 15 and the second base 25. In a variant embodiment of the preferred embodiment of the present invention, as shown in fig. 7, the first base 15 integrally extends from the second base 25, so that the first base 15 and the second base 25 have an integral structure. Namely, the array camera module has an integrated structure. In other words, in this modified embodiment, the first camera module 10 and the second camera module 20 share a single base, so that the first camera module 10 and the first camera module 20 can be positioned, mounted, and calibrated through the single base formed by the first base 15 and the second base 25. Accordingly, in another modified embodiment of the preferred embodiment of the present invention (refer to fig. 6, fig. 9, fig. 10, or fig. 12), the first base 15 and the second base 25 may be separately configured, in other words, the array camera module has a split structure.
In particular, in some embodiments of the present invention, the array camera module may be configured with a one-piece circuit board and a one-piece base, that is, in some embodiments of the present invention (refer to fig. 3), the first circuit board 13 of the first camera module 10 and the second circuit board 23 of the second camera module 20 have a one-piece structure, and the first base 15 of the second camera module 10 and the second base 25 of the second camera module 20 have a one-piece structure. Here, the integrated circuit board and the integrated base complement each other to further optimize the mounting and fitting accuracy of the first camera module 10 and the second camera module 20.
It should be noted that, in the present invention, when the first base 15 and the second base 25 are split type bases, the first base 15 and the second base 25 may be separately molded and respectively attached to the first circuit Board 13 and the second circuit Board 23 by a COB (chip On Board) process (refer to fig. 3). Alternatively, when the array camera module has a one-piece base, the one-piece base formed by the first base 15 and the second base 25 may be formed separately and attached to the first circuit Board 13 and the second circuit Board 23 (or corresponding positions of the one-piece circuit Board) by a COB (shop On Board) process, referring to fig. 7 and 11.
Of course, in other embodiments of the present invention, the first base 15 and the second base 25 may be mounted on the corresponding positions of the first circuit board 13 and the second circuit board by other methods. For example, as shown in fig. 8, in the modified embodiment of the preferred embodiment of the present invention, the first base 15 and the second base 25 may be formed at the corresponding positions of the first circuit board 13 and the second circuit board 23 through a molding or embossing process. In particular, when the first base 15 and the second base are a one-piece base, the one-piece base (the first base 15 and the second base 25) may be formed by a mob (moving On board), moc (moving On chip) or MOG (moving On Glass) process. Here, different from the conventional COB installation method, the integrated base formed by the integral molding process can make the first camera module and the second camera module have a more compact structure and a smaller size.
In order to further ensure that the relative installation positions of the first camera module 10 and the second camera module 20 satisfy a certain relationship, such as that the optical axes of the first camera module 10 and the second camera module 20 are parallel or spaced by a predetermined distance, and to strengthen the structural strength of the first camera module 10 and the second camera module 20, the array camera module further includes a bracket 60, and the bracket 60 is fixed on the outer peripheries of the first camera module 10 and the second camera module 20 by a bonding adhesive layer for positioning the first camera module 10 and the second camera module 20. Here, it should be appreciated that, by the support 60, the first camera module 10 and the second camera module 20 still have an integral structure, in other words, in the present invention, when the array camera module is configured with any one of an integral circuit board, an integral base, or a common support, the array camera module is an integral array camera module.
More specifically, as shown in fig. 3, in the embodiment of the present invention, the bracket 60 has a receiving cavity 61 for receiving the first camera module 10 and the second camera module 20 therein. Further, as shown in fig. 3, the bracket 60 has a first opening 601 and a second opening 602, the first opening 601 and the second opening 602 are respectively communicated with the accommodating cavity 61, the first opening 601 corresponds to the first camera module 10 to expose the first camera module 10, and the second opening 602 corresponds to the second camera module 20 to expose the second camera module 20.
As described above, in the present invention, the array camera module may use a triangulation algorithm or the like to measure depth information. Corresponding to the characteristics of the triangulation algorithm, a preset distance is set between the first camera module 10 and the second camera module 20, and the larger the preset distance is, the higher the depth information acquisition precision of the array camera module is. In other words, in the preferred embodiment of the present invention, there is a gap between the first camera module 10 and the second camera module 20. In particular, in the preferred embodiment of the present invention, the bracket 60 further has a third opening 603, wherein the third opening 603 is located between the first opening 601 and the second opening 602, that is, the third opening 603 is located between the first camera module 10 and the second camera module 20.
Here, the third opening 603 is communicated with the receiving cavity 61, so that a reserved space 62 is defined between the first camera module 10 and the second camera module 20 through the third opening 603, so as to fully utilize a gap space between the first camera module 10 and the second camera module 20. For example, the reserved space 62 may be used to install the supplementary lighting device 50. Or, when the array camera module is assembled in an electronic device (e.g., a smart phone), the reserved space 62 may be used to mount other electronic components (e.g., a microphone, etc.) of the electronic device, so as to maximally save the assembly space of the electronic device.
Fig. 11 shows a variant embodiment of the holder 60. As shown in fig. 11, in this modified embodiment, the first opening 601, the second opening 602, and the third opening 603 extend integrally to form an opening 600 of the bracket 60. Here, the holder 60 has a "mouth" shaped structure to form the housing cavity 61 in a peripheral wall thereof and the opening 600, wherein the opening 600 communicates with the housing cavity 61 to expose the first camera module 10 and the second camera module 20 and to form the reserved space 62 between the first camera module 10 and the second camera module 20.
Fig. 12 shows another modified embodiment of the stand 60 provided by the present invention, wherein in the modified embodiment, the stand 60 has a receiving cavity 61 for receiving the first camera module 10 and the second camera module 20 therein. Further, the bracket 60 has a first opening 601 and a second opening 602, the first opening 601 and the second opening 602 are respectively communicated with the accommodating cavity 61, the first opening 601 corresponds to the first camera module 10 to expose the first camera module 10, and the second opening 602 corresponds to the second camera module 20 to expose the second camera module 20. In particular, the bracket 60 further has a recess 605, and the recess 605 is located between the first camera module 10 and the second camera module 20 and integrally extends downward from the top surface of the bracket 60 to form a reserved space 62 between the first camera module 10 and the second camera module 20. Similarly, the reserved space 62 can be used for installing the supplementary lighting device 50. Or, when the array camera module is assembled in an electronic device (e.g., a smart phone), the reserved space 62 may be used to mount other electronic components (e.g., a microphone, etc.) of the electronic device, so as to maximally save the assembly space of the electronic device.
Further, in the preferred embodiment of the invention, the first camera module 10 further includes a first lens carrying element 16, wherein the first optical lens 12 is mounted on the first lens carrying element 16, and the first lens carrying element 16 is mounted on the first base 15, in such a way, the first optical lens 12 is maintained in the light sensing path of the first light sensing chip 11. The second camera module 20 further includes a second lens carrying element 26, wherein the second optical lens 22 is mounted on the second lens carrying element 26, and the second lens carrying element 26 is mounted on the second base 25, in such a way as to maintain the second optical lens 22 in the photosensitive path of the second photosensitive chip 21.
It should be noted that, in the present invention, the lens carrying element 16,26 can be implemented as a supporting lens barrel 161, 261 such that the corresponding camera module is implemented as a fixed focus camera module, or the lens carrying element 16,26 can be implemented as a driving element 162, 262 such that the corresponding camera module is implemented as a moving focus camera module. It should be appreciated that, in the present invention, the types and combinations of the first camera module 10 and the second camera module 20 of the array camera module may be any type, as shown in fig. 13 to 14, for which the present invention is not limited.
It should be appreciated that, in the present invention, the structure of the array camera module is described as an example, so that those skilled in the art can more fully understand the technical features of the array camera module provided by the present invention, and thus those skilled in the art should understand the technical features.
Further, as shown in fig. 15 and fig. 22, an array camera module according to a second preferred embodiment of the present invention is illustrated, wherein the array camera module illustrated in the second preferred embodiment is a modified implementation of the first preferred embodiment.
As shown in fig. 15 and 16, in the preferred embodiment of the present invention, the array camera module further includes a third camera module 30, and the third camera module 30 is an RGB camera module, so as to collect RGB image information of the target to be measured by the RGB camera module. As described above, in the present invention, the first camera module 10A and the second camera module 20A of the array camera module are infrared camera modules. Those skilled in the art will understand that the image information of the target to be measured collected by the infrared camera module is gray scale image information, and therefore, the depth image information of the target to be measured extracted by the first IR image information and the second IR image information is gray scale depth image information, and the imaging quality is not high. Moreover, whether the gray scale image information or the gray scale depth image information is used, the visual effect of the image information is difficult to satisfy the normal visual requirement of human eyes. Therefore, in the preferred embodiment of the present invention, the imaging quality of the array camera module is improved by additionally configuring the RGB camera module (the third camera module 30) and further fusing the RGB image information collected by the third camera module 30 and the depth information collected by the first camera module 10 and the second camera module 20 through the image processor 40A to obtain an RGB-D image information.
In particular, in order to facilitate the subsequent image synthesis to obtain the RGB-D image information, preferably, in the preferred embodiment of the present invention, the third camera module 30 is adjacently disposed to the first camera module 10A or the second camera module 20A, so that the field angle of the third camera module 30 is consistent with the field angle of the depth image information collected by the array camera module. As described above, in order to improve the accuracy of collecting the depth information of the array camera module, the first camera module 10A and the second camera module 20A have a predetermined distance therebetween, and in order to facilitate image fusion, the third camera module 30 needs to be disposed adjacent to the first camera module 10A or the second camera module 20A. Thus, a special arrangement mode among the camera modules of the array camera module is formed. Similarly, in the preferred embodiment of the present invention, the array camera module can have an integrated structure by any one or more of a common frame, an integrated circuit board, or an integrated base. For example, as shown in fig. 16 to 19, the array camera module is configured with integrated circuit boards, i.e., the first circuit board 13A of the first camera module 10A, the second circuit board 23A of the second camera module 20A, and a third circuit board 33 of the third camera module 30 are integrally extended. That is, the first camera module 10A, the second camera module 20A, and the third camera module 30 share a circuit board. Here, the first circuit board 13A, the second circuit board 23A and the third circuit board 33 have an integrated structure to define a positioning and mounting base surface for mounting and calibrating the first camera module 10A, the second camera module 20A and the third camera module 30.
Alternatively, in the present invention, the combination (whether integrated or separated) among the first camera module 10A, the second camera module 20A, and the third camera module 30 may be configured in other ways so that the array camera module has an integrated structure. For example, the first base 15A of the first camera module 10A, the second base 25A of the second camera module 20A, and the third base 35 of the third camera module 30 may be configured as a single-piece base, that is, the first base 15A of the first camera module 10A, the second base 25A of the second camera module 20A, and the third base 35 of the third camera module 30 may integrally extend therebetween, so that the array camera module has a single-piece structure.
Further, in order to ensure that the relative installation positions of the first camera module 10A, the second camera module 20A and the third camera module 30 satisfy a certain relationship, for example, the optical axes of the first camera module 10A, the second camera module 20A and the third camera module 30 are parallel or spaced by a predetermined distance, and to strengthen the structural strength of the first camera module 10A, the second camera module 20A and the third camera module 30, the array camera module further includes a bracket 60A, and the bracket 60A is fixed to the outer peripheries of the first camera module 10A, the second camera module 20A and the third camera module 30 by a bonding adhesive layer, so as to position and strengthen the first camera module 10A, the second camera module 20A and the third camera module 30.
As shown in fig. 20, in the preferred embodiment of the present invention, the bracket 60A has a receiving cavity 61A for receiving the first camera module 10A, the second camera module 20A and the third camera module 30 therein. Further, the bracket 60A has a first opening 601A, a second opening 602A and a third opening 603A, the first opening 601A, the second opening 602A and the third opening 603A are respectively communicated with the receiving cavity 61A, the first opening 601A corresponds to the first camera module 10A to expose the first camera module 10A, and the second opening 602A corresponds to the second camera module 20A to expose the second camera module 20A and the third opening 603A corresponds to the third camera module 30 to expose the third camera module 30.
As described above, in the present invention, the array camera module may use a triangulation algorithm or the like to measure depth information. Corresponding to the characteristics of the triangulation algorithm, a preset distance is arranged between the first camera module 10A and the second camera module 20A, and the larger the preset distance is, the higher the depth information acquisition precision of the array camera module is. In other words, in the preferred embodiment of the present invention, there is a gap between the first camera module 10A and the second camera module 20A. Particularly, in the preferred embodiment of the present invention, the bracket 60A further has a fourth opening 604A, wherein the fourth opening 604A is located between the first opening 601A and the second opening 602A, that is, the fourth opening 604A is located between the first camera module 10A and the second camera module 20A.
Here, the fourth opening 604A is communicated with the receiving cavity 61, so that a reserved space 62A is defined between the first camera module 10A and the second camera module 20A through the fourth opening 604A, so as to fully utilize a gap space between the first camera module 10A and the second camera module 20A. For example, the reserved space 62A may be used for installing the supplementary lighting device 50. Or, when the array camera module is assembled in an electronic device (e.g., a smart phone), the reserved space 62 may be used to mount other electronic components (e.g., a microphone, etc.) of the electronic device, so as to maximally save the assembly space of the electronic device.
Fig. 21 shows a modified embodiment of the bracket 60A provided by the present invention, wherein in the modified embodiment, the first opening 601A, the second opening 602A, the third opening 603A and the fourth opening 604A integrally extend to form an opening 600A of the bracket 60A. Here, the holder 60A has a "mouth" shaped structure to form the housing cavity 61A in a peripheral wall thereof and the opening 600A, wherein the opening 600A communicates with the housing cavity 61A to expose the first camera module 10A, the second camera module 20A, and the third camera module 30 and form the reserved space 62A between the first camera module 10A and the second camera module 20A.
Fig. 22 shows another modified embodiment of the stand 60A provided by the present invention, wherein in the modified embodiment, the stand 60A has a receiving cavity 61A for receiving the first camera module 10A, the second camera module 20A and the third camera module 30 therein. The bracket 60A has a first opening 601A, a second opening 602A and a third opening 603A, the first opening 601A, the second opening 602A and the third opening 603A are respectively communicated with the receiving cavity 61A, the first opening 601A corresponds to the first camera module 10A to expose the first camera module 10A, the second opening 602A corresponds to the second camera module 20A to expose the second camera module 20A and the third opening 603A corresponds to the third camera module 30 to expose the third camera module 30. Further, the bracket 60A further has a recessed portion 605A, and the recessed portion 605A is located between the first camera module 10A and the second camera module 20A and integrally extends downward from the top surface of the bracket 60A, so as to form a reserved space 62A between the first camera module 10A and the second camera module 20A. Similarly, the reserved space 62A can be used for installing the supplementary lighting device 50. Alternatively, when the array camera module is assembled in an electronic device (e.g., a smart phone), the reserved space 62A may be used to mount other electronic components (e.g., a microphone, etc.) of the electronic device, so as to maximally save the assembly space of the electronic device.
As described above, in order to facilitate image fusion, the third camera module 30 needs to be disposed adjacent to the first camera module 10A or the second camera module 20A. Further, to facilitate calibration and installation, the third camera module 30 and the first camera module 10A or the third camera and the second camera module 20A may be pre-assembled to form an integrated modular structure. For convenience of description, an example in which the third camera module 30 is disposed adjacent to the first camera module 10A and forms an integrated structure with the first camera module 10A is taken to describe an integrated configuration between the third camera module 30 and the first camera module 10A or the second camera module 20A.
Similarly, the first camera module 10A and the third camera module 30 can be combined by any one or more of a circuit board, a base, and a support, so that the first camera module 10A and the third camera module 30 have an integrated structure.
More specifically, referring to fig. 16, the first camera module 10A and the third camera module 30 have an integrated structure in a manner of a common circuit board, that is, the first circuit board 13A of the first camera module 10A integrally extends to the third circuit board 33 of the second camera module 30, so that the first circuit board 13A and the third circuit board 33 have an integrated structure. Here, when the first circuit board 13A and the third circuit board 33 have an integrated structure, they define a positioning mounting base surface for mounting and aligning the first camera module 10A and the third camera module 30.
Referring to fig. 17, the first camera module 10A and the third camera module 30 have an integrated structure by using a co-connected base. That is, the first base 15A of the first camera module 10A integrally extends to the third base 35 of the third camera module 30. Here, since the first camera module 10A and the third camera module 30 share a single base, the first camera module 10A and the third camera module 30 can be positioned, mounted, and calibrated by the single base formed by the first base 15A and the third base 35. In particular, in some embodiments of the present invention, the array camera module may be configured with a single circuit board and a single base, that is, in some embodiments of the present invention, the first circuit board 13A of the first camera module 10A and the second circuit board 33 of the third camera module 30 have a single structure, and the first base 15A of the second camera module 10A and the third base 35 of the third camera module 30 have a single structure. Here, the integrated circuit board and the integrated base complement each other to further optimize the mounting fitting accuracy of the first camera module 10A and the third camera module.
Alternatively, referring to fig. 18 and 19, the first camera module 10A and the third camera module 30 have an integrated structure in a manner of sharing a bracket. For example, as shown in fig. 18, in the modified embodiment of the preferred embodiment of the present invention, the array camera module further includes an inner bracket 70, wherein the inner bracket 70 is fixed to the outer peripheries of the second camera module 20A and the third camera module 30 or the first camera module 10A and the third camera module by a bonding adhesive layer for positioning and packaging the first camera module 10A and the third camera module 30 or the second camera module 20A and the third camera module 30, so that the third camera module 30 and the first camera module 10A or the third camera module and the second camera module 20A form an integrated modular structure. Accordingly, as shown in fig. 22 to 23B, when the first camera module 10A and the third camera module 30 have an integrated structure, in the subsequent installation and calibration process, the third camera module 30 and the first camera module 10A can be installed as a whole together with the second camera module 20A inside the bracket 60A, and the first camera module 10A, the second camera module 20A and the third camera module 30 are further positioned by the bracket 60A. Alternatively, the third camera module 30 and the second camera module 20A are installed in the bracket 60A together with the third camera module 30 as a whole, and the bracket 60A is used to further position the first camera module 10A, the second camera module 20A and the third camera module 30. It will be appreciated that in this way, the mounting and calibration of the array camera module can be effectively done in layers to reduce the difficulty of assembly and calibration.
More specifically, as shown in fig. 23A and 23B, during the installation and calibration process, the third camera module 30 and the first camera module 10A or the second camera module 20A are pre-assembled to form an integrated modular structure (by combining any one or more of a common circuit board, a support, or a common base). Further, the third camera module 30 and the first camera module 10A or the second camera module 20A, which form an integrated modular structure, are mounted inside the bracket 60A together with the remaining second camera module 20A or the first camera module, and the bracket 60A is used to further position the first camera module 10A, the second camera module 20A and the third camera module 30. Like this, will the installation and the calibration layering of array camera module go on to reduce the degree of difficulty of equipment and calibration, that is to say, through such mode, can change the calibration between the three camera module ingeniously into the calibration between a camera module and a camera module, with the degree of difficulty that reduces the calibration.
It should be noted that, in the preferred embodiment of the present invention, the first camera module 10A, the second camera module 20A and the third camera module 30 may be arranged in a row. At this time, the first camera module 10A, the second camera module 20A and the third camera module 30 are arranged in a "one" shape in view of visual effect. Alternatively, in another embodiment of the present invention, the third camera module 30 and the first camera module 10A are arranged in a longitudinal direction, and the second camera module 20A and the first camera module 10A are arranged in a transverse direction. At this time, the first camera module 10A, the second camera module 20A, and the third camera module 30 are arranged in an "L" shape in view of visual effect, as shown in fig. 24. And are not intended to limit the scope of the present invention.
Further, as shown in fig. 25, the present invention also provides a depth information collecting method, which includes the steps of:
s1 obtains a first IR image information of a target by a first camera module 10,10A, wherein the first camera module 10,10A is an infrared camera module;
s2 obtaining a second IR image information of the target by a second camera module 20,20A, wherein the second camera module 20,20A is an infrared camera module; and
s3 processes the first IR image information and the second IR image information according to a preset algorithm to obtain depth information of the target under test.
Accordingly, in step S1 and step S2, the method further comprises the steps of:
S10A light supplement device 50,50A is used to project an infrared light to the surface of the target.
In addition, in the second preferred embodiment of the present invention, the depth information collecting method further includes the steps of:
s4 obtaining an RGB image information of the target by a third camera module 30, wherein the third camera module 30 is an RGB camera module; and
s5 blends the RGB image information and the depth information of the target to be measured to obtain an RGB-D image information.
Further, as shown in fig. 26 to 28, the present invention also provides an electronic device 80, which includes:
An electronic apparatus body 81; and
and the array camera module 1 is assembled on the electronic equipment body and used for acquiring the depth information of a detected target. In particular, in the present invention, the electronic apparatus body 81 has a front side 811 and a back side 812, and the array camera module 1 can be assembled to the front side 811 of the electronic apparatus body 81 to configure a front camera module of the electronic apparatus 80. Of course, in another embodiment of the present invention, the array camera module 1 may be assembled on the back side 812 of the electronic device body 81 to be configured as a rear camera module of the electronic device 80.
In particular, in the present invention, as shown in fig. 26 to 29, the array camera module 1 has the reserved spaces 62 and 62A formed between the first camera modules 10 and 10A and the second camera modules 20 and 20A, so that when the array camera module 1 is assembled to the electronic device body 81 (for example, a smart phone), the reserved spaces 62 and 62A can be used for installing other electronic components (for example, a microphone and the like) of the electronic device 80, so as to maximally save the assembly space of the electronic device.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the present invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (16)

1. An array camera module supplies the degree of depth information of gathering a surveyed target, its characterized in that includes:
a first camera module;
a second camera module;
a third camera module; and
an inner support, wherein the first camera module and the second camera module are respectively an infrared camera module, wherein the first camera module and the second camera module are arranged at intervals for respectively collecting a first IR image information and a second IR image information of the measured object, wherein the third camera module is an RGB camera module, wherein the third camera module is arranged adjacently to the first camera module for collecting an RGB image information of the measured object, wherein the first IR image information, the second IR image information and the RGB image information are transmitted to an image processor for processing the first IR image information and the second IR image information by the image processor to obtain the depth information of the measured object, and the RGB image information of the measured object and the depth information of the measured object are fused to obtain the RGB-D image information of the measured object, wherein the inner bracket couples peripheral portions of the first and third camera modules such that the first and third camera modules have an integral modular structure.
2. The array camera module of claim 1, wherein the third camera module comprises a third circuit board, and the first camera module comprises a first circuit board, and wherein the third circuit board integrally extends from the first circuit board.
3. The array camera module of claim 2, wherein the third camera module comprises a third base and the first camera module comprises a first base, wherein the third base extends integrally with the first base.
4. The array camera module of claim 3, wherein the first base and the third base are integrally formed with the first circuit board and the third circuit board by a molding process.
5. The camera module array of any of claims 1-4, further comprising a holder having a cavity for receiving the first camera module, the second camera module and the third camera module therein, wherein the holder has a first opening, a second opening and a third opening, the first opening, the second opening and the third opening are respectively connected to the cavity, wherein the first opening corresponds to the first camera module to expose the first camera module, the second opening corresponds to the second camera module to expose the second camera module, the third opening corresponds to the third camera module to expose the third camera module, wherein the holder further has a fourth opening, wherein the fourth opening is located between the first opening and the second opening, and a reserved space is defined between the first camera module and the second camera module through the fourth opening.
6. The array camera module of claim 5, wherein the first opening, the second opening, the third opening, and the fourth opening integrally extend to form an opening, wherein the opening exposes the first camera module, the second camera module, and the third camera module and defines the headspace between the first camera module and the second camera module.
7. The camera module array of any one of claims 1 to 4, further comprising a holder having a receiving cavity for receiving the first camera module, the second camera module and the third camera module therein, wherein the holder has a first opening, a second opening and a third opening, the first opening, the second opening and the third opening are respectively communicated with the receiving cavity, the first opening corresponds to the first camera module to expose the first camera module, the second opening corresponds to the second camera module to expose the second camera module, the third opening corresponds to the third camera module to expose the third camera module, wherein the holder further has a recess portion between the first camera module and the second camera module and concavely formed on a top surface of the holder, so as to define and form a reserved space between the first camera module and the second camera module by the concave part.
8. The array camera module of any of claims 1-4, wherein the first camera module and the third camera module are arranged longitudinally and the second camera module and the first camera module are arranged laterally such that the first camera module, the second camera module and the third camera module are arranged in an "L" shape.
9. The array camera module of claim 7, wherein the array camera module further comprises a light supplement device configured to project infrared light having a specific wavelength to the target.
10. The array camera module of claim 9, wherein the light compensating device is mounted in the reserved space formed between the first camera module and the second camera module.
11. An electronic device, comprising:
the array camera module according to any one of claims 1 to 10; and
the array camera module is assembled on the electronic equipment body and used for collecting depth information of a measured target.
12. An array camera module assembling method is characterized by comprising the following steps:
arranging a third camera module in the first camera module in an adjacent manner, wherein the first camera module is an infrared camera module, the third camera module is an RGB camera module, and the first camera module and the third camera module have an integrated structure;
Arranging a second camera module on the first camera module at a certain interval, wherein the second camera module is an infrared camera module;
forming the first camera module and the third camera module into an integrated modular structure by an inner support respectively combined at the peripheral parts of the first camera module and the third camera module; and
the second camera shooting module, the third camera shooting module and the first camera shooting module are integrally combined through a support to form an integrated module.
13. The method of assembling an array camera module of claim 12, wherein the third camera module comprises a third circuit board, and the first camera module comprises a first circuit board, wherein the third circuit board integrally extends from the first circuit board.
14. The method of assembling an array camera module of claim 12, wherein the third camera module includes a third base and the first camera module includes a first base, and wherein the third base extends integrally with the first base.
15. An array camera module assembly method is characterized by comprising the following steps:
arranging a third camera module in a second camera module in an adjacent manner, wherein the second camera module is an infrared camera module, the third camera module is an RGB camera module, and the second camera module and the third camera module are of an integrated structure;
Arranging a first camera module on the second camera module at a certain interval, wherein the first camera module is an infrared camera module;
forming the second camera module and the third camera module into an integrated modular structure by an inner support respectively combined at the peripheral parts of the second camera module and the third camera module; and
the first camera shooting module, the third camera shooting module and the second camera shooting module are integrally combined through a support to form an integrated module.
16. An array camera module supplies the degree of depth information of gathering a surveyed target, its characterized in that includes:
a first camera module;
a second camera module;
a third camera module; and
an inner support, wherein the first camera module and the second camera module are an infrared camera module respectively, wherein the first camera module and the second camera module are disposed at an interval for collecting a first IR image information and a second IR image information of the measured object respectively, wherein the third camera module is an RGB camera module, wherein the third camera module is disposed adjacent to the second camera module for collecting an RGB image information of the measured object, wherein the first IR image information, the second IR image information and the RGB image information are transmitted to an image processor for processing the first IR image information and the second IR image information by the image processor to obtain a depth information of the measured object, and the RGB image information of the measured object and the depth information of the measured object are fused to obtain an RGB-D image information of the measured object, wherein the inner bracket couples peripheral portions of the second and third camera modules such that the second and third camera modules have an integral modular structure.
CN201810441714.4A 2018-01-31 2018-05-10 Array camera module, depth information acquisition method thereof and electronic equipment Active CN110099225B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2018100955311 2018-01-31
CN201810095531 2018-01-31

Publications (2)

Publication Number Publication Date
CN110099225A CN110099225A (en) 2019-08-06
CN110099225B true CN110099225B (en) 2022-07-19

Family

ID=65481754

Family Applications (4)

Application Number Title Priority Date Filing Date
CN201810441741.1A Active CN110099226B (en) 2018-01-31 2018-05-10 Array camera module, depth information acquisition method thereof and electronic equipment
CN201820692243.XU Active CN208572262U (en) 2018-01-31 2018-05-10 Array camera module and its electronic equipment
CN201810441714.4A Active CN110099225B (en) 2018-01-31 2018-05-10 Array camera module, depth information acquisition method thereof and electronic equipment
CN201820697441.5U Active CN208572263U (en) 2018-01-31 2018-05-10 Array camera module and its electronic equipment

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN201810441741.1A Active CN110099226B (en) 2018-01-31 2018-05-10 Array camera module, depth information acquisition method thereof and electronic equipment
CN201820692243.XU Active CN208572262U (en) 2018-01-31 2018-05-10 Array camera module and its electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201820697441.5U Active CN208572263U (en) 2018-01-31 2018-05-10 Array camera module and its electronic equipment

Country Status (1)

Country Link
CN (4) CN110099226B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110099226B (en) * 2018-01-31 2024-04-09 宁波舜宇光电信息有限公司 Array camera module, depth information acquisition method thereof and electronic equipment
CN111757086A (en) * 2019-03-28 2020-10-09 杭州海康威视数字技术股份有限公司 Active binocular camera, RGB-D image determination method and device
TWI707163B (en) * 2019-05-06 2020-10-11 大陸商三贏科技(深圳)有限公司 Camera module
CN111901502A (en) * 2019-05-06 2020-11-06 三赢科技(深圳)有限公司 Camera module
CN209823807U (en) * 2019-07-09 2019-12-20 Oppo广东移动通信有限公司 Electronic device
CN112166375A (en) * 2019-07-29 2021-01-01 深圳市大疆创新科技有限公司 Shooting equipment, cloud platform device and unmanned aerial vehicle
CN111741195A (en) * 2020-06-24 2020-10-02 上海摩软通讯技术有限公司 Camera shooting assembly, display module and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203811871U (en) * 2014-03-14 2014-09-03 瑞声声学科技(苏州)有限公司 Lens module
CN104272716A (en) * 2012-05-07 2015-01-07 Lg伊诺特有限公司 Camera module
CN206206918U (en) * 2016-10-13 2017-05-31 广东弘景光电科技股份有限公司 It is applied to the double supports taken the photograph in module of panorama
CN206294242U (en) * 2016-11-09 2017-06-30 昆山丘钛微电子科技有限公司 Focus cocircuit plate dual camera module
CN206323461U (en) * 2016-12-19 2017-07-11 广州视源电子科技股份有限公司 A kind of built-in camera device of interactive intelligent tablet computer

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008164367A (en) * 2006-12-27 2008-07-17 Matsushita Electric Ind Co Ltd Solid body imaging device, camera, vehicle and surveillance device
CN102572229A (en) * 2010-12-29 2012-07-11 鸿富锦精密工业(深圳)有限公司 Camera module
KR101966975B1 (en) * 2012-09-03 2019-04-08 엘지이노텍 주식회사 Apparatus for stereo matching
US10349037B2 (en) * 2014-04-03 2019-07-09 Ams Sensors Singapore Pte. Ltd. Structured-stereo imaging assembly including separate imagers for different wavelengths
WO2017030507A1 (en) * 2015-08-19 2017-02-23 Heptagon Micro Optics Pte. Ltd. Generating a disparity map having reduced over-smoothing
CN106840034A (en) * 2015-12-04 2017-06-13 宁波舜宇光电信息有限公司 3 D scanning system and its application with the speckle projector
US9674504B1 (en) * 2015-12-22 2017-06-06 Aquifi, Inc. Depth perceptive trinocular camera system
TWI584634B (en) * 2016-03-08 2017-05-21 聚晶半導體股份有限公司 Electronic apparatus and method of generating depth map
CN105635548A (en) * 2016-03-29 2016-06-01 联想(北京)有限公司 Image pickup module set
CN207380428U (en) * 2016-10-14 2018-05-18 宁波舜宇光电信息有限公司 Array camera module based on integral packaging technique
CN106572340B (en) * 2016-10-27 2019-05-10 深圳奥比中光科技有限公司 Camera system, mobile terminal and image processing method
CN206698308U (en) * 2016-11-08 2017-12-01 聚晶半导体股份有限公司 Photographing module and camera device
CN110099226B (en) * 2018-01-31 2024-04-09 宁波舜宇光电信息有限公司 Array camera module, depth information acquisition method thereof and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104272716A (en) * 2012-05-07 2015-01-07 Lg伊诺特有限公司 Camera module
CN203811871U (en) * 2014-03-14 2014-09-03 瑞声声学科技(苏州)有限公司 Lens module
CN206206918U (en) * 2016-10-13 2017-05-31 广东弘景光电科技股份有限公司 It is applied to the double supports taken the photograph in module of panorama
CN206294242U (en) * 2016-11-09 2017-06-30 昆山丘钛微电子科技有限公司 Focus cocircuit plate dual camera module
CN206323461U (en) * 2016-12-19 2017-07-11 广州视源电子科技股份有限公司 A kind of built-in camera device of interactive intelligent tablet computer

Also Published As

Publication number Publication date
CN110099226B (en) 2024-04-09
CN110099225A (en) 2019-08-06
CN110099226A (en) 2019-08-06
CN208572263U (en) 2019-03-01
CN208572262U (en) 2019-03-01

Similar Documents

Publication Publication Date Title
CN110099225B (en) Array camera module, depth information acquisition method thereof and electronic equipment
CN208044632U (en) The lower biometric devices of screen and electronic equipment
CN210578563U (en) Electronic device and vision system
EP1830565B1 (en) Image capturing apparatus
US7646423B2 (en) Image capture apparatus with illuminator and distance measuring light emitting device
TWI606309B (en) Optical imaging apparatus, in particular for computational imaging, having further functionality
CN109737868A (en) Flight time mould group and electronic equipment
CN108885696B (en) Under-screen biological feature recognition device and electronic equipment
CN109271916B (en) Electronic device, control method thereof, control device, and computer-readable storage medium
CN109076147A (en) Support the complex imaging system and mobile terminal of near infrared light and visual light imaging
US20160292506A1 (en) Cameras having an optical channel that includes spatially separated sensors for sensing different parts of the optical spectrum
JP2004535010A (en) Electronic pen, mounting part therefor and method of making the pen
CN106774656B (en) Sensor assembly, cover plate, mobile terminal and terminal control method
CN115516846A (en) Range-finding camera apparatus
EP3591578B1 (en) Under-screen biometric identification apparatus and electronic device
CN110568418A (en) Photoelectric module and electronic device
CN111263045A (en) Electronic equipment
US11989969B2 (en) Biometric information imaging device
CN210536823U (en) Depth image acquisition device
CN108093102B (en) Electronic device
CN212647496U (en) Palm image acquisition equipment
CN108040151A (en) Input and output module and electronic device
RU2781814C1 (en) Camera assembly and electronic apparatus
KR102505817B1 (en) 3d image acquisition device
WO2023070313A9 (en) Time-of-flight camera module and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant