CN114650343A - Image sensor and imaging device - Google Patents

Image sensor and imaging device Download PDF

Info

Publication number
CN114650343A
CN114650343A CN202011481336.6A CN202011481336A CN114650343A CN 114650343 A CN114650343 A CN 114650343A CN 202011481336 A CN202011481336 A CN 202011481336A CN 114650343 A CN114650343 A CN 114650343A
Authority
CN
China
Prior art keywords
pixel unit
pixel
light
unit
image sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011481336.6A
Other languages
Chinese (zh)
Inventor
陈秋萍
代郁峰
邹松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XFusion Digital Technologies Co Ltd
Original Assignee
XFusion Digital Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XFusion Digital Technologies Co Ltd filed Critical XFusion Digital Technologies Co Ltd
Priority to CN202011481336.6A priority Critical patent/CN114650343A/en
Publication of CN114650343A publication Critical patent/CN114650343A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/2224Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
    • H04N5/2226Determination of depth image, e.g. for foreground/background separation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

The application provides an image sensor and an imaging device, wherein the image sensor comprises a plurality of pixel units which are arranged in an array; the pixel units can be divided into a first pixel unit, a second pixel unit, a third pixel unit, a fourth pixel unit and a fifth pixel unit according to the types. The number of the first pixel units, the second pixel units, the third pixel units, the fourth pixel units and the fifth pixel units is respectively multiple. The first pixel unit, the second pixel unit and the third pixel unit are pixel units capable of receiving light rays with different colors. The fourth pixel unit is a pixel unit capable of receiving visible light rays; the fifth pixel unit is a TOF pixel unit for acquiring depth information of the target object. In the technical scheme, the fourth pixel unit for receiving visible light rays in ambient light is added in the image sensor, so that the photosensitivity of the image sensor is increased, the brightness of a formed image is improved, and the imaging quality of the image sensor is improved.

Description

Image sensor and imaging device
Technical Field
The present disclosure relates to the field of image technologies, and particularly to an image sensor and an imaging device.
Background
One of the mainstream schemes in the field of 3D depth vision is the ToF (Time of flight) technique. The ToF technique works as shown in fig. 1, where a ToF detector comprises a transmitter 1 and a receiver 2, the transmitter 1 transmitting a light pulse, which is inevitably reflected when it encounters an obstacle. The light pulses are detected at different distances and the reflected motion times (e.g., d1 and d2 in fig. 1) are different. The receiver 2 records the reflection movement time of the light pulse, calculates the distance between the transmitter 1 and the target object, and generates a 3D information map of the target object. The ToF technology is typically applied to an image sensor, and acquires color information and depth information of a target object through the image sensor to obtain a color map with depth information. For example, a ToF detector is mounted on the main camera side of the mobile phone, or a ToF detector is mounted on the front camera.
The mode adopted by the prior art for realizing an image sensor to carry a TOF detector is based on a scheme of integrating TOF pixels and color pixels on one image sensor, but since the color pixels only use a certain pixel component in visible light as picture brightness, the color pixels can only use the visible light brightness of 1/2 at most, which causes poor photosensitivity of the image sensor in an environment with low visible light illumination, thereby reducing image quality.
Disclosure of Invention
The application provides an image sensor and an imaging device, which are used for improving the imaging image quality of the image sensor.
In a first aspect, an image sensor is provided, the image sensor including a plurality of pixel units arranged in an array; the plurality of pixel units are arranged in a checkerboard configuration. The pixel units can be classified into a first pixel unit, a second pixel unit, a third pixel unit, a fourth pixel unit and a fifth pixel unit according to the types. The number of the first pixel units, the second pixel units, the third pixel units, the fourth pixel units and the fifth pixel units is respectively multiple. The first pixel unit, the second pixel unit and the third pixel unit are pixel units respectively used for receiving light rays with different colors in visible light; the fourth pixel unit is a pixel unit capable of receiving visible light rays in ambient light; and the fifth pixel unit is a TOF pixel unit for acquiring depth information of the target object. In the above technical solution, the fourth pixel unit for receiving visible light is added in the image sensor, so that the photosensitivity of the image sensor is increased, the brightness of a formed image is improved, and the imaging quality of the image sensor is further increased.
In a specific embodiment, the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is between 25% and 75%. The fourth pixel units with different proportions are arranged, so that the photosensitivity of the image sensor is improved.
In a specific possible implementation, the sizes of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit may be set in different manners. As one scheme, the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are equal in size. In another scheme, the sizes of the first pixel unit, the second pixel unit, the third pixel unit and the fourth pixel unit are equal; the size of the fifth pixel unit is equal to the size of a rectangle formed by the outermost sides of the four first pixel units arranged in an array. The pixel units are arranged in different ways.
In a specific possible embodiment, the first pixel unit, the second pixel unit, and the third pixel unit may be pixel units that receive different colors. As one solution, the first pixel unit, the second pixel unit and the third pixel unit are respectively: the pixel structure comprises a red pixel unit, a green pixel unit and a blue pixel unit; in another scheme, the first pixel unit, the second pixel unit and the third pixel unit are respectively: cyan pixel unit, magenta pixel unit and yellow pixel unit. The imaging effect can be realized by receiving different color units through the pixel units.
In a specific possible embodiment, the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit, and the fifth pixel unit may be arranged in different proportions, for example, when the sizes of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit, and the fifth pixel unit are the same, the arrangement proportions of the five pixel units may be:
when the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 25%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 2:2:2:1: 1; or 3:3:3:2:1 or 4:4:4:3: 1.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 50%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 2:4:2:7: 1; or 2:4:2:6: 2; or 2:4:2:7: 1; or 2:4:2:6: 2; or 4:4:4:11: 1; or 4:4:4:10: 2.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 75%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 1:2:1:11: 1; or 1:2:1:10: 2.
When the size of the fifth pixel unit is equal to the size of the rectangle in which the first pixel unit, the second pixel unit, the third pixel unit and the fourth pixel unit are arranged, the arrangement ratio of the five pixel units may be:
when the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 25%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 3:3:3:3: 1; or 5:5:5:5:1 or 8:8:8:8:1 or 15:15:15:15: 1.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 50%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 6:14:8:28:2 or 8:14:6:28:2 or 8:12:8:28: 2.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 75%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 4:7:4:45: 1; or 4:6:4:42: 2; or 4:4:4:36: 4.
In a specific embodiment, the image sensor further includes a filtering module disposed at a light incident side of the plurality of pixel units. And filtering out stray light by the filtering module.
In a specific possible embodiment, the filtering module is a dual-band filtering module, and the dual-band filtering module is transparent to visible light rays and infrared rays in ambient light. The filtering effect can be ensured by the dual-passband filtering module.
In a specific embodiment, the image sensor further comprises a lens for collecting light, and the lens is located at the light incident side of the plurality of pixel units. The amount of light entering is increased by the condensing lens.
In a specific implementation, the system further includes an image processing module, where the image processing module performs interpolation operation according to the light received by the first pixel unit, the light received by the second pixel unit, and the light received by the third pixel unit, so as to obtain a color image; performing interpolation operation according to the visible light rays acquired by the fourth pixel unit to obtain brightness information corresponding to each pixel unit; performing interpolation operation according to the infrared light acquired by the fifth pixel unit to obtain depth information corresponding to each pixel unit; and the image processing unit is used for obtaining a depth color image according to the color image and the brightness information and the depth information corresponding to each pixel unit.
In a specific embodiment, the image processing module is further configured to determine infrared interference light rays irradiating the first pixel unit, the second pixel unit and the third pixel unit according to light rays received by the first pixel unit, light rays received by the second pixel unit, light rays received by the third pixel unit and visible light rays received by the fourth pixel unit; and calibrating the monochromatic light received by the first pixel unit, the monochromatic light received by the second pixel unit and the monochromatic light received by the third pixel unit according to the obtained infrared interference light, and obtaining the color image according to the calibrated monochromatic light. The imaging effect is improved.
In a second aspect, there is provided an imaging device comprising a housing, and an image sensor of any of the above arranged within the housing. In the technical scheme, the fourth pixel unit for receiving visible light is added in the image sensor, so that the photosensitivity of the image sensor is increased, the brightness of a formed image is improved, and the imaging quality of the image sensor is improved.
Drawings
FIG. 1 is a schematic diagram of the operation of a TOF detector;
fig. 2 is a schematic view of an application scenario of an image sensor according to an embodiment of the present application;
fig. 3 is an exploded view of an image sensor according to an embodiment of the present disclosure;
fig. 4 is a flowchart illustrating an operation of an image sensor according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a pixel array provided by an embodiment of the present application;
fig. 6 is a schematic diagram of a pixel unit according to an embodiment of the present disclosure;
fig. 7 is a circuit diagram of a color information collecting unit according to an embodiment of the present application;
fig. 8 is a circuit diagram of a depth information acquisition unit according to an embodiment of the present application;
fig. 9 is a circuit diagram of a luminance information collecting unit according to an embodiment of the present application;
fig. 10 is a schematic block diagram of a luminance information processing unit provided in an embodiment of the present application;
FIG. 11 is a flowchart of a color information processing unit according to an embodiment of the present application;
fig. 12 is a flowchart of a depth information processing unit according to an embodiment of the present application;
FIG. 13 is a flowchart of a depth color fusion image unit according to an embodiment of the present application;
FIG. 14 is a schematic view of another pixel array provided in the present application;
fig. 15 is a schematic view of another pixel array according to an embodiment of the present application.
Detailed Description
The embodiments of the present application will be further described with reference to the accompanying drawings.
First, an application scenario of the image sensor provided in the embodiment of the present application is described, where the image sensor provided in the embodiment of the present application is applied to an electronic device, such as a mobile phone, a tablet computer, and a wearable electronic device, which are commonly used in devices with a photographing function. Of course, the method can also be applied to other types of electronic equipment with shooting function. As shown in fig. 2, when the image sensor 4 provided in the embodiment of the present application is applied in a mobile terminal, the mobile terminal includes a housing 3 and the image sensor 4 disposed in the housing 3, and the image sensor 4 is disposed on a main board 5 and is electrically connected to the main board 5. When light of a target object enters the image sensor 4, the optical signal can be converted into an electrical signal by the image sensor 4, and imaging processing is performed. In order to improve the imaging effect, the image sensor 4 in the prior art introduces a TOF technology, and integrates TOF pixel units in a pixel array, so that a color map with depth information can be obtained. However, the conventional image sensor still has the problem of poor photosensitivity in a low visible light environment, which affects the imaging effect. To this end, embodiments of the present application provide an image sensor, which is described in detail below with reference to specific drawings and embodiments.
Referring to fig. 3, fig. 3 shows a schematic structural diagram of an image sensor provided in an embodiment of the present application. The image sensor provided by the embodiment of the application comprises a lens 10, a filtering module 20, a pixel array 30 and an image processing module 40 which are arranged along an optical path. The pixel array 30 includes a color filter layer 31 and an image capturing module 32. The light irradiated from the target object passes through the lens 10, the filter module 20, and the color filter layer 31 in this order and is irradiated to the image pickup module 32. The image acquisition module 32 converts the optical signal into a digital signal, and the image processing module 40 performs an imaging process according to the digital signal of the image acquisition module 32 to generate a depth color image.
It should be understood that the lens 10 and the filter module 20 are optional module structures, and when the image sensor is disposed, the lens 10 or the filter module 20 may be selectively disposed according to actual needs, and are not particularly limited in the embodiment of the present application.
Referring to fig. 4 together, fig. 4 shows a flowchart of an imaging process of an image sensor provided in an embodiment of the present application. The mixed incident light with the complex waveband is focused by the lens 10, and then irradiates the pixel array after passing through the filtering module 20, and the filtering module 20 is arranged on the light incident side of the pixel array to filter out the stray light in the mixed incident light with the complex waveband and only leave the light which can pass through the pixel array (namely the light which is emitted by the target object and used for imaging), so that the imaging effect is improved. For example, if the light required by the pixel array is in the visible light band and the infrared light band of the ambient light, the filter module 20 may adopt a dual-band filter module, and two passbands that the dual-band filter module can pass through are the visible light band and the infrared light band, respectively. If the infrared band corresponding to the pixel array is a specific narrow-band infrared band, the dual-band filtering module can correspondingly penetrate through the specific narrow-band infrared band. After the dual-passband filtering module is adopted to transmit visible light and infrared light and block stray light of other frequency bands, the influence of the stray light on the imaging of the image sensor can be reduced, the light interference and noise existing in the working of a subsequent module can be reduced, and the imaging quality of the image sensor can be improved.
The visible light and the infrared light irradiate the color filter layer 31 to form optical signals arranged in a matrix, the optical signals irradiate the image acquisition module 32, and the image acquisition module 32 converts the optical signals into digital signals to generate digital signals arranged in a matrix. The image processing module 40 performs imaging processing based on the digital signals arranged in the matrix to generate a depth color image.
Referring also to fig. 5, the pixel array includes a plurality of pixel units arranged in M rows and N columns, where M and N are positive integers greater than 2. The plurality of pixel units include different kinds of pixel units, and the pixel units may be classified into, for example, a first pixel unit 33(R), a second pixel unit 34(G), a third pixel unit 35(B), a fourth pixel unit 36(W), and a fifth pixel unit 37(TOF) according to the kinds. The number of the first pixel unit 33, the second pixel unit 34, the third pixel unit 35, the fourth pixel unit 36, and the fifth pixel unit 37 is plural. The plurality of first pixel units 33, second pixel units 34, third pixel units 35, fourth pixel units 36, and fifth pixel units 37 are arranged in a pixel array in an array manner.
The first pixel unit 33, the second pixel unit 34, and the third pixel unit 35 are respectively used for receiving light rays with different colors in visible light, so as to receive light rays with different colors in visible light. For example, the first pixel unit 33 may be a red (R) pixel unit, the second pixel unit 34 may be a green (G) pixel unit, and the third pixel unit 35 may be a blue (B) pixel unit. Illustratively, as shown in fig. 5, where R represents red light received by a red pixel cell, G represents green light received by a green pixel cell, and B represents blue light received by a blue pixel cell.
As an alternative, the first pixel unit 33, the second pixel unit 34, and the third pixel unit 35 may also be a cyan pixel unit, a magenta pixel unit, and a yellow pixel unit, respectively.
It should be understood that the first pixel unit 33, the second pixel unit 34, and the third pixel unit 35 are only two specific cases as examples, and the first pixel unit 33, the second pixel unit 34, and the third pixel unit 35 provided in the embodiments of the present application may also receive light of other colors according to the imaging requirements.
The fourth pixel unit 36 is a pixel unit that can receive visible light rays in the ambient light, and W shown in fig. 5 denotes visible light received by the fourth pixel unit. For simplicity of description, visible light in ambient light is hereinafter simply referred to as visible light.
The fifth pixel cell 37 is a TOF pixel cell that acquires depth information of the target object. The TOF pixel cell may receive infrared light for detecting a target object distance.
As described above, referring to fig. 3, the pixel array 20 includes the color filter layer 31 and the image pickup module 32. The pixel array 20 includes a first pixel unit, a second pixel unit, a third pixel unit, a fourth pixel unit, and a fifth pixel unit. Referring to fig. 6, the first pixel unit includes a first filter region 311 and a first color information collection unit 321; the second pixel unit includes a second filter region 312 and a second color information collection unit 322; the third pixel unit comprises a third filter region 313 and a third color information acquisition unit 323; the fourth pixel unit comprises a fourth filter region 314 and a brightness information acquisition unit 324; the fifth pixel unit includes a fifth filter region 315 and a depth information collecting unit 325. It is noted that the first filter region 311, the second filter region 312, the third filter region 313, the fourth filter region 314, and the fifth filter region 315 are all located on the color filter layer 31. The first color information acquisition unit 321, the second color information acquisition unit 322, the third color information acquisition unit 323, the brightness information acquisition unit 324, and the depth information acquisition unit 325 are all located in the image acquisition module 32.
As an optional scheme, the pixel array may further include a microlens set, each filter region corresponds to one microlens, and light may be converged into the corresponding filter region by light condensation of the microlenses, so as to improve light received by each pixel unit.
When the visible light includes red, green, and blue light, and the visible light and the infrared light respectively irradiate the five pixel units, the first filter region 311 of the first pixel unit 33 can only transmit the red light; the second filter region 312 of the second pixel unit 34 can only transmit green light; the third filter region 313 of the third pixel unit 35 can only transmit blue light; the fourth filter region 314 of the fourth pixel unit 36 transmits the red light, the green light and the blue light; the fifth filter region 315 of the fifth pixel unit 37 is only transparent to infrared light. Light rays (monochromatic light rays of red, green and blue colors), light rays (infrared light rays) and light rays (visible light rays) related to the depth information are obtained through the filter regions of the five pixel units, wherein the light rays are related to color information of imaging of a target object, and the light rays (visible light rays) are related to brightness information. The light rays irradiate the image acquisition module for further processing after penetrating through the corresponding filter region.
The image acquisition module may be divided into a depth information acquisition unit 325, a color information acquisition unit (wherein the color information acquisition unit includes a first color information acquisition unit 321, a second color information acquisition unit 322, and a third color information acquisition unit 323), and a brightness information acquisition unit 324 according to functional division. The image acquisition module can acquire optical signals of depth, color and brightness of a target object, convert the optical signals into voltage signals and convert the voltage signals into digital signals. With the structure shown in fig. 6, the first color information collecting unit 321 of the first pixel unit 33 receives red light, the second color information collecting unit 322 of the second pixel unit 34 receives green light, and the third color information collecting unit 323 of the third pixel unit 35 receives blue light. The brightness information collecting unit 324 of the fourth pixel unit 36 receives the red light, the green light and the blue light, and the depth information collecting unit 325 of the fifth pixel unit 37 receives the infrared light.
Referring to fig. 7 together, fig. 7 shows a circuit diagram of a color information collecting unit provided in an embodiment of the present application, where the first color information collecting unit, the second color information collecting unit, and the third color information collecting unit are all conventional four-tube pixel sensors, and each color information collecting unit includes a transmission transistor MTGAnd a floating diffusion region FD, a transfer transistor MTGReceives the control signal, passes the transistor MTGIs connected to the photodiode PD, the transfer transistor MTGAt the output terminal of the pass transistor MTGOutputs a voltage signal when the control terminal receives the control signal, and transmits the transistor MTGThe voltage signal of the photodiode PD is transferred to the floating diffusion FD for storage. The photogenerated carriers accumulated in the photodiode PD are transferred to the floating diffusion region FD and converted into a voltage signal, which is then passedAfter the digital-to-analog conversion ADC is converted into a digital signal, the digital signal enters a color information processing unit. When there are a plurality of first, second and third pixel units, each pixel unit includes a photodiode PD for receiving an optical signal.
Referring also to fig. 8, fig. 8 shows a circuit diagram of a depth information collecting unit including a first voltage signal output module and a second voltage signal output module.
The first voltage signal output module comprises a first transmission transistor MTG1And a first floating diffusion region FD1, a first transfer transistor MTG1A control terminal of the first transistor M receives a control signalTG1Is connected to the photodiode PD, a first transfer transistor MTG1At the output of the first pass transistor MTG1Outputs a first voltage signal when receiving a control signal, a first transfer transistor MTG1The first voltage signal of the photodiode PD is transferred to the first floating diffusion FD1 for saving.
The second voltage signal output module 102 is used for converting the charges accumulated in the photodiode PD into a voltage signal, and includes a second transfer transistor MTG2And a second floating diffusion region FD2, a second transfer transistor MTG2A control terminal of the second pass transistor M receives the control signalTG2Is connected to the photodiode PD, the second transfer transistor MTG2At the output of the second pass transistor MTG2A control terminal outputs a second voltage signal when receiving the control signal, and a second pass transistor MTG2The second voltage signal of the photodiode PD is transferred to the second floating diffusion FD2 for saving.
When measuring depth information, settings are sent to the first transfer transistor MTG1Is the same phase as the modulated light, sets the control signal sent to the second transfer transistor MTG2And a control signal sent to the first transfer transistor MTG1The phase of the control signal(s) is complementary, the modulated light is emitted to the target object, and the photodiode PD stores the infrared optical signal and transfers it to the FD to be converted into a voltage signal.When the distance of the target object is detected, the time difference of the wave bands of the reflected signals is compared, so that the depth information irradiated to the target object is obtained.
Referring to fig. 9, fig. 9 shows a circuit diagram of a luminance information collecting unit provided in the embodiment of the present application, where the luminance information collecting unit and each color information collecting unit have the same structure, and the difference between the luminance information collecting unit and each color information collecting unit is only that received light is different, so that the collecting principle of the luminance information collecting unit is the same as that of the color information collecting unit, and details are not repeated here. The fourth pixel unit is introduced into the pixel array, and because the fourth pixel unit allows light rays with three different wavelengths of red, green and blue to pass through, the brightness of the fourth pixel unit is higher than that of other pixel units receiving single color (such as a first pixel unit receiving red light, a second pixel unit receiving green light, and the like), so that the image brightness output by the image sensor is higher than that of an image sensor which only uses light rays with single color to acquire brightness information in the prior art, and especially in a low-illumination environment, the imaging quality can be effectively improved. Meanwhile, in terms of signal-to-noise ratio, the signal-to-noise ratio of the white pixel unit is 3dB higher than that of the pixel unit, and therefore the image quality can be improved by introducing the fourth pixel unit.
To facilitate understanding of the principle of image formation by the image sensor provided in the embodiments of the present application, first, an interpolation operation is described, where the interpolation operation refers to using known data to predict unknown data, and the image interpolation is to give a pixel unit, and predict the value of the pixel unit according to the information of its surrounding pixel units. Common interpolation algorithms can be divided into two categories: adaptive and non-adaptive. Adaptive methods can change depending on the content of the interpolation (sharp edges or smooth textures), and non-adaptive methods do the same for all pixel elements. The non-adaptive algorithm includes: nearest neighbor interpolation, bilinear interpolation, bicubic interpolation, spline interpolation, etc.
The image processing module performs interpolation operation according to the light received by the first pixel unit, the second pixel unit and the third pixel unit to obtain a color image; performing interpolation operation according to the visible light acquired by the fourth pixel unit to obtain brightness information corresponding to each pixel unit; performing interpolation operation according to the infrared light acquired by the fifth pixel unit to obtain depth information corresponding to each pixel unit; and the image processing unit is used for obtaining a depth color image according to the fusion of the brightness information and the depth information corresponding to the color image and each pixel unit. The image processing module comprises a depth information processing unit, a color image processing unit, a brightness information processing unit and a depth color fusion image unit. The depth information processing unit is used for processing the depth information acquired by the image acquisition module, the color image processing unit is used for processing the color information acquired by the image acquisition module, and the brightness information processing unit is used for processing the brightness information acquired by the image acquisition module. And the depth color fusion image unit is used for fusing the processed color information, the depth information and the brightness information to obtain a depth color image.
Referring to fig. 10, fig. 10 shows a functional block diagram of a luminance information processing unit. The brightness information processing unit performs interpolation operation on the digital signals converted from the visible light received by the fourth pixel unit. In the pixel arrays of M rows and N columns, the digital signals converted according to the position of the fourth pixel unit and the visible light rays are brought into interpolation operation, so that the brightness information corresponding to each pixel unit in the pixel units of M rows and N columns is obtained.
Referring to fig. 11, fig. 11 shows a flowchart of the color information processing unit. Take red, green and blue as examples of the colored light. The color information processing unit firstly obtains red, green and blue monochromatic light, and then obtains a red full resolution image, a green full resolution image and a blue full resolution image corresponding to the pixel arrays of M rows and N columns through interpolation operation. The image processing module is also used for determining infrared interference light rays irradiating the first pixel unit, the second pixel unit and the third pixel unit according to the light rays received by the first pixel unit, the second pixel unit and the third pixel unit and the light rays acquired by the fourth pixel unit; and calibrating the monochromatic light received by the first pixel unit, the second pixel unit and the third pixel unit according to the obtained infrared interference light, and obtaining a color image according to the calibrated light. And then, the corresponding infrared light component in the full resolution image of each color is extracted by using the formed full resolution image, and then the visible light component in the mixed light is restored, and a color corrected color image is obtained. The method comprises the following specific steps of reducing visible light components in mixed light:
since the filter module can transmit visible light rays and infrared light rays, the light rays irradiated to each pixel unit comprise visible light rays and infrared light rays, and therefore the light rays received by the first pixel unit, the second pixel unit, the third pixel unit and the fourth pixel unit are mixed light rays. Illustratively, infrared light received by the first pixel unit, the second pixel unit and the third pixel unit may interfere with monochromatic light received by the first pixel unit, the second pixel unit and the third pixel unit. Meanwhile, the infrared light received by the fourth pixel unit may also interfere with the visible light received by the fourth pixel unit.
With mixed light Ri,j、Gi,j、Bi,j、Wi,jRespectively representing the mixed light received by the first pixel element, the mixed light received by the second pixel element, the mixed light received by the third pixel element, and the mixed light received by the fourth pixel element. Mixed light Ri,j、Gi,j、Bi,j、Wi,jThe mixed light can be directly acquired by the color information acquisition unit and the brightness information acquisition unit. Since the amount of infrared light and visible light received at each pixel cell is the same, the amount of infrared light received at each pixel cell is the same.
From the above description, the following formula can be derived:
the first pixel unit receives the mixed light Ri,j=Ri,j′+NIRi,j(ii) a Wherein R isi,j′Is the red light irradiated to the first pixel unit. NIR (near infrared ray)i,jThe infrared interference light received by the first pixel unit.
The second pixel unit receives the mixed light Gi,j=Gi,j′+NIRi,j(ii) a Wherein G isi,j′To irradiate the secondGreen light of pixel unit.
The mixed light B received by the third pixel uniti,j=Bi,j′+NIRi,j(ii) a Wherein, B′i,jIs a blue light irradiated to the third pixel unit.
The mixed light W received by the fourth pixel uniti,j=Wi,j′+NIRi,j(ii) a Wherein, Wi,j′Is a visible ray irradiated to the fourth pixel unit.
From the above equation of the amount of light, R can be derivedi,j+Gi,j+Bi,j=Ri,j′+Gi,j′+Bi,j′+3*NIRi,j(equation 1).
For visible light, it is composed of red light, blue light and green light, so that R can be obtainedi,j′+Gi,j′+Bi,j′=Wi,j′(equation 2).
Substituting equation 2 into equation 1, and combining Wi,j=Wi,j′+NIRi,jR can be obtainedi,j+Gi,j+Bi,j-Wi,j=3*NIRi,jTo yield the NIRi,j=(Ri,j+Gi,j+Bi,j-Wi,j)/3. I.e. the amount of infrared light received by each pixel cell.
So that the light can be mixed according to the known light Ri,jGi,jBi,jObtaining the NIRi,j. According to the NIR obtainedi,jThe red light received by the first pixel unit can be calculated as: ri,j′=Ri,j-NIRi,j(ii) a The green light received by the second pixel unit is: gi,j′=Gi,j-NIRi,j(ii) a The blue light received by the third pixel unit is: b isi,j′=Bi,j-NIRi,j
As can be seen from the above description, after the fourth pixel unit is introduced, infrared light and visible light in the mixed light can be separated by algorithm calculation during image processing, so that interference of infrared light is reduced, and imaging quality can be improved. Meanwhile, in order to improve the imaging effect, an additional filtering module for reducing infrared interference is not required to be arranged on the light incident side of the color pixel unit, so that the cost of the image sensor is reduced.
Similarly, when the first pixel unit, the second pixel unit and the third pixel unit are a cyan pixel unit, a magenta pixel unit and a yellow pixel unit, respectively, the color information processing unit can also remove the interference of the infrared interference light by an algorithm. The specific treatment process is as follows:
the mixed light C received by the first pixel unit can be obtained by the image acquisition modulei,jThe mixed light M received by the second pixel uniti,jThe third pixel unit receives the mixed light Yi,jThe mixed light W received by the fourth pixel uniti,j
The mixed light C received by the first pixel uniti,j=Ci,j′+NIRi,j(ii) a Wherein, Ci,j′Is the cyan light irradiated to the first pixel unit.
The mixed light M received by the second pixel uniti,j=Mi,j′+NIRi,j(ii) a Wherein M isi,j′Is the magenta light that is irradiated to the second pixel cell.
The mixed light Y received by the third pixel uniti,j=Yi,j′+NIRi,j(ii) a Wherein, Yi,j′Is the yellow light irradiated to the third pixel unit.
The mixed light W received by the fourth pixel uniti,j=Wi,j′+NIRi,j(ii) a Wherein, Wi,j′Is a visible ray irradiated to the fourth pixel unit.
From known Ci,j=Ci,j+Bi,j,Mi,j=Bi,j+Ri,j,Yi,j=Gi,j+Ri,j,Ri,j+Gi,j+Bi,j=Wi,jAnd the mixed light formula can obtain Ri,j′=W′i,j-C′i,j=(Wi,j′+NIRi,j)-(C′i,j+NIRi,j)=Wi,jGi,j(ii) a In the same way, Gi,j′=Wi,j-Mi,j,Bi,j′=Wi,j-Yi,j
As can be seen from the above description, the pixel units using complementary colors (cyan, magenta, and yellow) have a color calculation manner directly obtained by subtracting the complementary colors from the mixed light received by the fourth pixel unit, and compared with the pixel units using three primary colors (red, green, and blue), the calculation steps are reduced by one step, and the calculation process is simplified.
Referring to fig. 12, fig. 12 shows a flowchart of the depth information processing unit. The depth processing unit may employ a process similar to the color information processing unit described above at the time of processing. First, unprocessed infrared data is acquired, and the data can be acquired through a depth information acquisition unit. Then NIR obtained by the above formulai,jAnd obtaining the infrared ray after the visible light interference is subtracted. And obtaining a depth map of the pixel array with M rows and N columns according to the infrared rays, namely depth information corresponding to each pixel unit.
Referring to fig. 13, fig. 13 shows a flow chart of a depth color fused image unit. And after the pixel unit is processed by the color image processing unit and the brightness information processing unit, UV information and Y information of a visible light color map are obtained, wherein the UV indicates the chroma, and the Y indicates the brightness. The brightness and the signal-to-noise ratio of the original color image are improved through fusion. And then the depth image is fused with the depth image obtained by the depth information processing unit to obtain a high-quality depth color image.
As can be seen from the above description, the fourth pixel unit is introduced into the pixel array, and because the fourth pixel unit allows light with three different wavelengths, namely red, green and blue, to pass through, the luminance of the fourth pixel unit is higher than the luminance of other pixel units receiving monochromatic light (such as the first pixel unit receiving red light, the second pixel unit receiving green light, and the like), and therefore the luminance of an image output by the image sensor is higher than the luminance of an image output by an image sensor in the prior art that only uses monochromatic light to acquire luminance information, and especially in a low-illumination environment, the imaging quality can be effectively improved. Meanwhile, in terms of signal-to-noise ratio, the signal-to-noise ratio of the fourth pixel unit is 3dB higher than that of other pixel units, so that the image quality can be improved by introducing the fourth pixel unit. In addition, after the introduced fourth pixel unit is used, infrared light and visible light in the mixed light can be separated through algorithm calculation during image processing, light interference is reduced, and imaging quality can be improved. Meanwhile, in order to improve the imaging effect, an additional filtering module for reducing infrared interference is not required to be arranged on the light incident side of the color pixel unit, so that the cost of the image sensor is reduced.
For the pixel array, the specific situation shown in fig. 5 is not limited, and in the plurality of pixel units, the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is between 25% and 75%, which can meet the requirement of the image sensor provided in the embodiment of the present application. Illustratively, the ratio of the number of the fourth pixel units to the plurality of pixel units may be different ratios of 25%, 50%, 75%, and the like. The above-mentioned proportions are described below with reference to the accompanying drawings, respectively.
Referring to fig. 14, fig. 14 shows a specific case of a pixel array provided in an embodiment of the present application, and in the pixel array shown in fig. 14, the sizes of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit, and the fifth pixel unit are equal.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 25%, the ratio of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit is: 2:2:2:1: 1; or 3:3:3:2:1 or 4:4:4:3: 1.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 50%, the ratios of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit are as follows: 2:4:2:7: 1; or 2:4:2:6: 2; or 2:4:2:7: 1; or 2:4:2:6: 2; or 4:4:4:11: 1; or 4:4:4:10: 2.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 75%, the ratio of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit is: 1:2:1:11: 1; or 1:2:1:10: 2.
It should be understood that the arrangement order among the pixels is not particularly limited in the embodiments of the present application, and the arrangement order of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit, and the fifth pixel unit may be set as needed.
Referring to fig. 15, fig. 15 shows another pixel array provided in this embodiment of the present application, in which the sizes of the first pixel unit, the second pixel unit, the third pixel unit, and the fourth pixel unit are equal, and the size of the rectangle formed by the fifth pixel unit and the first pixel unit, the second pixel unit, the third pixel unit, and the fourth pixel unit is equal. In the image sensor, since the fourth pixel unit allows light with three different wavelengths of red, green and blue to pass through, the luminance of the fourth pixel unit is higher than the luminance of other pixel units receiving single color (such as the first pixel unit receiving red light, the second pixel unit receiving green light, etc.), and therefore the luminance of an image output by the image sensor is higher than the luminance of an image output by an image sensor in the prior art which only uses light with single color to acquire luminance information. Meanwhile, in terms of signal-to-noise ratio, the signal-to-noise ratio of the white pixel unit is 3dB higher than that of other pixel units, and therefore the image quality can be improved by introducing the fourth pixel unit. Besides, after the fourth pixel unit is introduced, infrared light and visible light in the mixed light can be separated through algorithm calculation, and the influence of light interference is reduced. In addition, the fifth pixel unit occupies the positions of 4 pixel units, the area of the fifth pixel unit is larger, the signal-to-noise ratio of the depth map is improved, and the depth precision of measurement is further improved.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 25%, the ratio of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit is: 3:3:3:3: 1; or 5:5:5:5:1 or 8:8:8:8:1 or 15:15:15:15: 1.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 50%, the ratio of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit is: 6:14:8:28:2 or 8:14:6:28:2 or 8:12:8:28: 2.
When the ratio of the number of the fourth pixel units to the number of the plurality of pixel units is 75%, the ratio of the first pixel unit, the second pixel unit, the third pixel unit, the fourth pixel unit and the fifth pixel unit is: 4:7:4:45: 1; or 4:6:4:42: 2; or 4:4:4:36: 4.
The embodiment of the application further provides an imaging device, and the imaging device is electronic equipment, such as mobile phones, tablet computers, wearable electronic equipment and other common equipment with a photographing function. Of course, other types of electronic devices with a shooting function are also possible. The imaging device includes a housing, and the image sensor of any of the above disposed within the housing. In the technical scheme, the fourth pixel unit for receiving visible light is added in the image sensor, so that the photosensitivity of the image sensor is increased, the brightness of a formed image is improved, and the imaging quality of the image sensor is improved.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (11)

1. An image sensor includes a plurality of pixel units arranged in an array; the pixel units comprise a first pixel unit, a second pixel unit, a third pixel unit, a fourth pixel unit and a fifth pixel unit; wherein the content of the first and second substances,
the first pixel unit, the second pixel unit and the third pixel unit are pixel units respectively used for receiving light rays with different colors in visible light;
the fourth pixel unit is a pixel unit for receiving visible light rays in ambient light;
the fifth pixel unit is a TOF pixel unit for detecting the depth of the target object.
2. The image sensor of claim 1, wherein the number of the fourth pixel cells in the plurality of pixel cells is in a ratio of 25% to 75% of the number of the plurality of pixel cells.
3. The image sensor of claim 2, wherein the first, second, third, fourth, and fifth pixel cells are equal in size.
4. The image sensor of claim 2, wherein the first, second, third, and fourth pixel cells are equal in size;
the size of the fifth pixel unit is equal to the size of a rectangle formed by the outermost sides of the four first pixel units arranged in an array.
5. The image sensor as claimed in any one of claims 1 to 4, wherein the first pixel unit, the second pixel unit and the third pixel unit are respectively: the pixel structure comprises a red pixel unit, a green pixel unit and a blue pixel unit; or the like, or, alternatively,
the first pixel unit, the second pixel unit and the third pixel unit are respectively: cyan pixel unit, magenta pixel unit and yellow pixel unit.
6. The image sensor as claimed in any one of claims 1 to 5, further comprising a filter module disposed at a light incident side of the plurality of pixel units.
7. The image sensor of claim 6, wherein the filter module is a dual-band pass filter module that is transparent to visible and infrared light in ambient light.
8. The image sensor as claimed in any one of claims 1 to 7, further comprising a lens for condensing light, the lens being located at a light incident side of the plurality of pixel cells.
9. The image sensor according to any one of claims 1 to 8, further comprising an image processing module, wherein the image processing module performs an interpolation operation according to the light received by the first pixel unit, the light received by the second pixel unit, and the light received by the third pixel unit to obtain a color image; performing interpolation operation according to the visible light rays acquired by the fourth pixel unit to obtain brightness information corresponding to each pixel unit; performing interpolation operation according to the infrared light acquired by the fifth pixel unit to obtain depth information corresponding to each pixel unit; and the image processing unit is used for obtaining a depth color image according to the color image and the brightness information and the depth information corresponding to each pixel unit.
10. The image sensor of claim 9, wherein the image processing module is further configured to determine the ir interference light applied to the first pixel unit, the second pixel unit, and the third pixel unit according to the light received by the first pixel unit, the light received by the second pixel unit, the light received by the third pixel unit, and the visible light received by the fourth pixel unit; and calibrating the monochromatic light received by the first pixel unit, the monochromatic light received by the second pixel unit and the monochromatic light received by the third pixel unit according to the obtained infrared interference light, and obtaining the color image according to the calibrated monochromatic light.
11. An imaging apparatus comprising a housing, and the image sensor according to any one of claims 1 to 10 disposed in the housing.
CN202011481336.6A 2020-12-15 2020-12-15 Image sensor and imaging device Pending CN114650343A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011481336.6A CN114650343A (en) 2020-12-15 2020-12-15 Image sensor and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011481336.6A CN114650343A (en) 2020-12-15 2020-12-15 Image sensor and imaging device

Publications (1)

Publication Number Publication Date
CN114650343A true CN114650343A (en) 2022-06-21

Family

ID=81990096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011481336.6A Pending CN114650343A (en) 2020-12-15 2020-12-15 Image sensor and imaging device

Country Status (1)

Country Link
CN (1) CN114650343A (en)

Similar Documents

Publication Publication Date Title
CN110649056B (en) Image sensor, camera assembly and mobile terminal
US9681057B2 (en) Exposure timing manipulation in a multi-lens camera
US8619143B2 (en) Image sensor including color and infrared pixels
US8125543B2 (en) Solid-state imaging device and imaging apparatus with color correction based on light sensitivity detection
EP2471258B1 (en) Reducing noise in a color image
CN110649057B (en) Image sensor, camera assembly and mobile terminal
WO2013027340A1 (en) Imaging device
CN111246064B (en) Image processing method, camera assembly and mobile terminal
US20230046521A1 (en) Control method, camera assembly, and mobile terminal
CN112235494B (en) Image sensor, control method, imaging device, terminal, and readable storage medium
CN110784634B (en) Image sensor, control method, camera assembly and mobile terminal
US11460666B2 (en) Imaging apparatus and method, and image processing apparatus and method
US20220336508A1 (en) Image sensor, camera assembly and mobile terminal
CN113540138A (en) Multispectral image sensor and imaging module thereof
EP3985729A1 (en) Image sensor, camera assembly, and mobile terminal
US20220139974A1 (en) Image sensor, camera assembly, and mobile terminal
CN114650343A (en) Image sensor and imaging device
EP4246959A1 (en) Image sensor and imaging apparatus
RU93977U1 (en) MULTI-COLOR COLORIMETER
US20220279108A1 (en) Image sensor and mobile terminal
US20240107186A1 (en) Rgbir camera module
CN112235485B (en) Image sensor, image processing method, imaging device, terminal, and readable storage medium
JP2011254265A (en) Multi-eye camera device and electronic information apparatus
Bruna et al. Notions about optics and sensors
JPS6246113B2 (en)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination