WO2021184353A1 - Dispositif de caméra - Google Patents

Dispositif de caméra Download PDF

Info

Publication number
WO2021184353A1
WO2021184353A1 PCT/CN2020/080408 CN2020080408W WO2021184353A1 WO 2021184353 A1 WO2021184353 A1 WO 2021184353A1 CN 2020080408 W CN2020080408 W CN 2020080408W WO 2021184353 A1 WO2021184353 A1 WO 2021184353A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
light
image sensor
resolution
filter
Prior art date
Application number
PCT/CN2020/080408
Other languages
English (en)
Chinese (zh)
Inventor
黄进新
汪鹏程
刘军
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2020/080408 priority Critical patent/WO2021184353A1/fr
Priority to CN202080000505.1A priority patent/CN113728618A/zh
Priority to CN202410417378.5A priority patent/CN118264914A/zh
Publication of WO2021184353A1 publication Critical patent/WO2021184353A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the embodiments of the present application relate to the field of security monitoring, and in particular, to a camera device.
  • Low illumination scene refers to scenes with insufficient light, such as outdoors at night or indoors without sufficient lighting.
  • cameras are often used in conjunction with visible light supplementary lights or infrared light supplementary lights.
  • visible light supplementary light is easy to cause light pollution and is not conducive to covert monitoring; although the use of infrared supplementary light has a clear image, it cannot record colors.
  • the industry has begun to widely adopt a dual-light fusion architecture. Under this architecture, the camera uses two sensors to image infrared light and visible light separately, and then integrates the infrared image and the visible light image to improve the camera’s low illumination Imaging capabilities.
  • a light splitting prism is set in the camera, and the light splitting prism divides the incident light into visible light and infrared light according to the frequency spectrum.
  • the camera inputs the aforementioned visible light and infrared light into two identical image sensors respectively.
  • the image sensor inputting visible light outputs a color image
  • the image sensor inputting infrared light outputs a grayscale image
  • the size and resolution of the foregoing color image and grayscale image are the same.
  • the camera then fuses the aforementioned color image and grayscale image to obtain a target image.
  • the details and texture of the target image mainly come from the grayscale image, and the color information comes from the color image.
  • the embodiments of the present application provide a camera device, which is used to output a higher-quality image in a low-illuminance environment and reduce the lower limit of the operating illuminance of the camera device.
  • an embodiment of the present application provides a camera device, which includes an optical component, a first image sensor, a second image sensor, and an image processor.
  • the resolution of the first image sensor is smaller than the resolution of the second image sensor.
  • the optical component is a component for preprocessing the incident optical signal, and the optical component is used to receive the incident optical signal and process the incident optical signal into a first optical signal and a second optical signal.
  • the first image sensor is used to sense the first light signal to generate a first image; at the same time, the second image sensor is used to sense the second light signal to generate a second image.
  • the image information of the first image includes first color information and first brightness information; the image information of the second image includes second brightness information.
  • the image processor is configured to generate a target image based on the first image and the second image.
  • the color and brightness of the target image are determined by the image information of the first image and the image information of the second image.
  • the camera device uses two image sensors with different resolutions. Because image resolution is inversely related to color sensitivity, and positively related to image sharpness. Therefore, the first image output by the low-resolution first image sensor has higher color sensitivity, which can ensure the true color of the first image. Since the second image sensor with high resolution has more pixels, the output second image has higher definition and can present rich details and textures. Therefore, the target image generated based on the foregoing two images can retain the advantages of the first image and the second image, and therefore, the camera device can be made to work in an environment with lower light intensity.
  • the image processor is specifically configured to: adjust the resolution of the first image to be the same as the resolution of the second image to obtain The third image; then, the third image and the second image are fused to obtain the target image.
  • the third image carries the first color information and the first brightness information.
  • the resolution of the first image is smaller than the resolution of the second image
  • adjusting the resolution of the first image to be the same as the resolution of the second image can be understood as changing the resolution of the first image
  • the resolution is increased to the resolution of the second image to obtain a third image. Since the first image has higher color sensitivity, true colors, and higher brightness, the third image also has true colors and higher brightness. Therefore, the target image determined based on the foregoing third image and the second image can retain the advantages of the third image and the second image, and therefore, the camera device can be made to work in an environment with lower light intensity.
  • the first optical signal includes visible light
  • the second optical signal includes visible light and infrared light.
  • the energy of the visible light in the first optical signal is greater than the energy of the visible light in the second optical signal
  • the frequency band of the visible light in the first optical signal is the same as the frequency band of the visible light in the second optical signal.
  • the ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal can be flexibly controlled according to actual application requirements.
  • the processing of the aforementioned optical component on the incident light signal is not only frequency division, but also energy division.
  • energy division refers to dividing the visible light in the incident light signal, which can also be understood as processing to make the intensity of the visible light in the first light signal different from the intensity of the visible light in the second light signal. Since the energy of the visible light in the first light signal is different from the energy of the visible light in the second light signal, it is beneficial for the two image sensors to have different brightness of the two images obtained when the light intensity is higher than the preset value (that is, during the day). . Determining the target image based on the aforementioned two images with different brightness is beneficial to improve the dynamic range.
  • the first image sensor is a color image sensor
  • the second image sensor is a black and white image sensor
  • the second image sensor is a black-and-white image sensor. Because the color filter matrix of the black-and-white image sensor has higher light transmittance than the color filter matrix of the color image sensor of the same specification, the photoelectric conversion efficiency is higher. Therefore, the brightness of the second image output by the second image sensor can be improved, which is beneficial to improve the quality of the target image. Therefore, the camera device can be operated in a lower light intensity environment.
  • the imaging device further includes an infrared cut-off filter.
  • the camera device is also used to activate the infrared cut-off filter when the light intensity is higher than a preset value, the infrared cut-off filter is located between the optical component and the second image sensor, and the infrared cut-off filter is used for To filter out the infrared light in the second optical signal.
  • the second image sensor is specifically used to sense visible light in the second light signal to generate the second image.
  • the image processor is specifically configured to combine the first color information of the first image with the second brightness information of the second image to obtain the target image, the color of the target image is determined by the first color information, and the target The brightness of the image is determined by the second brightness information.
  • an infrared cut filter to filter the infrared light in the second light signal when the light intensity is high, so as to cut off the second image sensor from being affected by infrared light when sensing visible light.
  • the second image sensor is a black and white image sensor, and the second image determined by the second image sensor only sensing the visible light in the second light signal presents only brightness and no color.
  • the resolution of the first image and the second image are different, the first image provides rich colors (ie, first color information), and the second image provides high-brightness texture details (ie, second brightness information). Therefore, in some cases, part or all of the first color information in the first image and part or all of the second brightness information in the second image may be combined to obtain the target image.
  • part or all of the first color information in the first image and part or all of the second color information in the second image may be combined.
  • the brightness information is combined to obtain the target image.
  • the second image sensor when the light intensity is lower than a preset value, is specifically used to sense the second light The visible light and infrared light in the signal generate the second image.
  • the image processor is specifically configured to fuse the third image with the second image to obtain the target image.
  • the second image sensor simultaneously senses infrared light and a part of visible light when the light intensity is low. Since the second image sensor is a black and white image sensor, the second image output by the second image sensor has only brightness. However, because the second image sensor senses visible light in addition to infrared light, the brightness of the second image output by the second image sensor that senses infrared light and visible light is greater than that of the second image sensor that only senses infrared light output. The brightness of the image. Therefore, the quality of the second image can be improved, thereby improving the quality of the target image.
  • the first image sensor and the second image sensor are both color image sensors.
  • the imaging device further includes an infrared cut filter.
  • the camera device is also used to activate the infrared cut-off filter when the light intensity is higher than a preset value, the infrared cut-off filter is located between the optical component and the second image sensor, and the infrared cut-off filter is used for To filter out the infrared light in the second optical signal.
  • the second image sensor is specifically configured to sense visible light in the second light signal to generate the second image, and the image information of the second image further includes second color information.
  • the image processor is specifically configured to fuse the third image with the second image to obtain the target image.
  • an infrared cut filter to filter the infrared light in the second light signal when the light intensity is high, so as to cut off the second image sensor from being affected by infrared light when sensing visible light.
  • the second image sensor is a color image sensor
  • the second image determined by the second image sensor sensing visible light includes not only the second brightness information, but also the second color information.
  • the aforementioned third image and the aforementioned second image are two images with different brightness. Determining the target image based on the aforementioned two images with different brightness can improve the quality of the target image and also increase the dynamic range.
  • the imaging device further includes a visible light cutoff filter.
  • the camera device is also used to activate the visible light cut-off filter when the light intensity is lower than a preset value, the visible light cut-off filter is located between the optical component and the second image sensor, and the visible light cut-off filter is used for To filter out the visible light in the second optical signal.
  • the second image sensor is specifically used to sense infrared light in the second light signal to generate the second image.
  • the image processor is specifically configured to fuse the third image with the second image to obtain the target image.
  • a visible light filter to filter out visible light in the second light signal when the light intensity is low, so as to reduce the influence of visible light on the second image sensor when sensing infrared light.
  • the second image sensor is a color image sensor
  • the second light signal only contains infrared light. Therefore, the second image only contains the second brightness information and no color information.
  • the optical assembly It includes a lens and a dichroic prism, and the dichroic prism is located between the lens and the image sensor.
  • the lens is used to receive the incident light signal.
  • the light splitting prism is used for dividing the incident light signal received by the lens into the first light signal and the second light signal.
  • the lens is an infrared confocal lens.
  • the camera The device also includes an infrared cut filter.
  • the optical assembly includes a first lens and a second lens.
  • the first lens and the second lens are used to jointly receive the incident light signal.
  • the focal length of the first lens is the same as the focal length of the second lens.
  • the aperture is larger than the aperture of the second lens, an infrared cut filter is arranged between the first lens and the first image sensor, and the second lens is an infrared confocal lens.
  • the first lens is used to receive a part of the incident light signal and transmit the received light signal to the infrared cut filter.
  • the infrared cut-off filter is used to filter infrared light in the optical signal from the first lens to obtain the first optical signal, and transmit the first optical signal to the first image sensor.
  • the second lens is used to receive the remaining part of the incident light signal, and transmit the received light signal to the second image sensor as a second light signal.
  • binocular lenses with different apertures can be used to make the energy of the light signals illuminating different image sensors different. Since the larger the aperture, the greater the luminous flux. Therefore, the energy of the visible light in the first light signal output by the first lens is greater than the energy of the visible light in the second light signal output by the second lens.
  • an infrared cut filter is also arranged between the first lens and the first image sensor, so that only visible light but no infrared light is included in the first light signal.
  • the resolution of the second image is equal to the resolution of the target image.
  • an embodiment of the present application provides an image processor, which is connected to a memory in a camera device.
  • the memory is used to store data or programs processed by the processor, such as the aforementioned first image, second image, and third image;
  • the image and the third image are subjected to image processing.
  • the camera device uses two image sensors with different resolutions. Because image resolution is inversely related to color sensitivity, and positively related to image sharpness. Therefore, the first image output by the low-resolution first image sensor has higher color sensitivity, which can ensure the true color of the first image.
  • the high-resolution second image sensor has more pixels, so the output second image has higher definition and can present rich details and textures. Therefore, the target image determined based on the foregoing two images can retain the advantages of the first image and the second image, and therefore, the camera device can be made to work in an environment with lower light intensity.
  • FIG. 1A is a schematic diagram of an embodiment of a camera device in an embodiment of the application.
  • FIG. 1B is a schematic diagram of another embodiment of the camera device in the embodiment of the application.
  • FIG. 1C is a schematic diagram of another embodiment of the imaging device in the embodiment of the application.
  • FIG. 2A is a schematic diagram of another embodiment of the camera device in the embodiment of the application.
  • 2B is a schematic diagram of an embodiment of an image processing flow in an embodiment of the application.
  • 2C is a schematic diagram of another embodiment of the image processing flow in the embodiment of the application.
  • 3A is a schematic diagram of another embodiment of the camera device in the embodiment of the application.
  • 3B is a schematic diagram of another embodiment of the image processing flow in the embodiment of the application.
  • FIG. 3C is a schematic diagram of another embodiment of the image processing flow in the embodiment of the application.
  • the embodiments of the present application provide a camera device, which is used to output a higher-quality image in a low-illuminance environment and reduce the lower limit of the operating illuminance of the camera device.
  • the camera device proposed in the embodiment of the present application can be applied to camera in a low illumination/low light environment.
  • the low-illuminance environment refers to an environment where the light intensity is lower than a certain value, which is generally measured by the energy received by the visible light per unit area of the image sensor in the imaging device, and the unit is Lux (Lx).
  • an illumination environment greater than 0 Lux and less than 1 Lux can be referred to as a low-illuminance environment.
  • the low-illuminance environment can be a dim outdoor street, for example, a street at night, or a street on a rainy day; it can also be an interior with only weak light, for example, a store, warehouse, etc., with only weak light. There is no limitation here.
  • the spatial resolution of some color images can be sacrificed in exchange for an increase in color sensitivity, thereby achieving the effect of reducing the lower limit of the operating illuminance of the camera device.
  • the lower limit of the original camera's working illumination requirement is 1Lux, and it is difficult to obtain a color image that can be accepted by the human eye if it is lower than 1Lux.
  • the lower limit of the operating illuminance of the camera device can be reduced to 0.1 Lux, or even 0.01 Lux, which is not specifically limited here.
  • the optical component 103 is used to receive the incident light signal and process the incident light signal into a first light signal and a second light signal, wherein the incident light signal is emitted by an object photographed by the imaging device 10.
  • the optical component 103 is also used to control the first light signal to be directed to the first image sensor 101 and control the second light signal to be directed to the second image sensor 102.
  • the first image sensor 101 is used to sense the first light signal to generate a first image, the first image is a color image, and the image information of the first image includes first color information and first brightness information.
  • the second image sensor 102 is used to sense the second light signal to generate a second image, the second image is a color image or a grayscale image, and the image information of the second image includes second brightness information (when the second image is In the case of a color image, the second image also includes second color information).
  • the resolution of the first image sensor 101 is smaller than the resolution of the second image sensor 102.
  • the image processor 104 is configured to generate a target image based on the first image and the second image, and the color and brightness of the target image are determined by the image information of the first image and the image information of the second image.
  • the image processor 104 may be a system on chip (system on chip, SoC).
  • the resolution of the first image sensor 101 is smaller than the resolution of the second image sensor 102, the resolution of the first image is smaller than the resolution of the second image. It should also be understood that since the first image contains first color information, that is, the first image can present colors, the first image sensor 101 is a color image sensor.
  • the second image sensor 102 may be a color image sensor or a black-and-white image sensor. For details, please refer to the related description of the corresponding embodiments in FIG. 2A and FIG. 3A later, and details are not repeated here.
  • the imaging device 10 uses two image sensors with different resolutions. Because image resolution is inversely related to color sensitivity, and positively related to image sharpness. Therefore, the first image output by the low-resolution first image sensor 101 has high color sensitivity, which can ensure the true color of the first image. Since the high-resolution second image sensor 102 has more pixels, the output second image has higher definition and can present rich details and textures. Therefore, the target image determined based on the foregoing two images can retain the advantages of the first image and the second image, and therefore, the camera device can be made to work in an environment with lower light intensity.
  • the process of determining the target image by the image processor 104 may include the following:
  • the image processor 104 adjusts the resolution of the first image to be the same as the resolution of the second image to obtain a third image. Then, the image processor 104 determines the target image based on the third image and the second image. Optionally, the resolution of the target image is equal to the resolution of the second image.
  • the process of adjusting the aforementioned low-resolution first image to a high-resolution third image can not only preserve the color and brightness presented by the first image, but also facilitate the determination of the target image based on the third image and the second image.
  • the resolutions of the aforementioned two images are the same. It can be understood that the resolutions of the two images are exactly the same; it can also be understood that there is a certain difference between the resolutions of the two images. However, The difference is not enough to affect the subsequent processing of the two images, and the specifics are not limited here.
  • the image processor 104 may adopt an up-sampling algorithm or a super-resolution algorithm.
  • the process of determining the target image by the image processor 104 based on the aforementioned third image and second image mainly refers to an image fusing process.
  • Image fusion image fusion refers to an image processing technology that can use a specific algorithm to combine two or more images into a new image. The combined image has the original image (that is, the two or more images before synthesis). ) Excellent characteristics. For example, characteristics such as brightness, clarity, and color.
  • the camera device 10 further includes an image signal processor (image signal processor, ISP) chip (not shown), and the ISP chip is located between the aforementioned image sensor and the image processor 104.
  • the aforementioned ISP chip may be a two-way processing process, respectively processing the first image output by the first image sensor 101 and the second image output by the second image sensor 102, and combining the processed first image and the second image. The two images are sent to the aforementioned image processor 104 for subsequent processing.
  • the ISP chip can perform multiple ISP processing on the aforementioned first image and second image, for example, 3D noise reduction, demosaicing, brightness correction, and color correction.
  • the aforementioned basic image processing can be adjusted according to actual application requirements. Specifically, the content of the aforementioned basic image processing is not limited here.
  • the image output from the first image sensor 101 to the image before the resolution is adjusted are all referred to as the first image.
  • the image output from the second image sensor 102 to the image before the image fusion or combination is called the second image. That is to say, in practical applications, the original image output from the image sensor may go through a variety of processing procedures, but the embodiment of the present application does not limit the foregoing processing procedures.
  • the first image is in the RAW format
  • the RGB image output by the ISP chip is fused
  • the first image is in the RGB format
  • the YUV image output by the ISP chip is fused, the first image is in the YUV format (in order to reduce redundancy, the subsequent embodiments only use images in the YUV format as an example for the introduction of the first and second images).
  • the first image and the second image are not compressed and encoded, and the fused image can be encoded into an image format that is easy to be recognized by the human eye and occupies a smaller storage space, such as jpeg format (referred to as jpg format), bmp format , Tga format, png format and gif format.
  • the ISP chip may also perform image correction on the foregoing third image and the second image.
  • the ISP chip can correct the coordinate system of the third image to the coordinate system of the second image, so that the scenes in the foregoing two images are aligned, which can also be understood as aligning the texture details of the foregoing two images.
  • the ISP chip may use preset correction parameters to adjust the aforementioned two images, or may adaptively configure the correction parameters according to changes in the current temperature to complete image correction.
  • the aforementioned first image sensor 101 or the second image sensor 102 may be a CCD image sensor composed of a charged coupled device (CCD), or a complementary metal oxide semiconductor (complementary metal oxide semiconductor,
  • CCD charged coupled device
  • CMOS image sensor composed of CMOS may also be other types of image sensors, which is not specifically limited here.
  • the camera device 10 adopts an asymmetric image sensor architecture (that is, the resolutions of the two image sensors in the camera device 10 are different), and the color sensitivity can be obtained by reducing the spatial resolution of the first image sensor that senses visible light. Degree (ie, low sensitivity). Therefore, the camera device 10 can be made to work better in a low-illuminance environment.
  • the optical component 103 is specifically used to process the incident light signal so that the energy of the visible light in the first light signal is greater than the energy of the visible light in the second light signal.
  • the first light signal includes visible light
  • the second light signal includes visible light and infrared light. It can also be understood that the optical component 103 performs frequency division and energy division on the incident light signal.
  • frequency division refers to dividing the incident light signal according to different frequencies, for example, dividing the incident light signal into visible light and infrared light.
  • the aforementioned energy is proportional to the square of the amplitude of the light wave, and the aforementioned visible light energy can also be understood as the intensity of visible light. Therefore, energy division can be understood as separating the visible light in the incident light signal into the first light signal and the second light signal by using physical structures such as coating of the lens.
  • the intensity of the visible light in the first light signal and the second light signal are The intensity of the visible light is different, and the intensity ratio between the visible light in the first light signal and the visible light in the second light signal is fixed.
  • the frequency band of visible light in the first optical signal is the same as the frequency band of visible light in the second optical signal; or, the frequency band of visible light in the first optical signal is the same as the possession of visible light in the second optical signal.
  • Light of the same frequency band for example, both the first optical signal and the second optical signal have green light.
  • the energy of the visible light in the first light signal is different from the energy of the visible light in the second light signal, it is beneficial for the two image sensors to obtain the result when the light intensity is higher than the preset value (that is, during the day).
  • the brightness of the two images is different. Determining the target image based on the aforementioned two images with different brightness is beneficial to improve the dynamic range.
  • the ratio between the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal can be flexibly controlled according to actual application requirements.
  • the first optical signal and the second optical signal contain the same visible light frequency band, and the ratio of the energy of the visible light in the first optical signal to the energy of the visible light in the second optical signal can be kept at 9:1, or kept at 8. :2, or remain as 7:3, or remain as 6:4, or remain as 6.5:3.5, etc.
  • the specifics are not limited here.
  • optical component 103 may adopt any one of the following implementation modes:
  • the optical component 103 includes a dichroic prism 1031 and a lens 1032.
  • the dichroic prism 1031 can also be called a beam splitter. It is an optical device that is coated with one or more films on the surface of the optical glass. The refraction and reflection of light split the incident light signal into two beams.
  • the dichroic prism 1031 is used to divide the incident optical signal into a first optical signal and a second optical signal. Wherein, the first light signal includes visible light, and the second light signal includes visible light and infrared light.
  • part of the visible light in the incident light signal is directed to the first image sensor 101 through the coating layer, and the other part of the visible light and all infrared light in the incident light signal is reflected at the coating layer and irradiated to the second image sensor 102 .
  • different optical components choose to use optical glass with different thicknesses and composition coating layers, then different optical components correspond to the energy of the visible light in the first optical signal and the energy of the visible light in the second optical signal. The ratio between the two will also be different.
  • the lens 1032 is an infrared confocal lens, and the infrared confocal lens is used to realize infrared confocal.
  • the optical component 103 is often used in conjunction with a filter.
  • a filter e.g., filter 1051
  • a filter e.g., Filter 1052
  • the aforementioned filter 1051 is an infrared cut filter
  • the first light signal entering the first image sensor 101 can be prevented from being mixed with infrared light.
  • the infrared light in the second optical signal can be filtered; when the aforementioned filter 1052 is a visible light cut filter, the second light can be filtered Visible light in the signal; when the aforementioned filter 1052 is a white glass (the material of the white glass is colorless and transparent glass, which does not filter light), it can pass the visible light and infrared light in the second optical signal, That is, the optical signal of the full frequency band is allowed to pass through the white glass slide.
  • the splitting prism is redesigned and the filter is combined to achieve frequency division (in the first and second optical signals, the infrared filter is used to cut off the light, avoiding infrared light in the first optical signal, and achieving Infrared light frequency band and visible light frequency band are divided) energy (the first and second light signals both include visible light of the same frequency band, and the energy of the two visible lights is different, which realizes the energy division), which can divide the visible light And infrared light is divided into two image sensors, and the ratio of visible light in the incident light signal into the two image sensors can be controlled at the same time. Therefore, the final output of the fused image can be increased when the light intensity is high (for example, during the day), and the dynamic range can be increased when the light intensity is low (for example, at night). Further make the color of the target image more natural.
  • FIG. 1C it is another implementation of the aforementioned optical assembly 103.
  • the optical assembly 103 includes a first lens 1033 and a second lens 1034.
  • the first lens 1033 is used to converge a part of the incident light signal so that the output light signal illuminates the first image sensor 101, and the second lens 1034 is used for The remaining part of the incident light signal is converged to make the output light signal illuminate the second image sensor 102.
  • the focal length of the first lens 1033 is the same as the focal length of the second lens 1034.
  • the aperture of the first lens 1033 is larger than the aperture of the second lens 1034. Therefore, the luminous flux of the first lens 1033 is greater than the luminous flux of the second lens 1034.
  • the energy of the visible light output by the first lens 1033 is greater than the energy of the visible light output by the second lens 1034. It can also be understood that the light signals actually passed by the first lens 1033 and the second lens 1034 are different. In addition, for the explanation of the energy of the visible light, please refer to the relevant introduction of the corresponding embodiment in FIG. 1B, which will not be repeated here.
  • a filter for example, a filter 1051
  • a filter may also be provided between the second lens 1034 and the second image sensor 102 (For example, filter 1052).
  • the aforementioned filter 1051 is an infrared cut filter for filtering infrared light in the optical signal from the first lens 1033, so that only visible light is included in the first optical signal sensed by the first image sensor 101 It does not contain infrared light.
  • the infrared light in the second optical signal can be filtered; when the aforementioned filter 1052 is a visible light cut filter, the second light can be filtered Visible light in the signal; when the aforementioned filter 1052 is a white glass (the material of the white glass is colorless and transparent glass, which does not filter light), it can pass the visible light and infrared light in the second optical signal, That is, the full-wavelength optical signal is allowed to pass through the white glass slide.
  • the second lens 1034 is an infrared confocal lens.
  • two lenses with the same focal length but different aperture sizes are used to form a binocular lens to achieve energy distribution (different sensors receive different light energy). Since the size of the aperture determines the intensity of the light that can pass through the lens, the use of apertures of different sizes can control the energy of the visible light entering the two image sensors to be different. Combined with the aforementioned asymmetric image sensor architecture, and using different filters under different light intensities, the final output of the fused image can be increased when the light intensity is high (for example, during the day). When it is lower (for example, dark night), the color of the target image is more natural.
  • the aforementioned optical component 103 in FIG. 1A and the optical components involved in the following may adopt the implementation shown in FIG. 1B or the implementation shown in FIG. Not limited.
  • the photosensitive performance may also be different. The following are introduced separately:
  • the first image sensor is a color image sensor
  • the second image sensor is a black and white image sensor. Since the resolution of the first image sensor is smaller than the resolution of the second image sensor, the aforementioned first image sensor is referred to as a low-resolution color image sensor, and the aforementioned second image sensor is referred to as a high-resolution black-and-white image sensor. Image Sensor.
  • the aforementioned low-resolution color image sensor can be a Bayer image sensor or other format color image sensors; the aforementioned high-resolution black and white image sensor can be a MONO format image sensor (Mono image sensor) or others
  • the format of the black and white image sensor is not limited here.
  • An infrared cut filter is arranged between the low-resolution color image sensor and the optical component, and a dual-optical filter is arranged between the high-resolution black-and-white image sensor and the optical component. This dual optical filter is also called an IR-CUT automatic switching filter.
  • the IR-CUT automatic switching filter is provided with a photosensitive device, or the IR-CUT automatic switching filter is connected to a photosensitive device, and the photosensitive device transmits the sensed light intensity to the imaging device.
  • the camera device specifically, a filter control chip in the camera device
  • the IR-CUT can control the IR-CUT to automatically switch the filter automatically.
  • the dual light filter can also be switched to a white glass plate, allowing both visible light and infrared light to pass.
  • the dual-optical filter can be replaced with an infrared cut-off filter, and the infrared cut-off filter is controlled by the camera device to enable and disable the infrared cut-off filter.
  • the low-resolution color image sensor senses visible light in the first light signal and outputs a low-resolution color image.
  • the low-resolution color image includes first color information and first brightness information.
  • the first color information is used to indicate the color of the low-resolution color image
  • the first brightness information is used to indicate the brightness of the low-resolution color image. This embodiment does not limit the specific forms of the first brightness information and the first color information.
  • the dual-optical filter set between the high-resolution black-and-white image sensor and the optical assembly is switched to a white glass plate, and the high-resolution black-and-white image sensor senses visible light and infrared light in the second light signal, and outputs high-resolution Grayscale image.
  • the high-resolution gray-scale image includes second brightness information, and the second brightness information is used to indicate the brightness of the high-resolution gray-scale image.
  • the second brightness information may be represented by a brightness component Y.
  • the embodiment of the present application does not limit the specific form of the second brightness information.
  • the image directly generated by the aforementioned image sensor is in the RAW format.
  • the RAW format is also divided into multiple types, such as Bayer RGGB, RYYB, RCCC, RCCB, RGBW, CMYW and other formats.
  • ISP chip you can convert various formats of RAW images into RGB format images.
  • ISP chip you can also convert RAW format images into YUV format images, or HSV format images, or Lab format images, or CMY format images, or YCbCr format images.
  • the ISP chip first converts RAW format images into RGB format images, and then converts RGB format images into YUV format images.
  • the ISP chip in the camera device can also perform basic image processing on the aforementioned low-resolution color images and high-resolution grayscale images, such as 3D noise reduction, demosaicing, brightness correction, and color correction.
  • the image processor adopts an up-sampling algorithm or a super-resolution algorithm to adjust the aforementioned low-resolution color image to a high-resolution color image, and the high-resolution color image has the same resolution as the aforementioned high-resolution grayscale image. Then, the image processor fuses the aforementioned high-resolution color image and high-resolution gray-scale image to obtain the target image.
  • the low-resolution RGB format image 201 is converted into a low-resolution YUV format image 202, and then the low-resolution YUV format image 202 is up-sampled to obtain a high-resolution YUV format image 203 . Then, the high-resolution YUV format image 203 and the high-resolution grayscale image 204 with only the Y component are fused to obtain a high-resolution YUV format image 205 (ie, the target image).
  • YUV formats mainly include YUV420, YUV422 and YUV444.
  • YUV444 means that each Y component corresponds to a set of UV components
  • YUV422 means that every two Y components share a set of UV components
  • YUV420 means that every four Y components share a set of UV components.
  • FIG. 2B uses YUV420 as an example, in practical applications, the format of the image can be adjusted according to specific requirements, which is not specifically limited here.
  • the aforementioned YUV format and the RGB format are different color coding methods, the change of the coding format will not affect the colors presented by the image.
  • the aforementioned high-resolution black and white image sensor simultaneously senses infrared light and a part of visible light, instead of only sensing infrared light or only sensing visible light. Therefore, the brightness of the second image output by the second image sensor by sensing infrared light and visible light is greater than the brightness of the second image output by the second image sensor by sensing only infrared light. Therefore, the quality of the second image can be improved, thereby improving the quality of the target image.
  • the low-resolution color image sensor senses visible light in the first light signal and outputs a low-resolution color image.
  • the low-resolution color image includes first color information and first brightness information.
  • the first color information is used to indicate the color of the low-resolution color image
  • the first brightness information is used to indicate the brightness of the low-resolution color image.
  • the embodiment of the present application does not limit the specific form of the first brightness information and the first color information.
  • the first color information is a U/V component
  • the first brightness information is a Y component.
  • the dual-optical filter arranged between the high-resolution black and white image sensor and the optical assembly is switched to an infrared cut-off filter, and the infrared cut-off filter is used to filter the infrared light in the second optical signal. Therefore, the high-resolution black and white image sensor senses the visible light in the second light signal and outputs a high-resolution grayscale image.
  • the high-resolution gray-scale image includes second brightness information
  • the second brightness information is used to indicate the brightness of the high-resolution gray-scale image.
  • the second brightness information may be represented by a brightness component Y. Since the high-resolution black-and-white image sensor cannot record colors, the second image only presents brightness and cannot present colors.
  • the second image has only the luminance component Y and no chrominance component U/V. It should be understood that when the format of the aforementioned image sensor is different, the format of the output image will also be different, which has been described in detail in the foregoing, and will not be repeated here.
  • the image processor can compare the first color information of the low-resolution color image with the brightness of the second brightness information.
  • the second brightness information of the high-resolution grayscale image is combined to obtain the target image.
  • the color of the target image is determined by the first color information
  • the brightness of the target image is determined by the second brightness information.
  • the image processor may combine the color component (ie U/V component) of the aforementioned low-resolution color image with the brightness component (ie Y component) of the high-resolution grayscale image to obtain the target image.
  • a higher-quality target image can be obtained without using a complex fusion algorithm, which can reduce the amount of data processing of the image processor.
  • the ratio of the resolution of the color image sensor to the resolution of the black-and-white image sensor is 1:4, and the resolution of the low-resolution color image (that is, the first image) and the high-resolution grayscale image (Ie, the second image) has a resolution ratio of 1:4.
  • the aforementioned low-resolution color image and high-resolution gray-scale image are both expressed in the YUV format.
  • the aforementioned low-resolution color image adopts the YUV444 format
  • the aforementioned high-resolution gray-scale image adopts the YUV420 format.
  • the image processor can output the target image in the YUV420 format.
  • the low-resolution RGB format image 211 is converted into a low-resolution YUV444 format image 212.
  • the U/V component in the low-resolution YUV444 format image 212 and the Y component in the high-resolution grayscale image 213 are combined to obtain a high-resolution YUV420 format image 214 (that is, the target image).
  • the ratio of the resolution of the aforementioned low-resolution color image sensor to the resolution of the high-resolution grayscale image sensor can be other values, for example, 1:2 or 1:16, etc., which is not specifically limited here. .
  • the image format used in the image fusion process will also be adaptively adjusted, so as to output a better quality target image while reducing the amount of calculation to output the target image.
  • the ratio of visible light in the first optical signal to visible light in the second optical signal in this embodiment is 3:2.
  • the aforementioned optical component splits beams according to frequency spectrum and energy at the same time, wherein 60% of the visible light in the incident light signal is irradiated to the low-resolution color image sensor, and 40% of the visible light and 100% of the infrared light in the incident light signal are irradiated to the low-resolution color image sensor. Illuminate to the high-resolution black-and-white image sensor.
  • the filter due to the effect of the filter, the light signals actually entering the two image sensors can be further adjusted.
  • the first image sensor adopts a low-resolution color image sensor
  • the second image sensor adopts a high-resolution black-and-white image sensor.
  • the asymmetric image sensor architecture can reduce the lower limit of the operating illuminance of the camera device.
  • the color filter matrix of the black and white image sensor has higher light transmittance and higher photoelectric conversion efficiency than the color filter matrix of the color image sensor of the same specification. Therefore, the brightness of the high-resolution grayscale image output by the high-resolution black-and-white image sensor (that is, the brightness indicated by the second brightness information) can be improved, thereby helping to improve the quality of the target image. Therefore, it is possible to further make the camera device work in an environment with lower light intensity.
  • both the aforementioned first image sensor and the aforementioned second image sensor are color image sensors.
  • the color image sensor may be a Bayer image sensor (Bayer image sensor) or a color image sensor in other formats. Since the resolution of the first image sensor is smaller than the resolution of the second image sensor, the aforementioned first image sensor is referred to as a low-resolution color image sensor, and the aforementioned second image sensor is referred to as a high-resolution color image sensor.
  • Image Sensor the resolution of the first image sensor is smaller than the resolution of the second image sensor.
  • An infrared cut filter is arranged between the low-resolution color image sensor and the optical component, and a dual-optical filter is arranged between the high-resolution color image sensor and the optical component.
  • the dual optical filter has been introduced in the previous embodiment corresponding to FIG. 2A, and will not be repeated here.
  • the low-resolution color image sensor senses visible light in the first light signal and outputs a low-resolution color image.
  • the low-resolution color image includes first color information and first brightness information.
  • the first color information is used to indicate the color of the low-resolution color image
  • the first brightness information is used to indicate the brightness of the low-resolution color image.
  • the embodiment of the present application does not limit the specific form of the first brightness information and the first color information.
  • the first color information is the U/V component
  • the first brightness information is the Y component.
  • the dual-optical filter provided between the high-resolution color image sensor and the optical component is switched to a visible light cut-off filter, and the high-resolution color image sensor senses the infrared light in the second light signal and outputs high-resolution Grayscale image.
  • the high-resolution gray-scale image includes second brightness information, and the second brightness information is used to indicate the brightness of the high-resolution gray-scale image.
  • the second brightness information can be represented by a brightness component Y. It should be understood that, although the high-resolution color image sensor can record colors, the high-resolution color image sensor only senses infrared light and no visible light, so the second image only presents brightness and cannot present colors. Therefore, the second image has only the luminance component Y and no chrominance component U/V.
  • the ISP chip in the camera device can perform the aforementioned ISP processing on the aforementioned low-resolution color image and the aforementioned high-resolution color image respectively. For example, 3D noise reduction, demosaicing, brightness correction and color correction and other processing.
  • the ISP chip can also adjust the format of the aforementioned low-resolution color image and high-resolution color image, for example, adjusting the Bayer format to the YUV format, etc., which is not specifically limited here.
  • the image processor adopts an up-sampling algorithm or a super-resolution algorithm to adjust the aforementioned low-resolution color image to a high-resolution color image, and the high-resolution color image has the same resolution as the aforementioned high-resolution grayscale image. Rate. Then, the image processor fuses the aforementioned high-resolution color image and high-resolution grayscale image to obtain the target image.
  • the low-resolution RGB format image 301 is converted into a low-resolution YUV format image 302, and then the low-resolution YUV format image 302 is up-sampled to obtain a high-resolution YUV format image 303 . Then, the high-resolution YUV format image 303 and the Y-component grayscale image 304 are merged to obtain a high-resolution YUV format image 305 (ie, the target image).
  • the aforementioned high-resolution color image sensor when the light intensity is low, only senses infrared light in the second light signal to generate a high-resolution grayscale image with only brightness. Combining the aforementioned low-resolution color image and high-resolution grayscale image can retain the advantages of the aforementioned two images and improve the quality of the target image.
  • the low-resolution color image sensor senses visible light in the first light signal and outputs a low-resolution color image.
  • the low-resolution color image includes first color information and first brightness information. Specifically, it is similar to the case where the light intensity is higher than the preset value, and will not be repeated here.
  • the dual-optical filter arranged between the high-resolution color image sensor and the optical assembly is switched to an infrared cut-off filter, and the infrared cut-off filter is used to filter out the infrared light in the second optical signal. Therefore, the high-resolution color image sensor senses the visible light in the second light signal and outputs a high-resolution color image.
  • the high-resolution color image (that is, the aforementioned second image) includes not only the second brightness information but also the second color information.
  • the second color information is used to indicate the color of the high-resolution color image
  • the second brightness information is used to indicate the brightness of the high-resolution color image.
  • the embodiment of the present application does not limit the specific form of the second brightness information and the second color information.
  • the second color information is a U/V component
  • the second brightness information is a Y component.
  • the ISP chip in the camera device can perform the aforementioned ISP processing on the aforementioned low-resolution color images and high-resolution color images. For example, 3D noise reduction, demosaicing, brightness correction and color correction and other processing.
  • the ISP chip can also adjust the format of the aforementioned low-resolution color image and high-resolution color image, for example, adjusting the Bayer format to the YUV format, etc., which is not specifically limited here.
  • the image processor adopts an up-sampling algorithm or a super-resolution algorithm to adjust the aforementioned low-resolution color image to a high-resolution color image, and the aforementioned two high-resolution color images have the same resolution. Then, the image processor fuses the aforementioned two high-resolution color images to obtain the target image. It is helpful to improve the dynamic range of the target image.
  • the low-resolution RGB format image 311 is converted into a low-resolution YUV format image 312, and then the low-resolution YUV format image 312 is up-sampled to obtain a high-resolution YUV format image 313 .
  • the high-resolution RGB format image 314 is converted into the high-resolution YUV format image 315.
  • the high-resolution YUV format image 313 and the high-resolution YUV format image 315 are merged to obtain a high-resolution YUV format image 316 (that is, the target image).
  • the target image can have the advantages of the aforementioned two images, and the dynamic range is improved while the quality of the target image is improved.
  • FIG. 3B and FIG. 3C use YUV420 as examples, in actual applications, the image format can be adjusted according to specific requirements, which is not specifically limited here.
  • the aforementioned YUV format and the RGB format are different color coding methods, the change of the coding format will not affect the colors presented by the image.
  • the ratio of visible light in the first optical signal to visible light in the second optical signal in this embodiment is 4:1.
  • the aforementioned optical components are split according to frequency spectrum and energy at the same time, where 80% of the visible light in the incident light signal is irradiated to the low-resolution color image sensor, and 20% of the visible light and 100% of the infrared light in the incident light signal are irradiated to the low-resolution color image sensor. Illuminate to the high-resolution color image sensor. And combined with the filter to further adjust the light signal actually entering the two image sensors.
  • the high-resolution color image sensor can output color images compared to the aforementioned high-resolution black-and-white image sensor. image. Since the energy of the visible light sensed between the low-resolution color image sensor and the high-resolution color image sensor is different, the brightness of the output low-resolution color image and the high-resolution color image are also different. Fusion of the aforementioned two images can not only increase the dynamic range, but also make the target image more realistic.
  • the present invention also provides an image processing method for performing the functions of the image processor in the foregoing embodiment, for example: adjusting the resolution of the foregoing first image to be the same as the resolution of the foregoing second image; and based on the foregoing first image
  • the image and the aforementioned second image generate a target image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention porte, selon les modes de réalisation, sur un dispositif de caméra utilisant des capteurs d'image asymétriques. Le dispositif de caméra comprend deux capteurs ayant des résolutions différentes, le capteur d'image ayant une faible résolution générant une image en couleur, et le capteur d'image ayant une résolution générant génère une image en couleur ou une image en niveaux de gris. Des informations de couleur et des informations de luminance des images générées par les deux capteurs d'image sont acquises et fusionnées pour générer une image cible, améliorant un effet d'imagerie d'une caméra dans une scène d'éclairage faible.
PCT/CN2020/080408 2020-03-20 2020-03-20 Dispositif de caméra WO2021184353A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/080408 WO2021184353A1 (fr) 2020-03-20 2020-03-20 Dispositif de caméra
CN202080000505.1A CN113728618A (zh) 2020-03-20 2020-03-20 一种摄像装置
CN202410417378.5A CN118264914A (zh) 2020-03-20 2020-03-20 一种摄像装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/080408 WO2021184353A1 (fr) 2020-03-20 2020-03-20 Dispositif de caméra

Publications (1)

Publication Number Publication Date
WO2021184353A1 true WO2021184353A1 (fr) 2021-09-23

Family

ID=77769970

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/080408 WO2021184353A1 (fr) 2020-03-20 2020-03-20 Dispositif de caméra

Country Status (2)

Country Link
CN (2) CN113728618A (fr)
WO (1) WO2021184353A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102244791A (zh) * 2010-05-12 2011-11-16 索尼公司 图像处理设备、图像处理方法和程序
US20160205374A1 (en) * 2005-08-25 2016-07-14 Callahan Cellular L.L.C. Digital cameras with direct luminance and chrominance detection
CN107197168A (zh) * 2017-06-01 2017-09-22 松下电器(中国)有限公司苏州***网络研究开发分公司 图像采集方法以及应用该方法的图像采集***
CN107563971A (zh) * 2017-08-12 2018-01-09 四川精视科技有限公司 一种真彩高清夜视成像方法
CN107820066A (zh) * 2017-08-12 2018-03-20 四川聚强创新科技有限公司 一种低照度彩色摄像机
CN109040534A (zh) * 2017-06-12 2018-12-18 杭州海康威视数字技术股份有限公司 一种图像处理方法及图像采集设备
CN208890917U (zh) * 2018-11-14 2019-05-21 杭州海康威视数字技术股份有限公司 一种镜头组件及摄像机
CN110891138A (zh) * 2018-09-10 2020-03-17 杭州萤石软件有限公司 黑光全彩实现方法和黑光全彩摄像机

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105554483B (zh) * 2015-07-16 2018-05-15 宇龙计算机通信科技(深圳)有限公司 一种图像处理方法及终端
JP6717690B2 (ja) * 2016-07-01 2020-07-01 オリンパス株式会社 撮像装置、撮像方法、およびプログラム
CN106454149B (zh) * 2016-11-29 2019-03-19 Oppo广东移动通信有限公司 图像拍摄方法、装置及终端设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160205374A1 (en) * 2005-08-25 2016-07-14 Callahan Cellular L.L.C. Digital cameras with direct luminance and chrominance detection
CN102244791A (zh) * 2010-05-12 2011-11-16 索尼公司 图像处理设备、图像处理方法和程序
CN107197168A (zh) * 2017-06-01 2017-09-22 松下电器(中国)有限公司苏州***网络研究开发分公司 图像采集方法以及应用该方法的图像采集***
CN109040534A (zh) * 2017-06-12 2018-12-18 杭州海康威视数字技术股份有限公司 一种图像处理方法及图像采集设备
CN107563971A (zh) * 2017-08-12 2018-01-09 四川精视科技有限公司 一种真彩高清夜视成像方法
CN107820066A (zh) * 2017-08-12 2018-03-20 四川聚强创新科技有限公司 一种低照度彩色摄像机
CN110891138A (zh) * 2018-09-10 2020-03-17 杭州萤石软件有限公司 黑光全彩实现方法和黑光全彩摄像机
CN208890917U (zh) * 2018-11-14 2019-05-21 杭州海康威视数字技术股份有限公司 一种镜头组件及摄像机

Also Published As

Publication number Publication date
CN118264914A (zh) 2024-06-28
CN113728618A (zh) 2021-11-30

Similar Documents

Publication Publication Date Title
WO2021184362A1 (fr) Dispositif de photographie
CN108965654A (zh) 基于单传感器的双光谱摄像机***和图像处理方法
KR101428635B1 (ko) 듀얼 이미지 캡쳐 프로세싱
US8803994B2 (en) Adaptive spatial sampling using an imaging assembly having a tunable spectral response
CN110365878B (zh) 一种摄像装置和方法
CN108712608A (zh) 终端设备拍摄方法和装置
KR20100103504A (ko) 칼라-모자이크 이미저로부터 전정색 응답을 성취하는 방법 및 장치
WO2020168465A1 (fr) Dispositif et procédé de traitement d'image
WO2022222634A1 (fr) Procédé de traitement d'image, appareil de traitement d'image, dispositif électronique et support de stockage
CN104581103A (zh) 一种图片处理方法及装置
CN111970432A (zh) 一种图像处理方法及图像处理装置
WO2020146118A1 (fr) Balance automatique des blancs assistée par enroulement de lentille
CN113556526A (zh) 一种基于rgbw滤光阵列的彩色夜视设备色彩增强方法
TW201638620A (zh) 鏡頭模組陣列、影像感測裝置與數位縮放影像融合方法
US20060033824A1 (en) Sodium screen digital traveling matte methods and apparatus
CN109068111A (zh) 一种监控装置、监控方法和电子设备
WO2021184353A1 (fr) Dispositif de caméra
CN112217962B (zh) 摄像机及图像生成方法
CN107613183A (zh) 一种摄像头***和摄像头***的应用方法
Nonaka et al. Monocular color-IR imaging system applicable for various light environments
US20200228769A1 (en) Lens rolloff assisted auto white balance
JP5545596B2 (ja) 画像入力装置
CN114143443B (zh) 双传感器摄像***及其摄像方法
WO2022032666A1 (fr) Procédé de traitement d'image et appareil associé
CN208739306U (zh) 一种监控装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925757

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20925757

Country of ref document: EP

Kind code of ref document: A1