WO2020238905A1 - Image fusion device and method - Google Patents

Image fusion device and method Download PDF

Info

Publication number
WO2020238905A1
WO2020238905A1 PCT/CN2020/092364 CN2020092364W WO2020238905A1 WO 2020238905 A1 WO2020238905 A1 WO 2020238905A1 CN 2020092364 W CN2020092364 W CN 2020092364W WO 2020238905 A1 WO2020238905 A1 WO 2020238905A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image signal
target image
unit
target
Prior art date
Application number
PCT/CN2020/092364
Other languages
French (fr)
Chinese (zh)
Inventor
罗丽红
聂鑫鑫
於敏杰
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2020238905A1 publication Critical patent/WO2020238905A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • This application relates to the field of image processing technology, and in particular to an image fusion device and method.
  • Image fusion is the fusion of the complementary information of different images of the same target collected according to a certain criterion, so that the fused image has better properties than any image that participates in the fusion, so as to more accurately reflect the actual information.
  • the image fusion scheme in related technologies is to collect visible light images and non-visible light images through a single camera, a spectroscopic structure, and two image sensors, perform registration processing, and then perform fusion to generate a fused image.
  • the spectroscopic structure is used to decompose incident light into visible light signals and non-visible light signals.
  • the above-mentioned solution requires two image sensors and a complicated spectroscopic structure design, and the process is complicated and the cost is high.
  • the present application provides an image fusion device and method, which simplifies the structure of image acquisition, thereby reducing costs.
  • this application provides an image fusion device, including:
  • the image sensor is configured to generate and output a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal Is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure being two of the multiple exposures;
  • the light supplementer includes a first light supplement device, and the first light supplement device is used to perform near-infrared supplement light, wherein at least the near-infrared supplement light exists in a partial exposure time period of the first preset exposure, There is no near-infrared fill light in the exposure time period of the second preset exposure;
  • the filter assembly includes a first filter, and the first filter is used to pass visible light and part of near-infrared light;
  • the processor includes a cache unit and an image processing unit
  • the buffer unit is configured to buffer the first target image signal when it is known that the first target image signal currently output by the image sensor needs to be buffered, and when it is known that the buffered second target image signal needs to be output synchronously At least the buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a second buffered frame An image signal, or the first target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
  • the image processing unit is configured to receive at least a first target image signal currently output by the image sensor, and at least a second target image signal synchronously output by the buffer unit, according to the first target image signal and the The second target image signal generates a color fusion image.
  • this application provides an image fusion device, including:
  • the image sensor is configured to generate and output a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal Is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure being two of the multiple exposures;
  • the light supplementer includes a first light supplement device, and the first light supplement device is used to perform near-infrared supplement light, wherein at least the near-infrared supplement light exists in a partial exposure time period of the first preset exposure, There is no near-infrared fill light in the exposure time period of the second preset exposure;
  • the filter assembly includes a first filter, and the first filter is used to pass visible light and part of near-infrared light;
  • the processor includes a cache unit and an image processing unit
  • the image processing unit is configured to receive the first target image signal currently output by the image sensor, preprocess the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least The first target image is synchronously output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least the first target image synchronously output by the buffer unit is received
  • Two target images generating a color fusion image based on the first target image and the second target image; wherein, if the first target image signal is a first image signal, the first target image is a first image signal An image generated after preprocessing, the second target image is a buffered frame of an image generated after preprocessing of a second target image signal, and the second target image signal is the second image signal; If the first target image signal is a second image signal, the first target image is an image generated after preprocessing of the second image signal, and the second target image is a buffered frame preprocessed by the second
  • the cache unit is configured to cache at least the first target image synchronously output by the image processing unit when it is known that the first target image needs to be cached, and when it is known that the cached second target image needs to be output synchronously At least synchronously outputting the buffered second target image signal to the image processing unit.
  • an embodiment of the present application provides an image fusion method, which is applied to an image fusion device.
  • the image fusion device includes an image sensor, a light fill, a filter component, and a processor, and the image sensor is located in the filter.
  • the light supplement includes a first light supplement device
  • the filter component includes a first filter
  • the processor includes a buffer unit and an image processing unit
  • the method includes:
  • the near-infrared light-filling is performed by the first light-filling device, wherein the near-infrared light-filling is performed at least during a partial exposure time period of the first preset exposure, and the near-infrared light is not performed during the exposure time period of the second preset exposure Fill light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
  • the image sensor adopts a global exposure mode to perform multiple exposures to generate and output a first image signal and a second image signal, the first image signal is an image signal generated according to the first preset exposure, the The second image signal is an image signal generated according to the second preset exposure;
  • the buffer unit knows that the first target image signal currently output by the image sensor needs to be buffered, the first target image signal is buffered, and when it is known that the buffered second target image signal needs to be output synchronously, At least the buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a frame of the buffered second image signal , Or the first target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
  • the image processing unit receives at least the first target image signal currently output by the image sensor, and at least the second target image signal synchronously output by the buffer unit, according to the first target image signal and the second target image signal.
  • the target image signal generates a color fusion image.
  • an embodiment of the present application provides an image fusion method, which is applied to an image fusion device.
  • the image fusion device includes an image sensor, a light fill, a filter component, and a processor, and the image sensor is located in the filter.
  • the light supplement includes a first light supplement device
  • the filter component includes a first filter
  • the processor includes a buffer unit and an image processing unit
  • the method includes:
  • the near-infrared light-filling is performed by the first light-filling device, wherein the near-infrared light-filling is performed at least during a partial exposure time period of the first preset exposure, and the near-infrared light is not performed during the exposure time period of the second preset exposure Fill light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
  • the image sensor adopts a global exposure mode to perform multiple exposures to generate and output a first image signal and a second image signal, the first image signal is an image signal generated according to the first preset exposure, the The second image signal is an image signal generated according to the second preset exposure;
  • the image processing unit receives the first target image signal currently output by the image sensor, and preprocesses the first target image signal to obtain the first target image.
  • the first target image needs to be cached, at least The first target image is synchronously output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least the first target image synchronously output by the buffer unit is received
  • Two target images generating a color fusion image based on the first target image and the second target image; wherein, if the first target image signal is a first image signal, the first target image is a first image signal An image generated after preprocessing, the second target image is a buffered frame of an image generated after preprocessing of a second target image signal, and the second target image signal is the second image signal; If the first target image signal is a second image signal, the first target image is an image generated after preprocessing of the second image signal, and the second target image is a buffered frame preprocessed by the
  • the cache unit knows that the first target image needs to be cached, at least the first target image output synchronously by the image processing unit is cached, and when it is known that the cached second target image needs to be output synchronously At least synchronously outputting the buffered second target image signal to the image processing unit.
  • the image fusion device and method provided by the embodiments of the present application include: a filter component, a single image sensor, a light supplement, and a processor, and the image sensor is used to generate and output a first image through multiple exposures Signal and a second image signal, wherein the first image signal is an image signal generated according to a first preset exposure, the second image signal is an image signal generated according to a second preset exposure, and the light complement is used Performing near-infrared supplementary light, wherein at least there is near-infrared supplementary light during a part of the exposure time period of the first preset exposure, and there is no near-infrared supplementary light during the exposure time period of the second preset exposure;
  • the filter assembly includes a first filter, and the first filter is used to pass visible light and part of the near-infrared light;
  • the processor includes a buffer unit and an image processing unit; the buffer unit is used for When it is known that the first target image signal currently output by the image sensor needs to be buffered,
  • the structure of image acquisition is simple, which can reduce the cost and can In any period of time, the first image signal containing near-infrared light information and the second image signal containing visible light information are simultaneously collected through the first preset exposure and the second preset exposure, and subsequently according to the first image signal and the second image signal.
  • the image signal undergoes fusion processing, and the quality of the color fusion image obtained is higher.
  • Fig. 1 is a schematic structural diagram of a first image acquisition device provided by an embodiment of the present application
  • FIG. 2 is a schematic structural diagram of an image fusion device provided by an embodiment of the present application.
  • Figure 3 is a schematic structural diagram of another image fusion device provided by an embodiment of the present application.
  • Fig. 4 is a schematic structural diagram of an image processing unit provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a principle of image caching provided by an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of an image preprocessing unit provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an image cache synchronization principle provided by an embodiment of the present application.
  • FIG. 8 is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • 9A is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • 9B is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • 9C is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of another image fusion device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of yet another image fusion device provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of another image caching principle provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • 15A is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • 15B is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • 15C is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of an image fusion processing principle provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of the relationship between the wavelength and relative intensity of near-infrared supplement light performed by a first light supplement device provided by an embodiment of the present application;
  • FIG. 18 is a schematic diagram of the relationship between the wavelength of the light passing through the first filter and the pass rate according to an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of a second image acquisition device provided by an embodiment of the present application.
  • FIG. 20 is a schematic diagram of an RGB sensor provided by an embodiment of the present application.
  • FIG. 21 is a schematic diagram of an RGBW sensor provided by an embodiment of the present application.
  • FIG. 22 is a schematic diagram of an RCCB sensor provided by an embodiment of the present application.
  • FIG. 23 is a schematic diagram of a RYYB sensor provided by an embodiment of the present application.
  • FIG. 24 is a schematic diagram of a sensing curve of an image sensor according to an embodiment of the present application.
  • 25 is a schematic diagram of a rolling shutter exposure method provided by an embodiment of the present application.
  • FIG. 26 is a schematic diagram of a first preset exposure and a second preset exposure according to an embodiment of the present application.
  • FIG. 27 is a schematic diagram of a second type of first preset exposure and a second preset exposure provided by an embodiment of the present application.
  • FIG. 28 is a schematic diagram of a third type of first preset exposure and a second preset exposure provided by an embodiment of the present application.
  • FIG. 29 is a schematic diagram of the first rolling shutter exposure method and near-infrared light supplement provided by an embodiment of the present application.
  • FIG. 30 is a schematic diagram of a second rolling shutter exposure method and near-infrared light supplement provided by an embodiment of the present application.
  • FIG. 31 is a schematic diagram of a third rolling shutter exposure method and near-infrared light supplement provided by an embodiment of the present application.
  • FIG. 32 is a schematic structural diagram of a first joint noise reduction unit provided by an embodiment of the present application.
  • FIG. 33 is a schematic structural diagram of a second joint noise reduction unit provided by an embodiment of the present application.
  • FIG. 34 is a schematic structural diagram of a third type of joint noise reduction unit provided by an embodiment of the present application.
  • FIG. 35 is a schematic flowchart of an image fusion method provided by an embodiment of the present application.
  • FIG. 36 is a schematic flowchart of an image fusion method provided by an embodiment of the present application.
  • 01 Image sensor, 02: Filler, 03: Filter component; 04: Lens;
  • 021 the first light supplement device
  • 022 the second light supplement device
  • 031 the first filter
  • 032 the second filter
  • 033 the switching component
  • FIG. 1 is a schematic structural diagram of an image acquisition device provided by an embodiment of the present application.
  • the image acquisition device includes an image sensor 01, a light supplement 02, and a filter component 03.
  • the image sensor 01 is located on the filter component 03 The light side.
  • the image sensor 01 is used to generate and output a first image signal and a second image signal through multiple exposures.
  • the first image signal is an image signal generated according to a first preset exposure
  • the second image signal is an image signal generated according to a second preset exposure
  • the first preset exposure and the second preset exposure are the multiple exposures Two of the exposures.
  • the light supplement 02 includes a first light supplement device 021.
  • the first light supplement device 021 is used to perform near-infrared supplement light, wherein at least there is near-infrared supplement light during a partial exposure period of the first preset exposure, and the second There is no near-infrared fill light in the exposure time period of the preset exposure.
  • the filter assembly 03 includes a first filter 031.
  • the first filter 031 allows light in the visible light band and part of the near-infrared light to pass.
  • the first light supplement device 021 passes through the first filter when performing near-infrared light supplementation.
  • the intensity of the near-infrared light of the sheet 031 is higher than the intensity of the near-infrared light passing through the first filter 031 when the first light supplement device 021 does not perform near-infrared light supplementation.
  • the near-infrared light band passing through the first filter (031) may be part of the near-infrared light band.
  • the image fusion device may include an image acquisition device, that is, an image sensor 01, a light supplement 02, and a filter component 03, and a processor.
  • the processor includes: a buffer unit and Image processing unit.
  • the buffering unit is used to buffer the first target image signal when it is known that the first target image signal currently output by the image sensor needs to be buffered, and when it is known that the buffered second target image signal needs to be output synchronously, at least the buffered
  • the second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a buffered frame of the second image signal, or the first target image signal is A second image signal, where the second target image signal is a buffered frame of the first image signal;
  • the image processing unit is configured to receive at least the first target image signal currently output by the image sensor, and at least the second target image signal synchronously output by the buffer unit, and generate a color fusion image according to the first target image signal and the second target image signal.
  • the processor may further include: a synchronization unit;
  • the synchronization unit is used to determine that when the first target image signal currently output by the image sensor needs to be buffered, instruct the buffer unit to buffer the first target image signal, and when it is determined from the buffered image signals that the second target image signal needs to be output synchronously, Instruct the buffer unit to synchronously output the second target image signal to the image processing unit.
  • the image processing unit may include: an image preprocessing unit and an image fusion unit;
  • An image preprocessing unit configured to preprocess the first target image signal to generate a first target image, and preprocess the second target image signal to generate a second target image;
  • the image fusion unit is used to perform fusion processing on the first target image and the second target image to obtain a color fusion image.
  • the first target image signal is the first image signal
  • the first target image generated after preprocessing is a black and white image
  • the second target image generated after preprocessing is Color image
  • the first target image signal is the second image signal
  • the first target image generated after preprocessing is a color image
  • the second target image generated after preprocessing is a black and white image
  • the first target image signal output by the image sensor first is stored in the buffer, and the second target image signal is output to the image processing unit after the image sensor outputs the second target image signal to realize the same
  • the synchronization between the second target image signals is then processed by the image preprocessing unit and the image fusion unit of the image processing unit.
  • the image preprocessing unit includes: a first preprocessing unit, a second preprocessing unit, and a joint noise reduction unit;
  • the first preprocessing unit is configured to perform a first preprocessing operation on the first target image signal to obtain a preprocessed first target image
  • a second preprocessing unit configured to perform a second preprocessing operation on the second target image signal to obtain a second target image
  • the joint noise reduction unit is used for filtering the first target image and the second target image to obtain the first target image and the second target image after noise reduction, the first target image and the second target image after noise reduction Used for fusion processing to obtain a color fusion image.
  • the first preprocessing operation includes at least one of the following: image interpolation, gamma mapping, and color conversion; the second preprocessing operation includes at least one of the following: white balance, image interpolation, and gamma mapping.
  • the buffer unit may respectively store the first target image signal and fetch the second target image signal in one frame period.
  • the specific scheme is as follows:
  • the synchronization unit is used to determine that the first target image signal of each frame needs to be buffered, and the second target image signal needs to be output synchronously, and the second target image signal is the image signal buffered by the buffer unit last time;
  • the buffer unit currently buffers the second image signal, and determines that the first image signal buffered last time is the second target image signal and outputs it to the image preprocessing unit;
  • the buffer unit currently buffers the first image signal, and determines the second image signal buffered last time as the second target image signal and outputs it to the image preprocessing unit.
  • the image sensor can alternately output the first image signal and the second image signal, or output the first image signal and the second image signal in a manner such as outputting the second image signal after a few first image signals.
  • This is not limited.
  • Fig. 7 takes the image sensor alternately outputting the first image signal and the second image signal as an example for description.
  • the synchronization unit instructs the buffer unit to store the second image signal M. -2, and output the first image signal M-3 buffered last time from the buffer unit.
  • the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M-3 to obtain color fusion Image; when the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to store the first image signal M-1, and outputs the second image signal M-2 buffered last time from the buffer unit; image sensor When outputting the second image signal M, the synchronization unit instructs the buffer unit to store the second image signal M, and outputs the first image signal M-1 buffered previously from the buffer unit, and so on.
  • the buffer unit may store the first target image signal and fetch the second target image signal in different frame periods, respectively.
  • the specific solution is as follows:
  • the synchronization unit is used to determine that the first target image signal is the first image signal and need to be buffered, and when it is determined that the first target image signal is the second image signal, the second target image signal needs to be output synchronously, and the second target image signal is The first image signal buffered most recently among the image signals buffered by the buffer unit; wherein, if the first target image signal is the second image signal, the buffer unit determines the first image signal buffered most recently as the second target image signal Output to the image preprocessing unit; if the first target image signal is the first image signal, the buffer unit buffers the first image signal; or,
  • the synchronization unit is used to determine that the first target image signal is the second image signal and need to be buffered, and when it is determined that the first target image signal is the first image signal, the second target image signal needs to be output synchronously, and the second target image signal is The second image signal buffered last time among the second image signals buffered by the buffer unit; wherein, if the first target image signal is the first image signal, the buffer unit determines the second image signal buffered last time as the second target The image signal is output to the image preprocessing unit; if the first target image signal is the second image signal, the buffer unit buffers the second image signal.
  • Fig. 8 takes the image sensor alternately outputting the first image signal and the second image signal as an example for illustration.
  • the synchronization unit instructs the buffer unit to buffer the last first image signal.
  • the image signal M-3 is output from the buffer unit.
  • the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M-3 to obtain a color fusion image; the image sensor is outputting the first image signal M- At 1:00, the synchronization unit instructs the buffer unit to store the first image signal M-1, and the image processing unit does not perform processing at this time; when the image sensor outputs the second image signal M, the synchronization unit instructs the buffer unit to store the first image signal M-1 most recently The image signal M-1 is output from the buffer unit. At this time, the image processing unit performs fusion processing on the second image signal M and the first image signal M-1 to obtain a color fusion image, and so on.
  • the image sensor outputs a second image signal at intervals of two first image signals, and the buffer unit only buffers the second image signal.
  • the synchronization unit instructs the buffer unit The second image signal M-2 is buffered, and the image processing unit does not perform processing at this time;
  • the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to buffer the last second image signal M- 2 output, the image processing unit fuses the second image signal M-2 and the first image signal M-1 to obtain a color fusion image;
  • the image processing unit instructs the buffer unit to The buffered second image signal M-2 is output.
  • the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M to obtain a color fusion image, and so on.
  • the first image signal may not be buffered in every frame, or it may be stored at intervals of several first image signals. See FIG. 9B, the image sensor is outputting When the second image signal M-2 is used, the synchronization unit instructs the buffer unit to output the most recently buffered first image signal M-5 from the buffer unit. At this time, the image processing unit combines the second image signal M-2 with the first image signal M.
  • the synchronization unit instructs the buffer unit Output the last buffered first image signal M-1 from the buffer unit.
  • the image processing unit performs fusion processing on the second image signal M+2 and the first image signal M-1 to obtain a color fusion image, and so on .
  • the first target image signal and the second target image signal can be synchronously output in one frame period.
  • the synchronization unit is used to determine that the first target image signal of each frame needs to be buffered, and needs to synchronously output the second target image signal buffered last time and the first target image signal buffered last time;
  • the buffering unit currently buffers the second image signal, and outputs the most recently buffered first image signal and the most recently buffered second image signal;
  • the buffering unit currently buffers the first image signal, and outputs the most recently buffered first image signal and the most recently buffered second image signal.
  • the image sensor alternately outputting the first image signal and the second image signal in FIG. 9C is taken as an example for description.
  • the synchronization unit instructs the buffer unit to store the first image signal.
  • the image signal M-2 is not processed by the image processing unit; when the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to store the first image signal M-1, and store the first image signal M-1 most recently buffered.
  • the image signal M-3 and the second image signal M-2 are output from the buffer unit.
  • the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M-1 to obtain a color fusion image;
  • the synchronization unit instructs the buffer unit to store the second image signal M, and the image processing unit does not perform processing;
  • the image sensor outputs the first image signal M+1, the synchronization unit instructs the buffer unit to store the second image signal M
  • the first image signal M+1, and the last buffered second image signal M and the last buffered first image signal M-1 are output from the buffer unit, and so on.
  • multiple images with different spectral ranges are generated through multiple exposures of the image sensor and light supplementation, which expands the image acquisition capability of a single sensor and improves the image quality in different scenarios;
  • the processor has an image cache function , Can achieve synchronization between images with different exposure time periods, and has an image fusion function, which can generate fused images with improved signal-to-noise ratio.
  • the joint noise reduction unit is specifically used for:
  • the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction.
  • the joint noise reduction unit includes a temporal noise reduction unit or a spatial noise reduction unit;
  • the temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain a motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the denoised first target image, Performing time-domain filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
  • the spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain the edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the denoised first target image.
  • the estimation result performs spatial filtering on the second target image to obtain the second target image after noise reduction.
  • the joint noise reduction unit includes a temporal noise reduction unit and a spatial noise reduction unit;
  • the temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain the motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the first temporal noise reduction image, according to The motion estimation result performs temporal filtering on the second target image to obtain a second temporal noise reduction image;
  • the spatial noise reduction unit is used to perform edge estimation according to the first temporal noise reduction image and the second temporal noise reduction image to obtain the edge estimation result, and perform spatial filtering on the first temporal noise reduction image according to the edge estimation result to obtain the noise reduction Performing spatial filtering on the second time-domain noise-reduced image according to the edge estimation result for the first target image to obtain a de-noised second target image;
  • the spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain the edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the first spatial noise reduction image, according to the edge estimation result Performing spatial filtering on the second target image to obtain a second spatial denoising image;
  • the temporal denoising unit is used to perform motion estimation based on the first spatial denoised image and the second spatial denoised image to obtain a motion estimation result, and perform temporal filtering on the first spatial denoised image according to the motion estimation result to obtain the denoised image Performing temporal filtering on the second spatial denoised image according to the motion estimation result to obtain the denoised second target image.
  • the image fusion device may include an image acquisition device, that is, include: an image sensor 01, a light supplement 02, and a filter component 03, and a processor, and the processor includes: a buffer unit And image processing unit.
  • An image processing unit for receiving the first target image signal currently output by the image sensor, preprocessing the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least the first target image is output synchronously To the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least receive the second target image synchronously output by the buffer unit, and generate a color fusion image based on the first target image and the second target image ;
  • the first target image signal is the first image signal
  • the first target image is an image generated after preprocessing the first image signal
  • the second target image is a buffered frame that is preprocessed by the second target image signal
  • the second target image signal is the second image signal
  • the first target image is the image generated after the second image signal is preprocessed
  • the second target image is already A buffered frame of an image preprocessed by a second target image
  • the cache unit is used to cache at least the first target image synchronously output by the image processing unit when it is known that the first target image needs to be cached, and when it is known that the cached second target image needs to be output synchronously, at least the cached
  • the second target image signal is synchronously output to the image processing unit.
  • the processor further includes: a synchronization unit; the synchronization unit is used to determine that the first target image generated by the preprocessing of the image processing unit needs to be cached, instruct the cache unit to cache the first target image, and to retrieve the first target image from the cached image.
  • the buffer unit is instructed to synchronously output the second target image to the image processing unit.
  • the image processing unit includes: an image preprocessing unit and an image fusion unit;
  • An image preprocessing unit configured to preprocess the first target image signal to generate a first target image, and preprocess the second target image signal to generate a second target image;
  • the image fusion unit is used to perform fusion processing on the first target image and the second target image to obtain a color fusion image.
  • the first target image signal is the first image signal
  • the first target image generated after preprocessing is a black and white image
  • the second target image generated after preprocessing is Color image
  • the first target image signal is the second image signal
  • the first target image generated after preprocessing is a color image
  • the second target image generated after preprocessing is a black and white image
  • the first target image signal first output by the image sensor is input to the image processing unit.
  • the buffer unit stores the first target image signal after preprocessing.
  • the second target image signal is output to the image processing unit after the image sensor outputs the second target image signal
  • the second target image is output to the image fusion unit after preprocessing by the image preprocessing unit
  • the first target image stored in the buffer unit is output to the image fusion unit to achieve synchronization between the first target image and the second target image, and then the image fusion unit performs fusion processing to obtain a color fusion image.
  • the image preprocessing unit includes: a first preprocessing unit, a second preprocessing unit, and a joint noise reduction unit;
  • the first preprocessing unit is configured to perform a first preprocessing operation on the first target image signal to obtain a preprocessed first target image
  • a second preprocessing unit configured to perform a second preprocessing operation on the second target image signal to obtain a second target image
  • the joint noise reduction unit is used to filter the first target image and the second target image to obtain the first target image and the second target image after noise reduction, and the first target image and the second target image after noise reduction are used Perform fusion processing to obtain a fused color fusion image.
  • the first preprocessing operation includes at least one of the following: image interpolation, gamma mapping, and color conversion; the second preprocessing operation includes at least one of the following: white balance, image interpolation, and gamma mapping.
  • the buffer unit can respectively store the first target image and fetch the second target image in one frame period.
  • the synchronization unit is used to determine that the first target image of each frame needs to be cached, and the second target image needs to be output synchronously, and the second target image is the image previously cached by the cache unit;
  • the buffering unit currently buffers the image generated after preprocessing the second image signal, and preprocesses the previously buffered first image signal.
  • the image is determined to be the second target image and output to the image preprocessing unit;
  • the buffering unit currently buffers the image generated after preprocessing of the first image signal, and preprocesses the image generated by the second image signal buffered previously It is determined that the second target image is output to the image preprocessing unit.
  • the image sensor may alternately output the first image signal and the second image signal, or output the first image signal and the second image signal in a manner such as outputting the second image signal after a few first image signals. This is not limited in the application embodiments.
  • Fig. 13 takes the image sensor alternately outputting the first image signal and the second image signal as an example for description.
  • the image preprocessing unit in Fig. 13 outputs the color image M-2
  • the synchronization unit instructs the buffer unit to store the color image M-2 , And output the previously buffered black and white image M-3.
  • the image fusion unit fuses the color image M-2 and the black and white image M-3 to obtain the color fusion image; the image preprocessing unit is outputting the black and white image M- At 1:00, the synchronization unit instructs the buffer unit to store the black and white image M-1 and outputs the previously buffered color image M-2; when the image preprocessing unit outputs the color image M, the synchronization unit instructs the buffer unit to store the color image M , And output the previously buffered black and white image M-1, and so on.
  • the buffer unit can store the first target image and fetch the second target image in different frame periods, respectively.
  • the specific solution is as follows:
  • the synchronization unit is used to determine that the first target image is an image preprocessed by the first image signal and needs to be buffered, and when determining that the first target image is an image preprocessed by the second image signal, need to synchronously output the second target image ,
  • the second target image is the image preprocessed by the first image signal that is buffered last time among the images that the buffer unit has buffered; wherein, if the first target image is an image preprocessed by the second image signal, the buffer unit will The image preprocessed by the first image signal buffered once is determined to be the second target image and output to the image preprocessing unit; if the first target image is an image preprocessed by the first image signal, the buffer unit caches the first image signal preprocessing unit.
  • the processed image or,
  • the synchronization unit is used to determine that the first target image is an image preprocessed by the second image signal and needs to be cached, and when determining that the first target image is an image preprocessed by the first image signal, need to synchronously output the second target image ,
  • the second target image is the image preprocessed by the second image signal that has been buffered last time among the images buffered by the buffer unit; wherein, if the first target image is the image preprocessed by the first image signal, the buffer unit will The image preprocessed by the second image signal buffered once is determined to be the second target image and output to the image preprocessing unit; if the first target image is an image preprocessed by the second image signal, the buffer unit buffers the second image The image after signal preprocessing.
  • Figure 14 takes the image sensor alternately outputting the first image signal and the second image signal as an example for illustration.
  • the synchronization unit instructs the buffer unit to buffer the black and white image last time M-3 output.
  • the image fusion unit fuses the color image M-2 and the black and white image M-3 to obtain the color fusion image; when the image preprocessing unit outputs the black and white image M-1, the synchronization unit instructs the buffer unit to store The black and white image M-1 is not processed by the image fusion unit at this time; when the image preprocessing unit outputs the color image M, the synchronization unit instructs the buffer unit to output the most recently buffered black and white image M-1, and the image fusion unit will The color image M and the black and white image M-1 are fused to obtain a color fusion image, and so on.
  • FIG. 14 only the black-and-white image is cached as an example for description, and only the color image is cached is similar to FIG. 14, and will not be repeated here.
  • the image sensor outputs a second image signal at intervals of two first image signals, and the buffer unit only buffers the color image preprocessed by the second image signal.
  • the image preprocessing unit outputs a color image M-2.
  • the synchronization unit instructs the caching unit to cache the color image M-2, at this time the image fusion unit does not perform processing; when the image preprocessing unit outputs the black and white image M-1, the synchronization unit instructs the caching unit to cache the color image most recently M-2 output, the image fusion unit merges the color image M-2 and the black and white image M-1 to obtain the color fusion image; when the image preprocessing unit outputs the black and white image M, the synchronization unit instructs the buffer unit to buffer the latest The color image M-2 is output. At this time, the image fusion unit performs fusion processing on the color image M-2 and the black and white image M to obtain a color fusion image, and so on.
  • not every frame may be buffered, or it may be stored at intervals of several black and white images. See Figure 15B.
  • the image preprocessing unit is outputting color images.
  • the synchronization unit instructs the buffer unit to output the most recently buffered black and white image M-5 from the buffer unit.
  • the image processing unit performs fusion processing on the color image M-2 and the black and white image M-5 to obtain a color fusion image ;
  • the synchronization unit instructs the buffer unit to store the black and white image M-1, and the image processing unit does not perform processing at this time;
  • the synchronization unit instructs The buffer unit outputs the most recently buffered black-and-white image M-1 from the buffer unit.
  • the image processing unit fuses the color image M and the black-and-white image M-1 to obtain the color fusion image.
  • the image preprocessing unit outputs the black-and-white image M +1, neither the buffer unit nor the image processing unit processes.
  • the synchronization unit instructs the buffer unit to output the most recently buffered black and white image M-1 from the buffer unit.
  • the processing unit performs fusion processing on the color image M+2 and the black and white image M-1 to obtain a color fusion image, and so on.
  • the first target image and the second target image may be synchronously output in one frame period.
  • the specific scheme is as follows:
  • a synchronization unit is used to determine that each frame of the first target image needs to be buffered, and needs to synchronously output the last buffered second target image and the last buffered first target image;
  • the buffering unit currently buffers the image generated after preprocessing of the second image signal, and preprocesses the image generated by the most recently buffered first image signal.
  • the buffering unit currently buffers the image generated after preprocessing of the first image signal, and preprocesses the image generated after the last buffered second image signal And the image output generated after preprocessing of the first image signal that was buffered last time.
  • the synchronization unit instructs the buffer unit to store the color image M-2, and the image processing unit does not perform processing; the image preprocessing unit is outputting the black and white image M-2.
  • the synchronization unit instructs the buffer unit to store the black and white image M-1, and outputs the most recently buffered black and white image M-3 and color image M-2 from the buffer unit.
  • the image processing unit sends the color image M-2 Perform fusion processing with black and white image M-1 to obtain a color fusion image; when the image preprocessing unit outputs a color image M, the synchronization unit instructs the buffer unit to store the color image M, and the image processing unit does not perform processing; the image preprocessing unit is outputting When the black and white image is M+1, the synchronization unit instructs the buffer unit to store the black and white image M+1, and outputs the most recently buffered color image M and the most recently buffered black and white image M-1 from the buffer unit, and so on.
  • the joint noise reduction unit is specifically used for:
  • the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction.
  • the joint noise reduction unit includes a temporal noise reduction unit or a spatial noise reduction unit;
  • the temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain a motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the denoised first target image, Performing time-domain filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
  • the spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain an edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the denoised first target image, Spatial filtering is performed on the second target image according to the edge estimation result to obtain the second target image after noise reduction.
  • the joint noise reduction unit includes a temporal noise reduction unit and a spatial noise reduction unit;
  • the temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain the motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the first temporal noise reduction image, according to The motion estimation result performs temporal filtering on the second target image to obtain a second temporal noise reduction image;
  • the spatial noise reduction unit is used to perform edge estimation according to the first temporal noise reduction image and the second temporal noise reduction image to obtain the edge estimation result, and perform spatial filtering on the first temporal noise reduction image according to the edge estimation result to obtain the noise reduction Performing spatial filtering on the second time-domain noise-reduced image according to the edge estimation result for the first target image to obtain a de-noised second target image;
  • the spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain the edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the first spatial noise reduction image, according to the edge estimation result Performing spatial filtering on the second target image to obtain a second spatial denoising image;
  • the temporal denoising unit is used to perform motion estimation based on the first spatial denoised image and the second spatial denoised image to obtain a motion estimation result, and perform temporal filtering on the first spatial denoised image according to the motion estimation result to obtain the denoised image Performing temporal filtering on the second spatial denoised image according to the motion estimation result to obtain the denoised second target image.
  • the image fusion unit includes: a color extraction unit, a brightness extraction unit, and a fusion processing unit connected to the color extraction unit and the brightness extraction unit respectively;
  • the color extraction unit is used to extract the color signal of the image after the second image signal preprocessing
  • a brightness extraction unit for extracting the brightness signal of the image after the second image signal preprocessing
  • the fusion processing unit is used to perform fusion processing on the color signal and the brightness signal of the image preprocessed by the first image signal and the image preprocessed by the second image signal to obtain a color fusion image.
  • the second image signal in FIG. 16 is a color image
  • the first image signal is a black-and-white image as an example.
  • fusion processing unit is specifically used for:
  • Fusion processing is performed on the color signals of the fused brightness image and the preprocessed image of the second image signal to obtain a color fusion image.
  • the structure of the image acquisition in the above scheme is simple, which can reduce the cost, and can simultaneously collect the first image signal containing near-infrared light information and the first image signal containing visible light information through the first preset exposure and the second preset exposure in any period of time.
  • the second image signal is subsequently fused according to the first image signal and the second image signal, and the quality of the color fusion image obtained is higher.
  • the image processing unit has an image buffer function, which can realize the comparison of images with different exposure time periods. Synchronization between.
  • noise reduction processing is performed on the color image to obtain a color image after noise reduction
  • x and y represent the coordinates of any current pixel
  • img_vis(x+i,y+j) represents the pixel value of the pixel in the neighborhood corresponding to the current pixel in the color image
  • img_vis'(x,y) represents The pixel value of the current pixel in the color image after noise reduction
  • img_nir(x+i,y+j) represents the pixel value of the pixel in the neighborhood corresponding to the current pixel in the black and white image
  • img_nir'(x,y) represents black and white
  • S represents the size of the neighborhood corresponding to the current pixel
  • weight(x+i,y+j) weight vis (x+i,y+j)+weight nir (x +i,y+j)
  • weight vis (x+i,y+j) is the weight corresponding to the current pixel in the color image
  • weight nir (x+i,y+j) is the weight corresponding to the current pixel in the color image
  • the image acquisition device may further include a lens 04.
  • the filter assembly 03 may be located between the lens 04 and the image sensor 01, and the image sensor 01 is located on the light exit side of the filter assembly 03 .
  • the lens 04 is located between the filter assembly 03 and the image sensor 01, and the image sensor 01 is located on the light exit side of the lens 04.
  • the first filter 031 can be a filter film.
  • the first filter 031 can be attached to the light-emitting side of the lens 04
  • the image acquisition device may be a video camera, a capture machine, a face recognition camera, a code reading camera, a vehicle-mounted camera, a panoramic detail camera, etc.
  • the light supplement 02 may be located in the image acquisition device or outside the image acquisition device.
  • the light supplement 02 can be a part of the image acquisition device or a device independent of the image acquisition device.
  • the fill 02 can communicate with the image capture device, which can ensure the exposure timing of the image sensor 01 in the image capture device and the first fill light included in the fill 02
  • the timing of the near-infrared supplement light of 021 has a certain relationship, for example, there is near-infrared supplementary light in at least part of the exposure time period of the first preset exposure, and there is no near-infrared supplementary light in the exposure time period of the second preset exposure.
  • the first supplementary light device 021 is a device that can emit near-infrared light, such as a near-infrared supplementary light, etc., the first supplementary light device 021 can perform near-infrared supplementary light in a stroboscopic manner, or other similar stroboscopic The near-infrared supplementary light is performed in a manner, which is not limited in the embodiment of the present application.
  • the first light supplement device 021 when the first light supplement device 021 performs near-infrared supplement light in a stroboscopic manner, the first light supplement device 021 can be manually controlled to perform near-infrared supplement light in a stroboscopic manner, or through a software program Or a specific device controls the first light supplement device 021 to perform near-infrared supplement light in a strobe mode, which is not limited in the embodiment of the present application.
  • the time period during which the first light supplement device 021 performs near-infrared light supplementation may coincide with the exposure time period of the first preset exposure, or may be greater than the exposure time period of the first preset exposure or less than the exposure time period of the first preset exposure. The time period, as long as there is near-infrared supplement light in the entire exposure time period or part of the exposure time period of the first preset exposure, and there is no near-infrared supplement light in the exposure time period of the second preset exposure.
  • the exposure time period of the second preset exposure may be between the start exposure time and the end exposure time.
  • Time period, for the rolling shutter exposure mode, the exposure time period of the second preset exposure may be the time period between the start exposure time of the first row of effective images of the second image signal and the end exposure time of the last row of effective images, but it is not limited to this.
  • the exposure time period of the second preset exposure may also be the exposure time period corresponding to the target image in the second image signal, and the target image is a number of rows of effective images corresponding to the target object or target area in the second image signal.
  • the time period between the start exposure time and the end exposure time of several rows of effective images can be regarded as the exposure time period of the second preset exposure.
  • the near-infrared light incident on the surface of the object may be reflected by the object and enter the first filter 031.
  • the ambient light may include visible light and near-infrared light, and near-infrared light in the ambient light is also reflected by the object when it is incident on the surface of the object, thereby entering the first filter 031.
  • the near-infrared light that passes through the first filter 031 when there is near-infrared supplementary light may include the near-infrared light that enters the first filter 031 by the reflection of the object when the first supplementary light device 021 performs near-infrared supplementary light.
  • the near-infrared light passing through the first filter 031 when there is no near-infrared supplementary light may include the near-infrared light reflected by the object and entering the first filter 031 when the first supplementary light device 021 is not performing near-infrared supplementary light.
  • the near-infrared light passing through the first filter 031 when there is near-infrared supplement light includes the near-infrared light emitted by the first supplementary light device 021 and reflected by the object, and the ambient light reflected by the object Near-infrared light
  • the near-infrared light passing through the first filter 031 when there is no near-infrared supplementary light includes near-infrared light reflected by an object in the ambient light.
  • the filter assembly 03 can be located between the lens 04 and the image sensor 01, and the image sensor 01 is located on the light-emitting side of the filter assembly 03 as an example.
  • the image acquisition device acquires the first image signal and the second image signal.
  • the image signal process is: when the image sensor 01 performs the first preset exposure, the first light supplement device 021 has near-infrared supplement light, and the ambient light in the shooting scene and the first light supplement device perform near-infrared supplement light at this time After the near-infrared light reflected by objects in the scene passes through the lens 04 and the first filter 031, the image sensor 01 generates the first image signal through the first preset exposure; when the image sensor 01 performs the second preset exposure, the first image signal A fill light device 021 does not have near-infrared fill light.
  • the image sensor 01 After the ambient light in the shooting scene passes through the lens 04 and the first filter 031, the image sensor 01 generates a second image signal through the second preset exposure.
  • the values of M and N and the magnitude relationship between M and N can be set according to actual requirements. For example, the values of M and N may be equal or different.
  • multiple exposure refers to multiple exposures in one frame period, that is, the image sensor 01 performs multiple exposures in one frame period, thereby generating and outputting at least one frame of the first image signal and at least One frame of second image signal.
  • 1 second includes 25 frame periods, and the image sensor 01 performs multiple exposures in each frame period, thereby generating at least one frame of the first image signal and at least one frame of the second image signal, and the The first image signal and the second image signal are called a group of image signals, so that 25 groups of image signals are generated within 25 frame periods.
  • the first preset exposure and the second preset exposure may be two adjacent exposures in the multiple exposures in one frame period, or two non-adjacent exposures in the multiple exposures in one frame period. Exposure is not limited in the embodiment of this application.
  • the first light-filling device 021 since the intensity of the near-infrared light in the ambient light is lower than the intensity of the near-infrared light emitted by the first light-filling device 021, the first light-filling device 021 passes through the first filter 031 when performing near-infrared light-filling.
  • the intensity of the near-infrared light is higher than the intensity of the near-infrared light that passes through the first filter 031 when the first light supplement device 021 does not perform the near-infrared light supplement.
  • the wavelength range of the first light supplement device 021 for near-infrared supplement light can be the second reference wavelength range, and the second reference wavelength range can be 700 nanometers to 800 nanometers, or 900 nanometers to 1000 nanometers, etc., to reduce common 850nm infrared
  • the influence of the lamp is not limited in this embodiment of the application.
  • the wavelength range of the near-infrared light incident on the first filter 031 may be the first reference wavelength range, and the first reference wavelength range is 650 nanometers to 1100 nanometers.
  • the near-infrared light passing through the first filter 031 may include the near-infrared light reflected by the object and entering the first filter 031 when the first light-filling device 021 performs near-infrared light-filling when there is near-infrared supplementary light, And the near-infrared light reflected by the object in the ambient light. Therefore, the intensity of the near-infrared light entering the filter assembly 03 is relatively strong at this time. However, when there is no near-infrared complementary light, the near-infrared light passing through the first filter 031 includes the near-infrared light reflected by the object into the filter assembly 03 in the ambient light.
  • the intensity of the near-infrared light passing through the first filter 031 is weak at this time. Therefore, the intensity of the near infrared light included in the first image signal generated and output according to the first preset exposure is higher than the intensity of the near infrared light included in the second image signal generated and output according to the second preset exposure.
  • the center wavelength and/or wavelength range of the first light supplement device 021 for near-infrared supplement light there are multiple choices for the center wavelength and/or wavelength range of the first light supplement device 021 for near-infrared supplement light.
  • the center wavelength of the near-infrared supplement light of the first light supplement device 021 can be designed, and the characteristics of the first filter 031 can be selected, so that the center of the first light supplement device 021 for the near-infrared light supplement.
  • the center wavelength and/or band width of the near-infrared light passing through the first filter 031 can meet the constraint conditions.
  • This constraint is mainly used to restrict the center wavelength of the near-infrared light passing through the first filter 031 as accurate as possible, and the band width of the near-infrared light passing through the first filter 031 is as narrow as possible, so as to avoid The infrared light band width is too wide and introduces wavelength interference.
  • the center wavelength of the near-infrared light supplemented by the first light-filling device 021 may be the average value in the wavelength range of the highest energy in the spectrum of the near-infrared light emitted by the first light-filling device 021, or it may be understood as the first light supplement
  • the set characteristic wavelength or the set characteristic wavelength range can be preset.
  • the center wavelength of the first light supplement device 021 for near-infrared supplement light may be any wavelength within the wavelength range of 750 ⁇ 10 nanometers; or, the center wavelength of the first light supplement device 021 for near-infrared supplement light It is any wavelength in the wavelength range of 780 ⁇ 10 nanometers; or, the center wavelength of the first light supplement device 021 for near-infrared supplement light is any wavelength in the wavelength range of 940 ⁇ 10 nanometers. That is, the set characteristic wavelength range may be a wavelength range of 750 ⁇ 10 nanometers, or a wavelength range of 780 ⁇ 10 nanometers, or a wavelength range of 940 ⁇ 10 nanometers.
  • the center wavelength of the near-infrared supplement light performed by the first light supplement device 021 is 940 nanometers
  • the relationship between the wavelength and the relative intensity of the near-infrared supplement light performed by the first light supplement device 021 is shown in FIG. 17. It can be seen from FIG. 17 that the wavelength range of the first light supplement device 021 for near-infrared supplement light is 900 nanometers to 1000 nanometers, and the relative intensity of near-infrared light is highest at 940 nanometers.
  • the above-mentioned constraint conditions may include: the difference between the center wavelength of the near-infrared light passing through the first filter 031 and the center wavelength of the near-infrared light of the first light supplement device 021 lies in the wavelength fluctuation Within the range, as an example, the wavelength fluctuation range may be 0-20 nanometers.
  • the center wavelength of the near-infrared supplement light passing through the first filter 031 can be the wavelength at the peak position in the near-infrared band in the near-infrared light pass rate curve of the first filter 031, or it can be understood as the first A filter 031 is the wavelength at the middle position in the near-infrared waveband whose pass rate exceeds a certain threshold in the near-infrared light pass rate curve of the filter 031.
  • the above constraint conditions may include: the first band width may be smaller than the second band width.
  • the first waveband width refers to the waveband width of the near-infrared light passing through the first filter 031
  • the second waveband width refers to the waveband width of the near-infrared light blocked by the first filter 031.
  • the wavelength band width refers to the width of the wavelength range in which the wavelength of light lies.
  • the first wavelength band width is 800 nanometers minus 700 nanometers, that is, 100 nanometers.
  • the wavelength band width of the near-infrared light passing through the first filter 031 is smaller than the wavelength band width of the near-infrared light blocked by the first filter 031.
  • FIG. 18 is a schematic diagram of the relationship between the wavelength and the pass rate of light that the first filter 031 can pass.
  • the wavelength band of the near-infrared light incident on the first filter 031 is 650 nanometers to 1100 nanometers.
  • the first filter 031 can pass visible light with a wavelength of 380 nanometers to 650 nanometers and a wavelength of near 900 nanometers to 1100 nanometers.
  • Infrared light passes through and blocks near-infrared light with a wavelength between 650 nanometers and 900 nanometers. That is, the width of the first band is 1000 nanometers minus 900 nanometers, that is, 100 nanometers.
  • the second band width is 900 nm minus 650 nm, plus 1100 nm minus 1000 nm, or 350 nm. 100 nanometers are smaller than 350 nanometers, that is, the wavelength band width of the near-infrared light passing through the first filter 031 is smaller than the wavelength band width of the near-infrared light blocked by the first filter 031.
  • the above relationship curve is just an example.
  • the wavelength range of the near-red light that can pass through the filter can be different, and the wavelength range of the near-infrared light blocked by the filter can also be different. different.
  • the above constraint conditions may include: passing the first filter
  • the half bandwidth of the near-infrared light of the light sheet 031 is less than or equal to 50 nanometers.
  • the half bandwidth refers to the band width of near-infrared light with a pass rate greater than 50%.
  • the above constraint conditions may include: the third band width may be smaller than the reference band width.
  • the third waveband width refers to the waveband width of near-infrared light with a pass rate greater than a set ratio.
  • the reference waveband width may be any waveband width in the range of 50 nanometers to 100 nanometers.
  • the set ratio can be any ratio from 30% to 50%.
  • the set ratio can also be set to other ratios according to usage requirements, which is not limited in the embodiment of the application.
  • the band width of the near-infrared light whose pass rate is greater than the set ratio may be smaller than the reference band width.
  • the wavelength band of the near-infrared light incident on the first filter 031 is 650 nanometers to 1100 nanometers, the setting ratio is 30%, and the reference wavelength band width is 100 nanometers. It can be seen from FIG. 18 that in the wavelength band of near-infrared light from 650 nanometers to 1100 nanometers, the band width of near-infrared light with a pass rate greater than 30% is significantly less than 100 nanometers.
  • the first light supplement device 021 Since the first light supplement device 021 provides near-infrared supplementary light at least during a partial exposure period of the first preset exposure, it does not provide near-infrared supplementary light during the entire exposure period of the second preset exposure, and the first preset exposure
  • the exposure and the second preset exposure are two of the multiple exposures of the image sensor 01, that is, the first light supplement device 021 provides near-infrared supplement light during the exposure period of the partial exposure of the image sensor 01, The near-infrared supplementary light is not provided during the exposure time period when another part of the image sensor 01 is exposed.
  • the number of times of supplementary light in the unit time length of the first supplementary light device 021 may be lower than the number of exposures of the image sensor 01 in the unit time length, wherein, within the interval of two adjacent times of supplementary light, there is one interval. Or multiple exposures.
  • the light supplement 02 is also A second light supplement device 022 may be included, and the second light supplement device 022 is used for visible light supplement light. In this way, if the second light supplement device 022 provides visible light supplement light for at least part of the exposure time of the first preset exposure, that is, at least the near-infrared supplement light and visible light supplement light are present during the partial exposure time period of the first preset exposure.
  • the mixed color of the two lights can be distinguished from the color of the red light in the traffic light, thereby avoiding the human eye from confusing the color of the light fill 02 for near-infrared fill light with the color of the red light in the traffic light.
  • the second light supplement device 022 provides visible light supplement light during the exposure time period of the second preset exposure, since the intensity of visible light is not particularly high during the exposure time period of the second preset exposure, When the visible light supplement is performed during the exposure time period of the exposure, the brightness of the visible light in the second image signal can also be increased, thereby ensuring the quality of image collection.
  • the second light supplement device 022 may be used to perform visible light supplement light in a constant light mode; or, the second light supplement device 022 may be used to perform visible light supplement light in a stroboscopic manner, wherein, at least in the first Visible light supplement light exists in part of the exposure time period of the preset exposure, and there is no visible light supplement light during the entire exposure time period of the second preset exposure; or, the second light supplement device 022 can be used to perform visible light supplement light in a strobe mode There is no visible light supplementary light at least during the entire exposure time period of the first preset exposure, and visible light supplementary light exists during the partial exposure time period of the second preset exposure.
  • the second light supplement device 022 When the second light supplement device 022 performs visible light supplement light in a constant light mode, it can not only prevent human eyes from confusing the color of the first supplement light device 021 for near-infrared supplement light with the color of the red light in the traffic light, but also can improve the Second, the brightness of visible light in the image signal to ensure the quality of image collection.
  • the second light supplement device 022 When the second light supplement device 022 performs visible light supplement light in a stroboscopic manner, it can prevent human eyes from confusing the color of the first light supplement device 021 for near-infrared supplement light with the color of the red light in the traffic light, or can improve The brightness of the visible light in the second image signal in turn ensures the quality of image collection, and can also reduce the number of times of supplementary light of the second supplementary light device 022, thereby prolonging the service life of the second supplementary light device 022.
  • the switching component 033 is used to switch the second filter 032 to the light incident side of the image sensor 01, and can also be understood as the second filter 032 replacing the first filter 031 in the image sensor 01. Position on the light side.
  • the first light supplement device 021 may be in a closed state or an open state.
  • the first image signal is generated and output by the first preset exposure
  • the second image signal is generated and output by the second preset exposure.
  • the first image can be The signal and the second image signal are processed.
  • the purposes of the first image signal and the second image signal may be different, so in some embodiments, at least one exposure parameter of the first preset exposure and the second preset exposure may be different.
  • the at least one exposure parameter may include but is not limited to one or more of exposure time, analog gain, digital gain, and aperture size. Wherein, the exposure gain includes analog gain and/or digital gain.
  • the intensity of the near-infrared light sensed by the image sensor 01 is stronger, and the first image signal generated and output accordingly includes the near-infrared light
  • the brightness of the light will also be higher.
  • near-infrared light with higher brightness is not conducive to the acquisition of external scene information.
  • the exposure gain of the first preset exposure may be smaller than the first preset exposure. 2. Exposure gain for preset exposure. In this way, when the first light supplement device 021 performs near-infrared supplement light, the brightness of the near-infrared light contained in the first image signal generated and output by the image sensor 01 will not be affected by the first light supplement device 021 performing near-infrared supplement light. Too high.
  • the longer the exposure time the higher the brightness included in the image signal obtained by the image sensor 01, and the longer the motion trailing of the moving objects in the external scene in the image signal; the shorter the exposure time, the longer the image
  • the image signal obtained by the sensor 01 includes the lower the brightness, and the shorter the motion trail of the moving object in the external scene is in the image signal. Therefore, in order to ensure that the brightness of the near-infrared light contained in the first image signal is within an appropriate range, and that the moving objects in the external scene have a short motion trail in the first image signal.
  • the exposure time of the first preset exposure may be less than the exposure time of the second preset exposure.
  • the first light supplement device 021 performs near-infrared supplement light
  • the brightness of the near-infrared light contained in the first image signal generated and output by the image sensor 01 will not be affected by the first light supplement device 021 performing near-infrared supplement light. Too high.
  • the shorter exposure time makes the motion trailing of the moving object in the external scene appear shorter in the first image signal, thereby facilitating the recognition of the moving object.
  • the exposure time of the first preset exposure is 40 milliseconds
  • the exposure time of the second preset exposure is 60 milliseconds, and so on.
  • the exposure time of the first preset exposure may not only be less than the exposure time of the second preset exposure , Can also be equal to the exposure time of the second preset exposure.
  • the exposure gain of the first preset exposure may be less than the exposure gain of the second preset exposure, or may be equal to the second preset exposure The exposure gain.
  • the purposes of the first image signal and the second image signal may be the same.
  • the exposure time of the first preset exposure may be equal to the exposure time of the second preset exposure. If the exposure time of the first preset exposure and the exposure time of the second preset exposure are different, the exposure time will be longer. There is a motion trailing in the image signal of one channel, resulting in different definitions of the two image signals.
  • the exposure gain of the first preset exposure may be equal to the exposure gain of the second preset exposure.
  • the exposure gain of the first preset exposure may be less than the exposure gain of the second preset exposure. It can also be equal to the exposure gain of the second preset exposure.
  • the exposure time of the first preset exposure may be less than the exposure time of the second preset exposure, or may be equal to the second preset exposure The exposure time.
  • the image sensor 01 may include multiple photosensitive channels, and each photosensitive channel may be used to sense at least one type of light in the visible light band and to sense light in the near-infrared band. That is, each photosensitive channel can not only sense at least one kind of light in the visible light band, but also can sense light in the near-infrared band. In a possible implementation, the multiple photosensitive channels can be used to sense at least two different visible light wavebands.
  • the plurality of photosensitive channels may include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, Y photosensitive channels, W photosensitive channels, and C photosensitive channels.
  • the R photosensitive channel is used to sense the light in the red and near-infrared bands
  • the G photosensitive channel is used to sense the light in the green and near-infrared bands
  • the B photosensitive channel is used to sense the light in the blue and near-infrared bands.
  • Y The photosensitive channel is used to sense light in the yellow band and near-infrared band.
  • W can be used to represent the light-sensing channel used to sense full-wavelength light
  • C can be used to represent the light-sensing channel used to sense full-wavelength light, so when there is more
  • a photosensitive channel includes a photosensitive channel for sensing light of a full waveband
  • this photosensitive channel may be a W photosensitive channel or a C photosensitive channel. That is, in practical applications, the photosensitive channel used for sensing the light of the full waveband can be selected according to the use requirements.
  • the image sensor 01 may be an RGB sensor, RGBW sensor, or RCCB sensor, or RYYB sensor.
  • the distribution of the R photosensitive channel, the G photosensitive channel and the B photosensitive channel in the RGB sensor can be seen in Figure 20, and the distribution of the R photosensitive channel, G photosensitive channel, B photosensitive channel and W photosensitive channel in the RGBW sensor can be seen in the figure 21.
  • the distribution of the R photosensitive channel, the C photosensitive channel and the B photosensitive channel in the RCCB sensor can be seen in Figure 22, and the distribution of the R photosensitive channel, the Y photosensitive channel and the B photosensitive channel in the RYYB sensor can be seen in Figure 23.
  • some photosensitive channels may only sense light in the near-infrared waveband, but not light in the visible light waveband.
  • the plurality of photosensitive channels may include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, and IR photosensitive channels.
  • the R photosensitive channel is used to sense red light and near-infrared light
  • the G photosensitive channel is used to sense green light and near-infrared light
  • the B photosensitive channel is used to sense blue light and near-infrared light.
  • IR The photosensitive channel is used to sense light in the near-infrared band.
  • the image sensor 01 may be an RGBIR sensor, where each IR photosensitive channel in the RGBIR sensor can sense light in the near-infrared waveband, but not light in the visible light waveband.
  • the image sensor 01 is an RGB sensor
  • other image sensors such as RGBIR sensors
  • the RGB information collected by the RGB sensor is more complete.
  • Some of the photosensitive channels of the RGBIR sensor cannot collect visible light, so the image collected by the RGB sensor The color details are more accurate.
  • the multiple photosensitive channels included in the image sensor 01 may correspond to multiple sensing curves.
  • the R curve in FIG. 24 represents the sensing curve of the image sensor 01 to light in the red light band
  • the G curve represents the sensing curve of the image sensor 01 to light in the green light band
  • the B curve represents the image sensor 01
  • the W (or C) curve represents the sensing curve of the image sensor 01 sensing the light in the full band
  • the NIR (Near infrared) curve represents the sensing of the image sensor 01 sensing the light in the near infrared band. curve.
  • the image sensor 01 may adopt a global exposure method or a rolling shutter exposure method.
  • the global exposure mode means that the exposure start time of each row of effective images is the same, and the exposure end time of each row of effective images is the same.
  • the global exposure mode is an exposure mode in which all rows of effective images are exposed at the same time and the exposure ends at the same time.
  • Rolling shutter exposure mode means that the exposure time of different rows of effective images does not completely coincide, that is, the exposure start time of a row of effective images is later than the exposure start time of the previous row of effective images, and the exposure end time of a row of effective images is later At the end of the exposure of the effective image on the previous line.
  • data can be output after each line of effective image is exposed. Therefore, the time from the start of output of the first line of effective image to the end of output of the last line of effective image can be expressed as reading Time out.
  • FIG. 25 is a schematic diagram of a rolling shutter exposure method. It can be seen from Figure 10 that the effective image of the first line begins to be exposed at time T1, and the exposure ends at time T3. The effective image of the second line begins to be exposed at time T2, and the exposure ends at time T4. Time T2 is backward compared to time T1. A period of time has passed, and time T4 has moved a period of time backward compared to time T3. In addition, the effective image of the first line ends exposure at time T3 and begins to output data, and the output of data ends at time T5. The effective image of line n ends exposure at time T6 and begins to output data, and the output of data ends at time T7, then T3 The time between ⁇ T7 is the read time.
  • the time period of the near-infrared fill light and the exposure time period of the nearest second preset exposure do not exist Intersection
  • the time period of near-infrared fill light is a subset of the exposure time period of the first preset exposure, or the time period of near-infrared fill light and the exposure time period of the first preset exposure overlap, or the first preset
  • the exposure period of exposure is a subset of the near-infrared fill light.
  • the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the time period of near-infrared fill light is the first preset A subset of the exposure time period for exposure.
  • the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the time period of near-infrared fill light is equal to that of the first preset exposure. There is an intersection of exposure time periods.
  • the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the exposure time period of the first preset exposure is near-infrared fill light A subset of.
  • the time period of near-infrared fill light is the same as the exposure time period of the nearest second preset exposure There is no intersection.
  • the start time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure
  • the end time of the near-infrared fill light is no later than the exposure of the first line of the effective image in the first preset exposure End time.
  • the start time of the near-infrared fill light is not earlier than the exposure end time of the last line of the effective image of the nearest second preset exposure before the first preset exposure and not later than the first line of the first preset exposure.
  • the exposure end time of the image, the end time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure and no later than the nearest second preset exposure after the first preset exposure
  • the start time of the near-infrared fill light is not earlier than the exposure end time of the last line of the effective image of the nearest second preset exposure before the first preset exposure and not later than the first line of the first preset exposure.
  • the exposure start time of the image, the end time of the near-infrared fill light is no earlier than the exposure end time of the last line of the effective image in the first preset exposure and no later than the nearest second preset exposure after the first preset exposure The exposure start time of the first line of valid images.
  • the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the start time of near-infrared fill light is no earlier than The exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure end time of the first line of the effective image in the first preset exposure.
  • the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the start time of near-infrared fill light is no earlier than the first
  • the exposure end time of the last effective image line of the nearest second preset exposure before the preset exposure and not later than the exposure end time of the first effective image line in the first preset exposure, and the end time of the near-infrared fill light is not It is earlier than the exposure start time of the last line of the effective image in the first preset exposure and not later than the exposure start time of the first line of the effective image of the nearest second preset exposure after the first preset exposure.
  • the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the start time of near-infrared fill light is no earlier than the first
  • the exposure end time of the last line of the effective image of the nearest second preset exposure before the preset exposure and not later than the exposure start time of the first line of the effective image in the first preset exposure the end time of the near-infrared fill light is not It is earlier than the exposure end time of the last line of the effective image in the first preset exposure and not later than the exposure start time of the first line of the effective image of the nearest second preset exposure after the first preset exposure.
  • the sorting of the first preset exposure and the second preset exposure may not be limited to these examples.
  • the slanted dotted line indicates the start time of exposure
  • the slanted solid line indicates the end time of exposure
  • the vertical dotted line indicates the first The time period of the near-infrared fill light corresponding to the preset exposure.
  • the multiple exposures may include odd exposures and even exposures.
  • the first preset exposure and the second preset exposure may include, but are not limited to, the following methods:
  • the first preset exposure is one exposure in an odd number of exposures
  • the second preset exposure is one exposure in an even number of exposures.
  • the multiple exposures may include the first preset exposure and the second preset exposure arranged in a parity order.
  • odd-numbered exposures such as the first exposure, third exposure, and fifth exposure in multiple exposures are all the first preset exposures
  • the second exposure, fourth exposure, and sixth exposure are even-numbered exposures.
  • the exposure is the second preset exposure.
  • the first preset exposure is one exposure in an even number of exposures
  • the second preset exposure is one exposure in an odd number of exposures.
  • the multiple exposures may include the first exposure in the odd-even order.
  • the preset exposure and the second preset exposure For example, odd-numbered exposures such as the first exposure, third exposure, and fifth exposure in multiple exposures are all second preset exposures, and even-numbered exposures such as the second exposure, fourth exposure, and sixth exposure
  • the exposure is the first preset exposure.
  • the first preset exposure is one of the specified odd-numbered exposures
  • the second preset exposure is one of the exposures other than the specified odd-numbered exposures, that is, The second preset exposure may be an odd number of exposures in multiple exposures, or an even number of exposures in multiple exposures.
  • the first preset exposure is one exposure in the specified even number of exposures
  • the second preset exposure is one exposure in the other exposures except the specified even number of exposures, that is, The second preset exposure may be an odd number of exposures in multiple exposures, or an even number of exposures in multiple exposures.
  • the first preset exposure is one exposure in the first exposure sequence
  • the second preset exposure is one exposure in the second exposure sequence.
  • the first preset exposure is one exposure in the second exposure sequence
  • the second preset exposure is one exposure in the first exposure sequence
  • the aforementioned multiple exposure includes multiple exposure sequences
  • the first exposure sequence and the second exposure sequence are the same exposure sequence or two different exposure sequences in the multiple exposure sequences
  • each exposure sequence includes N exposures
  • the N exposures include 1 first preset exposure and N-1 second preset exposures, or the N exposures include 1 second preset exposure and N-1 second preset exposures, where N is A positive integer greater than 2.
  • each exposure sequence includes 3 exposures, and these 3 exposures can include 1 first preset exposure and 2 second preset exposures.
  • the first exposure of each exposure sequence can be the first preset Exposure
  • the second and third exposures are the second preset exposure. That is, each exposure sequence can be expressed as: a first preset exposure, a second preset exposure, and a second preset exposure.
  • these 3 exposures can include 1 second preset exposure and 2 first preset exposures, so that the first exposure of each exposure sequence can be the second preset exposure, the second and the third The exposure is the first preset exposure. That is, each exposure sequence can be expressed as: the second preset exposure, the first preset exposure, and the first preset exposure.
  • the first light supplement device 021 can be used to stroboscopically fill light to make the image sensor 01 generate and output the first image signal containing near-infrared brightness information.
  • the second image signal containing visible light brightness information and since the first image signal and the second image signal are both acquired by the same image sensor 01, the viewpoint of the first image signal is the same as the viewpoint of the second image signal, so that the The first image signal and the second image signal can obtain complete information of the external scene.
  • the intensity of visible light is strong, such as daytime, the proportion of near-infrared light during the day is relatively strong, and the color reproduction of the collected image is not good.
  • the third image signal containing the visible light brightness information can be generated and output by the image sensor 01, so Even during the day, images with better color reproduction can be collected, and the true color information of the external scene can be obtained efficiently and simply regardless of the intensity of visible light, or whether it is day or night.
  • the present application uses the exposure timing of the image sensor to control the near-infrared supplementary light timing of the supplementary light device, so that the near-infrared supplementary light is performed during the first preset exposure and the first image signal is generated, and during the second preset exposure It does not perform near-infrared supplementary light and generates a second image signal.
  • This data collection method can directly collect the first image signal and the second image signal with different brightness information while the structure is simple and the cost is reduced, that is, through one
  • the image sensor can acquire two different image signals, which makes the image acquisition device easier and more efficient to acquire the first image signal and the second image signal.
  • the first image signal and the second image signal are both generated and output by the same image sensor, so the viewpoint corresponding to the first image signal is the same as the viewpoint corresponding to the second image signal. Therefore, the information of the external scene can be jointly obtained through the first image signal and the second image signal, and there is no difference between the viewpoint corresponding to the first image signal and the viewpoint corresponding to the second image signal. It is not aligned with the image generated by the second image signal.
  • noise reduction processing in some embodiments of the present application can refer to the following solutions:
  • the joint noise reduction unit may include a time domain noise reduction unit 021.
  • the temporal noise reduction unit 021 is configured to perform motion estimation according to the first image signal and the second image signal to obtain a motion estimation result, and perform temporal filtering processing on the first image signal according to the motion estimation result to obtain near-infrared light
  • the second image signal is subjected to time-domain filtering processing according to the motion estimation result to obtain a visible light noise-reduced image.
  • the temporal noise reduction unit 021 may include a motion estimation unit 0211 and a temporal filtering unit 0212.
  • the motion estimation unit 0211 may be configured to generate a first frame difference image according to the first image signal and the first historical noise reduction image, and determine the first frame difference image according to the first frame difference image and multiple first set frame difference thresholds.
  • the time domain filtering unit 0212 is used to perform time domain filtering processing on the first image signal according to the first time domain filtering strength of each pixel to obtain a near-infrared light noise reduction image, according to the first time domain of each pixel
  • the filtering strength performs time-domain filtering processing on the second image signal to obtain a visible light denoising image.
  • the motion estimation unit 0211 may perform difference processing on each pixel in the first image signal and the corresponding pixel in the first historical noise reduction image to obtain the original frame difference image, and the original frame difference image As the first frame difference image.
  • the motion estimation unit 0211 may perform difference processing on each pixel in the first image signal and the corresponding pixel in the first historical noise reduction image to obtain the original frame difference image. Afterwards, the original frame difference image is processed to obtain the first frame difference image.
  • processing the original frame difference image may refer to performing spatial smoothing processing or block quantization processing on the original frame difference image.
  • the motion estimation unit 0211 may determine the first temporal filtering intensity of each pixel according to each pixel in the first frame difference image and multiple first set frame difference thresholds.
  • each pixel in the first frame difference image corresponds to a first set frame difference threshold
  • the first set frame difference threshold corresponding to each pixel point may be the same or different.
  • the first set frame difference threshold corresponding to each pixel can be set by an external user.
  • the motion estimation unit 0211 may perform difference processing between the previous frame image of the first image signal and the first historical noise reduction image, thereby obtaining the first noise intensity image, according to the first noise
  • the noise intensity of each pixel in the intensity image determines the first set frame difference threshold of the pixel at the corresponding position in the first frame difference image.
  • the first set frame difference threshold corresponding to each pixel point can also be determined in other ways, which is not limited in the embodiment of the present application.
  • the motion estimation unit 0211 can determine the result according to the frame difference of the pixel and the first set frame difference threshold corresponding to the pixel by the following formula (1) The first temporal filtering strength of the corresponding pixel.
  • (x, y) is the position of the pixel in the image; ⁇ nir (x, y) refers to the first temporal filtering intensity of the pixel with coordinates (x, y), dif nir (x, y) means that the pixel in the frame is the first frame difference image difference, dif_thr nir (x, y) means that the first frame set difference threshold value corresponding to the pixel.
  • the frame difference of the pixel is smaller than the first set frame difference threshold, which means that the pixel tends to be more stationary, that is, the The smaller the motion level corresponding to the pixel. From the above formula (1), it can be seen that for any pixel, the frame difference of the pixel is smaller than the first set frame difference threshold, and the second time domain filtering intensity of the pixel is greater.
  • the exercise level is used to indicate the intensity of the exercise. The higher the exercise level, the more intense the exercise.
  • the value of the first temporal filtering strength can be between 0 and 1.
  • the time domain filtering unit 0212 may directly perform time domain filtering on the first image signal and the second image signal directly according to the first time domain filtering strength. Processing to obtain near-infrared light noise reduction image and visible light noise reduction image.
  • the first image signal is a near-infrared light image with a high signal-to-noise ratio
  • each pixel in the first image signal is used
  • the first time-domain filtering strength of the point performs time-domain filtering on the second image signal, which can more accurately distinguish the noise and effective information in the image, thereby avoiding the loss of image detail information and image smearing in the denoised image The problem.
  • the motion estimation unit 0211 may generate at least one first frame difference image according to the first image signal and at least one first historical noise reduction image, and according to the at least one frame difference image and each The multiple first set frame difference thresholds corresponding to the frame difference image determine the first temporal filtering intensity of each pixel in the first image signal.
  • the at least one historical noise reduction image refers to an image obtained by performing noise reduction on the first N frames of the first image signal.
  • the motion estimation unit 0211 may determine according to the first historical noise-reduction image and the first image signal with reference to the related implementation described above. The corresponding first frame difference image. After that, the motion estimation unit 0211 can determine each first frame difference image according to each first frame difference image and multiple first set frame difference thresholds corresponding to each first frame difference image, referring to the aforementioned related implementation manners The temporal filtering strength of each pixel.
  • the motion estimation unit 0211 may fuse the temporal filtering intensity of the corresponding pixels in each first frame difference image, so as to obtain the first temporal filtering intensity corresponding to each pixel. Or, for any pixel, the motion estimation unit 0211 may select the temporal filter intensity with the highest motion level represented by the at least one temporal filter intensity of the pixel in the at least one first frame difference image, and then the selected The time domain filtering strength is used as the first time domain filtering strength of the pixel.
  • the motion estimation unit 0211 may generate the first frame difference image according to the first image signal and the first historical noise reduction image, and determine the first frame difference image according to the first frame difference image and multiple first set frame difference thresholds.
  • the first temporal filtering strength of each pixel in the image signal, the first historical noise reduction image refers to the image after noise reduction is performed on any one of the first N frames of the first image signal; motion estimation unit 0211 It is also used to generate a second frame difference image according to the second image signal and the second historical noise reduction image, and determine the first frame difference image of each pixel in the second image signal according to the second frame difference image and multiple second set frame difference thresholds.
  • the second historical noise reduction image refers to an image after noise reduction is performed on any one of the first N frames of the second image signal;
  • the motion estimation unit 0211 is also used for The first time-domain filtering strength of each pixel and the second time-domain filtering strength of each pixel in the second image signal determine the joint time-domain filtering strength of each pixel;
  • the time-domain filtering unit 0212 is used for each pixel
  • the first time-domain filter strength of the pixel or the joint time-domain filter strength performs time-domain filtering processing on the first image signal to obtain a near-infrared light noise reduction image
  • the second image signal is processed according to the joint time-domain filtering strength of each pixel Perform time-domain filtering to obtain a visible light denoising image.
  • the motion estimation unit 0211 can not only determine the first time domain filtering intensity of each pixel in the first image signal through the implementation described above, but also determine the second time domain of each pixel in the second image signal. Filter strength.
  • the motion estimation unit 0211 may first perform difference processing on each pixel in the second image signal and the corresponding pixel in the second historical noise reduction image. Obtain the second frame difference image. Among other things, the first image signal and the second image signal are aligned.
  • the motion estimation unit 0211 may determine the second temporal filtering intensity of each pixel according to each pixel in the second frame difference image and multiple second set frame difference thresholds.
  • each pixel in the second frame difference image corresponds to a second set frame difference threshold, that is, multiple second set frame difference thresholds are equal to each pixel in the second frame difference image.
  • the second set frame difference threshold corresponding to each pixel may be the same or different. In a possible implementation manner, the second set frame difference threshold corresponding to each pixel point can be set by an external user.
  • the motion estimation unit 0211 may perform difference processing between the previous frame image of the second image signal and the second historical noise reduction image to obtain a second noise intensity image, according to the second noise
  • the noise intensity of each pixel in the intensity image determines the second set frame difference threshold of the pixel at the corresponding position in the second frame difference image.
  • the second set frame difference threshold corresponding to each pixel point can also be determined in other ways, which is not limited in the embodiment of the present application.
  • the motion estimation unit 0211 can determine the result by the following formula (2) according to the frame difference of the pixel and the second set frame difference threshold corresponding to the pixel The second temporal filtering strength of the corresponding pixel.
  • ⁇ vis (x, y) refers to the second time domain filter strength of the pixel with coordinates (x, y)
  • dif vis (x, y) represents the frame difference of the pixel in the second frame difference image
  • Dif_thr vis (x, y) represents the second set frame difference threshold corresponding to the pixel.
  • the motion estimation unit 0211 may weight the first time domain filtering strength and the second time domain filtering strength of each pixel, thereby Get the joint time domain weight of each pixel.
  • the determined joint time domain weight of each pixel point is the motion estimation result of the first image signal and the second image signal.
  • the motion estimation unit 0211 may weight the first temporal filtering strength and the second temporal filtering strength of each pixel by the following formula (3), thereby obtaining the joint temporal filtering of each pixel strength.
  • refers to the neighborhood range centered on the pixel with coordinates (x, y), that is, the local image area centered on the pixel with coordinates (x, y), (x+i,y +j) refers to the pixel coordinates in the local image area, Refers to the first temporal filtering strength in the local image area centered on the pixel with coordinates (x, y), Refers to the second temporal filtering strength in the local image area centered on the pixel with coordinates (x, y), ⁇ fus (x, y) refers to the joint time of pixels with coordinates (x, y) Domain filtering strength.
  • the ratio of the first time domain filter strength and the second time domain filter strength in the joint time domain filter strength is adjusted by the first time domain filter strength and the second time domain filter strength in the local image area, that is, the higher the local motion level The larger the proportion of time-domain filtering strength.
  • the first temporal filtering strength can be used to indicate the motion level of pixels in the first image signal
  • the second temporal filtering strength can be used to indicate the motion level of pixels in the second image signal.
  • the joint time-domain filtering strength determined by the above-mentioned method simultaneously fuses the first time-domain filtering strength and the second time-domain filtering strength, that is, the joint time-domain filtering strength also takes into account that the pixel point appears in the first image signal. The movement trend of and the movement trend shown in the second image signal.
  • the joint time domain filtering strength can more accurately characterize the motion trend of the pixel points.
  • the subsequent time domain filtering is performed with the joint time domain filtering strength. At this time, image noise can be removed more effectively, and problems such as image tailing caused by misjudgment of the motion level of pixels can be avoided.
  • the motion estimation unit may calculate the sum of the first temporal filtering strength of the pixel.
  • a time domain filtering strength is selected as the joint time domain filtering weight of the pixel.
  • one of the two time-domain filter intensities that represents the higher motion level of the pixel can be selected as the joint time-domain filter intensity.
  • the time-domain filtering unit 0212 may perform time-domain filtering processing on the first image signal and the second image signal respectively according to the joint time-domain filtering strength, so as to obtain near-infrared light drop Noise image and visible light noise reduction image.
  • the time-domain filtering unit 0212 may perform time-domain processing on each pixel in the first image signal and the first historical noise reduction image by the following formula (4) according to the joint time-domain filtering strength of each pixel. Weighted processing to obtain the near-infrared light noise reduction image. According to the joint time domain filtering strength of each pixel, the second image signal and each pixel in the second historical noise reduction image are processed by the following formula (5) Time-domain weighting processing to obtain visible light noise reduction images.
  • ⁇ fus (x, y) refers to the joint temporal filtering strength of the pixel with the coordinates (x, y)
  • I nir (x ,y,t) refers to the pixel with coordinates (x,y) in the first image signal
  • I vis (x, y, t) refers to the pixel with the coordinate (x, y) in the second image signal.
  • the time-domain filtering unit 0212 may also perform time-domain filtering on the first image signal according to the first time-domain filtering strength of each pixel.
  • the near-infrared light image is obtained, and the second image signal is subjected to time-domain filtering processing according to the joint time-domain filtering strength of each pixel, thereby obtaining a visible light image.
  • the joint noise reduction unit may include a spatial noise reduction unit 022.
  • the spatial noise reduction unit 022 is configured to perform edge estimation according to the first image signal and the second image signal to obtain an edge estimation result, and perform spatial filtering processing on the first image signal according to the edge estimation result to obtain a near-infrared light noise reduction image , Performing spatial filtering processing on the second image signal according to the edge estimation result to obtain a visible light noise reduction image.
  • the spatial noise reduction unit 022 may include an edge estimation unit 0221 and a spatial filtering unit 0222.
  • the edge estimation unit 0221 is used to determine the first spatial filtering strength of each pixel in the first image signal; the spatial filtering unit 0222 is used to determine the first spatial filtering strength corresponding to each pixel.
  • An image signal is subjected to spatial filtering processing to obtain a near-infrared light noise reduction image, and the second image signal is subjected to spatial filtering processing according to the first spatial filtering intensity corresponding to each pixel point to obtain a visible light noise reduction image.
  • the edge estimation unit 0221 may determine the first spatial filtering intensity of the corresponding pixel according to the difference between each pixel of the first image signal and other pixels in its neighborhood. Wherein, the edge estimation unit 0221 can generate the first spatial filtering intensity of each pixel through the following formula (6).
  • refers to a neighborhood range centered on a pixel with coordinates (x, y), that is, a local image area centered on a pixel with coordinates (x, y).
  • (x+i,y+j) refers to the pixel coordinates in the local image area
  • img nir (x,y) refers to the pixel value of the pixel with coordinates (x,y) in the first image signal
  • ⁇ 1 and ⁇ 2 refer to the standard deviation of Gaussian distribution
  • It refers to the first spatial filtering strength determined by the pixel point with coordinates (x, y) in the local image area according to the difference between it and the pixel point (x+i, y+j).
  • the spatial filtering unit 0222 may perform spatial filtering on the first image signal and the second image signal according to the multiple first spatial filtering intensities of each pixel, thus, the near-infrared light noise reduction image and the visible light noise reduction image are obtained.
  • the edge estimation unit 0221 is used to determine the first spatial filtering strength of each pixel in the first image signal, and determine the second spatial filtering strength of each pixel in the second image signal; Perform local information extraction on the image signal to obtain the first local information, and perform local information extraction on the second image signal to obtain the second local information; according to the first spatial filtering strength, the second spatial filtering strength, the first local information and the second local The information determines the joint spatial filtering strength corresponding to each pixel; the spatial filtering unit 0222 is used to perform spatial filtering on the first image signal according to the first spatial filtering strength corresponding to each pixel to obtain a near-infrared light noise reduction image, Perform spatial filtering processing on the second image signal according to the joint spatial filtering intensity corresponding to each pixel to obtain a visible light denoising image.
  • the first local information and the second local information include at least one of local gradient information, local brightness information, and local information entropy.
  • the edge estimation unit 0221 can not only determine the first spatial filtering strength of each pixel in the first image signal through the implementation described above, but also determine the second time domain of each pixel in the second image signal. Filter strength.
  • the edge estimation unit 0221 can determine the second spatial domain of the corresponding pixel according to the difference between each pixel of the second image signal and other pixels in its neighborhood Filter strength. Wherein, the edge estimation unit 0221 can generate the first spatial filtering intensity of each pixel through the following formula (7).
  • refers to a neighborhood range centered on a pixel with coordinates (x, y), that is, a local image area centered on a pixel with coordinates (x, y).
  • (x+i,y+j) refers to the pixel coordinates in the local image area
  • img vis (x,y) refers to the pixel value of the pixel with coordinates (x,y) in the second image signal
  • ⁇ 1 and ⁇ 2 refer to the standard deviation of Gaussian distribution
  • It refers to the second spatial filtering strength determined by the pixel point with coordinates (x, y) in the local image area according to the difference between it and the pixel point (x+i, y+j).
  • the edge estimation unit 0221 may use the Sobel edge detection operator to perform convolution processing on the first image signal and the second image signal to obtain the first The texture image and the second texture image, and use this as a weight to weight the multiple first spatial filter intensities and multiple second spatial filter intensities of each pixel to generate the multiple of each pixel in the local image area. Joint spatial filtering strength.
  • the first texture image is the first local information
  • the second texture image is the second local information.
  • the Sobel edge detection operator is shown in the following equation (8).
  • the edge estimation unit 0221 can generate the joint spatial filtering strength through the following equation (9).
  • sobel H refers to the Sobel edge detection operator in the horizontal direction
  • sobel V refers to the Sobel edge detection operator in the vertical direction
  • ⁇ fus (x+i,y+j) refers to the coordinates (x,y) Any joint spatial filtering strength of pixels in its neighborhood ⁇ , Refers to the texture information of the pixel with coordinates (x, y) in the first texture image, Refers to the texture information of the pixel with the coordinate (x, y) in the second texture image.
  • the joint spatial filtering strength when determining the joint spatial filtering strength, the corresponding processing is performed by the edge detection operator. Therefore, the final multiple joint spatial filtering strength of each pixel is smaller, which indicates that the pixel is related to the local image.
  • the larger the difference between other pixels in the area it can be seen that in the embodiment of the application, the area with the larger the brightness difference between adjacent pixels in the image, the smaller the joint spatial filtering intensity, and the brightness difference between adjacent pixels In a smaller area, the joint spatial filtering strength is relatively larger. That is, in the embodiment of the present application, when performing spatial filtering, a weaker filter strength is used for edges, and a stronger filter strength is used for non-edges, thereby improving the noise reduction effect.
  • the temporal filtering unit 0212 may perform spatial filtering processing on the first repaired image and the second repaired image respectively according to the joint spatial filtering strength, thereby obtaining a near-infrared light noise reduction image and a visible light noise reduction image.
  • the spatial filtering unit 0222 may perform spatial filtering processing on the first image signal according to the first spatial filtering intensity of each pixel. Perform spatial filtering processing on the second image signal according to the joint spatial filtering strength of each pixel.
  • the spatial filtering unit 0222 may perform spatial weighting processing on each pixel in the first image signal according to the first spatial filtering intensity of each pixel through the following formula (10), thereby obtaining near-infrared light
  • each pixel in the second image signal is weighted by the following formula (11) to obtain a visible light noise-reduced image.
  • I nir (x+i, y+j) refers to the neighbor of the pixel with the coordinates (x, y) in the first image signal Pixels in the domain
  • ⁇ nir (x+i, y+j) is the first spatial filtering strength of the pixel with coordinates (x, y) in the neighborhood range
  • refers to the coordinate as (x ,y) is the center of the neighborhood
  • I vis (x+i,y+j) refers to the neighborhood of the pixel with coordinates (x,y) in the second image signal
  • ⁇ fus (x+i, y+j) is the joint spatial filtering strength of the pixel with coordinates (x, y) in the neighborhood.
  • the image noise reduction unit 02 may also include the above-mentioned temporal noise reduction unit 021 and the spatial noise reduction unit 022 at the same time.
  • Time domain noise reduction image After that, the spatial noise reduction unit 022 performs spatial filtering on the obtained first temporal noise reduction image and the second temporal noise reduction image, thereby obtaining a near-infrared light noise reduction image and a visible light noise reduction image.
  • the spatial noise reduction unit 022 may first perform spatial filtering on the first image signal and the second image signal to obtain the first spatial noise reduction image and the second spatial noise reduction image. After that, the time domain noise reduction unit 021 performs time domain filtering on the obtained first spatial domain noise reduction image and the second spatial domain noise reduction image, thereby obtaining a near-infrared light noise reduction image and a visible light noise reduction image.
  • the image acquisition device generates multiple images with different spectral ranges through multiple exposures of the image sensor and the stroboscopic supplementary light of the light supplement device, which expands the spectral range that the image sensor can receive, expands the image acquisition capability of a single sensor, and improves different The image quality under the scene.
  • the embodiment of the present application also provides an image fusion method, which is applied to the image fusion device provided in the embodiment shown in FIGS. 1-34.
  • the image fusion device includes an image sensor, a light supplement, a filter component, and a processor,
  • the image sensor is located at the light exit side of the light filter assembly, the light supplement includes a first light supplement device, the filter assembly includes a first filter, and the processor includes: a buffer unit and an image processing unit , See Figure 35, the method includes:
  • Step 3201 Perform near-infrared compensation light by the first light-filling device, wherein the near-infrared compensation light is performed at least during a partial exposure time period of the first preset exposure, and no near-infrared light compensation is performed during the exposure time period of the second preset exposure.
  • the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
  • Step 3202 Let visible light and part of near-infrared light pass through the first filter
  • Step 3203 Perform multiple exposures through the image sensor in a global exposure mode to generate and output a first image signal and a second image signal, where the first image signal is an image signal generated according to the first preset exposure , The second image signal is an image signal generated according to the second preset exposure;
  • Step 3204 Through the buffer unit, when it is known that the first target image signal currently output by the image sensor needs to be buffered, the first target image signal is buffered, and when it is known that the buffered second target image signal needs to be output synchronously, at least The buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a buffered frame of the second image signal, or the first target image The signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
  • Step 3205 Receive at least the first target image signal currently output by the image sensor through the image processing unit, and at least receive the second target image signal synchronously output by the buffer unit, according to the first target image signal and the The second target image signal generates a color fusion image.
  • the synchronization unit determines that the first target image signal currently output by the image sensor needs to be buffered, it instructs the buffer unit to buffer the first target image signal, and obtain the result from the buffered image
  • the buffer unit is instructed to synchronously output the second target image signal to the image processing unit.
  • the image processing unit generating a color fusion image according to the first target image signal and the second target image signal includes:
  • An image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target image;
  • the image fusion unit performs fusion processing on the first target image and the second target image to obtain the color fusion image.
  • the image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target image, including :
  • the first preprocessing unit performs a first preprocessing operation on the first target image signal to obtain a preprocessed first target image
  • the second preprocessing unit performs a second preprocessing operation on the second target image signal to obtain a second target image
  • the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain a first target image and a second target image after noise reduction, the first target image after noise reduction and The second target image is used for fusion processing to obtain the color fusion image.
  • the synchronization unit determines that the first target image signal currently output by the image sensor needs to be buffered, it instructs the buffer unit to buffer the first target image signal, and obtain the result from the buffered image
  • instructing the buffer unit to synchronously output the second target image signal to the image processing unit includes:
  • the synchronization unit determines that each frame of the first target image signal needs to be buffered, and needs to output the second target image signal synchronously, and the second target image signal is the image signal buffered by the buffer unit last time;
  • the buffer unit currently buffers the second image signal, and determines the first image signal buffered last time as the second target image signal and outputs it to the image Preprocessing unit
  • the buffer unit currently buffers the first image signal, and determines the second image signal buffered last time as the second target image signal and outputs it to the image preprocessing unit.
  • the synchronization unit determines that the first target image signal currently output by the image sensor needs to be buffered, it instructs the buffer unit to buffer the first target image signal, and obtain the result from the buffered image
  • instructing the buffer unit to synchronously output the second target image signal to the image processing unit includes:
  • the synchronization unit determines that the first target image signal is a first image signal, it needs to be buffered, and when it is determined that the first target image signal is a second image signal, it needs to synchronously output the second target image signal, so
  • the second target image signal is the most recently buffered first image signal among the image signals buffered by the buffer unit; wherein, if the first target image signal is the second image signal, the buffer unit will The buffered first image signal is determined to be the second target image signal output to the image preprocessing unit; if the first target image signal is the first image signal, the buffer unit buffers the first image signal; or ,
  • the synchronization unit determines that the first target image signal is a second image signal, it needs to be buffered, and when it is determined that the first target image signal is a first image signal, it needs to synchronously output the second target image signal, so
  • the second target image signal is the most recently buffered second image signal among the second image signals buffered by the buffer unit; wherein, if the first target image signal is the first image signal, the buffer unit will The second image signal buffered most recently is determined to be the second target image signal output to the image preprocessing unit; if the first target image signal is the second image signal, the buffer unit buffers the second image signal .
  • the image fusion unit performs fusion processing on the first target image and the second target image to obtain the color fusion image, including:
  • the color extraction unit extracts the color signal of the image preprocessed by the second image signal
  • the brightness extraction unit extracts the brightness signal of the image preprocessed by the second image signal
  • the fusion processing unit performs fusion processing on the color signal and the brightness signal of the image preprocessed by the first image signal and the image preprocessed by the second image signal to obtain the color fusion image.
  • the fusion processing unit performs fusion processing on the color signal and brightness signal of the image preprocessed by the first image signal, and the color signal and brightness signal of the image preprocessed by the second image signal to obtain the
  • the color fusion image includes:
  • the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain the first target image and the second target image after noise reduction, including:
  • the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction .
  • the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain the first target image and the second target image after noise reduction, including:
  • the temporal noise reduction unit performs motion estimation according to the first target image and the second target image to obtain a motion estimation result, and performs temporal filtering on the first target image according to the motion estimation result to obtain The first target image after noise reduction, performing temporal filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
  • the spatial noise reduction unit performs edge estimation according to the first target image and the second target image to obtain an edge estimation result, and performs spatial filtering on the first target image according to the edge estimation result to obtain the reduction
  • the first target image after noise is subjected to spatial filtering on the second target image according to the edge estimation result to obtain the second target image after noise reduction.
  • the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain the first target image and the second target image after noise reduction, including:
  • the temporal noise reduction unit performs motion estimation according to the first target image and the second target image to obtain a motion estimation result, and performs temporal filtering on the first target image according to the motion estimation result to obtain the first target image A temporal noise reduction image, performing temporal filtering on the second target image according to the motion estimation result to obtain a second temporal noise reduction image;
  • the spatial noise reduction unit performs edge estimation according to the first temporal noise reduction image and the second temporal noise reduction image to obtain an edge estimation result, and performs noise reduction on the first temporal domain according to the edge estimation result Performing spatial filtering on the image to obtain the denoised first target image, and performing spatial filtering on the second temporal denoised image according to the edge estimation result to obtain the denoised second target image;
  • the spatial noise reduction unit performs edge estimation according to the first target image and the second target image to obtain an edge estimation result, and performs spatial filtering on the first target image according to the edge estimation result to obtain a first spatial domain Denoising an image, performing spatial filtering on the second target image according to the edge estimation result to obtain a second spatial denoising image;
  • the temporal noise reduction unit performs motion estimation according to the first spatial noise reduction image and the second spatial noise reduction image to obtain a motion estimation result, and performs motion estimation on the first spatial noise reduction image according to the motion estimation result
  • Time domain filtering is performed to obtain the denoised first target image
  • time domain filtering is performed on the second spatial domain denoised image according to the motion estimation result to obtain the denoised second target image.
  • the filter assembly may further include a second filter and a switching component.
  • the second filter may also be switched to the light incident side of the image sensor through the switching component. After the second filter is switched to the light-incident side of the image sensor, the second filter allows light in the visible light band to pass and blocks the light in the near-infrared light band. The second filter passes the visible light and blocks the light. After the light in the near-infrared light band, exposure is performed by the image sensor to generate and output a third image signal.
  • the light fill device may further include a second light fill device.
  • the first light filter included in the filter assembly allows light in the visible light band and part of the near-infrared light to pass through.
  • the second light supplement device performs visible light supplement light.
  • the intensity of the near-infrared light that passes through the first filter when the first light supplement device performs near-infrared light supplementation is higher than that when the first light supplement device does not perform near-infrared light supplementation.
  • the intensity of the near-infrared light of the filter is higher than that when the first light supplement device does not perform near-infrared light supplementation.
  • the wavelength range of the near-infrared light incident on the first filter is the first reference wavelength range
  • the first reference wavelength range is 650 nanometers to 1100 nanometers.
  • the center wavelength of the near-infrared supplement light performed by the first light supplement device is the set characteristic wavelength or the center wavelength of the near-infrared light passing through the first filter when it falls within the set characteristic wavelength range And/or the band width meets the constraints.
  • the center wavelength of the near-infrared supplement light performed by the first light supplement device is any wavelength within the wavelength range of 750 ⁇ 10 nanometers;
  • the center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 780 ⁇ 10 nanometers; or
  • the center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 940 ⁇ 10 nanometers.
  • the constraints include:
  • the difference between the center wavelength of the near-infrared light passing through the first filter and the center wavelength of the near-infrared light supplemented by the first light supplement device is within the wavelength fluctuation range, and the wavelength fluctuation range is 0-20 nanometers.
  • the constraints include:
  • the half bandwidth of the near-infrared light passing through the first filter is less than or equal to 50 nanometers.
  • the constraints include:
  • the first waveband width is smaller than the second waveband width; where the first waveband width refers to the waveband width of the near-infrared light passing through the first filter, and the second waveband width refers to the near-infrared light that is blocked by the first filter. Band width.
  • constraints are:
  • the third waveband width is smaller than the reference waveband width.
  • the third waveband width refers to the waveband width of near-infrared light whose pass rate is greater than a set ratio.
  • the reference waveband width is any waveband width in the range of 50nm to 150nm.
  • the set ratio is any ratio within a ratio range of 30% to 50%.
  • At least one exposure parameter of the first preset exposure and the second preset exposure is different, the at least one exposure parameter is one or more of exposure time, exposure gain, and aperture size, and the exposure gain Including analog gain, and/or, digital gain.
  • the exposure gain of the first preset exposure is smaller than the exposure gain of the second preset exposure.
  • At least one exposure parameter of the first preset exposure and the second preset exposure is the same, and the at least one exposure parameter includes one or more of exposure time, exposure gain, and aperture size,
  • the exposure gain includes analog gain, and/or, digital gain.
  • the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure.
  • the image sensor includes a plurality of light-sensing channels, and each light-sensing channel is used to sense at least one type of light in the visible light waveband and light in the near-infrared waveband.
  • multiple photosensitive channels are used to sense at least two different visible light wavelength bands.
  • the multiple photosensitive channels include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, Y photosensitive channels, W photosensitive channels, and C photosensitive channels;
  • the R photosensitive channel is used to sense the light in the red and near-infrared bands
  • the G photosensitive channel is used to sense the light in the green and near-infrared bands
  • the B photosensitive channel is used to sense the light in the blue and near-infrared bands.
  • the photosensitive channel is used to sense the light in the yellow band and the near-infrared band
  • the W photosensitive channel is used to sense the full band of light
  • the C photosensitive channel is used to sense the full band of light.
  • the image sensor is an RGB sensor, an RGBW sensor, or an RCCB sensor, or an RYYB sensor.
  • the second light supplement device is used to perform visible light supplement light in a constant light mode
  • the second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein there is visible light supplement light at least during a part of the exposure time period of the first preset exposure, and during the entire exposure time period of the second preset exposure There is no visible light fill light; or
  • the second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein at least there is no visible light supplement light during the entire exposure time period of the first preset exposure, and there is no visible light supplement light during the partial exposure time period of the second preset exposure Visible light fill light.
  • the number of times of supplementary light in the unit time length of the first supplementary light device is lower than the number of exposures of the image sensor in the unit time length, wherein, within the interval of every two adjacent times of supplementary light , Interval one or more exposures.
  • the image sensor uses a global exposure method for multiple exposures.
  • the time period of the near-infrared fill light does not exist with the exposure time period of the nearest second preset exposure Intersection, the time period of near-infrared fill light is a subset of the exposure time period of the first preset exposure, or the time period of near-infrared fill light and the exposure time period of the first preset exposure overlap, or the first preset The exposure period of exposure is a subset of the near-infrared fill light.
  • the image sensor adopts rolling shutter exposure for multiple exposures.
  • the time period of near-infrared fill light is different from the exposure time period of the nearest second preset exposure. There is an intersection
  • the start time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure end time of the first line of the effective image in the first preset exposure ;
  • the start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure Exposure end time, the end time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure
  • the start time of the exposure of the effective image is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure Exposure end time
  • the start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure.
  • the exposure start time, the end time of the near-infrared fill light is no earlier than the exposure end time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure. The time at which the exposure of the effective image starts.
  • multiple exposures include odd exposures and even exposures
  • the first preset exposure is one exposure in odd-numbered exposures
  • the second preset exposure is one exposure in even-numbered exposures
  • the first preset exposure is one exposure in an even number of exposures
  • the second preset exposure is one exposure in an odd number of exposures
  • the first preset exposure is one of the specified odd exposures
  • the second preset exposure is one of the exposures except the specified odd exposures
  • the first preset exposure is one of the specified even-numbered exposures
  • the second preset exposure is one of the other exposures except the specified even-numbered exposures
  • the first preset exposure is one exposure in the first exposure sequence
  • the second preset exposure is one exposure in the second exposure sequence
  • the first preset exposure is one exposure in the second exposure sequence
  • the second preset exposure is one exposure in the first exposure sequence
  • the multiple exposure includes multiple exposure sequences
  • the first exposure sequence and the second exposure sequence are one exposure sequence or two exposure sequences among the multiple exposure sequences
  • each exposure sequence includes N exposures
  • N exposures include 1.
  • First preset exposure and N-1 second preset exposure, or N exposures include 1 second preset exposure and N-1 second preset exposure, and N is a positive integer greater than 2.
  • the embodiment of the present application also provides an image fusion method, which is applied to the image fusion device provided in the embodiment shown in FIGS. 1-35.
  • the image fusion device includes an image sensor, a light supplement, a filter component, and a processor,
  • the image sensor is located at the light exit side of the light filter assembly, the light supplement includes a first light supplement device, the filter assembly includes a first filter, and the processor includes: a buffer unit and an image processing unit , See Figure 36, the method includes:
  • Step 3301 Perform near-infrared compensation light by the first light-filling device, wherein the near-infrared compensation light is performed at least during a partial exposure time period of the first preset exposure, and no near-infrared light compensation is performed during the exposure time period of the second preset exposure.
  • Performing near-infrared supplementary light, and the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
  • Step 3302 passing the visible light and part of the near-infrared light through the first filter
  • Step 3303 Perform multiple exposures through the image sensor in a global exposure mode to generate and output a first image signal and a second image signal, where the first image signal is an image signal generated according to the first preset exposure , The second image signal is an image signal generated according to the second preset exposure;
  • Step 3304 Receive the first target image signal currently output by the image sensor through the image processing unit, preprocess the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least synchronize the first target image Output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least receive the second target image synchronously output by the buffer unit, and generate color fusion based on the first target image and the second target image image;
  • the first target image signal is a first image signal
  • the first target image is an image generated after the first image signal is preprocessed
  • the second target image is a buffered frame by the second target An image generated after image signal preprocessing, the second target image signal is the second image signal
  • the first target image signal is the second image signal
  • the first target image is the second image signal
  • the first target image is the second image signal
  • An image generated after preprocessing, the second target image is a buffered frame of an image preprocessed by a second target image signal
  • the second target image signal is the first image signal
  • Step 3305 When the cache unit learns that the first target image needs to be cached, at least cache the first target image synchronously output by the image processing unit, and when it is known that the cached second target image needs to be output synchronously At least the buffered second target image signal is synchronously output to the image processing unit.
  • the method further includes:
  • the synchronization unit determines that the first target image preprocessed and generated by the image processing unit needs to be cached, it instructs the cache unit to cache the first target image, and determines from the cached images that the second target image needs to be output synchronously At this time, the cache unit is instructed to synchronously output the second target image to the image processing unit.
  • the image processing unit receives the first target image signal currently output by the image sensor, preprocesses the first target image signal to obtain the first target image, and when the first target image needs to be cached, Synchronously output the first target image to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, receive the second target image synchronously output by the buffer unit, according to the first target image and the second target image.
  • the target image generates a color fusion image, including:
  • the image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target image;
  • the image fusion unit performs fusion processing on the first target image and the second target image to obtain the color fusion image.
  • the image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target Images, including:
  • the first preprocessing unit performs a first preprocessing operation on the first target image signal to obtain a preprocessed first target image
  • the second preprocessing unit performs a second preprocessing operation on the second target image signal to obtain a second target image
  • the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain a first target image and a second target image after noise reduction, the first target image after noise reduction and The second target image is used for fusion processing to obtain the color fusion image, and the noise-reduced first target image and the second target image are used for fusion processing to obtain the color fusion image.
  • instructing the buffer unit to synchronously output the second target image to the image processing unit includes:
  • the synchronization unit determines that each frame of the first target image needs to be buffered, and needs to output the second target image synchronously, and the second target image is the image buffered by the buffer unit last time;
  • the buffering unit currently buffers the image generated after preprocessing of the second image signal, and preprocesses the first image signal buffered previously.
  • the image generated after the processing is determined to be the second target image and output to the image preprocessing unit;
  • the buffering unit currently buffers the image generated after the preprocessing of the first image signal, and preprocesses the second image signal buffered previously The generated image is determined to be the second target image and output to the image preprocessing unit.
  • instructing the buffer unit to synchronously output the second target image to the image processing unit includes:
  • the synchronization unit determines that the first target image is an image preprocessed by the first image signal, it needs to be buffered, and when determining that the first target image is an image preprocessed by the second image signal, it needs to synchronize the output data.
  • the second target image is an image preprocessed by the first image signal that is buffered last time among the images that have been buffered by the buffer unit; wherein, if the first target image is a second image signal Preprocessed image, the buffering unit determines the image preprocessed by the first image signal that was buffered last time as the second target image and output to the image preprocessing unit; if the first target image is the first If the image is preprocessed by the image signal, the buffer unit buffers the image after the preprocessing of the first image signal; or,
  • the synchronization unit determines that the first target image is an image preprocessed by the second image signal, it needs to be buffered, and when determining that the first target image is an image preprocessed by the first image signal, it needs to synchronize the output data.
  • the second target image is an image preprocessed by a second image signal that has been buffered last time among images that have been buffered by the buffer unit; wherein, if the first target image is a first image signal Preprocessed image, the buffering unit determines the image preprocessed by the second image signal that was buffered last time as the second target image and output to the image preprocessing unit; if the first target image is the second If the image signal is preprocessed, the buffer unit buffers the image after the second image signal preprocessing.
  • the filter assembly may further include a second filter and a switching component.
  • the second filter may also be switched to the light incident side of the image sensor through the switching component. After the second filter is switched to the light-incident side of the image sensor, the second filter allows light in the visible light band to pass and blocks the light in the near-infrared light band. The second filter passes the visible light and blocks the light. After the light in the near-infrared light band, exposure is performed by the image sensor to generate and output a third image signal.
  • the light fill device may further include a second light fill device.
  • the first light filter included in the filter assembly allows light in the visible light band and part of the near-infrared light to pass through.
  • the second light supplement device performs visible light supplement light.
  • the intensity of the near-infrared light that passes through the first filter when the first light supplement device performs near-infrared light supplementation is higher than that when the first light supplement device does not perform near-infrared light supplementation.
  • the intensity of the near-infrared light of the filter is higher than that when the first light supplement device does not perform near-infrared light supplementation.
  • the wavelength range of the near-infrared light incident on the first filter is the first reference wavelength range
  • the first reference wavelength range is 650 nanometers to 1100 nanometers.
  • the center wavelength of the near-infrared supplement light performed by the first light supplement device is the set characteristic wavelength or the center wavelength of the near-infrared light passing through the first filter when it falls within the set characteristic wavelength range And/or the band width meets the constraints.
  • the center wavelength of the near-infrared supplement light performed by the first light supplement device is any wavelength within the wavelength range of 750 ⁇ 10 nanometers;
  • the center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 780 ⁇ 10 nanometers; or
  • the center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 940 ⁇ 10 nanometers.
  • the constraints include:
  • the difference between the center wavelength of the near-infrared light passing through the first filter and the center wavelength of the near-infrared light supplemented by the first light supplement device is within the wavelength fluctuation range, and the wavelength fluctuation range is 0-20 nanometers.
  • the constraints include:
  • the half bandwidth of the near-infrared light passing through the first filter is less than or equal to 50 nanometers.
  • the constraints include:
  • the first waveband width is smaller than the second waveband width; where the first waveband width refers to the waveband width of the near-infrared light passing through the first filter, and the second waveband width refers to the near-infrared light that is blocked by the first filter. Band width.
  • constraints are:
  • the third waveband width is smaller than the reference waveband width.
  • the third waveband width refers to the waveband width of near-infrared light whose pass rate is greater than a set ratio.
  • the reference waveband width is any waveband width in the range of 50 nm to 150 nm.
  • the set ratio is any ratio within a ratio range of 30% to 50%.
  • At least one exposure parameter of the first preset exposure and the second preset exposure is different, the at least one exposure parameter is one or more of exposure time, exposure gain, and aperture size, and the exposure gain Including analog gain, and/or, digital gain.
  • the exposure gain of the first preset exposure is smaller than the exposure gain of the second preset exposure.
  • At least one exposure parameter of the first preset exposure and the second preset exposure is the same, and the at least one exposure parameter includes one or more of exposure time, exposure gain, and aperture size,
  • the exposure gain includes analog gain, and/or, digital gain.
  • the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure.
  • the image sensor includes a plurality of light-sensing channels, and each light-sensing channel is used to sense at least one type of light in the visible light waveband and light in the near-infrared waveband.
  • multiple photosensitive channels are used to sense at least two different visible light wavelength bands.
  • the multiple photosensitive channels include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, Y photosensitive channels, W photosensitive channels, and C photosensitive channels;
  • the R photosensitive channel is used to sense the light in the red and near-infrared bands
  • the G photosensitive channel is used to sense the light in the green and near-infrared bands
  • the B photosensitive channel is used to sense the light in the blue and near-infrared bands.
  • the photosensitive channel is used to sense the light in the yellow band and the near-infrared band
  • the W photosensitive channel is used to sense the full band of light
  • the C photosensitive channel is used to sense the full band of light.
  • the image sensor is an RGB sensor, an RGBW sensor, or an RCCB sensor, or an RYYB sensor.
  • the second light supplement device is used to perform visible light supplement light in a constant light mode
  • the second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein there is visible light supplement light at least during a part of the exposure time period of the first preset exposure, and during the entire exposure time period of the second preset exposure There is no visible light fill light; or
  • the second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein at least there is no visible light supplement light during the entire exposure time period of the first preset exposure, and there is no visible light supplement light during the partial exposure time period of the second preset exposure Visible light fill light.
  • the number of times of supplementary light in the unit time length of the first supplementary light device is lower than the number of exposures of the image sensor in the unit time length, wherein, within the interval of every two adjacent times of supplementary light , Interval one or more exposures.
  • the image sensor uses a global exposure method for multiple exposures.
  • the time period of the near-infrared fill light does not exist with the exposure time period of the nearest second preset exposure Intersection, the time period of near-infrared fill light is a subset of the exposure time period of the first preset exposure, or the time period of near-infrared fill light and the exposure time period of the first preset exposure overlap, or the first preset The exposure period of exposure is a subset of the near-infrared fill light.
  • the image sensor adopts rolling shutter exposure for multiple exposures.
  • the time period of near-infrared fill light is different from the exposure time period of the nearest second preset exposure. There is an intersection
  • the start time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure end time of the first line of the effective image in the first preset exposure ;
  • the start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure Exposure end time, the end time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure
  • the start time of the exposure of the effective image is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure Exposure end time
  • the start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure.
  • the exposure start time, the end time of the near-infrared fill light is no earlier than the exposure end time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure. The time at which the exposure of the effective image starts.
  • multiple exposures include odd exposures and even exposures
  • the first preset exposure is one exposure in odd-numbered exposures
  • the second preset exposure is one exposure in even-numbered exposures
  • the first preset exposure is one exposure in an even number of exposures
  • the second preset exposure is one exposure in an odd number of exposures
  • the first preset exposure is one of the specified odd-numbered exposures
  • the second preset exposure is one of the other exposures except the specified odd-numbered exposures
  • the first preset exposure is one of the specified even-numbered exposures
  • the second preset exposure is one of the other exposures except the specified even-numbered exposures
  • the first preset exposure is one exposure in the first exposure sequence
  • the second preset exposure is one exposure in the second exposure sequence
  • the first preset exposure is one exposure in the second exposure sequence
  • the second preset exposure is one exposure in the first exposure sequence
  • the multiple exposure includes multiple exposure sequences
  • the first exposure sequence and the second exposure sequence are one exposure sequence or two exposure sequences among the multiple exposure sequences
  • each exposure sequence includes N exposures
  • N exposures include 1.
  • First preset exposure and N-1 second preset exposure, or N exposures include 1 second preset exposure and N-1 second preset exposure, and N is a positive integer greater than 2.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

The present application provides an image fusion device and method. The device comprises: an image sensor, a light filling device, and a processor. The image sensor is configured to generate and output a first image signal and a second image signal by means of a plurality of exposures. The light filling device is configured to perform near-infrared light filling, wherein the near-infrared light filling exists at least in a part of an exposure period of a first preset exposure and does not exist in an exposure period of a second preset exposure. The processor comprises a buffering unit and an image processing unit. The buffering unit is configured to perform buffering when learning that a first target image signal output by the image sensor needs to be buffered and to perform synchronous output when learning that a buffered second target image signal needs to be synchronously output. The image processing unit is configured to receive the first target image signal currently output by the image sensor, to receive the second target image signal synchronously output by the buffering unit, and to generate a high-quality color fused image according to the first target image signal and the second target image signal.

Description

图像融合设备和方法Image fusion device and method
本申请要求于2019年5月31日提交中国专利局、申请号为2019104735175、申请名称为“图像融合设备和方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office, application number 2019104735175, and application name "Image Fusion Apparatus and Method" on May 31, 2019, the entire content of which is incorporated into this application by reference.
技术领域Technical field
本申请涉及图像处理技术领域,尤其涉及一种图像融合设备和方法。This application relates to the field of image processing technology, and in particular to an image fusion device and method.
背景技术Background technique
在低照度场景下,为了保证所采集的图像涵盖较多的图像信息,通常需要对图像采集设备采集的图像进行图像融合。图像融合是将所采集的同一目标的不同图像的互补信息依据某种准则融合,使融合后的图像具有比参加融合的任意一幅图像更优越的性质,以更精确地反映实际信息。In a low illumination scene, in order to ensure that the collected images cover more image information, it is usually necessary to perform image fusion on the images collected by the image collection device. Image fusion is the fusion of the complementary information of different images of the same target collected according to a certain criterion, so that the fused image has better properties than any image that participates in the fusion, so as to more accurately reflect the actual information.
相关技术中图像融合的方案是:通过单摄像头、分光结构以及两个图像传感器采集可见光图像和非可见光图像,进行配准处理之后,再进行融合生成融合图像。分光结构用于将入射光分解为可见光信号和非可见光信号。上述方案需要两颗图像传感器和复杂的分光结构设计,工艺复杂,成本较高。The image fusion scheme in related technologies is to collect visible light images and non-visible light images through a single camera, a spectroscopic structure, and two image sensors, perform registration processing, and then perform fusion to generate a fused image. The spectroscopic structure is used to decompose incident light into visible light signals and non-visible light signals. The above-mentioned solution requires two image sensors and a complicated spectroscopic structure design, and the process is complicated and the cost is high.
发明内容Summary of the invention
本申请提供一种图像融合设备和方法,简化了图像采集的结构,进而降低成本。The present application provides an image fusion device and method, which simplifies the structure of image acquisition, thereby reducing costs.
第一方面,本申请提供一种图像融合设备,包括:In the first aspect, this application provides an image fusion device, including:
镜头、滤光组件、单个图像传感器、补光器和处理器,所述图像传感器位于所述滤光组件的出光侧;A lens, a filter component, a single image sensor, a light supplement and a processor, the image sensor is located on the light exit side of the filter component;
所述图像传感器,用于通过多次曝光产生并输出第一图像信号和第二图像信号,其中,所述第一图像信号是根据第一预设曝光产生的图像信号,所述第二图像信号是根据第二预设曝光产生的图像信号,所述第一预设曝光和所述第二预设曝光为所述多次曝光中的其中两次曝光;The image sensor is configured to generate and output a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal Is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure being two of the multiple exposures;
所述补光器包括第一补光装置,所述第一补光装置用于进行近红外补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在近红外补光,在所述第二预设曝光的曝光时间段内不存在近红外补光;The light supplementer includes a first light supplement device, and the first light supplement device is used to perform near-infrared supplement light, wherein at least the near-infrared supplement light exists in a partial exposure time period of the first preset exposure, There is no near-infrared fill light in the exposure time period of the second preset exposure;
所述滤光组件包括第一滤光片,所述第一滤光片用于通过可见光波段和部分近红外光;The filter assembly includes a first filter, and the first filter is used to pass visible light and part of near-infrared light;
所述处理器,包括缓存单元和图像处理单元;The processor includes a cache unit and an image processing unit;
所述缓存单元,用于在获知所述图像传感器当前输出的第一目标图像信号需要缓存时,将所述第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,至少将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若所述第一目标图像信号为第一图像信号,所述第二目标图像信号为已缓存的一帧第 二图像信号,或者所述第一目标图像信号为第二图像信号,所述第二目标图像信号为已缓存的一帧第一图像信号;The buffer unit is configured to buffer the first target image signal when it is known that the first target image signal currently output by the image sensor needs to be buffered, and when it is known that the buffered second target image signal needs to be output synchronously At least the buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a second buffered frame An image signal, or the first target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
所述图像处理单元,用于至少接收所述图像传感器当前输出的第一目标图像信号,以及至少接收所述缓存单元同步输出的第二目标图像信号,根据所述第一目标图像信号和所述第二目标图像信号生成彩色融合图像。The image processing unit is configured to receive at least a first target image signal currently output by the image sensor, and at least a second target image signal synchronously output by the buffer unit, according to the first target image signal and the The second target image signal generates a color fusion image.
第二方面,本申请提供一种图像融合设备,包括:In the second aspect, this application provides an image fusion device, including:
镜头、滤光组件、单个图像传感器、补光器和处理器,所述图像传感器位于所述滤光组件的出光侧;A lens, a filter component, a single image sensor, a light supplement and a processor, the image sensor is located on the light exit side of the filter component;
所述图像传感器,用于通过多次曝光产生并输出第一图像信号和第二图像信号,其中,所述第一图像信号是根据第一预设曝光产生的图像信号,所述第二图像信号是根据第二预设曝光产生的图像信号,所述第一预设曝光和所述第二预设曝光为所述多次曝光中的其中两次曝光;The image sensor is configured to generate and output a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal Is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure being two of the multiple exposures;
所述补光器包括第一补光装置,所述第一补光装置用于进行近红外补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在近红外补光,在所述第二预设曝光的曝光时间段内不存在近红外补光;The light supplementer includes a first light supplement device, and the first light supplement device is used to perform near-infrared supplement light, wherein at least the near-infrared supplement light exists in a partial exposure time period of the first preset exposure, There is no near-infrared fill light in the exposure time period of the second preset exposure;
所述滤光组件包括第一滤光片,所述第一滤光片用于通过可见光波段和部分近红外光;The filter assembly includes a first filter, and the first filter is used to pass visible light and part of near-infrared light;
所述处理器,包括缓存单元和图像处理单元;The processor includes a cache unit and an image processing unit;
图像处理单元,用于接收所述图像传感器当前输出的第一目标图像信号,将所述第一目标图像信号预处理后得到第一目标图像,在所述第一目标图像需要缓存时,至少将所述第一目标图像同步输出至所述缓存单元进行缓存,以及在所述缓存单元需要同步输出所述缓存单元已缓存的第二目标图像时,至少接收所述缓存单元同步输出的所述第二目标图像,根据所述第一目标图像和所述第二目标图像生成彩色融合图像;其中,若所述第一目标图像信号为第一图像信号则所述第一目标图像为第一图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后生成的图像,所述第二目标图像信号为所述第二图像信号;若所述第一目标图像信号为第二图像信号,则所述第一目标图像为第二图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后的图像,所述第二目标图像信号为所述第一图像信号;The image processing unit is configured to receive the first target image signal currently output by the image sensor, preprocess the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least The first target image is synchronously output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least the first target image synchronously output by the buffer unit is received Two target images, generating a color fusion image based on the first target image and the second target image; wherein, if the first target image signal is a first image signal, the first target image is a first image signal An image generated after preprocessing, the second target image is a buffered frame of an image generated after preprocessing of a second target image signal, and the second target image signal is the second image signal; If the first target image signal is a second image signal, the first target image is an image generated after preprocessing of the second image signal, and the second target image is a buffered frame preprocessed by the second target image signal After the image, the second target image signal is the first image signal;
缓存单元,用于在获知所述第一目标图像需要缓存时,至少将所述图像处理单元同步输出的所述第一目标图像进行缓存,以及在获知需要同步输出已缓存的第二目标图像时,至少将已缓存的第二目标图像信号同步输出至所述图像处理单元。The cache unit is configured to cache at least the first target image synchronously output by the image processing unit when it is known that the first target image needs to be cached, and when it is known that the cached second target image needs to be output synchronously At least synchronously outputting the buffered second target image signal to the image processing unit.
第三方面,本申请实施例提供一种图像融合方法,应用于图像融合设备,所述图像融合设备包括图像传感器、补光器、滤光组件和处理器,所述图像传感器位于所述滤光组件的出光侧,所述补光器包括第一补光装置,所述滤光组件包括第一滤光片,所述处理器包括:缓存单元和图像处理单元,所述方法包括:In a third aspect, an embodiment of the present application provides an image fusion method, which is applied to an image fusion device. The image fusion device includes an image sensor, a light fill, a filter component, and a processor, and the image sensor is located in the filter. On the light emitting side of the component, the light supplement includes a first light supplement device, the filter component includes a first filter, the processor includes a buffer unit and an image processing unit, and the method includes:
通过所述第一补光装置进行近红外补光,其中,至少在第一预设曝光的部分曝光时间段内进行近红外补光,在第二预设曝光的曝光时间段内不进行近红外补光,所述第一预设曝光和所述第二预设曝光为图像传感器的多次曝光中的其中两次曝光;The near-infrared light-filling is performed by the first light-filling device, wherein the near-infrared light-filling is performed at least during a partial exposure time period of the first preset exposure, and the near-infrared light is not performed during the exposure time period of the second preset exposure Fill light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
通过所述第一滤光片使可见光波段的光和部分近红外光通过;Passing visible light and part of near-infrared light through the first filter;
通过所述图像传感器采用全局曝光方式进行多次曝光,以产生并输出第一图像信号和第二图像信号,所述第一图像信号是根据所述第一预设曝光产生的图像信号,所述第二图像信号是根据所述第二预设曝光产生的图像信号;The image sensor adopts a global exposure mode to perform multiple exposures to generate and output a first image signal and a second image signal, the first image signal is an image signal generated according to the first preset exposure, the The second image signal is an image signal generated according to the second preset exposure;
通过所述缓存单元在获知所述图像传感器当前输出的第一目标图像信号需要缓存时,将所述第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,至少将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若所述第一目标图像信号为第一图像信号,所述第二目标图像信号为已缓存的一帧第二图像信号,或者所述第一目标图像信号为第二图像信号,所述第二目标图像信号为已缓存的一帧第一图像信号;When the buffer unit knows that the first target image signal currently output by the image sensor needs to be buffered, the first target image signal is buffered, and when it is known that the buffered second target image signal needs to be output synchronously, At least the buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a frame of the buffered second image signal , Or the first target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
通过所述图像处理单元至少接收所述图像传感器当前输出的第一目标图像信号,以及至少接收所述缓存单元同步输出的第二目标图像信号,根据所述第一目标图像信号和所述第二目标图像信号生成彩色融合图像。The image processing unit receives at least the first target image signal currently output by the image sensor, and at least the second target image signal synchronously output by the buffer unit, according to the first target image signal and the second target image signal. The target image signal generates a color fusion image.
第四方面,本申请实施例提供一种图像融合方法,应用于图像融合设备,所述图像融合设备包括图像传感器、补光器、滤光组件和处理器,所述图像传感器位于所述滤光组件的出光侧,所述补光器包括第一补光装置,所述滤光组件包括第一滤光片,所述处理器包括:缓存单元和图像处理单元,所述方法包括:In a fourth aspect, an embodiment of the present application provides an image fusion method, which is applied to an image fusion device. The image fusion device includes an image sensor, a light fill, a filter component, and a processor, and the image sensor is located in the filter. On the light emitting side of the component, the light supplement includes a first light supplement device, the filter component includes a first filter, the processor includes a buffer unit and an image processing unit, and the method includes:
通过所述第一补光装置进行近红外补光,其中,至少在第一预设曝光的部分曝光时间段内进行近红外补光,在第二预设曝光的曝光时间段内不进行近红外补光,所述第一预设曝光和所述第二预设曝光为图像传感器的多次曝光中的其中两次曝光;The near-infrared light-filling is performed by the first light-filling device, wherein the near-infrared light-filling is performed at least during a partial exposure time period of the first preset exposure, and the near-infrared light is not performed during the exposure time period of the second preset exposure Fill light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
通过所述第一滤光片使可见光波段的光和部分近红外光通过;Passing visible light and part of near-infrared light through the first filter;
通过所述图像传感器采用全局曝光方式进行多次曝光,以产生并输出第一图像信号和第二图像信号,所述第一图像信号是根据所述第一预设曝光产生的图像信号,所述第二图像信号是根据所述第二预设曝光产生的图像信号;The image sensor adopts a global exposure mode to perform multiple exposures to generate and output a first image signal and a second image signal, the first image signal is an image signal generated according to the first preset exposure, the The second image signal is an image signal generated according to the second preset exposure;
通过所述图像处理单元接收所述图像传感器当前输出的第一目标图像信号,将所述第一目标图像信号预处理后得到第一目标图像,在所述第一目标图像需要缓存时,至少将所述第一目标图像同步输出至所述缓存单元进行缓存,以及在所述缓存单元需要同步输出所述缓存单元已缓存的第二目标图像时,至少接收所述缓存单元同步输出的所述第二目标图像,根据所述第一目标图像和所述第二目标图像生成彩色融合图像;其中,若所述第一目标图像信号为第一图像信号则所述第一目标图像为第一图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后生成的图像,所述第二目标图像信号为所述第二图像信号;若所述第一目标图像信号为第二图像信号,则所述第一目标图像为第二图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后的图像,所述第二目标图像信号为所述第一图像信号;The image processing unit receives the first target image signal currently output by the image sensor, and preprocesses the first target image signal to obtain the first target image. When the first target image needs to be cached, at least The first target image is synchronously output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least the first target image synchronously output by the buffer unit is received Two target images, generating a color fusion image based on the first target image and the second target image; wherein, if the first target image signal is a first image signal, the first target image is a first image signal An image generated after preprocessing, the second target image is a buffered frame of an image generated after preprocessing of a second target image signal, and the second target image signal is the second image signal; If the first target image signal is a second image signal, the first target image is an image generated after preprocessing of the second image signal, and the second target image is a buffered frame preprocessed by the second target image signal After the image, the second target image signal is the first image signal;
通过所述缓存单元在获知所述第一目标图像需要缓存时,至少将所述图像处理单元同步输出的所述第一目标图像进行缓存,以及在获知需要同步输出已缓存的第二目标图像时,至少将已缓存的第二目标图像信号同步输出至所述图像处理单元。When the cache unit knows that the first target image needs to be cached, at least the first target image output synchronously by the image processing unit is cached, and when it is known that the cached second target image needs to be output synchronously At least synchronously outputting the buffered second target image signal to the image processing unit.
本申请实施例提供的图像融合设备和方法,该图像融合设备包括:滤光组件、单个图 像传感器、补光器和处理器,所述图像传感器,用于通过多次曝光产生并输出第一图像信号和第二图像信号,其中,所述第一图像信号是根据第一预设曝光产生的图像信号,所述第二图像信号是根据第二预设曝光产生的图像信号,所述补光器用于进行近红外补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在近红外补光,在所述第二预设曝光的曝光时间段内不存在近红外补光;所述滤光组件包括第一滤光片,所述第一滤光片用于通过可见光波段和部分近红外光;所述处理器,包括缓存单元和图像处理单元;所述缓存单元,用于在获知所述图像传感器当前输出的第一目标图像信号需要缓存时,将所述第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若所述第一目标图像信号为第一图像信号,所述第二目标图像信号为已缓存的一帧第二图像信号,或者所述第一目标图像信号为第二图像信号,所述第二目标图像信号为已缓存的一帧第一图像信号;所述图像处理单元,用于接收所述图像传感器当前输出的第一目标图像信号,以及接收所述缓存单元同步输出的第二目标图像信号,根据所述第一目标图像信号和所述第二目标图像信号生成彩色融合图像,上述方案中图像采集的结构简单,可以降低成本,而且可以在任一时间段内通过第一预设曝光和第二预设曝光同时采集到包含近红外光信息的第一图像信号和包含可见光信息的第二图像信号,后续根据该第一图像信号和第二图像信号进行融合处理,得到的彩色融合图像的质量较高。The image fusion device and method provided by the embodiments of the present application include: a filter component, a single image sensor, a light supplement, and a processor, and the image sensor is used to generate and output a first image through multiple exposures Signal and a second image signal, wherein the first image signal is an image signal generated according to a first preset exposure, the second image signal is an image signal generated according to a second preset exposure, and the light complement is used Performing near-infrared supplementary light, wherein at least there is near-infrared supplementary light during a part of the exposure time period of the first preset exposure, and there is no near-infrared supplementary light during the exposure time period of the second preset exposure; The filter assembly includes a first filter, and the first filter is used to pass visible light and part of the near-infrared light; the processor includes a buffer unit and an image processing unit; the buffer unit is used for When it is known that the first target image signal currently output by the image sensor needs to be buffered, the first target image signal is buffered, and when it is known that the buffered second target image signal needs to be output synchronously, the buffered first The two target image signals are synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a buffered frame of the second image signal, or the first The target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal; the image processing unit is configured to receive the first target image signal currently output by the image sensor, and Receive the second target image signal synchronously output by the buffer unit, and generate a color fusion image according to the first target image signal and the second target image signal. In the above solution, the structure of image acquisition is simple, which can reduce the cost and can In any period of time, the first image signal containing near-infrared light information and the second image signal containing visible light information are simultaneously collected through the first preset exposure and the second preset exposure, and subsequently according to the first image signal and the second image signal. The image signal undergoes fusion processing, and the quality of the color fusion image obtained is higher.
附图说明Description of the drawings
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。The drawings herein are incorporated into the specification and constitute a part of the specification, show embodiments in accordance with the disclosure, and together with the specification are used to explain the principle of the disclosure.
图1是本申请实施例提供的第一种图像采集装置的结构示意图;Fig. 1 is a schematic structural diagram of a first image acquisition device provided by an embodiment of the present application;
图2是本申请实施例提供的一种图像融合设备的结构示意图;FIG. 2 is a schematic structural diagram of an image fusion device provided by an embodiment of the present application;
图3是本申请实施例提供的另一种图像融合设备的结构示意图;Figure 3 is a schematic structural diagram of another image fusion device provided by an embodiment of the present application;
图4是本申请实施例提供的一种图像处理单元的结构示意图;Fig. 4 is a schematic structural diagram of an image processing unit provided by an embodiment of the present application;
图5是本申请实施例提供的一种图像缓存原理示意图;FIG. 5 is a schematic diagram of a principle of image caching provided by an embodiment of the present application;
图6是本申请实施例提供的一种图像预处理单元的结构示意图;FIG. 6 is a schematic structural diagram of an image preprocessing unit provided by an embodiment of the present application;
图7是本申请实施例提供的一种图像缓存同步原理示意图;FIG. 7 is a schematic diagram of an image cache synchronization principle provided by an embodiment of the present application;
图8是本申请实施例提供的另一种图像缓存同步原理示意图;FIG. 8 is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图9A是本申请实施例提供的又一种图像缓存同步原理示意图;9A is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图9B是本申请实施例提供的又一种图像缓存同步原理示意图;9B is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图9C是本申请实施例提供的又一种图像缓存同步原理示意图;9C is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图10是本申请实施例提供的另一种图像融合设备的结构示意图;FIG. 10 is a schematic structural diagram of another image fusion device provided by an embodiment of the present application;
图11是本申请实施例提供的又一种图像融合设备的结构示意图;FIG. 11 is a schematic structural diagram of yet another image fusion device provided by an embodiment of the present application;
图12是本申请实施例提供的另一种图像缓存原理示意图;FIG. 12 is a schematic diagram of another image caching principle provided by an embodiment of the present application;
图13是本申请实施例提供的又一种图像缓存同步原理示意图;FIG. 13 is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图14是本申请实施例提供的又一种图像缓存同步原理示意图;FIG. 14 is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图15A是本申请实施例提供的又一种图像缓存同步原理示意图;15A is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图15B是本申请实施例提供的又一种图像缓存同步原理示意图;15B is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application;
图15C是本申请实施例提供的又一种图像缓存同步原理示意图15C is a schematic diagram of another image cache synchronization principle provided by an embodiment of the present application
图16是本申请实施例提供的一种图像融合处理原理示意图;FIG. 16 is a schematic diagram of an image fusion processing principle provided by an embodiment of the present application;
图17是本申请实施例提供的一种第一补光装置进行近红外补光的波长和相对强度之间的关系示意图;FIG. 17 is a schematic diagram of the relationship between the wavelength and relative intensity of near-infrared supplement light performed by a first light supplement device provided by an embodiment of the present application;
图18是本申请实施例提供的一种第一滤光片通过的光线的波长与通过率之间的关系示意图;FIG. 18 is a schematic diagram of the relationship between the wavelength of the light passing through the first filter and the pass rate according to an embodiment of the present application;
图19是本申请实施例提供的第二种图像采集装置的结构示意图;FIG. 19 is a schematic structural diagram of a second image acquisition device provided by an embodiment of the present application;
图20是本申请实施例提供的一种RGB传感器的示意图;FIG. 20 is a schematic diagram of an RGB sensor provided by an embodiment of the present application;
图21是本申请实施例提供的一种RGBW传感器的示意图;FIG. 21 is a schematic diagram of an RGBW sensor provided by an embodiment of the present application;
图22是本申请实施例提供的一种RCCB传感器的示意图;FIG. 22 is a schematic diagram of an RCCB sensor provided by an embodiment of the present application;
图23是本申请实施例提供的一种RYYB传感器的示意图;FIG. 23 is a schematic diagram of a RYYB sensor provided by an embodiment of the present application;
图24是本申请实施例提供的一种图像传感器的感应曲线示意图;FIG. 24 is a schematic diagram of a sensing curve of an image sensor according to an embodiment of the present application;
图25是本申请实施例提供的一种卷帘曝光方式的示意图;25 is a schematic diagram of a rolling shutter exposure method provided by an embodiment of the present application;
图26是本申请实施例提供的第一种第一预设曝光和第二预设曝光的示意图;FIG. 26 is a schematic diagram of a first preset exposure and a second preset exposure according to an embodiment of the present application;
图27是本申请实施例提供的第二种第一预设曝光和第二预设曝光的示意图;FIG. 27 is a schematic diagram of a second type of first preset exposure and a second preset exposure provided by an embodiment of the present application;
图28是本申请实施例提供的第三种第一预设曝光和第二预设曝光的示意图;FIG. 28 is a schematic diagram of a third type of first preset exposure and a second preset exposure provided by an embodiment of the present application;
图29是本申请实施例提供的第一种卷帘曝光方式和近红外补光的示意图;FIG. 29 is a schematic diagram of the first rolling shutter exposure method and near-infrared light supplement provided by an embodiment of the present application;
图30是本申请实施例提供的第二种卷帘曝光方式和近红外补光的示意图;FIG. 30 is a schematic diagram of a second rolling shutter exposure method and near-infrared light supplement provided by an embodiment of the present application;
图31是本申请实施例提供的第三种卷帘曝光方式和近红外补光的示意图;FIG. 31 is a schematic diagram of a third rolling shutter exposure method and near-infrared light supplement provided by an embodiment of the present application;
图32是本申请实施例提供的第一种联合降噪单元的结构示意图;FIG. 32 is a schematic structural diagram of a first joint noise reduction unit provided by an embodiment of the present application;
图33是本申请实施例提供的第二种联合降噪单元的结构示意图;FIG. 33 is a schematic structural diagram of a second joint noise reduction unit provided by an embodiment of the present application;
图34是本申请实施例提供的第三种联合降噪单元的结构示意图;FIG. 34 is a schematic structural diagram of a third type of joint noise reduction unit provided by an embodiment of the present application;
图35是本申请实施例提供的一种图像融合方法流程示意图;FIG. 35 is a schematic flowchart of an image fusion method provided by an embodiment of the present application;
图36是本申请实施例提供的一种图像融合方法流程示意图。FIG. 36 is a schematic flowchart of an image fusion method provided by an embodiment of the present application.
附图标记说明:Description of reference signs:
01:图像传感器,02:补光器,03:滤光组件;04:镜头;01: Image sensor, 02: Filler, 03: Filter component; 04: Lens;
021:第一补光装置,022:第二补光装置,031:第一滤光片,032:第二滤光片,033:切换部件。021: the first light supplement device, 022: the second light supplement device, 031: the first filter, 032: the second filter, 033: the switching component.
通过上述附图,已示出本公开明确的实施例,后文中将有更详细的描述。这些附图和文字描述并不是为了通过任何方式限制本公开构思的范围,而是通过参考特定实施例为本领域技术人员说明本公开的概念。Through the above-mentioned drawings, the specific embodiments of the present disclosure have been shown, which will be described in more detail below. The drawings and text description are not intended to limit the scope of the concept of the present disclosure in any way, but to explain the concept of the present disclosure for those skilled in the art by referring to specific embodiments.
具体实施方式Detailed ways
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。Here, exemplary embodiments will be described in detail, and examples thereof are shown in the accompanying drawings. When the following description refers to the drawings, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements. The implementation manners described in the following exemplary embodiments do not represent all implementation manners consistent with the present disclosure. Rather, they are merely examples of devices and methods consistent with some aspects of the present disclosure as detailed in the appended claims.
本申请的说明书和权利要求书及所述附图中的术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、 ***、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。The terms "including" and "having" in the specification and claims of the application and the drawings and any variations thereof are intended to cover non-exclusive inclusions. For example, a process, method, system, product, or device that includes a series of steps or units is not limited to the listed steps or units, but optionally includes unlisted steps or units, or optionally also includes Other steps or units inherent to these processes, methods, products or equipment.
图1是本申请实施例提供的一种图像采集装置的结构示意图,参见图1,该图像采集装置包括图像传感器01、补光器02和滤光组件03,图像传感器01位于滤光组件03的出光侧。图像传感器01用于通过多次曝光产生并输出第一图像信号和第二图像信号。其中,第一图像信号是根据第一预设曝光产生的图像信号,第二图像信号是根据第二预设曝光产生的图像信号,第一预设曝光和第二预设曝光为该多次曝光中的其中两次曝光。补光器02包括第一补光装置021,第一补光装置021用于进行近红外补光,其中,至少在第一预设曝光的部分曝光时间段内存在近红外补光,在第二预设曝光的曝光时间段内不存在近红外补光。滤光组件03包括第一滤光片031,第一滤光片031使可见光波段的光和部分近红外光通过,其中,第一补光装置021进行近红外光补光时通过第一滤光片031的近红外光的强度高于第一补光装置021未进行近红外补光时通过第一滤光片031的近红外光的强度。通过第一滤光片(031)的近红外光波段可以是部分近红外光波段。FIG. 1 is a schematic structural diagram of an image acquisition device provided by an embodiment of the present application. Referring to FIG. 1, the image acquisition device includes an image sensor 01, a light supplement 02, and a filter component 03. The image sensor 01 is located on the filter component 03 The light side. The image sensor 01 is used to generate and output a first image signal and a second image signal through multiple exposures. Wherein, the first image signal is an image signal generated according to a first preset exposure, the second image signal is an image signal generated according to a second preset exposure, and the first preset exposure and the second preset exposure are the multiple exposures Two of the exposures. The light supplement 02 includes a first light supplement device 021. The first light supplement device 021 is used to perform near-infrared supplement light, wherein at least there is near-infrared supplement light during a partial exposure period of the first preset exposure, and the second There is no near-infrared fill light in the exposure time period of the preset exposure. The filter assembly 03 includes a first filter 031. The first filter 031 allows light in the visible light band and part of the near-infrared light to pass. The first light supplement device 021 passes through the first filter when performing near-infrared light supplementation. The intensity of the near-infrared light of the sheet 031 is higher than the intensity of the near-infrared light passing through the first filter 031 when the first light supplement device 021 does not perform near-infrared light supplementation. The near-infrared light band passing through the first filter (031) may be part of the near-infrared light band.
在本申请一实施例中,参见图2,该图像融合设备可以包括图像采集装置,即包括:图像传感器01、补光器02和滤光组件03,以及处理器,处理器包括:缓存单元和图像处理单元。In an embodiment of the present application, referring to FIG. 2, the image fusion device may include an image acquisition device, that is, an image sensor 01, a light supplement 02, and a filter component 03, and a processor. The processor includes: a buffer unit and Image processing unit.
缓存单元,用于在获知图像传感器当前输出的第一目标图像信号需要缓存时,将第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,至少将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若第一目标图像信号为第一图像信号,第二目标图像信号为已缓存的一帧第二图像信号,或者第一目标图像信号为第二图像信号,第二目标图像信号为已缓存的一帧第一图像信号;The buffering unit is used to buffer the first target image signal when it is known that the first target image signal currently output by the image sensor needs to be buffered, and when it is known that the buffered second target image signal needs to be output synchronously, at least the buffered The second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a buffered frame of the second image signal, or the first target image signal is A second image signal, where the second target image signal is a buffered frame of the first image signal;
图像处理单元,用于至少接收图像传感器当前输出的第一目标图像信号,以及至少接收缓存单元同步输出的第二目标图像信号,根据第一目标图像信号和第二目标图像信号生成彩色融合图像。The image processing unit is configured to receive at least the first target image signal currently output by the image sensor, and at least the second target image signal synchronously output by the buffer unit, and generate a color fusion image according to the first target image signal and the second target image signal.
进一步的,参见图3,处理器还可以包括:同步单元;Further, referring to FIG. 3, the processor may further include: a synchronization unit;
同步单元用于确定图像传感器当前输出的第一目标图像信号需要缓存时,指示缓存单元将第一目标图像信号进行缓存,以及从已缓存的图像信号中确定需要同步输出第二目标图像信号时,指示缓存单元将第二目标图像信号同步输出至图像处理单元。The synchronization unit is used to determine that when the first target image signal currently output by the image sensor needs to be buffered, instruct the buffer unit to buffer the first target image signal, and when it is determined from the buffered image signals that the second target image signal needs to be output synchronously, Instruct the buffer unit to synchronously output the second target image signal to the image processing unit.
参见图4,图像处理单元可以包括:图像预处理单元和图像融合单元;4, the image processing unit may include: an image preprocessing unit and an image fusion unit;
图像预处理单元,用于将第一目标图像信号经预处理后生成第一目标图像,将第二目标图像信号经预处理后生成第二目标图像;An image preprocessing unit, configured to preprocess the first target image signal to generate a first target image, and preprocess the second target image signal to generate a second target image;
图像融合单元,用于将第一目标图像和第二目标图像进行融合处理,得到彩色融合图像。The image fusion unit is used to perform fusion processing on the first target image and the second target image to obtain a color fusion image.
其中,若第一目标图像信号为第一图像信号时,预处理后生成的第一目标图像为黑白图像,第二目标图像信号为第二图像信号时,预处理后生成的第二目标图像为彩色图像。Wherein, if the first target image signal is the first image signal, the first target image generated after preprocessing is a black and white image, and when the second target image signal is the second image signal, the second target image generated after preprocessing is Color image.
若第一目标图像信号为第二图像信号时,预处理后生成的第一目标图像为彩色图像,第二目标图像信号为第一图像信号时,预处理后生成的第二目标图像为黑白图像。If the first target image signal is the second image signal, the first target image generated after preprocessing is a color image, and when the second target image signal is the first image signal, the second target image generated after preprocessing is a black and white image .
具体的,参见图5,本实施例中,将图像传感器先输出的第一目标图像信号存入缓存,待图像传感器输出第二目标图像信号后输出至图像处理单元,实现第一目标图像信号同第二目标图像信号之间的同步,然后再通过图像处理单元的图像预处理单元以及图像融合单元进行处理。Specifically, referring to FIG. 5, in this embodiment, the first target image signal output by the image sensor first is stored in the buffer, and the second target image signal is output to the image processing unit after the image sensor outputs the second target image signal to realize the same The synchronization between the second target image signals is then processed by the image preprocessing unit and the image fusion unit of the image processing unit.
进一步的,参见图6,图像预处理单元,包括:第一预处理单元、第二预处理单元和联合降噪单元;Further, referring to FIG. 6, the image preprocessing unit includes: a first preprocessing unit, a second preprocessing unit, and a joint noise reduction unit;
第一预处理单元,用于对第一目标图像信号进行第一预处理操作,得到预处理后的第一目标图像;The first preprocessing unit is configured to perform a first preprocessing operation on the first target image signal to obtain a preprocessed first target image;
第二预处理单元,用于对第二目标图像信号进行第二预处理操作,得到第二目标图像;A second preprocessing unit, configured to perform a second preprocessing operation on the second target image signal to obtain a second target image;
联合降噪单元,用于对第一目标图像和第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,该降噪后的第一目标图像和第二目标图像用于进行融合处理,得到彩色融合图像。The joint noise reduction unit is used for filtering the first target image and the second target image to obtain the first target image and the second target image after noise reduction, the first target image and the second target image after noise reduction Used for fusion processing to obtain a color fusion image.
其中,第一预处理操作包括以下至少一项:图像插值、伽马映射和色彩转换;第二预处理操作包括以下至少一项:白平衡、图像插值和伽马映射。The first preprocessing operation includes at least one of the following: image interpolation, gamma mapping, and color conversion; the second preprocessing operation includes at least one of the following: white balance, image interpolation, and gamma mapping.
在本申请一实施例中,参见图7,缓存单元可以在一个帧周期分别存入第一目标图像信号和取出第二目标图像信号,具体方案如下:In an embodiment of the present application, referring to FIG. 7, the buffer unit may respectively store the first target image signal and fetch the second target image signal in one frame period. The specific scheme is as follows:
同步单元,用于确定每一帧第一目标图像信号需要缓存,并且需要同步输出第二目标图像信号,该第二目标图像信号为缓存单元前一次缓存的图像信号;The synchronization unit is used to determine that the first target image signal of each frame needs to be buffered, and the second target image signal needs to be output synchronously, and the second target image signal is the image signal buffered by the buffer unit last time;
其中,若第一目标图像信号为第二图像信号,则缓存单元当前缓存第二图像信号,并将前一次缓存的第一图像信号确定为第二目标图像信号输出至图像预处理单元;Wherein, if the first target image signal is the second image signal, the buffer unit currently buffers the second image signal, and determines that the first image signal buffered last time is the second target image signal and outputs it to the image preprocessing unit;
若第一目标图像信号为第一图像信号,则缓存单元当前缓存第一图像信号,并将前一次缓存的第二图像信号确定为第二目标图像信号输出至图像预处理单元。If the first target image signal is the first image signal, the buffer unit currently buffers the first image signal, and determines the second image signal buffered last time as the second target image signal and outputs it to the image preprocessing unit.
其中,图像传感器可以交替输出第一图像信号和第二图像信号,或,间隔几个第一图像信号后输出第二图像信号等方式输出第一图像信号和第二图像信号,本申请实施例中对此并不限定。Among them, the image sensor can alternately output the first image signal and the second image signal, or output the first image signal and the second image signal in a manner such as outputting the second image signal after a few first image signals. In the embodiment of the present application This is not limited.
图7中以图像传感器交替输出第一图像信号和第二图像信号为例进行说明,图7中图像传感器在输出第二图像信号M-2时,同步单元指示缓存单元存储该第二图像信号M-2,并将前一次缓存的第一图像信号M-3从缓存单元中输出,此时图像处理单元将第二图像信号M-2和第一图像信号M-3进行融合处理,得到彩色融合图像;图像传感器在输出第一图像信号M-1时,同步单元指示缓存单元存储该第一图像信号M-1,并将前一次缓存的第二图像信号M-2从缓存单元输出;图像传感器在输出第二图像信号M时,同步单元指示缓存单元存储该第二图像信号M,并将前一次缓存的第一图像信号M-1从缓存单元输出,依此类推。Fig. 7 takes the image sensor alternately outputting the first image signal and the second image signal as an example for description. In Fig. 7, when the image sensor outputs the second image signal M-2, the synchronization unit instructs the buffer unit to store the second image signal M. -2, and output the first image signal M-3 buffered last time from the buffer unit. At this time, the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M-3 to obtain color fusion Image; when the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to store the first image signal M-1, and outputs the second image signal M-2 buffered last time from the buffer unit; image sensor When outputting the second image signal M, the synchronization unit instructs the buffer unit to store the second image signal M, and outputs the first image signal M-1 buffered previously from the buffer unit, and so on.
在本申请另一实施例中,参见图8、图9A、图9B、图9C,缓存单元可以在不同帧周期分别存入第一目标图像信号和取出第二目标图像信号,具体方案如下:In another embodiment of the present application, referring to FIG. 8, FIG. 9A, FIG. 9B, and FIG. 9C, the buffer unit may store the first target image signal and fetch the second target image signal in different frame periods, respectively. The specific solution is as follows:
同步单元,用于确定第一目标图像信号为第一图像信号时需要缓存,以及在确定第一目标图像信号为第二图像信号时,需要同步输出第二目标图像信号,第二目标图像信号为缓存单元已缓存的图像信号中最近一次缓存的第一图像信号;其中,若第一目标图像信号为第二图像信号,则缓存单元将最近一次缓存的第一图像信号确定为第二目标图像信号输 出至图像预处理单元;若第一目标图像信号为第一图像信号,则缓存单元缓存第一图像信号;或者,The synchronization unit is used to determine that the first target image signal is the first image signal and need to be buffered, and when it is determined that the first target image signal is the second image signal, the second target image signal needs to be output synchronously, and the second target image signal is The first image signal buffered most recently among the image signals buffered by the buffer unit; wherein, if the first target image signal is the second image signal, the buffer unit determines the first image signal buffered most recently as the second target image signal Output to the image preprocessing unit; if the first target image signal is the first image signal, the buffer unit buffers the first image signal; or,
同步单元,用于确定第一目标图像信号为第二图像信号时需要缓存,以及在确定第一目标图像信号为第一图像信号时,需要同步输出第二目标图像信号,第二目标图像信号为缓存单元已缓存的第二图像信号中最近一次缓存的第二图像信号;其中,若第一目标图像信号为第一图像信号,则缓存单元将最近一次缓存的第二图像信号确定为第二目标图像信号输出至图像预处理单元;若第一目标图像信号为第二图像信号,则缓存单元缓存第二图像信号。The synchronization unit is used to determine that the first target image signal is the second image signal and need to be buffered, and when it is determined that the first target image signal is the first image signal, the second target image signal needs to be output synchronously, and the second target image signal is The second image signal buffered last time among the second image signals buffered by the buffer unit; wherein, if the first target image signal is the first image signal, the buffer unit determines the second image signal buffered last time as the second target The image signal is output to the image preprocessing unit; if the first target image signal is the second image signal, the buffer unit buffers the second image signal.
图8中以图像传感器交替输出第一图像信号和第二图像信号为例进行说明,图8中图像传感器在输出第二图像信号M-2时,同步单元指示缓存单元将最近一次缓存的第一图像信号M-3从缓存单元输出,此时图像处理单元将第二图像信号M-2和第一图像信号M-3进行融合处理,得到彩色融合图像;图像传感器在输出第一图像信号M-1时,同步单元指示缓存单元存储该第一图像信号M-1,此时图像处理单元不进行处理;图像传感器在输出第二图像信号M时,同步单元指示缓存单元将最近一次缓存的第一图像信号M-1从缓存单元输出,此时图像处理单元将第二图像信号M和第一图像信号M-1进行融合处理,得到彩色融合图像,依此类推。Fig. 8 takes the image sensor alternately outputting the first image signal and the second image signal as an example for illustration. In Fig. 8, when the image sensor outputs the second image signal M-2, the synchronization unit instructs the buffer unit to buffer the last first image signal. The image signal M-3 is output from the buffer unit. At this time, the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M-3 to obtain a color fusion image; the image sensor is outputting the first image signal M- At 1:00, the synchronization unit instructs the buffer unit to store the first image signal M-1, and the image processing unit does not perform processing at this time; when the image sensor outputs the second image signal M, the synchronization unit instructs the buffer unit to store the first image signal M-1 most recently The image signal M-1 is output from the buffer unit. At this time, the image processing unit performs fusion processing on the second image signal M and the first image signal M-1 to obtain a color fusion image, and so on.
图8中是以仅将第一图像信号进行缓存为例进行说明,仅将第二图像信号进行缓存与图8类似,此处不再赘述。In FIG. 8, only the first image signal is cached as an example for description, and only the second image signal is cached is similar to FIG. 8, and will not be repeated here.
参见图9A,图像传感器间隔两个第一图像信号输出一个第二图像信号,缓存单元仅缓存第二图像信号,图9A中图像传感器在输出第二图像信号M-2时,同步单元指示缓存单元将该第二图像信号M-2进行缓存,此时图像处理单元不进行处理;图像传感器在输出第一图像信号M-1时,同步单元指示缓存单元将最近一次缓存的第二图像信号M-2输出,图像处理单元将第二图像信号M-2和第一图像信号M-1进行融合处理,得到彩色融合图像;图像传感器在输出第一图像信号M时,同步单元指示缓存单元将最近一次缓存的第二图像信号M-2输出,此时图像处理单元将第二图像信号M-2和第一图像信号M进行融合处理,得到彩色融合图像,依此类推。Referring to FIG. 9A, the image sensor outputs a second image signal at intervals of two first image signals, and the buffer unit only buffers the second image signal. In FIG. 9A, when the image sensor outputs the second image signal M-2, the synchronization unit instructs the buffer unit The second image signal M-2 is buffered, and the image processing unit does not perform processing at this time; when the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to buffer the last second image signal M- 2 output, the image processing unit fuses the second image signal M-2 and the first image signal M-1 to obtain a color fusion image; when the image sensor outputs the first image signal M, the synchronization unit instructs the buffer unit to The buffered second image signal M-2 is output. At this time, the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M to obtain a color fusion image, and so on.
进一步的,在本申请的其他实施例中,对于第一图像信号来说,可以不是每一帧都进行缓存,也可以是间隔几个第一图像信号再存储,参见图9B,图像传感器在输出第二图像信号M-2时,同步单元指示缓存单元将最近一次缓存的第一图像信号M-5从缓存单元输出,此时图像处理单元将第二图像信号M-2和第一图像信号M-5进行融合处理,得到彩色融合图像;图像传感器在输出第一图像信号M-1时,同步单元指示缓存单元存储该第一图像信号M-1,此时图像处理单元不进行处理;图像传感器在输出第二图像信号M时,同步单元指示缓存单元将最近一次缓存的第一图像信号M-1从缓存单元输出,此时图像处理单元将第二图像信号M和第一图像信号M-1进行融合处理,得到彩色融合图像,图像传感器在输出第一图像信号M+1时,缓存单元和图像处理单元均不处理,图像传感器在输出第二图像信号M+2时,同步单元指示缓存单元将最近一次缓存的第一图像信号M-1从缓存单元输出,此时图像处理单元将第二图像信号M+2和第一图像信号M-1进行融合处理,得到彩色融合图像,依此类推。Further, in other embodiments of the present application, for the first image signal, the first image signal may not be buffered in every frame, or it may be stored at intervals of several first image signals. See FIG. 9B, the image sensor is outputting When the second image signal M-2 is used, the synchronization unit instructs the buffer unit to output the most recently buffered first image signal M-5 from the buffer unit. At this time, the image processing unit combines the second image signal M-2 with the first image signal M. -5 Perform fusion processing to obtain a color fusion image; when the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to store the first image signal M-1, and the image processing unit does not perform processing at this time; image sensor When outputting the second image signal M, the synchronization unit instructs the buffer unit to output the most recently buffered first image signal M-1 from the buffer unit. At this time, the image processing unit combines the second image signal M and the first image signal M-1. Perform fusion processing to obtain a color fusion image. When the image sensor outputs the first image signal M+1, neither the buffer unit nor the image processing unit processes it. When the image sensor outputs the second image signal M+2, the synchronization unit instructs the buffer unit Output the last buffered first image signal M-1 from the buffer unit. At this time, the image processing unit performs fusion processing on the second image signal M+2 and the first image signal M-1 to obtain a color fusion image, and so on .
进一步的,在本申请的其他实施例中,可以在一个帧周期同步输出第一目标图像信号 和第二目标图像信号,具体方案如下:Further, in other embodiments of the present application, the first target image signal and the second target image signal can be synchronously output in one frame period. The specific solution is as follows:
参见图9C,同步单元,用于确定每一帧第一目标图像信号需要缓存,并且需要同步输出最近一次缓存的第二目标图像信号和最近一次缓存的第一目标图像信号;Referring to FIG. 9C, the synchronization unit is used to determine that the first target image signal of each frame needs to be buffered, and needs to synchronously output the second target image signal buffered last time and the first target image signal buffered last time;
其中,若第一目标图像信号为第二图像信号,则缓存单元当前缓存第二图像信号,并将最近一次缓存的第一图像信号和最近一次缓存的第二图像信号输出;Wherein, if the first target image signal is the second image signal, the buffering unit currently buffers the second image signal, and outputs the most recently buffered first image signal and the most recently buffered second image signal;
若第一目标图像信号为第一图像信号,则缓存单元当前缓存第一图像信号,并将最近一次缓存的第一图像信号和最近一次缓存的第二图像信号输出。If the first target image signal is the first image signal, the buffering unit currently buffers the first image signal, and outputs the most recently buffered first image signal and the most recently buffered second image signal.
参见图9C,图9C中以图像传感器交替输出第一图像信号和第二图像信号为例进行说明,图9C中图像传感器在输出第二图像信号M-2时,同步单元指示缓存单元存储该第二图像信号M-2,图像处理单元不进行处理;图像传感器在输出第一图像信号M-1时,同步单元指示缓存单元存储该第一图像信号M-1,并将最近一次缓存的第一图像信号M-3和第二图像信号M-2从缓存单元中输出,此时图像处理单元将第二图像信号M-2和第一图像信号M-1进行融合处理,得到彩色融合图像;图像传感器在输出第二图像信号M时,同步单元指示缓存单元存储该第二图像信号M,图像处理单元不进行处理;图像传感器在输出第一图像信号M+1时,同步单元指示缓存单元存储该第一图像信号M+1,并将最近一次缓存的第二图像信号M和最近一次缓存的第一图像信号M-1从缓存单元输出,依此类推。Referring to FIG. 9C, the image sensor alternately outputting the first image signal and the second image signal in FIG. 9C is taken as an example for description. When the image sensor in FIG. 9C outputs the second image signal M-2, the synchronization unit instructs the buffer unit to store the first image signal. Second, the image signal M-2 is not processed by the image processing unit; when the image sensor outputs the first image signal M-1, the synchronization unit instructs the buffer unit to store the first image signal M-1, and store the first image signal M-1 most recently buffered. The image signal M-3 and the second image signal M-2 are output from the buffer unit. At this time, the image processing unit performs fusion processing on the second image signal M-2 and the first image signal M-1 to obtain a color fusion image; When the sensor outputs the second image signal M, the synchronization unit instructs the buffer unit to store the second image signal M, and the image processing unit does not perform processing; when the image sensor outputs the first image signal M+1, the synchronization unit instructs the buffer unit to store the second image signal M The first image signal M+1, and the last buffered second image signal M and the last buffered first image signal M-1 are output from the buffer unit, and so on.
本申请实施例中,通过图像传感器多次曝光和补光器补光生成多幅具有不同光谱范围的图像,扩展单传感器的图像采集能力,提升不同场景下的图像质量;处理器具有图像缓存功能,能够实现具有不同曝光时间段的图像之间的同步,而且具有图像融合功能,能够生成信噪比提升的融合图像。In the embodiment of the application, multiple images with different spectral ranges are generated through multiple exposures of the image sensor and light supplementation, which expands the image acquisition capability of a single sensor and improves the image quality in different scenarios; the processor has an image cache function , Can achieve synchronization between images with different exposure time periods, and has an image fusion function, which can generate fused images with improved signal-to-noise ratio.
在本申请一实施例中,联合降噪单元,具体用于:In an embodiment of the present application, the joint noise reduction unit is specifically used for:
根据第一目标图像和第二目标图像之间的相关性,对第一目标图像和第二目标图像分别进行联合滤波处理,得到降噪后的第一目标图像和第二目标图像。According to the correlation between the first target image and the second target image, the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction.
在本申请一实施例中,联合降噪单元包括时域降噪单元或空域降噪单元;In an embodiment of the present application, the joint noise reduction unit includes a temporal noise reduction unit or a spatial noise reduction unit;
时域降噪单元用于根据第一目标图像和第二目标图像进行运动估计,得到运动估计结果,根据运动估计结果对第一目标图像进行时域滤波,得到降噪后的第一目标图像,根据运动估计结果对第二目标图像进行时域滤波,得到降噪后的第二目标图像;The temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain a motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the denoised first target image, Performing time-domain filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
空域降噪单元用于根据第一目标图像和第二目标图像进行边缘估计,得到边缘估计结果,根据边缘估计结果对第一目标图像进行空域滤波,得到降噪后的第一目标图像,根据边缘估计结果对第二目标图像进行空域滤波,得到降噪后的第二目标图像。The spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain the edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the denoised first target image. The estimation result performs spatial filtering on the second target image to obtain the second target image after noise reduction.
在本申请另一实施例中,联合降噪单元包括时域降噪单元和空域降噪单元;In another embodiment of the present application, the joint noise reduction unit includes a temporal noise reduction unit and a spatial noise reduction unit;
时域降噪单元用于根据第一目标图像和第二目标图像进行运动估计,得到运动估计结果,根据运动估计结果对第一目标图像进行时域滤波,得到第一时域降噪图像,根据运动估计结果对第二目标图像进行时域滤波,得到第二时域降噪图像;The temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain the motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the first temporal noise reduction image, according to The motion estimation result performs temporal filtering on the second target image to obtain a second temporal noise reduction image;
空域降噪单元用于根据第一时域降噪图像和第二时域降噪图像进行边缘估计,得到边缘估计结果,根据边缘估计结果对第一时域降噪图像进行空域滤波,得到降噪后的第一目标图像,根据边缘估计结果对第二时域降噪图像进行空域滤波,得到降噪后的第二目标图像;The spatial noise reduction unit is used to perform edge estimation according to the first temporal noise reduction image and the second temporal noise reduction image to obtain the edge estimation result, and perform spatial filtering on the first temporal noise reduction image according to the edge estimation result to obtain the noise reduction Performing spatial filtering on the second time-domain noise-reduced image according to the edge estimation result for the first target image to obtain a de-noised second target image;
或者,or,
空域降噪单元用于根据第一目标图像和第二目标图像进行边缘估计,得到边缘估计结果,根据边缘估计结果对第一目标图像进行空域滤波,得到第一空域降噪图像,根据边缘估计结果对第二目标图像进行空域滤波,得到第二空域降噪图像;The spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain the edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the first spatial noise reduction image, according to the edge estimation result Performing spatial filtering on the second target image to obtain a second spatial denoising image;
时域降噪单元用于根据第一空域降噪图像和第二空域降噪图像进行运动估计,得到运动估计结果,根据运动估计结果对第一空域降噪图像进行时域滤波,得到降噪后的第一目标图像,根据运动估计结果对第二空域降噪图像进行时域滤波,得到降噪后的第二目标图像。The temporal denoising unit is used to perform motion estimation based on the first spatial denoised image and the second spatial denoised image to obtain a motion estimation result, and perform temporal filtering on the first spatial denoised image according to the motion estimation result to obtain the denoised image Performing temporal filtering on the second spatial denoised image according to the motion estimation result to obtain the denoised second target image.
在本申请另一实施例中,参见图10,该图像融合设备可以包括图像采集装置,即包括:图像传感器01、补光器02和滤光组件03,以及处理器,处理器包括:缓存单元和图像处理单元。In another embodiment of the present application, referring to FIG. 10, the image fusion device may include an image acquisition device, that is, include: an image sensor 01, a light supplement 02, and a filter component 03, and a processor, and the processor includes: a buffer unit And image processing unit.
图像处理单元,用于接收图像传感器当前输出的第一目标图像信号,将第一目标图像信号预处理后得到第一目标图像,在第一目标图像需要缓存时,至少将第一目标图像同步输出至缓存单元进行缓存,以及在缓存单元需要同步输出缓存单元已缓存的第二目标图像时,至少接收缓存单元同步输出的第二目标图像,根据第一目标图像和第二目标图像生成彩色融合图像;其中,若第一目标图像信号为第一图像信号则第一目标图像为第一图像信号预处理后生成的图像,第二目标图像为已缓存的一帧由第二目标图像信号预处理后生成的图像,第二目标图像信号为第二图像信号;若第一目标图像信号为第二图像信号,则第一目标图像为第二图像信号预处理后生成的图像,第二目标图像为已缓存的一帧由第二目标图像信号预处理后的图像,第二目标图像信号为第一图像信号;An image processing unit for receiving the first target image signal currently output by the image sensor, preprocessing the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least the first target image is output synchronously To the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least receive the second target image synchronously output by the buffer unit, and generate a color fusion image based on the first target image and the second target image ; Wherein, if the first target image signal is the first image signal, the first target image is an image generated after preprocessing the first image signal, and the second target image is a buffered frame that is preprocessed by the second target image signal For the generated image, the second target image signal is the second image signal; if the first target image signal is the second image signal, the first target image is the image generated after the second image signal is preprocessed, and the second target image is already A buffered frame of an image preprocessed by a second target image signal, where the second target image signal is the first image signal;
缓存单元,用于在获知第一目标图像需要缓存时,至少将图像处理单元同步输出的第一目标图像进行缓存,以及在获知需要同步输出已缓存的第二目标图像时,至少将已缓存的第二目标图像信号同步输出至图像处理单元。The cache unit is used to cache at least the first target image synchronously output by the image processing unit when it is known that the first target image needs to be cached, and when it is known that the cached second target image needs to be output synchronously, at least the cached The second target image signal is synchronously output to the image processing unit.
参见图11,处理器还包括:同步单元;同步单元用于确定图像处理单元预处理生成的第一目标图像需要缓存时,指示缓存单元将第一目标图像进行缓存,以及从已缓存的图像中确定需要同步输出第二目标图像时,指示缓存单元将第二目标图像同步输出至图像处理单元。Referring to FIG. 11, the processor further includes: a synchronization unit; the synchronization unit is used to determine that the first target image generated by the preprocessing of the image processing unit needs to be cached, instruct the cache unit to cache the first target image, and to retrieve the first target image from the cached image. When it is determined that the second target image needs to be output synchronously, the buffer unit is instructed to synchronously output the second target image to the image processing unit.
参见图4,图像处理单元,包括:图像预处理单元和图像融合单元;Referring to Figure 4, the image processing unit includes: an image preprocessing unit and an image fusion unit;
图像预处理单元,用于将第一目标图像信号经预处理后生成第一目标图像,并将第二目标图像信号经预处理后生成第二目标图像;An image preprocessing unit, configured to preprocess the first target image signal to generate a first target image, and preprocess the second target image signal to generate a second target image;
图像融合单元,用于将第一目标图像和第二目标图像进行融合处理,得到彩色融合图像。The image fusion unit is used to perform fusion processing on the first target image and the second target image to obtain a color fusion image.
其中,若第一目标图像信号为第一图像信号时,预处理后生成的第一目标图像为黑白图像,第二目标图像信号为第二图像信号时,预处理后生成的第二目标图像为彩色图像。Wherein, if the first target image signal is the first image signal, the first target image generated after preprocessing is a black and white image, and when the second target image signal is the second image signal, the second target image generated after preprocessing is Color image.
若第一目标图像信号为第二图像信号时,预处理后生成的第一目标图像为彩色图像,第二目标图像信号为第一图像信号时,预处理后生成的第二目标图像为黑白图像。If the first target image signal is the second image signal, the first target image generated after preprocessing is a color image, and when the second target image signal is the first image signal, the second target image generated after preprocessing is a black and white image .
具体的,参见图12,本实施例中,将图像传感器先输出的第一目标图像信号输入 图像处理单元,经过图像预处理单元进行预处理后,缓存单元存储该第一目标图像信号预处理后的第一目标图像,待图像传感器输出第二目标图像信号后将第二目标图像信号输出至图像处理单元,并经过图像预处理单元进行预处理后将第二目标图像输出至图像融合单元时,将缓存单元中存储的第一目标图像输出至图像融合单元,实现第一目标图像同第二目标图像之间的同步,然后再通过图像融合单元进行融合处理,得到彩色融合图像。Specifically, referring to FIG. 12, in this embodiment, the first target image signal first output by the image sensor is input to the image processing unit. After the image preprocessing unit performs preprocessing, the buffer unit stores the first target image signal after preprocessing. When the second target image signal is output to the image processing unit after the image sensor outputs the second target image signal, and the second target image is output to the image fusion unit after preprocessing by the image preprocessing unit, The first target image stored in the buffer unit is output to the image fusion unit to achieve synchronization between the first target image and the second target image, and then the image fusion unit performs fusion processing to obtain a color fusion image.
参见图6,图像预处理单元,包括:第一预处理单元、第二预处理单元和联合降噪单元;Referring to FIG. 6, the image preprocessing unit includes: a first preprocessing unit, a second preprocessing unit, and a joint noise reduction unit;
第一预处理单元,用于对第一目标图像信号进行第一预处理操作,得到预处理后的第一目标图像;The first preprocessing unit is configured to perform a first preprocessing operation on the first target image signal to obtain a preprocessed first target image;
第二预处理单元,用于对第二目标图像信号进行第二预处理操作,得到第二目标图像;A second preprocessing unit, configured to perform a second preprocessing operation on the second target image signal to obtain a second target image;
联合降噪单元,用于对第一目标图像和第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,降噪后的第一目标图像和第二目标图像用于进行融合处理,得到融合后的彩色融合图像。The joint noise reduction unit is used to filter the first target image and the second target image to obtain the first target image and the second target image after noise reduction, and the first target image and the second target image after noise reduction are used Perform fusion processing to obtain a fused color fusion image.
其中,第一预处理操作包括以下至少一项:图像插值、伽马映射和色彩转换;第二预处理操作包括以下至少一项:白平衡、图像插值和伽马映射。The first preprocessing operation includes at least one of the following: image interpolation, gamma mapping, and color conversion; the second preprocessing operation includes at least one of the following: white balance, image interpolation, and gamma mapping.
在本申请一实施例中,参见图13,缓存单元可以在一个帧周期分别存入第一目标图像和取出第二目标图像,具体方案如下:In an embodiment of the present application, referring to FIG. 13, the buffer unit can respectively store the first target image and fetch the second target image in one frame period. The specific solution is as follows:
同步单元,用于确定每一帧第一目标图像需要缓存,并且需要同步输出第二目标图像,第二目标图像为缓存单元前一次缓存的图像;The synchronization unit is used to determine that the first target image of each frame needs to be cached, and the second target image needs to be output synchronously, and the second target image is the image previously cached by the cache unit;
其中,若第一目标图像为第二图像信号预处理后生成的图像,则缓存单元当前缓存第二图像信号预处理后生成的图像,并将前一次缓存的第一图像信号预处理后生成的图像确定为第二目标图像输出至图像预处理单元;Wherein, if the first target image is an image generated after preprocessing of the second image signal, the buffering unit currently buffers the image generated after preprocessing the second image signal, and preprocesses the previously buffered first image signal. The image is determined to be the second target image and output to the image preprocessing unit;
若第一目标图像信号为第一图像信号预处理后生成的图像,则缓存单元当前缓存第一图像信号预处理后生成的图像,并将前一次缓存的第二图像信号预处理后生成的图像确定为第二目标图像输出至图像预处理单元。If the first target image signal is an image generated after preprocessing of the first image signal, the buffering unit currently buffers the image generated after preprocessing of the first image signal, and preprocesses the image generated by the second image signal buffered previously It is determined that the second target image is output to the image preprocessing unit.
其中,图像传感器可以交替输出所述第一图像信号和所述第二图像信号,或,间隔几个第一图像信号后输出第二图像信号等方式输出第一图像信号和第二图像信号,本申请实施例中对此并不限定。Wherein, the image sensor may alternately output the first image signal and the second image signal, or output the first image signal and the second image signal in a manner such as outputting the second image signal after a few first image signals. This is not limited in the application embodiments.
图13中以图像传感器交替输出第一图像信号和第二图像信号为例进行说明,图13中图像预处理单元在输出彩色图像M-2时,同步单元指示缓存单元存储该彩色图像M-2,并将前一次缓存的黑白图像M-3输出,此时图像融合单元将彩色图像M-2和黑白图像M-3进行融合处理,得到彩色融合图像;图像预处理单元在输出黑白图像M-1时,同步单元指示缓存单元存储该黑白图像M-1,并将前一次缓存的彩色图像M-2输出;图像预处理单元在输出彩色图像M时,同步单元指示缓存单元存储该彩色图像M,并将前一次缓存的黑白图像M-1输出,依此类推。Fig. 13 takes the image sensor alternately outputting the first image signal and the second image signal as an example for description. When the image preprocessing unit in Fig. 13 outputs the color image M-2, the synchronization unit instructs the buffer unit to store the color image M-2 , And output the previously buffered black and white image M-3. At this time, the image fusion unit fuses the color image M-2 and the black and white image M-3 to obtain the color fusion image; the image preprocessing unit is outputting the black and white image M- At 1:00, the synchronization unit instructs the buffer unit to store the black and white image M-1 and outputs the previously buffered color image M-2; when the image preprocessing unit outputs the color image M, the synchronization unit instructs the buffer unit to store the color image M , And output the previously buffered black and white image M-1, and so on.
在本申请另一实施例中,参见图14、图15A、图15B、图15C,缓存单元可以在不同帧周期分别存入第一目标图像和取出第二目标图像,具体方案如下:In another embodiment of the present application, referring to FIG. 14, FIG. 15A, FIG. 15B, and FIG. 15C, the buffer unit can store the first target image and fetch the second target image in different frame periods, respectively. The specific solution is as follows:
同步单元,用于确定第一目标图像为第一图像信号预处理后的图像时需要缓存,以及 在确定第一目标图像为第二图像信号预处理后的图像时,需要同步输出第二目标图像,第二目标图像为缓存单元已缓存的图像中最近一次缓存的第一图像信号预处理后的图像;其中,若第一目标图像为第二图像信号预处理后的图像,则缓存单元将最近一次缓存的第一图像信号预处理后的图像确定为第二目标图像输出至图像预处理单元;若第一目标图像为第一图像信号预处理后的图像,则缓存单元缓存第一图像信号预处理后的图像;或者,The synchronization unit is used to determine that the first target image is an image preprocessed by the first image signal and needs to be buffered, and when determining that the first target image is an image preprocessed by the second image signal, need to synchronously output the second target image , The second target image is the image preprocessed by the first image signal that is buffered last time among the images that the buffer unit has buffered; wherein, if the first target image is an image preprocessed by the second image signal, the buffer unit will The image preprocessed by the first image signal buffered once is determined to be the second target image and output to the image preprocessing unit; if the first target image is an image preprocessed by the first image signal, the buffer unit caches the first image signal preprocessing unit. The processed image; or,
同步单元,用于确定第一目标图像为第二图像信号预处理后的图像时需要缓存,以及在确定第一目标图像为第一图像信号预处理后的图像时,需要同步输出第二目标图像,第二目标图像为缓存单元已缓存的图像中最近一次缓存的第二图像信号预处理后的图像;其中,若第一目标图像为第一图像信号预处理后的图像,则缓存单元将最近一次缓存的第二图像信号预处理后的图像确定为第二目标图像输出至图像预处理单元;若第一目标图像为第二图像信号预处理后的图像,则缓存单元缓存所述第二图像信号预处理后的图像。The synchronization unit is used to determine that the first target image is an image preprocessed by the second image signal and needs to be cached, and when determining that the first target image is an image preprocessed by the first image signal, need to synchronously output the second target image , The second target image is the image preprocessed by the second image signal that has been buffered last time among the images buffered by the buffer unit; wherein, if the first target image is the image preprocessed by the first image signal, the buffer unit will The image preprocessed by the second image signal buffered once is determined to be the second target image and output to the image preprocessing unit; if the first target image is an image preprocessed by the second image signal, the buffer unit buffers the second image The image after signal preprocessing.
图14中以图像传感器交替输出第一图像信号和第二图像信号为例进行说明,图14中图像预处理单元在输出彩色图像M-2时,同步单元指示缓存单元将最近一次缓存的黑白图像M-3输出,此时图像融合单元将彩色图像M-2和黑白图像M-3进行融合处理,得到彩色融合图像;图像预处理单元在输出黑白图像M-1时,同步单元指示缓存单元存储该黑白图像M-1,此时图像融合单元不进行处理;图像预处理单元在输出彩色图像M时,同步单元指示缓存单元将最近一次缓存的黑白图像M-1输出,此时图像融合单元将彩色图像M和黑白图像M-1进行融合处理,得到彩色融合图像,依此类推。Figure 14 takes the image sensor alternately outputting the first image signal and the second image signal as an example for illustration. When the image preprocessing unit in Figure 14 outputs the color image M-2, the synchronization unit instructs the buffer unit to buffer the black and white image last time M-3 output. At this time, the image fusion unit fuses the color image M-2 and the black and white image M-3 to obtain the color fusion image; when the image preprocessing unit outputs the black and white image M-1, the synchronization unit instructs the buffer unit to store The black and white image M-1 is not processed by the image fusion unit at this time; when the image preprocessing unit outputs the color image M, the synchronization unit instructs the buffer unit to output the most recently buffered black and white image M-1, and the image fusion unit will The color image M and the black and white image M-1 are fused to obtain a color fusion image, and so on.
图14中是以仅将黑白图像进行缓存为例进行说明,仅将彩色图像进行缓存与图14类似,此处不再赘述。In FIG. 14, only the black-and-white image is cached as an example for description, and only the color image is cached is similar to FIG. 14, and will not be repeated here.
参见图15A,图像传感器间隔两个第一图像信号输出一个第二图像信号,缓存单元仅缓存第二图像信号预处理后的彩色图像,图15A中图像预处理单元在输出彩色图像M-2时,同步单元指示缓存单元将该彩色图像M-2进行缓存,此时图像融合单元不进行处理;图像预处理单元在输出黑白图像M-1时,同步单元指示缓存单元将最近一次缓存的彩色图像M-2输出,图像融合单元将彩色图像M-2和黑白图像M-1进行融合处理,得到彩色融合图像;图像预处理单元在输出黑白图像M时,同步单元指示缓存单元将最近一次缓存的彩色图像M-2输出,此时图像融合单元将彩色图像M-2和黑白图像M进行融合处理,得到彩色融合图像,依此类推。Referring to FIG. 15A, the image sensor outputs a second image signal at intervals of two first image signals, and the buffer unit only buffers the color image preprocessed by the second image signal. In FIG. 15A, the image preprocessing unit outputs a color image M-2. , The synchronization unit instructs the caching unit to cache the color image M-2, at this time the image fusion unit does not perform processing; when the image preprocessing unit outputs the black and white image M-1, the synchronization unit instructs the caching unit to cache the color image most recently M-2 output, the image fusion unit merges the color image M-2 and the black and white image M-1 to obtain the color fusion image; when the image preprocessing unit outputs the black and white image M, the synchronization unit instructs the buffer unit to buffer the latest The color image M-2 is output. At this time, the image fusion unit performs fusion processing on the color image M-2 and the black and white image M to obtain a color fusion image, and so on.
进一步的,在本申请的其他实施例中,对于黑白图像来说,可以不是每一帧都进行缓存,也可以是间隔几个黑白图像再存储,参见图15B,图像预处理单元在输出彩色图像M-2时,同步单元指示缓存单元将最近一次缓存的黑白图像M-5从缓存单元输出,此时图像处理单元将彩色图像M-2和黑白图像M-5进行融合处理,得到彩色融合图像;图像预处理单元在输出黑白图像M-1时,同步单元指示缓存单元存储该黑白图像M-1,此时图像处理单元不进行处理;图像预处理单元在输出彩色图像M时,同步单元指示缓存单元将最近一次缓存的黑白图像M-1从缓存单元输出,此时图像处理单元将彩色图像M和黑白图像M-1进行融合处理,得到彩色融合图像,图像预处理单元在输出黑白图像M+1时,缓存单元和图像处理单元均不处理,图像预处理单元在输出彩色图像M+2时,同步单元指示缓存单元将最近一次缓存的黑白图像M-1从缓存单元输出,此时图像处理单元将彩色图像M+2和黑白图像M-1进行融合处理,得到彩色融 合图像,依此类推。Further, in other embodiments of the present application, for black and white images, not every frame may be buffered, or it may be stored at intervals of several black and white images. See Figure 15B. The image preprocessing unit is outputting color images. At M-2, the synchronization unit instructs the buffer unit to output the most recently buffered black and white image M-5 from the buffer unit. At this time, the image processing unit performs fusion processing on the color image M-2 and the black and white image M-5 to obtain a color fusion image ; When the image preprocessing unit outputs the black and white image M-1, the synchronization unit instructs the buffer unit to store the black and white image M-1, and the image processing unit does not perform processing at this time; when the image preprocessing unit outputs the color image M, the synchronization unit instructs The buffer unit outputs the most recently buffered black-and-white image M-1 from the buffer unit. At this time, the image processing unit fuses the color image M and the black-and-white image M-1 to obtain the color fusion image. The image preprocessing unit outputs the black-and-white image M +1, neither the buffer unit nor the image processing unit processes. When the image preprocessing unit outputs the color image M+2, the synchronization unit instructs the buffer unit to output the most recently buffered black and white image M-1 from the buffer unit. The processing unit performs fusion processing on the color image M+2 and the black and white image M-1 to obtain a color fusion image, and so on.
进一步的,在本申请的其他实施例中,可以在一个帧周期同步输出第一目标图像和第二目标图像,具体方案如下:Further, in other embodiments of the present application, the first target image and the second target image may be synchronously output in one frame period. The specific scheme is as follows:
参见图15C,同步单元,用于确定每一帧第一目标图像需要缓存,并且需要同步输出最近一次缓存的第二目标图像和最近一次缓存的第一目标图像;Referring to FIG. 15C, a synchronization unit is used to determine that each frame of the first target image needs to be buffered, and needs to synchronously output the last buffered second target image and the last buffered first target image;
其中,若第一目标图像为第二图像信号预处理后生成的图像,则缓存单元当前缓存第二图像信号预处理后生成的图像,并将最近一次缓存的第一图像信号预处理后生成的图像和最近一次缓存的第二图像信号预处理后生成的图像输出;Wherein, if the first target image is an image generated after preprocessing of the second image signal, the buffering unit currently buffers the image generated after preprocessing of the second image signal, and preprocesses the image generated by the most recently buffered first image signal. The image output generated after preprocessing the image and the last buffered second image signal;
若第一目标图像信号为第一图像信号预处理后生成的图像,则缓存单元当前缓存第一图像信号预处理后生成的图像,并将最近一次缓存的第二图像信号预处理后生成的图像和最近一次缓存的第一图像信号预处理后生成的图像输出。If the first target image signal is an image generated after preprocessing of the first image signal, the buffering unit currently buffers the image generated after preprocessing of the first image signal, and preprocesses the image generated after the last buffered second image signal And the image output generated after preprocessing of the first image signal that was buffered last time.
参见图15C,图15C中图像预处理单元在输出彩色图像M-2时,同步单元指示缓存单元存储该彩色图像M-2,图像处理单元不进行处理;图像预处理单元在输出黑白图像M-1时,同步单元指示缓存单元存储该黑白图像M-1,并将最近一次缓存的黑白图像M-3和彩色图像M-2从缓存单元中输出,此时图像处理单元将彩色图像M-2和黑白图像M-1进行融合处理,得到彩色融合图像;图像预处理单元在输出彩色图像M时,同步单元指示缓存单元存储该彩色图像M,图像处理单元不进行处理;图像预处理单元在输出黑白图像M+1时,同步单元指示缓存单元存储该黑白图像M+1,并将最近一次缓存的彩色图像M和最近一次缓存的黑白图像M-1从缓存单元输出,依此类推。Referring to Figure 15C, when the image preprocessing unit in Figure 15C outputs the color image M-2, the synchronization unit instructs the buffer unit to store the color image M-2, and the image processing unit does not perform processing; the image preprocessing unit is outputting the black and white image M-2. At 1 o'clock, the synchronization unit instructs the buffer unit to store the black and white image M-1, and outputs the most recently buffered black and white image M-3 and color image M-2 from the buffer unit. At this time, the image processing unit sends the color image M-2 Perform fusion processing with black and white image M-1 to obtain a color fusion image; when the image preprocessing unit outputs a color image M, the synchronization unit instructs the buffer unit to store the color image M, and the image processing unit does not perform processing; the image preprocessing unit is outputting When the black and white image is M+1, the synchronization unit instructs the buffer unit to store the black and white image M+1, and outputs the most recently buffered color image M and the most recently buffered black and white image M-1 from the buffer unit, and so on.
在本申请一实施例中,联合降噪单元,具体用于:In an embodiment of the present application, the joint noise reduction unit is specifically used for:
根据第一目标图像和第二目标图像之间的相关性,对第一目标图像和第二目标图像分别进行联合滤波处理,得到降噪后的第一目标图像和第二目标图像。According to the correlation between the first target image and the second target image, the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction.
在本申请一实施例中,所述联合降噪单元,包括时域降噪单元或空域降噪单元;In an embodiment of the present application, the joint noise reduction unit includes a temporal noise reduction unit or a spatial noise reduction unit;
时域降噪单元用于根据第一目标图像和第二目标图像进行运动估计,得到运动估计结果,根据运动估计结果对第一目标图像进行时域滤波,得到降噪后的第一目标图像,根据运动估计结果对第二目标图像进行时域滤波,得到降噪后的第二目标图像;The temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain a motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the denoised first target image, Performing time-domain filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
空域降噪单元用于根据第一目标图像和第二目标图像进行边缘估计,得到边缘估计结果,根据边缘估计结果对所第一目标图像进行空域滤波,得到所降噪后的第一目标图像,根据边缘估计结果对第二目标图像进行空域滤波,得到降噪后的第二目标图像。The spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain an edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the denoised first target image, Spatial filtering is performed on the second target image according to the edge estimation result to obtain the second target image after noise reduction.
在本申请另一实施例中,联合降噪单元包括时域降噪单元和空域降噪单元;In another embodiment of the present application, the joint noise reduction unit includes a temporal noise reduction unit and a spatial noise reduction unit;
时域降噪单元用于根据第一目标图像和第二目标图像进行运动估计,得到运动估计结果,根据运动估计结果对第一目标图像进行时域滤波,得到第一时域降噪图像,根据运动估计结果对第二目标图像进行时域滤波,得到第二时域降噪图像;The temporal noise reduction unit is used to perform motion estimation according to the first target image and the second target image to obtain the motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result to obtain the first temporal noise reduction image, according to The motion estimation result performs temporal filtering on the second target image to obtain a second temporal noise reduction image;
空域降噪单元用于根据第一时域降噪图像和第二时域降噪图像进行边缘估计,得到边缘估计结果,根据边缘估计结果对第一时域降噪图像进行空域滤波,得到降噪后的第一目标图像,根据边缘估计结果对第二时域降噪图像进行空域滤波,得到降噪后的第二目标图像;The spatial noise reduction unit is used to perform edge estimation according to the first temporal noise reduction image and the second temporal noise reduction image to obtain the edge estimation result, and perform spatial filtering on the first temporal noise reduction image according to the edge estimation result to obtain the noise reduction Performing spatial filtering on the second time-domain noise-reduced image according to the edge estimation result for the first target image to obtain a de-noised second target image;
或者,or,
空域降噪单元用于根据第一目标图像和第二目标图像进行边缘估计,得到边缘估计结 果,根据边缘估计结果对第一目标图像进行空域滤波,得到第一空域降噪图像,根据边缘估计结果对第二目标图像进行空域滤波,得到第二空域降噪图像;The spatial noise reduction unit is used to perform edge estimation according to the first target image and the second target image to obtain the edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the first spatial noise reduction image, according to the edge estimation result Performing spatial filtering on the second target image to obtain a second spatial denoising image;
时域降噪单元用于根据第一空域降噪图像和第二空域降噪图像进行运动估计,得到运动估计结果,根据运动估计结果对第一空域降噪图像进行时域滤波,得到降噪后的第一目标图像,根据运动估计结果对第二空域降噪图像进行时域滤波,得到降噪后的第二目标图像。The temporal denoising unit is used to perform motion estimation based on the first spatial denoised image and the second spatial denoised image to obtain a motion estimation result, and perform temporal filtering on the first spatial denoised image according to the motion estimation result to obtain the denoised image Performing temporal filtering on the second spatial denoised image according to the motion estimation result to obtain the denoised second target image.
进一步的,在本申请一实施例中,参见图16,图像融合单元包括:色彩提取单元、亮度提取单元,分别与色彩提取单元和亮度提取单元连接的融合处理单元;Further, in an embodiment of the present application, referring to FIG. 16, the image fusion unit includes: a color extraction unit, a brightness extraction unit, and a fusion processing unit connected to the color extraction unit and the brightness extraction unit respectively;
其中,色彩提取单元,用于提取第二图像信号预处理后的图像的色彩信号;Wherein, the color extraction unit is used to extract the color signal of the image after the second image signal preprocessing;
亮度提取单元,用于提取第二图像信号预处理后的图像的亮度信号;A brightness extraction unit for extracting the brightness signal of the image after the second image signal preprocessing;
融合处理单元,用于对第一图像信号预处理后的图像、第二图像信号预处理后的图像的色彩信号和亮度信号进行融合处理,得到彩色融合图像。The fusion processing unit is used to perform fusion processing on the color signal and the brightness signal of the image preprocessed by the first image signal and the image preprocessed by the second image signal to obtain a color fusion image.
其中,图16中第二图像信号以彩色图像,第一图像信号以黑白图像为例进行说明。Wherein, the second image signal in FIG. 16 is a color image, and the first image signal is a black-and-white image as an example.
进一步的,融合处理单元,具体用于:Further, the fusion processing unit is specifically used for:
对第二图像信号预处理后的图像的亮度信息和第一图像信号预处理后的图像进行加权融合处理,得到融合亮度图像;Performing weighted fusion processing on the brightness information of the image preprocessed by the second image signal and the image preprocessed by the first image signal to obtain a fused brightness image;
对融合亮度图像和第二图像信号预处理后的图像的色彩信号进行融合处理,得到彩色融合图像。Fusion processing is performed on the color signals of the fused brightness image and the preprocessed image of the second image signal to obtain a color fusion image.
上述方案中图像采集的结构简单,可以降低成本,而且可以在任一时间段内通过第一预设曝光和第二预设曝光同时采集到包含近红外光信息的第一图像信号和包含可见光信息的第二图像信号,后续根据该第一图像信号和第二图像信号进行融合处理,得到的彩色融合图像的质量较高,而且图像处理单元具有图像缓存功能,能够实现具有不同曝光时间段的图像之间的同步。The structure of the image acquisition in the above scheme is simple, which can reduce the cost, and can simultaneously collect the first image signal containing near-infrared light information and the first image signal containing visible light information through the first preset exposure and the second preset exposure in any period of time. The second image signal is subsequently fused according to the first image signal and the second image signal, and the quality of the color fusion image obtained is higher. Moreover, the image processing unit has an image buffer function, which can realize the comparison of images with different exposure time periods. Synchronization between.
在本申请的一实施例中,根据参考如下公式(a),对彩色图像进行降噪处理,得到降噪后的彩色图像;In an embodiment of the present application, according to the following formula (a), noise reduction processing is performed on the color image to obtain a color image after noise reduction;
Figure PCTCN2020092364-appb-000001
Figure PCTCN2020092364-appb-000001
根据如下公式(b),对黑白图像进行降噪处理,得到降噪后的黑白图像;According to the following formula (b), perform noise reduction processing on the black and white image to obtain a black and white image after noise reduction;
Figure PCTCN2020092364-appb-000002
Figure PCTCN2020092364-appb-000002
其中,x、y表示任一当前像素点的坐标,img_vis(x+i,y+j)表示彩色图像中当前像素点对应的邻域内的像素点的像素值,img_vis'(x,y)表示彩色图像中当前像素点降噪后的像素值,img_nir(x+i,y+j)表示黑白图像中当前像素点对应的邻域内的像素点的像素 值,img_nir'(x,y)表示黑白图像中当前像素点降噪后的像素值,S表示当前像素点对应的邻域大小,weight(x+i,y+j)=weight vis(x+i,y+j)+weight nir(x+i,y+j),weight vis(x+i,y+j)为彩色图像中当前像素点对应的权重,weight nir(x+i,y+j)为黑白图像中当前像素点对应的权重,weight nir(x+i,y+j)和weight vis(x+i,y+j)均可以通过
Figure PCTCN2020092364-appb-000003
计算,f xy表示当前像素点的像素值,f ij表示当前像素点的邻域像素点的像素值,i、j为邻域像素点坐标,δ 1,δ 2表示高斯分布标准差。
Among them, x and y represent the coordinates of any current pixel, img_vis(x+i,y+j) represents the pixel value of the pixel in the neighborhood corresponding to the current pixel in the color image, and img_vis'(x,y) represents The pixel value of the current pixel in the color image after noise reduction, img_nir(x+i,y+j) represents the pixel value of the pixel in the neighborhood corresponding to the current pixel in the black and white image, img_nir'(x,y) represents black and white The pixel value of the current pixel in the image after noise reduction, S represents the size of the neighborhood corresponding to the current pixel, weight(x+i,y+j)=weight vis (x+i,y+j)+weight nir (x +i,y+j), weight vis (x+i,y+j) is the weight corresponding to the current pixel in the color image, weight nir (x+i,y+j) is the weight corresponding to the current pixel in the black and white image Weight, weight nir (x+i,y+j) and weight vis (x+i,y+j) can be passed
Figure PCTCN2020092364-appb-000003
In calculation, f xy represents the pixel value of the current pixel, f ij represents the pixel value of the neighborhood pixel of the current pixel, i and j are the coordinates of the neighborhood pixel, and δ 1 , δ 2 represent the standard deviation of the Gaussian distribution.
在本申请实施例中,参见图1,图像采集装置还可以包括镜头04,此时,滤光组件03可以位于镜头04和图像传感器01之间,且图像传感器01位于滤光组件03的出光侧。或者,镜头04位于滤光组件03与图像传感器01之间,且图像传感器01位于镜头04的出光侧。作为一种示例,第一滤光片031可以是滤光薄膜,这样,当滤光组件03位于镜头04和图像传感器01之间时,第一滤光片031可以贴在镜头04的出光侧的表面,或者,当镜头04位于滤光组件03与图像传感器01之间时,第一滤光片031可以贴在镜头04的入光侧的表面。In the embodiment of the present application, referring to FIG. 1, the image acquisition device may further include a lens 04. In this case, the filter assembly 03 may be located between the lens 04 and the image sensor 01, and the image sensor 01 is located on the light exit side of the filter assembly 03 . Alternatively, the lens 04 is located between the filter assembly 03 and the image sensor 01, and the image sensor 01 is located on the light exit side of the lens 04. As an example, the first filter 031 can be a filter film. In this way, when the filter assembly 03 is located between the lens 04 and the image sensor 01, the first filter 031 can be attached to the light-emitting side of the lens 04 The surface, or, when the lens 04 is located between the filter assembly 03 and the image sensor 01, the first filter 031 may be attached to the surface of the lens 04 on the light incident side.
作为一种示例,图像采集装置可以是摄像机、抓拍机、人脸识别相机、读码相机、车载相机、全景细节相机等。As an example, the image acquisition device may be a video camera, a capture machine, a face recognition camera, a code reading camera, a vehicle-mounted camera, a panoramic detail camera, etc.
作为另一种示例,补光器02可以位于图像采集装置内,也可以位于图像采集装置的外部。补光器02可以为图像采集装置的一部分,也可以为独立于图像采集装置的一个器件。当补光器02位于图像采集的外部时,补光器02可以与图像采集装置进行通信连接,可以保证图像采集设备中的图像传感器01的曝光时序与补光器02包括的第一补光装置021的近红外补光时序存在一定的关系,如至少在第一预设曝光的部分曝光时间段内存在近红外补光,在第二预设曝光的曝光时间段内不存在近红外补光。As another example, the light supplement 02 may be located in the image acquisition device or outside the image acquisition device. The light supplement 02 can be a part of the image acquisition device or a device independent of the image acquisition device. When the light fill 02 is located outside the image capture, the fill 02 can communicate with the image capture device, which can ensure the exposure timing of the image sensor 01 in the image capture device and the first fill light included in the fill 02 The timing of the near-infrared supplement light of 021 has a certain relationship, for example, there is near-infrared supplementary light in at least part of the exposure time period of the first preset exposure, and there is no near-infrared supplementary light in the exposure time period of the second preset exposure.
另外,第一补光装置021为可以发出近红外光的装置,例如近红外补光灯等,第一补光装置021可以以频闪方式进行近红外补光,也可以以类似频闪的其他方式进行近红外补光,本申请实施例对此不做限定。在一些示例中,当第一补光装置021以频闪方式进行近红外补光时,可以通过手动方式来控制第一补光装置021以频闪方式进行近红外补光,也可以通过软件程序或特定设备来控制第一补光装置021以频闪方式进行近红外补光,本申请实施例对此不做限定。其中,第一补光装置021进行近红外补光的时间段可以与第一预设曝光的曝光时间段重合,也可以大于第一预设曝光的曝光时间段或者小于第一预设曝光的曝光时间段,只要在第一预设曝光的整个曝光时间段或者部分曝光时间段内存在近红外补光,而在第二预设曝光的曝光时间段内不存在近红外补光即可。In addition, the first supplementary light device 021 is a device that can emit near-infrared light, such as a near-infrared supplementary light, etc., the first supplementary light device 021 can perform near-infrared supplementary light in a stroboscopic manner, or other similar stroboscopic The near-infrared supplementary light is performed in a manner, which is not limited in the embodiment of the present application. In some examples, when the first light supplement device 021 performs near-infrared supplement light in a stroboscopic manner, the first light supplement device 021 can be manually controlled to perform near-infrared supplement light in a stroboscopic manner, or through a software program Or a specific device controls the first light supplement device 021 to perform near-infrared supplement light in a strobe mode, which is not limited in the embodiment of the present application. The time period during which the first light supplement device 021 performs near-infrared light supplementation may coincide with the exposure time period of the first preset exposure, or may be greater than the exposure time period of the first preset exposure or less than the exposure time period of the first preset exposure. The time period, as long as there is near-infrared supplement light in the entire exposure time period or part of the exposure time period of the first preset exposure, and there is no near-infrared supplement light in the exposure time period of the second preset exposure.
需要说明的是,第二预设曝光的曝光时间段内不存在近红外补光,对于全局曝光方式来说,第二预设曝光的曝光时间段可以是开始曝光时刻和结束曝光时刻之间的时间段,对于卷帘曝光方式来说,第二预设曝光的曝光时间段可以是第二图像信号第一行有效图像的开始曝光时刻与最后一行有效图像的结束曝光时刻之间的时间段,但并 不局限于此。例如,第二预设曝光的曝光时间段也可以是第二图像信号中目标图像对应的曝光时间段,目标图像为第二图像信号中与目标对象或目标区域所对应的若干行有效图像,这若干行有效图像的开始曝光时刻与结束曝光时刻之间的时间段可以看作第二预设曝光的曝光时间段。It should be noted that there is no near-infrared fill light in the exposure time period of the second preset exposure. For the global exposure mode, the exposure time period of the second preset exposure may be between the start exposure time and the end exposure time. Time period, for the rolling shutter exposure mode, the exposure time period of the second preset exposure may be the time period between the start exposure time of the first row of effective images of the second image signal and the end exposure time of the last row of effective images, But it is not limited to this. For example, the exposure time period of the second preset exposure may also be the exposure time period corresponding to the target image in the second image signal, and the target image is a number of rows of effective images corresponding to the target object or target area in the second image signal. The time period between the start exposure time and the end exposure time of several rows of effective images can be regarded as the exposure time period of the second preset exposure.
需要说明的另一点是,由于第一补光装置021在对外部场景进行近红外补光时,入射到物体表面的近红外光可能会被物体反射,从而进入到第一滤光片031中。并且由于通常情况下,环境光可以包括可见光和近红外光,且环境光中的近红外光入射到物体表面时也会被物体反射,从而进入到第一滤光片031中。因此,在存在近红外补光时通过第一滤光片031的近红外光可以包括第一补光装置021进行近红外补光时经物体反射进入第一滤光片031的近红外光,在不存在近红外补光时通过第一滤光片031的近红外光可以包括第一补光装置021未进行近红外补光时经物体反射进入第一滤光片031的近红外光。也即是,在存在近红外补光时通过第一滤光片031的近红外光包括第一补光装置021发出的且经物体反射后的近红外光,以及环境光中经物体反射后的近红外光,在不存在近红外补光时通过第一滤光片031的近红外光包括环境光中经物体反射后的近红外光。Another point that needs to be explained is that, when the first light supplement device 021 performs near-infrared light supplementation on an external scene, the near-infrared light incident on the surface of the object may be reflected by the object and enter the first filter 031. And because under normal circumstances, the ambient light may include visible light and near-infrared light, and near-infrared light in the ambient light is also reflected by the object when it is incident on the surface of the object, thereby entering the first filter 031. Therefore, the near-infrared light that passes through the first filter 031 when there is near-infrared supplementary light may include the near-infrared light that enters the first filter 031 by the reflection of the object when the first supplementary light device 021 performs near-infrared supplementary light. The near-infrared light passing through the first filter 031 when there is no near-infrared supplementary light may include the near-infrared light reflected by the object and entering the first filter 031 when the first supplementary light device 021 is not performing near-infrared supplementary light. That is, the near-infrared light passing through the first filter 031 when there is near-infrared supplement light includes the near-infrared light emitted by the first supplementary light device 021 and reflected by the object, and the ambient light reflected by the object Near-infrared light, the near-infrared light passing through the first filter 031 when there is no near-infrared supplementary light includes near-infrared light reflected by an object in the ambient light.
以图像采集装置中,滤光组件03可以位于镜头04和图像传感器01之间,且图像传感器01位于滤光组件03的出光侧的结构特征为例,图像采集装置采集第一图像信号和第二图像信号的过程为:在图像传感器01进行第一预设曝光时,第一补光装置021存在近红外补光,此时拍摄场景中的环境光和第一补光装置进行近红外补光时被场景中物体反射的近红外光经由镜头04、第一滤光片031之后,由图像传感器01通过第一预设曝光产生第一图像信号;在图像传感器01进行第二预设曝光时,第一补光装置021不存在近红外补光,此时拍摄场景中的环境光经由镜头04、第一滤光片031之后,由图像传感器01通过第二预设曝光产生第二图像信号,在图像采集的一个帧周期内可以有M个第一预设曝光和N个第二预设曝光,第一预设曝光和第二预设曝光之间可以有多种组合的排序,在图像采集的一个帧周期中,M和N的取值以及M和N的大小关系可以根据实际需求设置,例如,M和N的取值可相等,也可不相同。In the image acquisition device, the filter assembly 03 can be located between the lens 04 and the image sensor 01, and the image sensor 01 is located on the light-emitting side of the filter assembly 03 as an example. The image acquisition device acquires the first image signal and the second image signal. The image signal process is: when the image sensor 01 performs the first preset exposure, the first light supplement device 021 has near-infrared supplement light, and the ambient light in the shooting scene and the first light supplement device perform near-infrared supplement light at this time After the near-infrared light reflected by objects in the scene passes through the lens 04 and the first filter 031, the image sensor 01 generates the first image signal through the first preset exposure; when the image sensor 01 performs the second preset exposure, the first image signal A fill light device 021 does not have near-infrared fill light. At this time, after the ambient light in the shooting scene passes through the lens 04 and the first filter 031, the image sensor 01 generates a second image signal through the second preset exposure. There can be M first preset exposures and N second preset exposures in one frame period of the acquisition. There can be multiple sorts of combinations between the first preset exposure and the second preset exposure. In the frame period, the values of M and N and the magnitude relationship between M and N can be set according to actual requirements. For example, the values of M and N may be equal or different.
在一些实施例中,多次曝光是指一个帧周期内的多次曝光,也即是,图像传感器01在一个帧周期内进行多次曝光,从而产生并输出至少一帧第一图像信号和至少一帧第二图像信号。例如,1秒内包括25个帧周期,图像传感器01在每个帧周期内进行多次曝光,从而产生至少一帧第一图像信号和至少一帧第二图像信号,将一个帧周期内产生的第一图像信号和第二图像信号称为一组图像信号,这样,25个帧周期内就会产生25组图像信号。其中,第一预设曝光和第二预设曝光可以是一个帧周期内该多次曝光中相邻的两次曝光,也可以是一个帧周期内该多个次曝光中不相邻的两次曝光,本申请实施例对此不做限定。In some embodiments, multiple exposure refers to multiple exposures in one frame period, that is, the image sensor 01 performs multiple exposures in one frame period, thereby generating and outputting at least one frame of the first image signal and at least One frame of second image signal. For example, 1 second includes 25 frame periods, and the image sensor 01 performs multiple exposures in each frame period, thereby generating at least one frame of the first image signal and at least one frame of the second image signal, and the The first image signal and the second image signal are called a group of image signals, so that 25 groups of image signals are generated within 25 frame periods. Wherein, the first preset exposure and the second preset exposure may be two adjacent exposures in the multiple exposures in one frame period, or two non-adjacent exposures in the multiple exposures in one frame period. Exposure is not limited in the embodiment of this application.
另外,由于环境光中的近红外光的强度低于第一补光装置021发出的近红外光的强度,因此,第一补光装置021进行近红外补光时通过第一滤光片031的近红外光的强度高于第一补光装置021未进行近红外补光时通过第一滤光片031的近红外光的强度。In addition, since the intensity of the near-infrared light in the ambient light is lower than the intensity of the near-infrared light emitted by the first light-filling device 021, the first light-filling device 021 passes through the first filter 031 when performing near-infrared light-filling. The intensity of the near-infrared light is higher than the intensity of the near-infrared light that passes through the first filter 031 when the first light supplement device 021 does not perform the near-infrared light supplement.
其中,第一补光装置021进行近红外补光的波段范围可以为第二参考波段范围, 第二参考波段范围可以为700纳米~800纳米,或者900纳米~1000纳米等,以减轻常见850nm红外灯的影响本申请实施例对此不做限定。Among them, the wavelength range of the first light supplement device 021 for near-infrared supplement light can be the second reference wavelength range, and the second reference wavelength range can be 700 nanometers to 800 nanometers, or 900 nanometers to 1000 nanometers, etc., to reduce common 850nm infrared The influence of the lamp is not limited in this embodiment of the application.
另外,入射到第一滤光片031的近红外光的波段范围可以为第一参考波段范围,第一参考波段范围为650纳米~1100纳米。In addition, the wavelength range of the near-infrared light incident on the first filter 031 may be the first reference wavelength range, and the first reference wavelength range is 650 nanometers to 1100 nanometers.
由于在存在近红外补光时,通过第一滤光片031的近红外光可以包括第一补光装置021进行近红外光补光时经物体反射进入第一滤光片031的近红外光,以及环境光中的经物体反射后的近红外光。所以此时进入滤光组件03的近红外光的强度较强。但是,在不存在近红外补光时,通过第一滤光片031的近红外光包括环境光中经物体反射进入滤光组件03的近红外光。由于没有第一补光装置021进行补光的近红外光,所以此时通过第一滤光片031的近红外光的强度较弱。因此,根据第一预设曝光产生并输出的第一图像信号包括的近红外光的强度,要高于根据第二预设曝光产生并输出的第二图像信号包括的近红外光的强度。Since the near-infrared light passing through the first filter 031 may include the near-infrared light reflected by the object and entering the first filter 031 when the first light-filling device 021 performs near-infrared light-filling when there is near-infrared supplementary light, And the near-infrared light reflected by the object in the ambient light. Therefore, the intensity of the near-infrared light entering the filter assembly 03 is relatively strong at this time. However, when there is no near-infrared complementary light, the near-infrared light passing through the first filter 031 includes the near-infrared light reflected by the object into the filter assembly 03 in the ambient light. Since there is no near-infrared light supplemented by the first light supplement device 021, the intensity of the near-infrared light passing through the first filter 031 is weak at this time. Therefore, the intensity of the near infrared light included in the first image signal generated and output according to the first preset exposure is higher than the intensity of the near infrared light included in the second image signal generated and output according to the second preset exposure.
第一补光装置021进行近红外补光的中心波长和/或波段范围可以有多种选择,本申请实施例中,为了使第一补光装置021和第一滤光片031有更好的配合,可以对第一补光装置021进行近红外补光的中心波长进行设计,以及对第一滤光片031的特性进行选择,从而使得在第一补光装置021进行近红外补光的中心波长为设定特征波长或者落在设定特征波长范围时,通过第一滤光片031的近红外光的中心波长和/或波段宽度可以达到约束条件。该约束条件主要是用来约束通过第一滤光片031的近红外光的中心波长尽可能准确,以及通过第一滤光片031的近红外光的波段宽度尽可能窄,从而避免出现因近红外光波段宽度过宽而引入波长干扰。There are multiple choices for the center wavelength and/or wavelength range of the first light supplement device 021 for near-infrared supplement light. In the embodiment of the present application, in order to make the first light supplement device 021 and the first filter 031 better In cooperation, the center wavelength of the near-infrared supplement light of the first light supplement device 021 can be designed, and the characteristics of the first filter 031 can be selected, so that the center of the first light supplement device 021 for the near-infrared light supplement When the wavelength is the set characteristic wavelength or falls within the set characteristic wavelength range, the center wavelength and/or band width of the near-infrared light passing through the first filter 031 can meet the constraint conditions. This constraint is mainly used to restrict the center wavelength of the near-infrared light passing through the first filter 031 as accurate as possible, and the band width of the near-infrared light passing through the first filter 031 is as narrow as possible, so as to avoid The infrared light band width is too wide and introduces wavelength interference.
其中,第一补光装置021进行近红外补光的中心波长可以为第一补光装置021发出的近红外光的光谱中能量最大的波长范围内的平均值,也可以理解为第一补光装置021发出的近红外光的光谱中能量超过一定阈值的波长范围内的中间位置处的波长。Wherein, the center wavelength of the near-infrared light supplemented by the first light-filling device 021 may be the average value in the wavelength range of the highest energy in the spectrum of the near-infrared light emitted by the first light-filling device 021, or it may be understood as the first light supplement The wavelength of the near-infrared light emitted by the device 021 at the middle position in the wavelength range where the energy exceeds a certain threshold.
其中,设定特征波长或者设定特征波长范围可以预先设置。作为一种示例,第一补光装置021进行近红外补光的中心波长可以为750±10纳米的波长范围内的任一波长;或者,第一补光装置021进行近红外补光的中心波长为780±10纳米的波长范围内的任一波长;或者,第一补光装置021进行近红外补光的中心波长为940±10纳米的波长范围内的任一波长。也即是,设定特征波长范围可以为750±10纳米的波长范围、或者780±10纳米的波长范围、或者940±10纳米的波长范围。示例性地,第一补光装置021进行近红外补光的中心波长为940纳米,第一补光装置021进行近红外补光的波长和相对强度之间的关系如图17所示。从图17可以看出,第一补光装置021进行近红外补光的波段范围为900纳米~1000纳米,其中,在940纳米处,近红外光的相对强度最高。Among them, the set characteristic wavelength or the set characteristic wavelength range can be preset. As an example, the center wavelength of the first light supplement device 021 for near-infrared supplement light may be any wavelength within the wavelength range of 750±10 nanometers; or, the center wavelength of the first light supplement device 021 for near-infrared supplement light It is any wavelength in the wavelength range of 780±10 nanometers; or, the center wavelength of the first light supplement device 021 for near-infrared supplement light is any wavelength in the wavelength range of 940±10 nanometers. That is, the set characteristic wavelength range may be a wavelength range of 750±10 nanometers, or a wavelength range of 780±10 nanometers, or a wavelength range of 940±10 nanometers. Exemplarily, the center wavelength of the near-infrared supplement light performed by the first light supplement device 021 is 940 nanometers, and the relationship between the wavelength and the relative intensity of the near-infrared supplement light performed by the first light supplement device 021 is shown in FIG. 17. It can be seen from FIG. 17 that the wavelength range of the first light supplement device 021 for near-infrared supplement light is 900 nanometers to 1000 nanometers, and the relative intensity of near-infrared light is highest at 940 nanometers.
由于在存在近红外补光时,通过第一滤光片031的近红外光大部分为第一补光装置021进行近红外补光时经物体反射进入第一滤光片031的近红外光,因此,在一些实施例中,上述约束条件可以包括:通过第一滤光片031的近红外光的中心波长与第一补光装置021进行近红外补光的中心波长之间的差值位于波长波动范围内,作为一种示例,波长波动范围可以为0~20纳米。Since in the presence of near-infrared supplementary light, most of the near-infrared light passing through the first filter 031 is the near-infrared light that enters the first filter 031 by the reflection of the object when the first supplementary light device 021 performs near-infrared supplementary light, therefore In some embodiments, the above-mentioned constraint conditions may include: the difference between the center wavelength of the near-infrared light passing through the first filter 031 and the center wavelength of the near-infrared light of the first light supplement device 021 lies in the wavelength fluctuation Within the range, as an example, the wavelength fluctuation range may be 0-20 nanometers.
其中,通过第一滤光片031的近红外补光的中心波长可以为第一滤光片031的近 红外光通过率曲线中的近红外波段范围内波峰位置处的波长,也可以理解为第一滤光片031的近红外光通过率曲线中通过率超过一定阈值的近红外波段范围内的中间位置处的波长。Among them, the center wavelength of the near-infrared supplement light passing through the first filter 031 can be the wavelength at the peak position in the near-infrared band in the near-infrared light pass rate curve of the first filter 031, or it can be understood as the first A filter 031 is the wavelength at the middle position in the near-infrared waveband whose pass rate exceeds a certain threshold in the near-infrared light pass rate curve of the filter 031.
为了避免通过第一滤光片031的近红外光的波段宽度过宽而引入波长干扰,在一些实施例中,上述约束条件可以包括:第一波段宽度可以小于第二波段宽度。其中,第一波段宽度是指通过第一滤光片031的近红外光的波段宽度,第二波段宽度是指被第一滤光片031阻挡的近红外光的波段宽度。应当理解的是,波段宽度是指光线的波长所处的波长范围的宽度。例如,通过第一滤光片031的近红外光的波长所处的波长范围为700纳米~800纳米,那么第一波段宽度为800纳米减去700纳米,即100纳米。换句话说,通过第一滤光片031的近红外光的波段宽度小于第一滤光片031阻挡的近红外光的波段宽度。In order to avoid the introduction of wavelength interference due to the excessively wide band width of the near-infrared light passing through the first filter 031, in some embodiments, the above constraint conditions may include: the first band width may be smaller than the second band width. The first waveband width refers to the waveband width of the near-infrared light passing through the first filter 031, and the second waveband width refers to the waveband width of the near-infrared light blocked by the first filter 031. It should be understood that the wavelength band width refers to the width of the wavelength range in which the wavelength of light lies. For example, if the wavelength of the near-infrared light passing through the first filter 031 is in the wavelength range of 700 nanometers to 800 nanometers, then the first wavelength band width is 800 nanometers minus 700 nanometers, that is, 100 nanometers. In other words, the wavelength band width of the near-infrared light passing through the first filter 031 is smaller than the wavelength band width of the near-infrared light blocked by the first filter 031.
例如,参见图18,图18为第一滤光片031可以通过的光的波长与通过率之间的关系的一种示意图。入射到第一滤光片031的近红外光的波段为650纳米~1100纳米,第一滤光片031可以使波长位于380纳米~650纳米的可见光通过,以及波长位于900纳米~1100纳米的近红外光通过,阻挡波长位于650纳米~900纳米的近红外光。也即是,第一波段宽度为1000纳米减去900纳米,即100纳米。第二波段宽度为900纳米减去650纳米,加上1100纳米减去1000纳米,即350纳米。100纳米小于350纳米,即通过第一滤光片031的近红外光的波段宽度小于第一滤光片031阻挡的近红外光的波段宽度。以上关系曲线仅是一种示例,对于不同的滤光片,能够通过滤光片的近红光波段的波段范围可以有所不同,被滤光片阻挡的近红外光的波段范围也可以有所不同。For example, referring to FIG. 18, FIG. 18 is a schematic diagram of the relationship between the wavelength and the pass rate of light that the first filter 031 can pass. The wavelength band of the near-infrared light incident on the first filter 031 is 650 nanometers to 1100 nanometers. The first filter 031 can pass visible light with a wavelength of 380 nanometers to 650 nanometers and a wavelength of near 900 nanometers to 1100 nanometers. Infrared light passes through and blocks near-infrared light with a wavelength between 650 nanometers and 900 nanometers. That is, the width of the first band is 1000 nanometers minus 900 nanometers, that is, 100 nanometers. The second band width is 900 nm minus 650 nm, plus 1100 nm minus 1000 nm, or 350 nm. 100 nanometers are smaller than 350 nanometers, that is, the wavelength band width of the near-infrared light passing through the first filter 031 is smaller than the wavelength band width of the near-infrared light blocked by the first filter 031. The above relationship curve is just an example. For different filters, the wavelength range of the near-red light that can pass through the filter can be different, and the wavelength range of the near-infrared light blocked by the filter can also be different. different.
为了避免在非近红外补光的时间段内,通过第一滤光片031的近红外光的波段宽度过宽而引入波长干扰,在一些实施例中,上述约束条件可以包括:通过第一滤光片031的近红外光的半带宽小于或等于50纳米。其中,半带宽是指通过率大于50%的近红外光的波段宽度。In order to avoid the introduction of wavelength interference due to the excessively wide band width of the near-infrared light passing through the first filter 031 during the non-near-infrared supplementary light period, in some embodiments, the above constraint conditions may include: passing the first filter The half bandwidth of the near-infrared light of the light sheet 031 is less than or equal to 50 nanometers. Among them, the half bandwidth refers to the band width of near-infrared light with a pass rate greater than 50%.
为了避免通过第一滤光片031的近红外光的波段宽度过宽而引入波长干扰,在一些实施例中,上述约束条件可以包括:第三波段宽度可以小于参考波段宽度。其中,第三波段宽度是指通过率大于设定比例的近红外光的波段宽度,作为一种示例,参考波段宽度可以为50纳米~100纳米的波段范围内的任一波段宽度。设定比例可以为30%~50%中的任一比例,当然设定比例还可以根据使用需求设置为其他比例,本申请实施例对此不做限定。换句话说,通过率大于设定比例的近红外光的波段宽度可以小于参考波段宽度。In order to avoid the introduction of wavelength interference due to the excessively wide band width of the near-infrared light passing through the first filter 031, in some embodiments, the above constraint conditions may include: the third band width may be smaller than the reference band width. Wherein, the third waveband width refers to the waveband width of near-infrared light with a pass rate greater than a set ratio. As an example, the reference waveband width may be any waveband width in the range of 50 nanometers to 100 nanometers. The set ratio can be any ratio from 30% to 50%. Of course, the set ratio can also be set to other ratios according to usage requirements, which is not limited in the embodiment of the application. In other words, the band width of the near-infrared light whose pass rate is greater than the set ratio may be smaller than the reference band width.
例如,参见图18,入射到第一滤光片031的近红外光的波段为650纳米~1100纳米,设定比例为30%,参考波段宽度为100纳米。从图18可以看出,在650纳米~1100纳米的近红外光的波段中,通过率大于30%的近红外光的波段宽度明显小于100纳米。For example, referring to FIG. 18, the wavelength band of the near-infrared light incident on the first filter 031 is 650 nanometers to 1100 nanometers, the setting ratio is 30%, and the reference wavelength band width is 100 nanometers. It can be seen from FIG. 18 that in the wavelength band of near-infrared light from 650 nanometers to 1100 nanometers, the band width of near-infrared light with a pass rate greater than 30% is significantly less than 100 nanometers.
由于第一补光装置021至少在第一预设曝光的部分曝光时间段内提供近红外补光,在第二预设曝光的整个曝光时间段内不提供近红外补光,而第一预设曝光和第二预设曝光为图像传感器01的多次曝光中的其中两次曝光,也即是,第一补光装置021在图像传感器01的部分曝光的曝光时间段内提供近红外补光,在图像传感器01的另外 一部分曝光的曝光时间段内不提供近红外补光。所以,第一补光装置021在单位时间长度内的补光次数可以低于图像传感器01在该单位时间长度内的曝光次数,其中,每相邻两次补光的间隔时间段内,间隔一次或多次曝光。Since the first light supplement device 021 provides near-infrared supplementary light at least during a partial exposure period of the first preset exposure, it does not provide near-infrared supplementary light during the entire exposure period of the second preset exposure, and the first preset exposure The exposure and the second preset exposure are two of the multiple exposures of the image sensor 01, that is, the first light supplement device 021 provides near-infrared supplement light during the exposure period of the partial exposure of the image sensor 01, The near-infrared supplementary light is not provided during the exposure time period when another part of the image sensor 01 is exposed. Therefore, the number of times of supplementary light in the unit time length of the first supplementary light device 021 may be lower than the number of exposures of the image sensor 01 in the unit time length, wherein, within the interval of two adjacent times of supplementary light, there is one interval. Or multiple exposures.
在一种可能的实现方式中,由于人眼容易将第一补光装置021进行近红外光补光的颜色与交通灯中的红灯的颜色混淆,所以,参见图19,补光器02还可以包括第二补光装置022,第二补光装置022用于进行可见光补光。这样,如果第二补光装置022至少在第一预设曝光的部分曝光时间提供可见光补光,也即是,至少在第一预设曝光的部分曝光时间段内存在近红外补光和可见光补光,这两种光的混合颜色可以区别于交通灯中的红灯的颜色,从而避免了人眼将补光器02进行近红外补光的颜色与交通灯中的红灯的颜色混淆。另外,如果第二补光装置022在第二预设曝光的曝光时间段内提供可见光补光,由于第二预设曝光的曝光时间段内可见光的强度不是特别高,因此,在第二预设曝光的曝光时间段内进行可见光补光时,还可以提高第二图像信号中的可见光的亮度,进而保证图像采集的质量。In a possible implementation manner, since the human eyes are likely to confuse the color of the near-infrared light supplemented by the first light supplement device 021 with the color of the red light in the traffic light. Therefore, referring to FIG. 19, the light supplement 02 is also A second light supplement device 022 may be included, and the second light supplement device 022 is used for visible light supplement light. In this way, if the second light supplement device 022 provides visible light supplement light for at least part of the exposure time of the first preset exposure, that is, at least the near-infrared supplement light and visible light supplement light are present during the partial exposure time period of the first preset exposure. Light, the mixed color of the two lights can be distinguished from the color of the red light in the traffic light, thereby avoiding the human eye from confusing the color of the light fill 02 for near-infrared fill light with the color of the red light in the traffic light. In addition, if the second light supplement device 022 provides visible light supplement light during the exposure time period of the second preset exposure, since the intensity of visible light is not particularly high during the exposure time period of the second preset exposure, When the visible light supplement is performed during the exposure time period of the exposure, the brightness of the visible light in the second image signal can also be increased, thereby ensuring the quality of image collection.
在一些实施例中,第二补光装置022可以用于以常亮方式进行可见光补光;或者,第二补光装置022可以用于以频闪方式进行可见光补光,其中,至少在第一预设曝光的部分曝光时间段内存在可见光补光,在第二预设曝光的整个曝光时间段内不存在可见光补光;或者,第二补光装置022可以用于以频闪方式进行可见光补光,其中,至少在第一预设曝光的整个曝光时间段内不存在可见光补光,在第二预设曝光的部分曝光时间段内存在可见光补光。当第二补光装置022常亮方式进行可见光补光时,不仅可以避免人眼将第一补光装置021进行近红外补光的颜色与交通灯中的红灯的颜色混淆,还可以提高第二图像信号中的可见光的亮度,进而保证图像采集的质量。当第二补光装置022以频闪方式进行可见光补光时,可以避免人眼将第一补光装置021进行近红外补光的颜色与交通灯中的红灯的颜色混淆,或者,可以提高第二图像信号中的可见光的亮度,进而保证图像采集的质量,而且还可以减少第二补光装置022的补光次数,从而延长第二补光装置022的使用寿命。In some embodiments, the second light supplement device 022 may be used to perform visible light supplement light in a constant light mode; or, the second light supplement device 022 may be used to perform visible light supplement light in a stroboscopic manner, wherein, at least in the first Visible light supplement light exists in part of the exposure time period of the preset exposure, and there is no visible light supplement light during the entire exposure time period of the second preset exposure; or, the second light supplement device 022 can be used to perform visible light supplement light in a strobe mode There is no visible light supplementary light at least during the entire exposure time period of the first preset exposure, and visible light supplementary light exists during the partial exposure time period of the second preset exposure. When the second light supplement device 022 performs visible light supplement light in a constant light mode, it can not only prevent human eyes from confusing the color of the first supplement light device 021 for near-infrared supplement light with the color of the red light in the traffic light, but also can improve the Second, the brightness of visible light in the image signal to ensure the quality of image collection. When the second light supplement device 022 performs visible light supplement light in a stroboscopic manner, it can prevent human eyes from confusing the color of the first light supplement device 021 for near-infrared supplement light with the color of the red light in the traffic light, or can improve The brightness of the visible light in the second image signal in turn ensures the quality of image collection, and can also reduce the number of times of supplementary light of the second supplementary light device 022, thereby prolonging the service life of the second supplementary light device 022.
需要说明的是,切换部件033用于将第二滤光片032切换到图像传感器01的入光侧,也可以理解为第二滤光片032替换第一滤光片031在图像传感器01的入光侧的位置。在第二滤光片032切换到图像传感器01的入光侧之后,第一补光装置021可以处于关闭状态也可以处于开启状态。通过增加切换部件和第二滤光片,能够兼容多种图像采集功能,提高了灵活性。It should be noted that the switching component 033 is used to switch the second filter 032 to the light incident side of the image sensor 01, and can also be understood as the second filter 032 replacing the first filter 031 in the image sensor 01. Position on the light side. After the second light filter 032 is switched to the light incident side of the image sensor 01, the first light supplement device 021 may be in a closed state or an open state. By adding a switching component and a second filter, it can be compatible with multiple image acquisition functions, which improves flexibility.
第一图像信号是第一预设曝光产生并输出的,第二图像信号是第二预设曝光产生并输出的,在产生并输出第一图像信号和第二图像信号之后,可以对第一图像信号和第二图像信号进行处理。在某些情况下,第一图像信号和第二图像信号的用途可能不同,所以在一些实施例中,第一预设曝光与第二预设曝光的至少一个曝光参数可以不同。作为一种示例,该至少一个曝光参数可以包括但不限于曝光时间、模拟增益、数字增益、光圈大小中的一种或多种。其中,曝光增益包括模拟增益和/或数字增益。The first image signal is generated and output by the first preset exposure, and the second image signal is generated and output by the second preset exposure. After the first image signal and the second image signal are generated and output, the first image can be The signal and the second image signal are processed. In some cases, the purposes of the first image signal and the second image signal may be different, so in some embodiments, at least one exposure parameter of the first preset exposure and the second preset exposure may be different. As an example, the at least one exposure parameter may include but is not limited to one or more of exposure time, analog gain, digital gain, and aperture size. Wherein, the exposure gain includes analog gain and/or digital gain.
在一些实施例中。可以理解的是,与第二预设曝光相比,在存在近红外补光时,图像传感器01感应到的近红外光的强度较强,相应地产生并输出的第一图像信号包括的近红外光的亮度也会较高。但是较高亮度的近红外光不利于外部场景信息的获取。 而且在一些实施例中,曝光增益越大,图像传感器01输出的图像信号的亮度越高,曝光增益越小,图像传感器01输出的图像信号的亮度越低,因此,为了保证第一图像信号包含的近红外光的亮度在合适的范围内,在第一预设曝光和第二预设曝光的至少一个曝光参数不同的情况下,作为一种示例,第一预设曝光的曝光增益可以小于第二预设曝光的曝光增益。这样,在第一补光装置021进行近红外补光时,图像传感器01产生并输出的第一图像信号包含的近红外光的亮度,不会因第一补光装置021进行近红外补光而过高。In some embodiments. It can be understood that, compared with the second preset exposure, when the near-infrared light is present, the intensity of the near-infrared light sensed by the image sensor 01 is stronger, and the first image signal generated and output accordingly includes the near-infrared light The brightness of the light will also be higher. However, near-infrared light with higher brightness is not conducive to the acquisition of external scene information. Moreover, in some embodiments, the greater the exposure gain, the higher the brightness of the image signal output by the image sensor 01, and the smaller the exposure gain, the lower the brightness of the image signal output by the image sensor 01. Therefore, in order to ensure that the first image signal contains The brightness of the near-infrared light is within a suitable range, and when at least one exposure parameter of the first preset exposure and the second preset exposure are different, as an example, the exposure gain of the first preset exposure may be smaller than the first preset exposure. 2. Exposure gain for preset exposure. In this way, when the first light supplement device 021 performs near-infrared supplement light, the brightness of the near-infrared light contained in the first image signal generated and output by the image sensor 01 will not be affected by the first light supplement device 021 performing near-infrared supplement light. Too high.
在另一些实施例中,曝光时间越长,图像传感器01得到的图像信号包括的亮度越高,并且外部场景中的运动的对象在图像信号中的运动拖尾越长;曝光时间越短,图像传感器01得到的图像信号包括的亮度越低,并且外部场景中的运动的对象在图像信号中的运动拖尾越短。因此,为了保证第一图像信号包含的近红外光的亮度在合适的范围内,且外部场景中的运动的对象在第一图像信号中的运动拖尾较短。在第一预设曝光和第二预设曝光的至少一个曝光参数不同的情况下,作为一种示例,第一预设曝光的曝光时间可以小于第二预设曝光的曝光时间。这样,在第一补光装置021进行近红外补光时,图像传感器01产生并输出的第一图像信号包含的近红外光的亮度,不会因第一补光装置021进行近红外补光而过高。并且较短的曝光时间使外部场景中的运动的对象在第一图像信号中出现的运动拖尾较短,从而有利于对运动对象的识别。示例性地,第一预设曝光的曝光时间为40毫秒,第二预设曝光的曝光时间为60毫秒等。In other embodiments, the longer the exposure time, the higher the brightness included in the image signal obtained by the image sensor 01, and the longer the motion trailing of the moving objects in the external scene in the image signal; the shorter the exposure time, the longer the image The image signal obtained by the sensor 01 includes the lower the brightness, and the shorter the motion trail of the moving object in the external scene is in the image signal. Therefore, in order to ensure that the brightness of the near-infrared light contained in the first image signal is within an appropriate range, and that the moving objects in the external scene have a short motion trail in the first image signal. In the case where at least one exposure parameter of the first preset exposure and the second preset exposure are different, as an example, the exposure time of the first preset exposure may be less than the exposure time of the second preset exposure. In this way, when the first light supplement device 021 performs near-infrared supplement light, the brightness of the near-infrared light contained in the first image signal generated and output by the image sensor 01 will not be affected by the first light supplement device 021 performing near-infrared supplement light. Too high. In addition, the shorter exposure time makes the motion trailing of the moving object in the external scene appear shorter in the first image signal, thereby facilitating the recognition of the moving object. Exemplarily, the exposure time of the first preset exposure is 40 milliseconds, the exposure time of the second preset exposure is 60 milliseconds, and so on.
值得注意的是,在一些实施例中,当第一预设曝光的曝光增益小于第二预设曝光的曝光增益时,第一预设曝光的曝光时间不仅可以小于第二预设曝光的曝光时间,还可以等于第二预设曝光的曝光时间。同理,当第一预设曝光的曝光时间小于第二预设曝光的曝光时间时,第一预设曝光的曝光增益可以小于第二预设曝光的曝光增益,也可以等于第二预设曝光的曝光增益。It is worth noting that, in some embodiments, when the exposure gain of the first preset exposure is less than the exposure gain of the second preset exposure, the exposure time of the first preset exposure may not only be less than the exposure time of the second preset exposure , Can also be equal to the exposure time of the second preset exposure. Similarly, when the exposure time of the first preset exposure is less than the exposure time of the second preset exposure, the exposure gain of the first preset exposure may be less than the exposure gain of the second preset exposure, or may be equal to the second preset exposure The exposure gain.
在另一些实施例中,第一图像信号和第二图像信号的用途可以相同,例如第一图像信号和第二图像信号都用于智能分析时,为了能使进行智能分析的人脸或目标在运动时能够有同样的清晰度,第一预设曝光与第二预设曝光的至少一个曝光参数可以相同。作为一种示例,第一预设曝光的曝光时间可以等于第二预设曝光的曝光时间,如果第一预设曝光的曝光时间和第二预设曝光的曝光时间不同,会出现曝光时间较长的一路图像信号存在运动拖尾,导致两路图像信号的清晰度不同。同理,作为另一种示例,第一预设曝光的曝光增益可以等于第二预设曝光的曝光增益。In other embodiments, the purposes of the first image signal and the second image signal may be the same. For example, when both the first image signal and the second image signal are used for intelligent analysis, in order to enable the face or target for intelligent analysis to be The same definition can be achieved during exercise, and at least one exposure parameter of the first preset exposure and the second preset exposure can be the same. As an example, the exposure time of the first preset exposure may be equal to the exposure time of the second preset exposure. If the exposure time of the first preset exposure and the exposure time of the second preset exposure are different, the exposure time will be longer. There is a motion trailing in the image signal of one channel, resulting in different definitions of the two image signals. Similarly, as another example, the exposure gain of the first preset exposure may be equal to the exposure gain of the second preset exposure.
值得注意的是,在一些实施例中,当第一预设曝光的曝光时间等于第二预设曝光的曝光时间时,第一预设曝光的曝光增益可以小于第二预设曝光的曝光增益,也可以等于第二预设曝光的曝光增益。同理,当第一预设曝光的曝光增益等于第二预设曝光的曝光增益时,第一预设曝光的曝光时间可以小于第二预设曝光的曝光时间,也可以等于第二预设曝光的曝光时间。It should be noted that, in some embodiments, when the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure, the exposure gain of the first preset exposure may be less than the exposure gain of the second preset exposure. It can also be equal to the exposure gain of the second preset exposure. Similarly, when the exposure gain of the first preset exposure is equal to the exposure gain of the second preset exposure, the exposure time of the first preset exposure may be less than the exposure time of the second preset exposure, or may be equal to the second preset exposure The exposure time.
其中,图像传感器01可以包括多个感光通道,每个感光通道可以用于感应至少一种可见光波段的光,以及感应近红外波段的光。也即是,每个感光通道既能感应至少一种可见光波段的光,又能感应近红外波段的光。在一种可能的实现方式中,该多 个感光通道可以用于感应至少两种不同的可见光波段的光。The image sensor 01 may include multiple photosensitive channels, and each photosensitive channel may be used to sense at least one type of light in the visible light band and to sense light in the near-infrared band. That is, each photosensitive channel can not only sense at least one kind of light in the visible light band, but also can sense light in the near-infrared band. In a possible implementation, the multiple photosensitive channels can be used to sense at least two different visible light wavebands.
在一些实施例中,该多个感光通道可以包括R感光通道、G感光通道、B感光通道、Y感光通道、W感光通道和C感光通道中的至少两种。其中,R感光通道用于感应红光波段和近红外波段的光,G感光通道用于感应绿光波段和近红外波段的光,B感光通道用于感应蓝光波段和近红外波段的光,Y感光通道用于感应黄光波段和近红外波段的光。由于在一些实施例中,可以用W来表示用于感应全波段的光的感光通道,在另一些实施例中,可以用C来表示用于感应全波段的光的感光通道,所以当该多个感光通道包括用于感应全波段的光的感光通道时,这个感光通道可以是W感光通道,也可以是C感光通道。也即是,在实际应用中,可以根据使用需求来选择用于感应全波段的光的感光通道。示例性地,图像传感器01可以为RGB传感器、RGBW传感器,或RCCB传感器,或RYYB传感器。其中,RGB传感器中的R感光通道、G感光通道和B感光通道的分布方式可以参见图20,RGBW传感器中的R感光通道、G感光通道、B感光通道和W感光通道的分布方式可以参见图21,RCCB传感器中的R感光通道、C感光通道和B感光通道分布方式可以参见图22,RYYB传感器中的R感光通道、Y感光通道和B感光通道分布方式可以参见图23。In some embodiments, the plurality of photosensitive channels may include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, Y photosensitive channels, W photosensitive channels, and C photosensitive channels. Among them, the R photosensitive channel is used to sense the light in the red and near-infrared bands, the G photosensitive channel is used to sense the light in the green and near-infrared bands, and the B photosensitive channel is used to sense the light in the blue and near-infrared bands. Y The photosensitive channel is used to sense light in the yellow band and near-infrared band. Since in some embodiments, W can be used to represent the light-sensing channel used to sense full-wavelength light, in other embodiments, C can be used to represent the light-sensing channel used to sense full-wavelength light, so when there is more When a photosensitive channel includes a photosensitive channel for sensing light of a full waveband, this photosensitive channel may be a W photosensitive channel or a C photosensitive channel. That is, in practical applications, the photosensitive channel used for sensing the light of the full waveband can be selected according to the use requirements. Exemplarily, the image sensor 01 may be an RGB sensor, RGBW sensor, or RCCB sensor, or RYYB sensor. Among them, the distribution of the R photosensitive channel, the G photosensitive channel and the B photosensitive channel in the RGB sensor can be seen in Figure 20, and the distribution of the R photosensitive channel, G photosensitive channel, B photosensitive channel and W photosensitive channel in the RGBW sensor can be seen in the figure 21. The distribution of the R photosensitive channel, the C photosensitive channel and the B photosensitive channel in the RCCB sensor can be seen in Figure 22, and the distribution of the R photosensitive channel, the Y photosensitive channel and the B photosensitive channel in the RYYB sensor can be seen in Figure 23.
在另一些实施例中,有些感光通道也可以仅感应近红外波段的光,而不感应可见光波段的光。作为一种示例,该多个感光通道可以包括R感光通道、G感光通道、B感光通道、IR感光通道中的至少两种。其中,R感光通道用于感应红光波段和近红外波段的光,G感光通道用于感应绿光波段和近红外波段的光,B感光通道用于感应蓝光波段和近红外波段的光,IR感光通道用于感应近红外波段的光。In other embodiments, some photosensitive channels may only sense light in the near-infrared waveband, but not light in the visible light waveband. As an example, the plurality of photosensitive channels may include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, and IR photosensitive channels. Among them, the R photosensitive channel is used to sense red light and near-infrared light, the G photosensitive channel is used to sense green light and near-infrared light, and the B photosensitive channel is used to sense blue light and near-infrared light. IR The photosensitive channel is used to sense light in the near-infrared band.
示例地,图像传感器01可以为RGBIR传感器,其中,RGBIR传感器中的每个IR感光通道都可以感应近红外波段的光,而不感应可见光波段的光。For example, the image sensor 01 may be an RGBIR sensor, where each IR photosensitive channel in the RGBIR sensor can sense light in the near-infrared waveband, but not light in the visible light waveband.
其中,当图像传感器01为RGB传感器时,相比于其他图像传感器,如RGBIR传感器等,RGB传感器采集的RGB信息更完整,RGBIR传感器有一部分的感光通道采集不到可见光,所以RGB传感器采集的图像的色彩细节更准确。Among them, when the image sensor 01 is an RGB sensor, compared to other image sensors, such as RGBIR sensors, the RGB information collected by the RGB sensor is more complete. Some of the photosensitive channels of the RGBIR sensor cannot collect visible light, so the image collected by the RGB sensor The color details are more accurate.
值得注意的是,图像传感器01包括的多个感光通道可以对应多条感应曲线。示例性地,参见图24,图24中的R曲线代表图像传感器01对红光波段的光的感应曲线,G曲线代表图像传感器01对绿光波段的光的感应曲线,B曲线代表图像传感器01对蓝光波段的光的感应曲线,W(或者C)曲线代表图像传感器01感应全波段的光的感应曲线,NIR(Near infrared,近红外光)曲线代表图像传感器01感应近红外波段的光的感应曲线。It should be noted that the multiple photosensitive channels included in the image sensor 01 may correspond to multiple sensing curves. Exemplarily, referring to FIG. 24, the R curve in FIG. 24 represents the sensing curve of the image sensor 01 to light in the red light band, the G curve represents the sensing curve of the image sensor 01 to light in the green light band, and the B curve represents the image sensor 01 For the sensing curve of light in the blue band, the W (or C) curve represents the sensing curve of the image sensor 01 sensing the light in the full band, and the NIR (Near infrared) curve represents the sensing of the image sensor 01 sensing the light in the near infrared band. curve.
作为一种示例,图像传感器01可以采用全局曝光方式,也可以采用卷帘曝光方式。其中,全局曝光方式是指每一行有效图像的曝光开始时刻均相同,且每一行有效图像的曝光结束时刻均相同。换句话说,全局曝光方式是所有行有效图像同时进行曝光并且同时结束曝光的一种曝光方式。卷帘曝光方式是指不同行有效图像的曝光时间不完全重合,也即是,一行有效图像的曝光开始时刻都晚于上一行有效图像的曝光开始时刻,且一行有效图像的曝光结束时刻都晚于上一行有效图像的曝光结束时刻。另外,卷帘曝光方式中每一行有效图像结束曝光后可以进行数据输出,因此,从第一行有效图像的数据开始输出时刻到最后一行有效图像的数据结束输出时刻之间的时间 可以表示为读出时间。As an example, the image sensor 01 may adopt a global exposure method or a rolling shutter exposure method. Among them, the global exposure mode means that the exposure start time of each row of effective images is the same, and the exposure end time of each row of effective images is the same. In other words, the global exposure mode is an exposure mode in which all rows of effective images are exposed at the same time and the exposure ends at the same time. Rolling shutter exposure mode means that the exposure time of different rows of effective images does not completely coincide, that is, the exposure start time of a row of effective images is later than the exposure start time of the previous row of effective images, and the exposure end time of a row of effective images is later At the end of the exposure of the effective image on the previous line. In addition, in the rolling exposure mode, data can be output after each line of effective image is exposed. Therefore, the time from the start of output of the first line of effective image to the end of output of the last line of effective image can be expressed as reading Time out.
示例性地,参见图25,图25为一种卷帘曝光方式的示意图。从图10可以看出,第1行有效图像在T1时刻开始曝光,在T3时刻结束曝光,第2行有效图像在T2时刻开始曝光,在T4时刻结束曝光,T2时刻相比于T1时刻向后推移了一个时间段,T4时刻相比于T3时刻向后推移了一个时间段。另外,第1行有效图像在T3时刻结束曝光并开始输出数据,在T5时刻结束数据的输出,第n行有效图像在T6时刻结束曝光并开始输出数据,在T7时刻结束数据的输出,则T3~T7时刻之间的时间即为读出时间。For example, refer to FIG. 25, which is a schematic diagram of a rolling shutter exposure method. It can be seen from Figure 10 that the effective image of the first line begins to be exposed at time T1, and the exposure ends at time T3. The effective image of the second line begins to be exposed at time T2, and the exposure ends at time T4. Time T2 is backward compared to time T1. A period of time has passed, and time T4 has moved a period of time backward compared to time T3. In addition, the effective image of the first line ends exposure at time T3 and begins to output data, and the output of data ends at time T5. The effective image of line n ends exposure at time T6 and begins to output data, and the output of data ends at time T7, then T3 The time between ~T7 is the read time.
在一些实施例中,当图像传感器01采用全局曝光方式进行多次曝光时,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,近红外补光的时间段是第一预设曝光的曝光时间段的子集,或者,近红外补光的时间段与第一预设曝光的曝光时间段存在交集,或者第一预设曝光的曝光时间段是近红外补光的子集。这样,即可实现至少在第一预设曝光的部分曝光时间段内存在近红外补光,在第二预设曝光的整个曝光时间段内不存在近红外补光,从而不会对第二预设曝光造成影响。In some embodiments, when the image sensor 01 uses the global exposure mode to perform multiple exposures, for any one near-infrared fill light, the time period of the near-infrared fill light and the exposure time period of the nearest second preset exposure do not exist Intersection, the time period of near-infrared fill light is a subset of the exposure time period of the first preset exposure, or the time period of near-infrared fill light and the exposure time period of the first preset exposure overlap, or the first preset The exposure period of exposure is a subset of the near-infrared fill light. In this way, it can be realized that at least the near-infrared supplementary light exists in a part of the exposure time period of the first preset exposure, and there is no near-infrared supplementary light during the entire exposure time period of the second preset exposure, so that the second preset exposure is not affected. Set the exposure to affect.
例如,参见图26,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,近红外补光的时间段是第一预设曝光的曝光时间段的子集。参见图27,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,近红外补光的时间段与第一预设曝光的曝光时间段存在交集。参见图28,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,第一预设曝光的曝光时间段是近红外补光的子集。For example, referring to Figure 26, for any near-infrared fill light, the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the time period of near-infrared fill light is the first preset A subset of the exposure time period for exposure. Referring to Figure 27, for any one near-infrared fill light, the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the time period of near-infrared fill light is equal to that of the first preset exposure. There is an intersection of exposure time periods. Referring to Figure 28, for any one near-infrared fill light, the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the exposure time period of the first preset exposure is near-infrared fill light A subset of.
在另一些实施例中,当图像传感器01采用卷帘曝光方式进行多次曝光时,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集。并且,近红外补光的开始时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻,近红外补光的结束时刻不晚于第一预设曝光中第一行有效图像的曝光结束时刻。或者,近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光结束时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻。或者,近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光开始时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光结束时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻。In other embodiments, when the image sensor 01 adopts rolling shutter exposure for multiple exposures, for any one near-infrared fill light, the time period of near-infrared fill light is the same as the exposure time period of the nearest second preset exposure There is no intersection. In addition, the start time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure of the first line of the effective image in the first preset exposure End time. Or, the start time of the near-infrared fill light is not earlier than the exposure end time of the last line of the effective image of the nearest second preset exposure before the first preset exposure and not later than the first line of the first preset exposure. The exposure end time of the image, the end time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure and no later than the nearest second preset exposure after the first preset exposure The exposure start time of the first line of valid images. Or, the start time of the near-infrared fill light is not earlier than the exposure end time of the last line of the effective image of the nearest second preset exposure before the first preset exposure and not later than the first line of the first preset exposure. The exposure start time of the image, the end time of the near-infrared fill light is no earlier than the exposure end time of the last line of the effective image in the first preset exposure and no later than the nearest second preset exposure after the first preset exposure The exposure start time of the first line of valid images.
例如,参见图29,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,并且,近红外补光的开始时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻,近红外补光的结束时刻不晚于第一预设曝光中第一行有效图像的曝光结束时刻。参见图30,对于任意一次近红外补光,近红外 补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,并且,近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光结束时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻。参见图31,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,并且,近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光开始时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光结束时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻。图29至图31仅是一种示例,第一预设曝光和第二预设曝光的排序可以不限于这些示例。图29至图31中,针对第一预设曝光和第二预设曝光,倾斜虚线表示曝光开始时刻,倾斜实线表示曝光结束时刻,针对第一预设曝光,竖直虚线之间表示第一预设曝光对应的近红外补光的时间段。For example, referring to Figure 29, for any one time of near-infrared fill light, the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the start time of near-infrared fill light is no earlier than The exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure end time of the first line of the effective image in the first preset exposure. Referring to Figure 30, for any one near-infrared fill light, the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the start time of near-infrared fill light is no earlier than the first The exposure end time of the last effective image line of the nearest second preset exposure before the preset exposure and not later than the exposure end time of the first effective image line in the first preset exposure, and the end time of the near-infrared fill light is not It is earlier than the exposure start time of the last line of the effective image in the first preset exposure and not later than the exposure start time of the first line of the effective image of the nearest second preset exposure after the first preset exposure. Referring to Figure 31, for any one near-infrared fill light, the time period of near-infrared fill light does not overlap with the exposure time period of the nearest second preset exposure, and the start time of near-infrared fill light is no earlier than the first The exposure end time of the last line of the effective image of the nearest second preset exposure before the preset exposure and not later than the exposure start time of the first line of the effective image in the first preset exposure, the end time of the near-infrared fill light is not It is earlier than the exposure end time of the last line of the effective image in the first preset exposure and not later than the exposure start time of the first line of the effective image of the nearest second preset exposure after the first preset exposure. FIGS. 29 to 31 are only an example, and the sorting of the first preset exposure and the second preset exposure may not be limited to these examples. In FIGS. 29 to 31, for the first preset exposure and the second preset exposure, the slanted dotted line indicates the start time of exposure, the slanted solid line indicates the end time of exposure, and for the first preset exposure, the vertical dotted line indicates the first The time period of the near-infrared fill light corresponding to the preset exposure.
其中,多次曝光可以包括奇数次曝光和偶数次曝光,这样,第一预设曝光和第二预设曝光可以包括但不限于如下几种方式:The multiple exposures may include odd exposures and even exposures. In this way, the first preset exposure and the second preset exposure may include, but are not limited to, the following methods:
第一种可能的实现方式,第一预设曝光为奇数次曝光中的一次曝光,第二预设曝光为偶数次曝光中的一次曝光。这样,多次曝光可以包括按照奇偶次序排列的第一预设曝光和第二预设曝光。例如,多次曝光中的第1次曝光、第3个曝光、第5次曝光等奇数次曝光均为第一预设曝光,第2次曝光、第4次曝光、第6次曝光等偶数次曝光均为第二预设曝光。In the first possible implementation manner, the first preset exposure is one exposure in an odd number of exposures, and the second preset exposure is one exposure in an even number of exposures. In this way, the multiple exposures may include the first preset exposure and the second preset exposure arranged in a parity order. For example, odd-numbered exposures such as the first exposure, third exposure, and fifth exposure in multiple exposures are all the first preset exposures, and the second exposure, fourth exposure, and sixth exposure are even-numbered exposures. The exposure is the second preset exposure.
第二种可能的实现方式,第一预设曝光为偶数次曝光中的一次曝光,第二预设曝光为奇数次曝光中的一次曝光,这样,多次曝光可以包括按照奇偶次序排列的第一预设曝光和第二预设曝光。例如,多次曝光中的第1次曝光、第3个曝光、第5次曝光等奇数次曝光均为第二预设曝光,第2次曝光、第4次曝光、第6次曝光等偶数次曝光均为第一预设曝光。In the second possible implementation manner, the first preset exposure is one exposure in an even number of exposures, and the second preset exposure is one exposure in an odd number of exposures. In this way, the multiple exposures may include the first exposure in the odd-even order. The preset exposure and the second preset exposure. For example, odd-numbered exposures such as the first exposure, third exposure, and fifth exposure in multiple exposures are all second preset exposures, and even-numbered exposures such as the second exposure, fourth exposure, and sixth exposure The exposure is the first preset exposure.
第三种可能的实现方式,第一预设曝光为指定的奇数次曝光中的一次曝光,第二预设曝光为除指定的奇数次曝光之外的其他曝光中的一次曝光,也即是,第二预设曝光可以为多次曝光中的奇数次曝光,也可以为多次曝光中的偶数次曝光。In the third possible implementation manner, the first preset exposure is one of the specified odd-numbered exposures, and the second preset exposure is one of the exposures other than the specified odd-numbered exposures, that is, The second preset exposure may be an odd number of exposures in multiple exposures, or an even number of exposures in multiple exposures.
第四种可能的实现方式,第一预设曝光为指定的偶数次曝光中的一次曝光,第二预设曝光为除指定的偶数次曝光之外的其他曝光中的一次曝光,也即是,第二预设曝光可以为多次曝光中的奇数次曝光,也可以为多次曝光中的偶数次曝光。In the fourth possible implementation manner, the first preset exposure is one exposure in the specified even number of exposures, and the second preset exposure is one exposure in the other exposures except the specified even number of exposures, that is, The second preset exposure may be an odd number of exposures in multiple exposures, or an even number of exposures in multiple exposures.
第五种可能的实现方式,第一预设曝光为第一曝光序列中的一次曝光,第二预设曝光为第二曝光序列中的一次曝光。In the fifth possible implementation manner, the first preset exposure is one exposure in the first exposure sequence, and the second preset exposure is one exposure in the second exposure sequence.
第六种可能的实现方式,第一预设曝光为第二曝光序列中的一次曝光,第二预设曝光为第一曝光序列中的一次曝光。In the sixth possible implementation manner, the first preset exposure is one exposure in the second exposure sequence, and the second preset exposure is one exposure in the first exposure sequence.
其中,上述多次曝光包括多个曝光序列,第一曝光序列和第二曝光序列为该多个曝光序列中的同一个曝光序列或者两个不同的曝光序列,每个曝光序列包括N次曝光,该N次曝光包括1次第一预设曝光和N-1次第二预设曝光,或者,该N次曝光包括1 次第二预设曝光和N-1次第二预设曝光,N为大于2的正整数。Wherein, the aforementioned multiple exposure includes multiple exposure sequences, the first exposure sequence and the second exposure sequence are the same exposure sequence or two different exposure sequences in the multiple exposure sequences, and each exposure sequence includes N exposures, The N exposures include 1 first preset exposure and N-1 second preset exposures, or the N exposures include 1 second preset exposure and N-1 second preset exposures, where N is A positive integer greater than 2.
例如,每个曝光序列包括3次曝光,这3次曝光可以包括1次第一预设曝光和2次第二预设曝光,这样,每个曝光序列的第1次曝光可以为第一预设曝光,第2次和第3次曝光为第二预设曝光。也即是,每个曝光序列可以表示为:第一预设曝光、第二预设曝光、第二预设曝光。或者,这3次曝光可以包括1次第二预设曝光和2次第一预设曝光,这样,每个曝光序列的第1次曝光可以为第二预设曝光,第2次和第3次曝光为第一预设曝光。也即是,每个曝光序列可以表示为:第二预设曝光、第一预设曝光、第一预设曝光。For example, each exposure sequence includes 3 exposures, and these 3 exposures can include 1 first preset exposure and 2 second preset exposures. In this way, the first exposure of each exposure sequence can be the first preset Exposure, the second and third exposures are the second preset exposure. That is, each exposure sequence can be expressed as: a first preset exposure, a second preset exposure, and a second preset exposure. Or, these 3 exposures can include 1 second preset exposure and 2 first preset exposures, so that the first exposure of each exposure sequence can be the second preset exposure, the second and the third The exposure is the first preset exposure. That is, each exposure sequence can be expressed as: the second preset exposure, the first preset exposure, and the first preset exposure.
上述仅提供了六种第一预设曝光和第二预设曝光的可能的实现方式,实际应用中,不限于上述六种可能的实现方式,本申请实施例对此不做限定。The foregoing only provides six possible implementation manners of the first preset exposure and the second preset exposure. In actual applications, it is not limited to the foregoing six possible implementation manners, which are not limited in the embodiment of the present application.
综上,当环境光中的可见光强度较弱时,例如夜晚,可以通过第一补光装置021频闪式的补光,使图像传感器01产生并输出包含近红外亮度信息的第一图像信号,以及包含可见光亮度信息的第二图像信号,且由于第一图像信号和第二图像信号均由同一个图像传感器01获取,所以第一图像信号的视点与第二图像信号的视点相同,从而通过第一图像信号和第二图像信号可以获取完整的外部场景的信息。在可见光强度较强时,例如白天,白天近红外光的占比比较强,采集的图像的色彩还原度不佳,可以通过图像传感器01产生并输出的包含可见光亮度信息的第三图像信号,这样即使白天,也可以采集到色彩还原度比较好的图像,也可达到不论可见光强度的强弱,或者说不论白天还是夜晚,均能高效、简便地获取外部场景的真实色彩信息。To sum up, when the intensity of visible light in the ambient light is weak, such as at night, the first light supplement device 021 can be used to stroboscopically fill light to make the image sensor 01 generate and output the first image signal containing near-infrared brightness information. And the second image signal containing visible light brightness information, and since the first image signal and the second image signal are both acquired by the same image sensor 01, the viewpoint of the first image signal is the same as the viewpoint of the second image signal, so that the The first image signal and the second image signal can obtain complete information of the external scene. When the intensity of visible light is strong, such as daytime, the proportion of near-infrared light during the day is relatively strong, and the color reproduction of the collected image is not good. The third image signal containing the visible light brightness information can be generated and output by the image sensor 01, so Even during the day, images with better color reproduction can be collected, and the true color information of the external scene can be obtained efficiently and simply regardless of the intensity of visible light, or whether it is day or night.
本申请利用图像传感器的曝光时序来控制补光装置的近红外补光时序,以便在第一预设曝光的过程中进行近红外补光并产生第一图像信号,在第二预设曝光的过程中不进行近红外补光并产生第二图像信号,这样的数据采集方式,可以在结构简单、降低成本的同时直接采集到亮度信息不同的第一图像信号和第二图像信号,也即通过一个图像传感器就可以获取两种不同的图像信号,使得该图像采集装置更加简便,进而使得获取第一图像信号和第二图像信号也更加高效。并且,第一图像信号和第二图像信号均由同一个图像传感器产生并输出,所以第一图像信号对应的视点与第二图像信号对应的视点相同。因此,通过第一图像信号和第二图像信号可以共同获取外部场景的信息,且不会存在因第一图像信号对应的视点与第二图像信号对应的视点不相同,而导致根据第一图像信号和第二图像信号生成的图像不对齐。The present application uses the exposure timing of the image sensor to control the near-infrared supplementary light timing of the supplementary light device, so that the near-infrared supplementary light is performed during the first preset exposure and the first image signal is generated, and during the second preset exposure It does not perform near-infrared supplementary light and generates a second image signal. This data collection method can directly collect the first image signal and the second image signal with different brightness information while the structure is simple and the cost is reduced, that is, through one The image sensor can acquire two different image signals, which makes the image acquisition device easier and more efficient to acquire the first image signal and the second image signal. In addition, the first image signal and the second image signal are both generated and output by the same image sensor, so the viewpoint corresponding to the first image signal is the same as the viewpoint corresponding to the second image signal. Therefore, the information of the external scene can be jointly obtained through the first image signal and the second image signal, and there is no difference between the viewpoint corresponding to the first image signal and the viewpoint corresponding to the second image signal. It is not aligned with the image generated by the second image signal.
进一步的,本申请一些实施例中降噪处理可参见以下方案:Further, the noise reduction processing in some embodiments of the present application can refer to the following solutions:
在一些可能的实现方式中,参见图32,联合降噪单元可以包括时域降噪单元021。其中,时域降噪单元021用于根据第一图像信号和所述第二图像信号进行运动估计,得到运动估计结果,根据运动估计结果对第一图像信号进行时域滤波处理,得到近红外光降噪图像,根据运动估计结果对第二图像信号进行时域滤波处理,得到可见光降噪图像。In some possible implementations, referring to FIG. 32, the joint noise reduction unit may include a time domain noise reduction unit 021. Wherein, the temporal noise reduction unit 021 is configured to perform motion estimation according to the first image signal and the second image signal to obtain a motion estimation result, and perform temporal filtering processing on the first image signal according to the motion estimation result to obtain near-infrared light For the noise-reduced image, the second image signal is subjected to time-domain filtering processing according to the motion estimation result to obtain a visible light noise-reduced image.
需要说明的是,参见图33,该时域降噪单元021可以包括运动估计单元0211和时域滤波单元0212。It should be noted that, referring to FIG. 33, the temporal noise reduction unit 021 may include a motion estimation unit 0211 and a temporal filtering unit 0212.
在一些示例中,该运动估计单元0211可以用于根据第一图像信号和第一历史降噪图像生成第一帧差图像,根据第一帧差图像和多个第一设定帧差阈值确定第一图像 信号中每个像素点的第一时域滤波强度,其中,第一历史降噪图像是指对所述第一图像信号的前N帧图像中的任一帧图像进行降噪后的图像;该时域滤波单元0212用于根据每个像素点的第一时域滤波强度对第一图像信号进行时域滤波处理,得到近红外光降噪图像,根据每个像素点的第一时域滤波强度对第二图像信号进行时域滤波处理,得到可见光降噪图像。In some examples, the motion estimation unit 0211 may be configured to generate a first frame difference image according to the first image signal and the first historical noise reduction image, and determine the first frame difference image according to the first frame difference image and multiple first set frame difference thresholds. The first temporal filtering strength of each pixel in an image signal, where the first historical noise reduction image refers to an image after noise reduction is performed on any one of the first N frames of the first image signal ; The time domain filtering unit 0212 is used to perform time domain filtering processing on the first image signal according to the first time domain filtering strength of each pixel to obtain a near-infrared light noise reduction image, according to the first time domain of each pixel The filtering strength performs time-domain filtering processing on the second image signal to obtain a visible light denoising image.
示例性地,该运动估计单元0211可以将第一图像信号中的每个像素点和第一历史降噪图像中对应的像素点进行作差处理,得到原始帧差图像,将该原始帧差图像作为第一帧差图像。Exemplarily, the motion estimation unit 0211 may perform difference processing on each pixel in the first image signal and the corresponding pixel in the first historical noise reduction image to obtain the original frame difference image, and the original frame difference image As the first frame difference image.
或者,该运动估计单元0211可以将第一图像信号中的每个像素点和第一历史降噪图像中对应的像素点进行作差处理,得到原始帧差图像。之后,对该原始帧差图像进行处理,从而得到第一帧差图像。其中,对原始帧差图像进行处理,可以是指对原始帧差图像进行空域平滑处理或分块量化处理。Alternatively, the motion estimation unit 0211 may perform difference processing on each pixel in the first image signal and the corresponding pixel in the first historical noise reduction image to obtain the original frame difference image. Afterwards, the original frame difference image is processed to obtain the first frame difference image. Wherein, processing the original frame difference image may refer to performing spatial smoothing processing or block quantization processing on the original frame difference image.
在得到第一帧差图像之后,该运动估计单元0211可以根据第一帧差图像中的每个像素点和多个第一设定帧差阈值确定每个像素点的第一时域滤波强度。其中,第一帧差图像中的每个像素点均对应的一个第一设定帧差阈值,且各个像素点对应的第一设定帧差阈值有可能相同,也可能不同。在一种可能的实现方式中,每个像素点对应的第一设定帧差阈值可以由外部用户自行进行设置。在另一种可能的实现方式中,该运动估计单元0211可以将第一图像信号的上一帧图像与第一历史降噪图像进行作差处理,从而得到第一噪声强度图像,根据第一噪声强度图像中每个像素点的噪声强度确定第一帧差图像中相应位置处的像素点的第一设定帧差阈值。当然,每个像素点对应的第一设定帧差阈值也可以通过其他方式确定得到,本申请实施例对此不做限定。After obtaining the first frame difference image, the motion estimation unit 0211 may determine the first temporal filtering intensity of each pixel according to each pixel in the first frame difference image and multiple first set frame difference thresholds. Wherein, each pixel in the first frame difference image corresponds to a first set frame difference threshold, and the first set frame difference threshold corresponding to each pixel point may be the same or different. In a possible implementation manner, the first set frame difference threshold corresponding to each pixel can be set by an external user. In another possible implementation manner, the motion estimation unit 0211 may perform difference processing between the previous frame image of the first image signal and the first historical noise reduction image, thereby obtaining the first noise intensity image, according to the first noise The noise intensity of each pixel in the intensity image determines the first set frame difference threshold of the pixel at the corresponding position in the first frame difference image. Of course, the first set frame difference threshold corresponding to each pixel point can also be determined in other ways, which is not limited in the embodiment of the present application.
对于第一帧差图像中的每个像素点,该运动估计单元0211可以根据该像素点的帧差和该像素点对应的第一设定帧差阈值,通过下述公式(1)来确定得到相应像素点的第一时域滤波强度。For each pixel in the first frame difference image, the motion estimation unit 0211 can determine the result according to the frame difference of the pixel and the first set frame difference threshold corresponding to the pixel by the following formula (1) The first temporal filtering strength of the corresponding pixel.
Figure PCTCN2020092364-appb-000004
Figure PCTCN2020092364-appb-000004
其中,(x,y)为像素点在图像中的位置;α nir(x,y)是指坐标为(x,y)的像素点的第一时域滤波强度,dif nir(x,y)是指该像素点在第一帧差图像中的帧差,dif_thr nir(x,y)是指该像素点对应的第一设定帧差阈值。 Among them, (x, y) is the position of the pixel in the image; α nir (x, y) refers to the first temporal filtering intensity of the pixel with coordinates (x, y), dif nir (x, y) means that the pixel in the frame is the first frame difference image difference, dif_thr nir (x, y) means that the first frame set difference threshold value corresponding to the pixel.
需要说明的是,对于第一帧差图像中的每个像素点,像素点的帧差相较于第一设定帧差阈值越小,则说明该像素点越趋向于静止,也即,该像素点所对应的运动级别越小。而由上述公式(1)可知,对于任意一个像素点,该像素点的帧差相对于第一设定帧差阈值越小,则该像素点的第二时域滤波强度越大。其中,运动级别用于指示运动的剧烈程度,运动级别越大,则说明运动越剧烈。第一时域滤波强度的取值可以在0到1之间。It should be noted that, for each pixel in the first frame difference image, the frame difference of the pixel is smaller than the first set frame difference threshold, which means that the pixel tends to be more stationary, that is, the The smaller the motion level corresponding to the pixel. From the above formula (1), it can be seen that for any pixel, the frame difference of the pixel is smaller than the first set frame difference threshold, and the second time domain filtering intensity of the pixel is greater. Among them, the exercise level is used to indicate the intensity of the exercise. The higher the exercise level, the more intense the exercise. The value of the first temporal filtering strength can be between 0 and 1.
在确定第一图像信号中每个像素点的第一时域滤波强度之后,则时域滤波单元0212可以直接根据第一时域滤波强度分别对第一图像信号和第二图像信号进行时域滤波处理,从而得到近红外光降噪图像和可见光降噪图像。After determining the first time domain filtering strength of each pixel in the first image signal, the time domain filtering unit 0212 may directly perform time domain filtering on the first image signal and the second image signal directly according to the first time domain filtering strength. Processing to obtain near-infrared light noise reduction image and visible light noise reduction image.
需要说明的是,当第一图像信号的图像质量明显优于第二图像信号时,由于第一图像信号是近红外光图像,具有高信噪比,因此,利用第一图像信号中每个像素点的第一时域滤波强度对第二图像信号进行时域滤波处理,可以更为准确的区分图像中的噪声和有效信息,从而避免降噪后的图像中图像细节信息的损失以及图像拖尾的问题。It should be noted that when the image quality of the first image signal is significantly better than that of the second image signal, since the first image signal is a near-infrared light image with a high signal-to-noise ratio, each pixel in the first image signal is used The first time-domain filtering strength of the point performs time-domain filtering on the second image signal, which can more accurately distinguish the noise and effective information in the image, thereby avoiding the loss of image detail information and image smearing in the denoised image The problem.
需要说明的是,在一些可能的情况中,运动估计单元0211可以根据第一图像信号和至少一个第一历史降噪图像生成至少一个第一帧差图像,并根据至少一个帧差图像和每个帧差图像对应的多个第一设定帧差阈值确定第一图像信号中每个像素点的第一时域滤波强度。It should be noted that in some possible cases, the motion estimation unit 0211 may generate at least one first frame difference image according to the first image signal and at least one first historical noise reduction image, and according to the at least one frame difference image and each The multiple first set frame difference thresholds corresponding to the frame difference image determine the first temporal filtering intensity of each pixel in the first image signal.
其中,至少一个历史降噪图像是指对所述第一图像信号的前N帧图像进行降噪得到的图像。对于该至少一个第一历史降噪图像中的每个第一历史降噪图像,该运动估计单元0211可以根据该第一历史降噪图像和第一图像信号,参考前述介绍的相关实现方式来确定对应的第一帧差图像。之后,该运动估计单元0211可以根据每个第一帧差图像以及每个第一帧差图像对应的多个第一设定帧差阈值,参考前述相关实现方式确定每个第一帧差图像中每个像素点的时域滤波强度。之后,该运动估计单元0211可以将对各个第一帧差图像中相对应的像素点的时域滤波强度进行融合,从而得到每个像素点对应的第一时域滤波强度。或者,对于任一像素点,运动估计单元0211可以从至少一个第一帧差图像中该像素点的至少一个时域滤波强度中选择所表示的运动级别最大的时域滤波强度,进而将选择的时域滤波强度作为该像素点的第一时域滤波强度。Wherein, the at least one historical noise reduction image refers to an image obtained by performing noise reduction on the first N frames of the first image signal. For each first historical noise-reduction image in the at least one first historical noise-reduction image, the motion estimation unit 0211 may determine according to the first historical noise-reduction image and the first image signal with reference to the related implementation described above. The corresponding first frame difference image. After that, the motion estimation unit 0211 can determine each first frame difference image according to each first frame difference image and multiple first set frame difference thresholds corresponding to each first frame difference image, referring to the aforementioned related implementation manners The temporal filtering strength of each pixel. After that, the motion estimation unit 0211 may fuse the temporal filtering intensity of the corresponding pixels in each first frame difference image, so as to obtain the first temporal filtering intensity corresponding to each pixel. Or, for any pixel, the motion estimation unit 0211 may select the temporal filter intensity with the highest motion level represented by the at least one temporal filter intensity of the pixel in the at least one first frame difference image, and then the selected The time domain filtering strength is used as the first time domain filtering strength of the pixel.
在另一些示例中,该运动估计单元0211可以根据第一图像信号和第一历史降噪图像生成第一帧差图像,根据第一帧差图像和多个第一设定帧差阈值确定第一图像信号中每个像素点的第一时域滤波强度,第一历史降噪图像是指对第一图像信号的前N帧图像中的任一帧图像进行降噪后的图像;运动估计单元0211还用于根据第二图像信号和第二历史降噪图像生成第二帧差图像,根据第二帧差图像和多个第二设定帧差阈值确定第二图像信号中每个像素点的第二时域滤波强度,第二历史降噪图像是指对第二图像信号的前N帧图像中的任一帧图像进行降噪后的图像;运动估计单元0211还用于根据第一图像信号中每个像素点的第一时域滤波强度和第二图像信号中每个像素点的第二时域滤波强度确定每个像素点的联合时域滤波强度;时域滤波单元0212用于根据每个像素点的第一时域滤波强度或联合时域滤波强度对第一图像信号进行时域滤波处理,得到近红外光降噪图像,根据每个像素点的联合时域滤波强度对第二图像信号进行时域滤波处理,得到可见光降噪图像。In other examples, the motion estimation unit 0211 may generate the first frame difference image according to the first image signal and the first historical noise reduction image, and determine the first frame difference image according to the first frame difference image and multiple first set frame difference thresholds. The first temporal filtering strength of each pixel in the image signal, the first historical noise reduction image refers to the image after noise reduction is performed on any one of the first N frames of the first image signal; motion estimation unit 0211 It is also used to generate a second frame difference image according to the second image signal and the second historical noise reduction image, and determine the first frame difference image of each pixel in the second image signal according to the second frame difference image and multiple second set frame difference thresholds. Two temporal filtering strengths, the second historical noise reduction image refers to an image after noise reduction is performed on any one of the first N frames of the second image signal; the motion estimation unit 0211 is also used for The first time-domain filtering strength of each pixel and the second time-domain filtering strength of each pixel in the second image signal determine the joint time-domain filtering strength of each pixel; the time-domain filtering unit 0212 is used for each pixel The first time-domain filter strength of the pixel or the joint time-domain filter strength performs time-domain filtering processing on the first image signal to obtain a near-infrared light noise reduction image, and the second image signal is processed according to the joint time-domain filtering strength of each pixel Perform time-domain filtering to obtain a visible light denoising image.
也即,运动估计单元0211不仅可以通过前述介绍的实现方式确定第一图像信号中每个像素点的第一时域滤波强度,还可以确定第二图像信号中每个像素点的第二时域滤波强度。That is, the motion estimation unit 0211 can not only determine the first time domain filtering intensity of each pixel in the first image signal through the implementation described above, but also determine the second time domain of each pixel in the second image signal. Filter strength.
在确定每个像素点的第二时域滤波强度时,该运动估计单元0211可以首先将第二图像信号中的每个像素点和第二历史降噪图像中对应的像素点进行作差处理,得到第二帧差图像。其中,除此之外,第一图像信号和第二图像信号是对齐的。When determining the second temporal filtering strength of each pixel, the motion estimation unit 0211 may first perform difference processing on each pixel in the second image signal and the corresponding pixel in the second historical noise reduction image. Obtain the second frame difference image. Among other things, the first image signal and the second image signal are aligned.
在得到第二帧差图像之后,该运动估计单元0211可以根据第二帧差图像中的每个像素点和多个第二设定帧差阈值确定每个像素点的第二时域滤波强度。其中,第二 帧差图像中的每个像素点均对应的一个第二设定帧差阈值,也即,多个第二设定帧差阈值与第二帧差图像中的每个像素点一一对应。并且,各个像素点对应的第二设定帧差阈值有可能相同,也可能不同。在一种可能的实现方式中,每个像素点对应的第二设定帧差阈值可以由外部用户自行进行设置。在另一种可能的实现方式中,该运动估计单元0211可以将第二图像信号的上一帧图像与第二历史降噪图像进行作差处理,从而得到第二噪声强度图像,根据第二噪声强度图像中每个像素点的噪声强度确定第二帧差图像中相应位置处的像素点的第二设定帧差阈值。当然,每个像素点对应的第二设定帧差阈值也可以通过其他方式确定得到,本申请实施例对此不做限定。After obtaining the second frame difference image, the motion estimation unit 0211 may determine the second temporal filtering intensity of each pixel according to each pixel in the second frame difference image and multiple second set frame difference thresholds. Wherein, each pixel in the second frame difference image corresponds to a second set frame difference threshold, that is, multiple second set frame difference thresholds are equal to each pixel in the second frame difference image. One correspondence. In addition, the second set frame difference threshold corresponding to each pixel may be the same or different. In a possible implementation manner, the second set frame difference threshold corresponding to each pixel point can be set by an external user. In another possible implementation manner, the motion estimation unit 0211 may perform difference processing between the previous frame image of the second image signal and the second historical noise reduction image to obtain a second noise intensity image, according to the second noise The noise intensity of each pixel in the intensity image determines the second set frame difference threshold of the pixel at the corresponding position in the second frame difference image. Of course, the second set frame difference threshold corresponding to each pixel point can also be determined in other ways, which is not limited in the embodiment of the present application.
对于第二帧差图像中的每个像素点,该运动估计单元0211可以根据该像素点的帧差和该像素点对应的第二设定帧差阈值,通过下述公式(2)来确定得到相应像素点的第二时域滤波强度。For each pixel in the second frame difference image, the motion estimation unit 0211 can determine the result by the following formula (2) according to the frame difference of the pixel and the second set frame difference threshold corresponding to the pixel The second temporal filtering strength of the corresponding pixel.
Figure PCTCN2020092364-appb-000005
Figure PCTCN2020092364-appb-000005
其中,α vis(x,y)是指坐标为(x,y)的像素点的第二时域滤波强度,dif vis(x,y)表示该像素点在第二帧差图像中的帧差,dif_thr vis(x,y)表示该像素点对应的第二设定帧差阈值。 Among them, α vis (x, y) refers to the second time domain filter strength of the pixel with coordinates (x, y), dif vis (x, y) represents the frame difference of the pixel in the second frame difference image , Dif_thr vis (x, y) represents the second set frame difference threshold corresponding to the pixel.
需要说明的是,对于第二帧差图像中的每个像素点,像素点的帧差相对于第二设定帧差阈值越小,则说明该像素点越趋向于静止,也即,该像素点的运动级别越小。而由上述公式(2)可知,对于任意一个像素点,该像素点的帧差相对于第二设定帧差阈值越小,则该像素点的第二时域滤波强度越大。综上可知,在本申请实施例中,像素点的运动级别越小,则对应的第二时域滤波强度的取值越大。其中,第二时域滤波强度的取值可以在0到1之间。It should be noted that for each pixel in the second frame difference image, the smaller the frame difference of the pixel is relative to the second set frame difference threshold, the more the pixel tends to be stationary, that is, the pixel The smaller the exercise level of the point. From the above formula (2), it can be known that for any pixel, the frame difference of the pixel is smaller than the second set frame difference threshold, and the second time domain filtering intensity of the pixel is greater. In summary, in the embodiment of the present application, the smaller the motion level of the pixel, the larger the value of the corresponding second temporal filtering intensity. Wherein, the value of the second time domain filtering strength can be between 0 and 1.
在确定每个像素点的第一时域滤波强度和第二时域滤波强度之后,该运动估计单元0211可以每个像素点的第一时域滤波强度和第二时域滤波强度进行加权,从而得到每个像素点的联合时域权重。此时,确定的每个像素点的联合时域权重即为第一图像信号和第二图像信号的运动估计结果。After determining the first time domain filtering strength and the second time domain filtering strength of each pixel, the motion estimation unit 0211 may weight the first time domain filtering strength and the second time domain filtering strength of each pixel, thereby Get the joint time domain weight of each pixel. At this time, the determined joint time domain weight of each pixel point is the motion estimation result of the first image signal and the second image signal.
示例性地,该运动估计单元0211可以通过下式(3)来对每个像素点的第一时域滤波强度和第二时域滤波强度进行加权,从而得到每个像素点的联合时域滤波强度。Exemplarily, the motion estimation unit 0211 may weight the first temporal filtering strength and the second temporal filtering strength of each pixel by the following formula (3), thereby obtaining the joint temporal filtering of each pixel strength.
Figure PCTCN2020092364-appb-000006
Figure PCTCN2020092364-appb-000006
其中,Ω是指以坐标为(x,y)的像素点为中心的邻域范围,也即,以坐标为(x,y)的像素点为中心的局部图像区域,(x+i,y+j)是指该局部图像区域内的像素点坐标,
Figure PCTCN2020092364-appb-000007
是指以坐标为(x,y)的像素点为中心的局部图像区域内的第一时域滤波强度,
Figure PCTCN2020092364-appb-000008
是指以坐标为(x,y)的像素点为中心的局部图像区域内的第二时域滤波强度,α fus(x,y)是指坐标为(x,y)的像素点的联合时域滤波强度。通过局部图像区域内的第一时域滤波强度和第二时域滤波强度来调整第一时域滤波强度、第二时域滤波强度在联合时域滤波强度中的占比,即局部运动级别越大的一方其时域滤波强度占比越大。
Among them, Ω refers to the neighborhood range centered on the pixel with coordinates (x, y), that is, the local image area centered on the pixel with coordinates (x, y), (x+i,y +j) refers to the pixel coordinates in the local image area,
Figure PCTCN2020092364-appb-000007
Refers to the first temporal filtering strength in the local image area centered on the pixel with coordinates (x, y),
Figure PCTCN2020092364-appb-000008
Refers to the second temporal filtering strength in the local image area centered on the pixel with coordinates (x, y), α fus (x, y) refers to the joint time of pixels with coordinates (x, y) Domain filtering strength. The ratio of the first time domain filter strength and the second time domain filter strength in the joint time domain filter strength is adjusted by the first time domain filter strength and the second time domain filter strength in the local image area, that is, the higher the local motion level The larger the proportion of time-domain filtering strength.
需要说明的是,第一时域滤波强度可以用于表示像素点在第一图像信号中的运动级别,第二时域滤波强度可以用于表示像素点在第二图像信号中的运动级别,而通过上述方式确定的联合时域滤波强度同时融合了第一时域滤波强度和第二时域滤波强度,也即,该联合时域滤波强度同时考虑了该像素点在第一图像信号中表现出的运动趋势以及在第二图像信号中表现出的运动趋势。这样,相较于第一时域滤波强度或第二时域滤波强度,该联合时域滤波强度能够更加准确的表征像素点的运动趋势,这样,后续以该联合时域滤波强度进行时域滤波时,可以更有效的去除图像噪声,并且,可以避免由于对像素点的运动级别的误判所导致的图像拖尾等问题。It should be noted that the first temporal filtering strength can be used to indicate the motion level of pixels in the first image signal, and the second temporal filtering strength can be used to indicate the motion level of pixels in the second image signal. The joint time-domain filtering strength determined by the above-mentioned method simultaneously fuses the first time-domain filtering strength and the second time-domain filtering strength, that is, the joint time-domain filtering strength also takes into account that the pixel point appears in the first image signal. The movement trend of and the movement trend shown in the second image signal. In this way, compared with the first time domain filtering strength or the second time domain filtering strength, the joint time domain filtering strength can more accurately characterize the motion trend of the pixel points. In this way, the subsequent time domain filtering is performed with the joint time domain filtering strength. At this time, image noise can be removed more effectively, and problems such as image tailing caused by misjudgment of the motion level of pixels can be avoided.
在一些示例中,在确定每个像素点的第一时域滤波强度和第二时域滤波强度之后,对于任一像素点,该运动估计单元可以从该像素点的第一时域滤波强度和第二时域滤波强度中选择一个时域滤波强度作为该像素点的联合时域滤波权重。其中,在选择时,可以选择两个时域滤波强度中表征该像素点的运动级别较大的一个时域滤波强度作为联合时域滤波强度。In some examples, after determining the first temporal filtering strength and the second temporal filtering strength of each pixel, for any pixel, the motion estimation unit may calculate the sum of the first temporal filtering strength of the pixel. In the second time domain filtering strength, a time domain filtering strength is selected as the joint time domain filtering weight of the pixel. Wherein, when selecting, one of the two time-domain filter intensities that represents the higher motion level of the pixel can be selected as the joint time-domain filter intensity.
在确定每个像素点的联合时域滤波强度之后,时域滤波单元0212可以根据该联合时域滤波强度对第一图像信号和第二图像信号分别进行时域滤波处理,从而得到近红外光降噪图像和可见光降噪图像。After determining the joint time-domain filtering strength of each pixel, the time-domain filtering unit 0212 may perform time-domain filtering processing on the first image signal and the second image signal respectively according to the joint time-domain filtering strength, so as to obtain near-infrared light drop Noise image and visible light noise reduction image.
示例性地,时域滤波单元0212可以根据每个像素点的联合时域滤波强度,通过下述公式(4)对第一图像信号和第一历史降噪图像中的每个像素点进行时域加权处理,从而得到近红外光降噪图像,根据每个像素点的联合时域滤波强度,通过下述公式(5)对第二图像信号和第二历史降噪图像中的每个像素点进行时域加权处理,从而得到可见光降噪图像。Exemplarily, the time-domain filtering unit 0212 may perform time-domain processing on each pixel in the first image signal and the first historical noise reduction image by the following formula (4) according to the joint time-domain filtering strength of each pixel. Weighted processing to obtain the near-infrared light noise reduction image. According to the joint time domain filtering strength of each pixel, the second image signal and each pixel in the second historical noise reduction image are processed by the following formula (5) Time-domain weighting processing to obtain visible light noise reduction images.
Figure PCTCN2020092364-appb-000009
Figure PCTCN2020092364-appb-000009
Figure PCTCN2020092364-appb-000010
Figure PCTCN2020092364-appb-000010
其中,
Figure PCTCN2020092364-appb-000011
是指近红外光降噪图像中坐标为(x,y)的像素点,
Figure PCTCN2020092364-appb-000012
是指第一历史降噪图像中坐标为(x,y)的像素点,α fus(x,y)是指坐标为(x,y)的像素点的联合时域滤波强度,I nir(x,y,t)是指第一图像信号中坐标为(x,y)的像素点,
Figure PCTCN2020092364-appb-000013
是指可见光降噪图像中坐标为(x,y)的像素点,
Figure PCTCN2020092364-appb-000014
是指第二历史降噪图像中坐标为(x,y)的像素点,I vis(x,y,t)是指第二图像信号中坐标为(x,y)的像素点。
among them,
Figure PCTCN2020092364-appb-000011
Refers to the pixel with the coordinate (x, y) in the near-infrared light noise reduction image,
Figure PCTCN2020092364-appb-000012
Refers to the pixel with the coordinates (x, y) in the first historical denoising image, α fus (x, y) refers to the joint temporal filtering strength of the pixel with the coordinates (x, y), I nir (x ,y,t) refers to the pixel with coordinates (x,y) in the first image signal,
Figure PCTCN2020092364-appb-000013
Refers to the pixel with the coordinate (x, y) in the visible light noise reduction image,
Figure PCTCN2020092364-appb-000014
Refers to the pixel with the coordinate (x, y) in the second historical noise reduction image, and I vis (x, y, t) refers to the pixel with the coordinate (x, y) in the second image signal.
或者,考虑到第一图像信号为具有高信噪比的近红外光信号,该时域滤波单元0212也可以根据每个像素点的第一时域滤波强度对第一图像信号进行时域滤波,得到 近红外光图像,根据每个像素点的联合时域滤波强度对第二图像信号进行时域滤波处理,从而得到可见光图像。Or, considering that the first image signal is a near-infrared light signal with a high signal-to-noise ratio, the time-domain filtering unit 0212 may also perform time-domain filtering on the first image signal according to the first time-domain filtering strength of each pixel. The near-infrared light image is obtained, and the second image signal is subjected to time-domain filtering processing according to the joint time-domain filtering strength of each pixel, thereby obtaining a visible light image.
需要说明的是,由前述对于时域滤波强度与运动级别的关系的介绍可知,在本申请实施例中,对于第一图像信号和第二图像信号中运动越激烈的区域,可以采用越小的时域滤波强度对其进行滤波。It should be noted that from the foregoing introduction to the relationship between the temporal filtering strength and the motion level, it can be seen that in the embodiment of the present application, for the regions with more intense motion in the first image signal and the second image signal, the smaller can be used. The time domain filtering strength filters it.
在另一些可能的实现方式中,参见图32,该联合降噪单元可以包括空域降噪单元022。其中,该空域降噪单元022用于根据第一图像信号和第二图像信号进行边缘估计,得到边缘估计结果,根据边缘估计结果对第一图像信号进行空域滤波处理,得到近红外光降噪图像,根据边缘估计结果对第二图像信号进行空域滤波处理,得到可见光降噪图像。In other possible implementations, referring to FIG. 32, the joint noise reduction unit may include a spatial noise reduction unit 022. The spatial noise reduction unit 022 is configured to perform edge estimation according to the first image signal and the second image signal to obtain an edge estimation result, and perform spatial filtering processing on the first image signal according to the edge estimation result to obtain a near-infrared light noise reduction image , Performing spatial filtering processing on the second image signal according to the edge estimation result to obtain a visible light noise reduction image.
需要说明的是,参见图34,该空域降噪单元022可以包括边缘估计单元0221和空域滤波单元0222。It should be noted that, referring to FIG. 34, the spatial noise reduction unit 022 may include an edge estimation unit 0221 and a spatial filtering unit 0222.
在一些示例中,该边缘估计单元0221用于确定第一图像信号中每个像素点的第一空域滤波强度;该空域滤波单元0222用于根据每个像素点对应的第一空域滤波强度对第一图像信号进行空域滤波处理,得到近红外光降噪图像,根据每个像素点对应的第一空域滤波强度对第二图像信号进行空域滤波处理,得到可见光降噪图像。In some examples, the edge estimation unit 0221 is used to determine the first spatial filtering strength of each pixel in the first image signal; the spatial filtering unit 0222 is used to determine the first spatial filtering strength corresponding to each pixel. An image signal is subjected to spatial filtering processing to obtain a near-infrared light noise reduction image, and the second image signal is subjected to spatial filtering processing according to the first spatial filtering intensity corresponding to each pixel point to obtain a visible light noise reduction image.
示例性地,该边缘估计单元0221可以根据第一图像信号的每个像素点与其邻域内的其他像素点之间的差异,确定相应像素点的第一空域滤波强度。其中,该边缘估计单元0221可以通过下式(6)来生成每个像素点的第一空域滤波强度。Exemplarily, the edge estimation unit 0221 may determine the first spatial filtering intensity of the corresponding pixel according to the difference between each pixel of the first image signal and other pixels in its neighborhood. Wherein, the edge estimation unit 0221 can generate the first spatial filtering intensity of each pixel through the following formula (6).
Figure PCTCN2020092364-appb-000015
Figure PCTCN2020092364-appb-000015
其中,Ω是指以坐标为(x,y)的像素点为中心的邻域范围,也即,以坐标为(x,y)的像素点为中心的局部图像区域。(x+i,y+j)是指该局部图像区域内的像素点坐标,img nir(x,y)是指第一图像信号中坐标为(x,y)的像素点的像素值,δ 1和δ 2是指高斯分布标准差,
Figure PCTCN2020092364-appb-000016
是指坐标为(x,y)的像素点在该局部图像区域内根据其与像素点(x+i,y+j)之间的差异确定的第一空域滤波强度。
Among them, Ω refers to a neighborhood range centered on a pixel with coordinates (x, y), that is, a local image area centered on a pixel with coordinates (x, y). (x+i,y+j) refers to the pixel coordinates in the local image area, img nir (x,y) refers to the pixel value of the pixel with coordinates (x,y) in the first image signal, δ 1 and δ 2 refer to the standard deviation of Gaussian distribution,
Figure PCTCN2020092364-appb-000016
It refers to the first spatial filtering strength determined by the pixel point with coordinates (x, y) in the local image area according to the difference between it and the pixel point (x+i, y+j).
在确定每个像素点的多个第一空域滤波强度之后,空域滤波单元0222可以根据每个像素点的多个第一空域滤波强度分别对第一图像信号和第二图像信号进行空域滤波处理,从而得到近红外光降噪图像和可见光降噪图像。After determining the multiple first spatial filtering intensities of each pixel, the spatial filtering unit 0222 may perform spatial filtering on the first image signal and the second image signal according to the multiple first spatial filtering intensities of each pixel, Thus, the near-infrared light noise reduction image and the visible light noise reduction image are obtained.
在另一些示例中,该边缘估计单元0221用于确定第一图像信号中每个像素点的第一空域滤波强度,确定第二图像信号中每个像素点的第二空域滤波强度;对第一图像信号进行局部信息提取,得到第一局部信息,对第二图像信号进行局部信息提取,得到第二局部信息;根据第一空域滤波强度、第二空域滤波强度、第一局部信息和第二局部信息确定每个像素点对应的联合空域滤波强度;该空域滤波单元0222用于根据每个像素点对应的第一空域滤波强度对第一图像信号进行空域滤波处理,得到近红 外光降噪图像,根据每个像素点对应的联合空域滤波强度对第二图像信号进行空域滤波处理,得到可见光降噪图像。其中,第一局部信息和第二局部信息包括局部梯度信息、局部亮度信息和局部信息熵中的至少一种。In other examples, the edge estimation unit 0221 is used to determine the first spatial filtering strength of each pixel in the first image signal, and determine the second spatial filtering strength of each pixel in the second image signal; Perform local information extraction on the image signal to obtain the first local information, and perform local information extraction on the second image signal to obtain the second local information; according to the first spatial filtering strength, the second spatial filtering strength, the first local information and the second local The information determines the joint spatial filtering strength corresponding to each pixel; the spatial filtering unit 0222 is used to perform spatial filtering on the first image signal according to the first spatial filtering strength corresponding to each pixel to obtain a near-infrared light noise reduction image, Perform spatial filtering processing on the second image signal according to the joint spatial filtering intensity corresponding to each pixel to obtain a visible light denoising image. Wherein, the first local information and the second local information include at least one of local gradient information, local brightness information, and local information entropy.
也即,该边缘估计单元0221不仅可以通过前述介绍的实现方式确定第一图像信号中每个像素点的第一空域滤波强度,还可以确定第二图像信号中每个像素点的第二时域滤波强度。That is, the edge estimation unit 0221 can not only determine the first spatial filtering strength of each pixel in the first image signal through the implementation described above, but also determine the second time domain of each pixel in the second image signal. Filter strength.
在确定每个像素点的第二空域滤波强度时,该边缘估计单元0221可以根据第二图像信号的每个像素点与其邻域内的其他像素点之间的差异,确定相应像素点的第二空域滤波强度。其中,该边缘估计单元0221可以通过下式(7)来生成每个像素点的第一空域滤波强度。When determining the second spatial filtering strength of each pixel, the edge estimation unit 0221 can determine the second spatial domain of the corresponding pixel according to the difference between each pixel of the second image signal and other pixels in its neighborhood Filter strength. Wherein, the edge estimation unit 0221 can generate the first spatial filtering intensity of each pixel through the following formula (7).
Figure PCTCN2020092364-appb-000017
Figure PCTCN2020092364-appb-000017
其中,Ω是指以坐标为(x,y)的像素点为中心的邻域范围,也即,以坐标为(x,y)的像素点为中心的局部图像区域。(x+i,y+j)是指该局部图像区域内的像素点坐标,img vis(x,y)是指第二图像信号中坐标为(x,y)的像素点的像素值,δ 1和δ 2是指高斯分布标准差,
Figure PCTCN2020092364-appb-000018
是指坐标为(x,y)的像素点在该局部图像区域内根据其与像素点(x+i,y+j)之间的差异确定的第二空域滤波强度。
Among them, Ω refers to a neighborhood range centered on a pixel with coordinates (x, y), that is, a local image area centered on a pixel with coordinates (x, y). (x+i,y+j) refers to the pixel coordinates in the local image area, img vis (x,y) refers to the pixel value of the pixel with coordinates (x,y) in the second image signal, δ 1 and δ 2 refer to the standard deviation of Gaussian distribution,
Figure PCTCN2020092364-appb-000018
It refers to the second spatial filtering strength determined by the pixel point with coordinates (x, y) in the local image area according to the difference between it and the pixel point (x+i, y+j).
由上述公式6和7可知,对于以坐标为(x,y)的像素点为中心的局部图像区域,该像素点与该局部图像区域内的像素点之间的差异越小,则该像素点对应的多个空域滤波强度越大。也即,该像素点的空域滤波强度的大小与该像素点与对应的局部图像区域内的像素点之间的差异大小呈负相关。From the above formulas 6 and 7, it can be seen that for a local image area centered on a pixel with coordinates (x, y), the smaller the difference between the pixel and the pixel in the local image area, the pixel The stronger the corresponding multiple spatial filtering is. That is, the magnitude of the spatial filtering intensity of the pixel is negatively related to the difference between the pixel and the pixel in the corresponding partial image area.
在确定每个像素点的第一空域滤波强度和第二空域滤波强度之后,边缘估计单元0221可以利用Sobel边缘检测算子分别对第一图像信号和第二图像信号进行卷积处理,得到第一纹理图像和第二纹理图像,并以此为权重对每个像素点的多个第一空域滤波强度和多个第二空域滤波强度进行加权处理,生成每个像素点在局部图像区域内的多个联合空域滤波强度。其中,第一纹理图像即为第一局部信息,第二纹理图像即为第二局部信息。After determining the first spatial filtering strength and the second spatial filtering strength of each pixel, the edge estimation unit 0221 may use the Sobel edge detection operator to perform convolution processing on the first image signal and the second image signal to obtain the first The texture image and the second texture image, and use this as a weight to weight the multiple first spatial filter intensities and multiple second spatial filter intensities of each pixel to generate the multiple of each pixel in the local image area. Joint spatial filtering strength. Among them, the first texture image is the first local information, and the second texture image is the second local information.
示例性地,Sobel边缘检测算子如下式(8)所示。边缘估计单元0221可以通过下式(9)来生成联合空域滤波强度。Exemplarily, the Sobel edge detection operator is shown in the following equation (8). The edge estimation unit 0221 can generate the joint spatial filtering strength through the following equation (9).
Figure PCTCN2020092364-appb-000019
Figure PCTCN2020092364-appb-000019
Figure PCTCN2020092364-appb-000020
Figure PCTCN2020092364-appb-000020
其中,sobel H是指水平方向上的Sobel边缘检测算子,sobel V是指垂直方向上的Sobel边缘检测算子;β fus(x+i,y+j)是指坐标为(x,y)的像素点在其邻域范围Ω内的任一联合空域滤波强度,
Figure PCTCN2020092364-appb-000021
是指第一纹理图像中坐标为(x,y)的像素点的纹理信息,
Figure PCTCN2020092364-appb-000022
是指第二纹理图像中坐标为(x,y)的像素点的纹理信息。
Among them, sobel H refers to the Sobel edge detection operator in the horizontal direction, sobel V refers to the Sobel edge detection operator in the vertical direction; β fus (x+i,y+j) refers to the coordinates (x,y) Any joint spatial filtering strength of pixels in its neighborhood Ω,
Figure PCTCN2020092364-appb-000021
Refers to the texture information of the pixel with coordinates (x, y) in the first texture image,
Figure PCTCN2020092364-appb-000022
Refers to the texture information of the pixel with the coordinate (x, y) in the second texture image.
需要说明的是,在确定联合空域滤波强度时,通过边缘检测算子进行了相应处理,所以,最终得到的每个像素点的多个联合空域滤波强度越小,则表明该像素点与局部图像区域内的其他像素点之间的差异越大,因此可见,在本申请实施例中,图像中相邻像素点亮度差异越大的区域,联合空域滤波强度越小,而相邻像素点亮度差异较小的区域,联合空域滤波强度则相对较大。也即,在本申请实施例中,在进行空域滤波时,在边缘采用较弱的滤波强度,在非边缘采用较强的滤波强度,从而提高了降噪效果。It should be noted that when determining the joint spatial filtering strength, the corresponding processing is performed by the edge detection operator. Therefore, the final multiple joint spatial filtering strength of each pixel is smaller, which indicates that the pixel is related to the local image. The larger the difference between other pixels in the area, it can be seen that in the embodiment of the application, the area with the larger the brightness difference between adjacent pixels in the image, the smaller the joint spatial filtering intensity, and the brightness difference between adjacent pixels In a smaller area, the joint spatial filtering strength is relatively larger. That is, in the embodiment of the present application, when performing spatial filtering, a weaker filter strength is used for edges, and a stronger filter strength is used for non-edges, thereby improving the noise reduction effect.
在得到联合空域滤波强度之后,时域滤波单元0212可以根据联合空域滤波强度分别对第一修复图像和第二修复图像进行空域滤波处理,从而得到近红外光降噪图像和可见光降噪图像。After obtaining the joint spatial filtering strength, the temporal filtering unit 0212 may perform spatial filtering processing on the first repaired image and the second repaired image respectively according to the joint spatial filtering strength, thereby obtaining a near-infrared light noise reduction image and a visible light noise reduction image.
或者,考虑到第一图像信号是具有高信噪比的近红外光图像,因此,在第一图像信号的质量明显优于第二图像信号时,无需利用第二图像信号的边缘信息来辅助第一图像信号进行空域滤波处理。在这种情况下,该空域滤波单元0222可以根据每个像素点的第一空域滤波强度对第一图像信号进行空域滤波处理。根据每个像素点的联合空域滤波强度对第二图像信号进行空域滤波处理。Or, considering that the first image signal is a near-infrared image with a high signal-to-noise ratio, when the quality of the first image signal is significantly better than that of the second image signal, there is no need to use the edge information of the second image signal to assist the second image signal. An image signal is subjected to spatial filtering processing. In this case, the spatial filtering unit 0222 may perform spatial filtering processing on the first image signal according to the first spatial filtering intensity of each pixel. Perform spatial filtering processing on the second image signal according to the joint spatial filtering strength of each pixel.
示例性地,该空域滤波单元0222可以根据每个像素点的第一空域滤波强度通过下述公式(10)来对第一图像信号中的每个像素点进行空域加权处理,从而得到近红外光降噪图像,根据每个像素点的联合时域滤波强度,通过下述公式(11)对第二图像信号中的每个像素点进行加权处理,从而得到可见光降噪图像。Exemplarily, the spatial filtering unit 0222 may perform spatial weighting processing on each pixel in the first image signal according to the first spatial filtering intensity of each pixel through the following formula (10), thereby obtaining near-infrared light In the noise-reduced image, according to the joint time-domain filtering strength of each pixel, each pixel in the second image signal is weighted by the following formula (11) to obtain a visible light noise-reduced image.
Figure PCTCN2020092364-appb-000023
Figure PCTCN2020092364-appb-000023
Figure PCTCN2020092364-appb-000024
Figure PCTCN2020092364-appb-000024
其中,
Figure PCTCN2020092364-appb-000025
是指近红外光降噪图像中坐标为(x,y)的像素点,I nir(x+i,y+j)是指第一图像信号中坐标为(x,y)的像素点的邻域范围内的像素点,β nir(x+i,y+j)为坐标为(x,y)的像素点在该邻域范围内的第一空域滤波强度,Ω是指以坐标为(x,y)的像素点为中心的邻域范围,
Figure PCTCN2020092364-appb-000026
为可见光降噪图像中坐标为(x,y)的像素点,I vis(x+i,y+j)是指第二图像信号中坐标为(x,y)的像素点的邻域范围内的像素点,β fus(x+i,y+j)为坐标为(x,y)的像素点在该邻域范围内的联合空域滤波强度。
among them,
Figure PCTCN2020092364-appb-000025
Refers to the pixel with the coordinates (x, y) in the near-infrared light noise reduction image, I nir (x+i, y+j) refers to the neighbor of the pixel with the coordinates (x, y) in the first image signal Pixels in the domain, β nir (x+i, y+j) is the first spatial filtering strength of the pixel with coordinates (x, y) in the neighborhood range, Ω refers to the coordinate as (x ,y) is the center of the neighborhood,
Figure PCTCN2020092364-appb-000026
Is the pixel with coordinates (x,y) in the visible light denoising image, I vis (x+i,y+j) refers to the neighborhood of the pixel with coordinates (x,y) in the second image signal Β fus (x+i, y+j) is the joint spatial filtering strength of the pixel with coordinates (x, y) in the neighborhood.
值得注意的是,在本申请实施例中,图像降噪单元02也可以同时包括上述的时域降噪单元021和空域降噪单元022。在这种情况下,可以参照前述介绍的相关实现方式,先通过时域降噪单元021对第一图像信号和第二图像信号进行时域滤波,得到的第一时域降噪图像和第二时域降噪图像。之后,再由空域降噪单元022对得到的第一时域降噪图像和第二时域降噪图像进行空域滤波,从而得到近红外光降噪图像和可见光降噪图像。或者,可以先由空域降噪单元022对第一图像信号和第二图像信号进行空域滤波,得到第一空域降噪图像和第二空域降噪图像。之后,再由时域降噪单元021对得到的第一空域降噪图像和第二空域降噪图像进行时域滤波,从而得到近红外光降噪图像和可见光降噪图像。It is worth noting that in this embodiment of the present application, the image noise reduction unit 02 may also include the above-mentioned temporal noise reduction unit 021 and the spatial noise reduction unit 022 at the same time. In this case, you can refer to the related implementations introduced above, and first perform time-domain filtering on the first image signal and the second image signal by the time-domain noise reduction unit 021 to obtain the first time-domain noise-reduced image and the second image signal. Time domain noise reduction image. After that, the spatial noise reduction unit 022 performs spatial filtering on the obtained first temporal noise reduction image and the second temporal noise reduction image, thereby obtaining a near-infrared light noise reduction image and a visible light noise reduction image. Alternatively, the spatial noise reduction unit 022 may first perform spatial filtering on the first image signal and the second image signal to obtain the first spatial noise reduction image and the second spatial noise reduction image. After that, the time domain noise reduction unit 021 performs time domain filtering on the obtained first spatial domain noise reduction image and the second spatial domain noise reduction image, thereby obtaining a near-infrared light noise reduction image and a visible light noise reduction image.
综上,图像采集装置通过图像传感器多次曝光和补光装置频闪补光生成多幅具有不同光谱范围的图像,扩展图像传感器能够接收到的光谱范围,扩展单传感器的图像采集能力,提升不同场景下的图像质量。In summary, the image acquisition device generates multiple images with different spectral ranges through multiple exposures of the image sensor and the stroboscopic supplementary light of the light supplement device, which expands the spectral range that the image sensor can receive, expands the image acquisition capability of a single sensor, and improves different The image quality under the scene.
本申请实施例中还提供一种图像融合方法,应用于图1-34所示的实施例提供的图像融合设备,所述图像融合设备包括图像传感器、补光器、滤光组件和处理器,所述图像传感器位于所述滤光组件的出光侧,所述补光器包括第一补光装置,所述滤光组件包括第一滤光片,所述处理器包括:缓存单元和图像处理单元,参见图35,所述方法包括:The embodiment of the present application also provides an image fusion method, which is applied to the image fusion device provided in the embodiment shown in FIGS. 1-34. The image fusion device includes an image sensor, a light supplement, a filter component, and a processor, The image sensor is located at the light exit side of the light filter assembly, the light supplement includes a first light supplement device, the filter assembly includes a first filter, and the processor includes: a buffer unit and an image processing unit , See Figure 35, the method includes:
步骤3201、通过所述第一补光装置进行近红外补光,其中,至少在第一预设曝光的部分曝光时间段内进行近红外补光,在第二预设曝光的曝光时间段内不进行近红外补光,所述第一预设曝光和所述第二预设曝光为图像传感器的多次曝光中的其中两次曝光;Step 3201: Perform near-infrared compensation light by the first light-filling device, wherein the near-infrared compensation light is performed at least during a partial exposure time period of the first preset exposure, and no near-infrared light compensation is performed during the exposure time period of the second preset exposure. Performing near-infrared supplementary light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
步骤3202、通过所述第一滤光片使可见光波段的光和部分近红外光通过;Step 3202: Let visible light and part of near-infrared light pass through the first filter;
步骤3203、通过所述图像传感器采用全局曝光方式进行多次曝光,以产生并输出第一图像信号和第二图像信号,所述第一图像信号是根据所述第一预设曝光产生的图像信号,所述第二图像信号是根据所述第二预设曝光产生的图像信号;Step 3203: Perform multiple exposures through the image sensor in a global exposure mode to generate and output a first image signal and a second image signal, where the first image signal is an image signal generated according to the first preset exposure , The second image signal is an image signal generated according to the second preset exposure;
步骤3204、通过缓存单元,在获知图像传感器当前输出的第一目标图像信号需要缓存时,将第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,至少将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若第一目标图像信号为第一图像信号,第二目标图像信号为已缓存的一帧第二图像信号,或者第一目标图像信号为第二图像信号,第二目标图像信号为已缓存的一帧第一图像信号;Step 3204: Through the buffer unit, when it is known that the first target image signal currently output by the image sensor needs to be buffered, the first target image signal is buffered, and when it is known that the buffered second target image signal needs to be output synchronously, at least The buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a buffered frame of the second image signal, or the first target image The signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
步骤3205、通过所述图像处理单元至少接收所述图像传感器当前输出的第一目标图像信号,以及至少接收所述缓存单元同步输出的第二目标图像信号,根据所述第一目标图像信号和所述第二目标图像信号生成彩色融合图像。Step 3205: Receive at least the first target image signal currently output by the image sensor through the image processing unit, and at least receive the second target image signal synchronously output by the buffer unit, according to the first target image signal and the The second target image signal generates a color fusion image.
在一种可能的实现方式中,所述同步单元确定所述图像传感器当前输出的第一目标图像信号需要缓存时,指示缓存单元将所述第一目标图像信号进行缓存,以及从已缓存的图像信号中确定需要同步输出第二目标图像信号时,指示缓存单元将所述第二目标图像信号同步输出至所述图像处理单元。In a possible implementation manner, when the synchronization unit determines that the first target image signal currently output by the image sensor needs to be buffered, it instructs the buffer unit to buffer the first target image signal, and obtain the result from the buffered image When it is determined in the signal that the second target image signal needs to be synchronously output, the buffer unit is instructed to synchronously output the second target image signal to the image processing unit.
在一种可能的实现方式中,所述图像处理单元根据所述第一目标图像信号和所述第二 目标图像信号生成彩色融合图像,包括:In a possible implementation manner, the image processing unit generating a color fusion image according to the first target image signal and the second target image signal includes:
图像预处理单元将所述第一目标图像信号经预处理后生成第一目标图像,将所述第二目标图像信号经预处理后生成第二目标图像;An image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target image;
所述图像融合单元将所述第一目标图像和所述第二目标图像进行融合处理,得到所述彩色融合图像。The image fusion unit performs fusion processing on the first target image and the second target image to obtain the color fusion image.
在一种可能的实现方式中,图像预处理单元将所述第一目标图像信号经预处理后生成第一目标图像,将所述第二目标图像信号经预处理后生成第二目标图像,包括:In a possible implementation manner, the image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target image, including :
所述第一预处理单元对所述第一目标图像信号进行第一预处理操作,得到预处理后的第一目标图像;The first preprocessing unit performs a first preprocessing operation on the first target image signal to obtain a preprocessed first target image;
所述第二预处理单元对所述第二目标图像信号进行第二预处理操作,得到第二目标图像;The second preprocessing unit performs a second preprocessing operation on the second target image signal to obtain a second target image;
所述联合降噪单元对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,所述降噪后的第一目标图像和第二目标图像用于进行融合处理,得到所述彩色融合图像。The joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain a first target image and a second target image after noise reduction, the first target image after noise reduction and The second target image is used for fusion processing to obtain the color fusion image.
在一种可能的实现方式中,所述同步单元确定所述图像传感器当前输出的第一目标图像信号需要缓存时,指示缓存单元将所述第一目标图像信号进行缓存,以及从已缓存的图像信号中确定需要同步输出第二目标图像信号时,指示缓存单元将所述第二目标图像信号同步输出至所述图像处理单元,包括:In a possible implementation manner, when the synchronization unit determines that the first target image signal currently output by the image sensor needs to be buffered, it instructs the buffer unit to buffer the first target image signal, and obtain the result from the buffered image When it is determined in the signal that the second target image signal needs to be output synchronously, instructing the buffer unit to synchronously output the second target image signal to the image processing unit includes:
所述同步单元确定每一帧所述第一目标图像信号需要缓存,并且需要同步输出所述第二目标图像信号,所述第二目标图像信号为所述缓存单元前一次缓存的图像信号;The synchronization unit determines that each frame of the first target image signal needs to be buffered, and needs to output the second target image signal synchronously, and the second target image signal is the image signal buffered by the buffer unit last time;
其中,若所述第一目标图像信号为第二图像信号,则所述缓存单元当前缓存第二图像信号,并将前一次缓存的第一图像信号确定为所述第二目标图像信号输出至图像预处理单元;Wherein, if the first target image signal is a second image signal, the buffer unit currently buffers the second image signal, and determines the first image signal buffered last time as the second target image signal and outputs it to the image Preprocessing unit
若所述第一目标图像信号为第一图像信号,则所述缓存单元当前缓存第一图像信号,并将前一次缓存的第二图像信号确定为所述第二目标图像信号输出至图像预处理单元。If the first target image signal is the first image signal, the buffer unit currently buffers the first image signal, and determines the second image signal buffered last time as the second target image signal and outputs it to the image preprocessing unit.
在一种可能的实现方式中,所述同步单元确定所述图像传感器当前输出的第一目标图像信号需要缓存时,指示缓存单元将所述第一目标图像信号进行缓存,以及从已缓存的图像信号中确定需要同步输出第二目标图像信号时,指示缓存单元将所述第二目标图像信号同步输出至所述图像处理单元,包括:In a possible implementation manner, when the synchronization unit determines that the first target image signal currently output by the image sensor needs to be buffered, it instructs the buffer unit to buffer the first target image signal, and obtain the result from the buffered image When it is determined in the signal that the second target image signal needs to be output synchronously, instructing the buffer unit to synchronously output the second target image signal to the image processing unit includes:
所述同步单元确定所述第一目标图像信号为第一图像信号时需要缓存,以及在确定所述第一目标图像信号为第二图像信号时,需要同步输出所述第二目标图像信号,所述第二目标图像信号为所述缓存单元已缓存的图像信号中最近一次缓存的第一图像信号;其中,若所述第一目标图像信号为第二图像信号,则所述缓存单元将最近一次缓存的第一图像信号确定为所述第二目标图像信号输出至图像预处理单元;若所述第一目标图像信号为第一图像信号,则所述缓存单元缓存所述第一图像信号;或者,When the synchronization unit determines that the first target image signal is a first image signal, it needs to be buffered, and when it is determined that the first target image signal is a second image signal, it needs to synchronously output the second target image signal, so The second target image signal is the most recently buffered first image signal among the image signals buffered by the buffer unit; wherein, if the first target image signal is the second image signal, the buffer unit will The buffered first image signal is determined to be the second target image signal output to the image preprocessing unit; if the first target image signal is the first image signal, the buffer unit buffers the first image signal; or ,
所述同步单元确定所述第一目标图像信号为第二图像信号时需要缓存,以及在确定所述第一目标图像信号为第一图像信号时,需要同步输出所述第二目标图像信号,所述第二目标图像信号为所述缓存单元已缓存的第二图像信号中最近一次缓存的第二图像信号;其中,若所述第一目标图像信号为第一图像信号,则所述缓存单元将最近一次缓存的第二图 像信号确定为所述第二目标图像信号输出至图像预处理单元;若所述第一目标图像信号为第二图像信号,则所述缓存单元缓存所述第二图像信号。When the synchronization unit determines that the first target image signal is a second image signal, it needs to be buffered, and when it is determined that the first target image signal is a first image signal, it needs to synchronously output the second target image signal, so The second target image signal is the most recently buffered second image signal among the second image signals buffered by the buffer unit; wherein, if the first target image signal is the first image signal, the buffer unit will The second image signal buffered most recently is determined to be the second target image signal output to the image preprocessing unit; if the first target image signal is the second image signal, the buffer unit buffers the second image signal .
在一种可能的实现方式中,所述图像融合单元将所述第一目标图像和所述第二目标图像进行融合处理,得到所述彩色融合图像,包括:In a possible implementation, the image fusion unit performs fusion processing on the first target image and the second target image to obtain the color fusion image, including:
所述色彩提取单元提取所述第二图像信号预处理后的图像的色彩信号;The color extraction unit extracts the color signal of the image preprocessed by the second image signal;
所述亮度提取单元提取所述第二图像信号预处理后的图像的亮度信号;The brightness extraction unit extracts the brightness signal of the image preprocessed by the second image signal;
所述融合处理单元对所述第一图像信号预处理后的图像、所述第二图像信号预处理后的图像的色彩信号和亮度信号进行融合处理,得到所述彩色融合图像。The fusion processing unit performs fusion processing on the color signal and the brightness signal of the image preprocessed by the first image signal and the image preprocessed by the second image signal to obtain the color fusion image.
在一种可能的实现方式中,所述融合处理单元对所述第一图像信号预处理后的图像、所述第二图像信号预处理后的图像的色彩信号和亮度信号进行融合处理,得到所述彩色融合图像,包括:In a possible implementation, the fusion processing unit performs fusion processing on the color signal and brightness signal of the image preprocessed by the first image signal, and the color signal and brightness signal of the image preprocessed by the second image signal to obtain the The color fusion image includes:
对所述第二图像信号预处理后的图像的亮度信息和所述第一图像信号预处理后的图像进行加权融合处理,得到融合亮度图像;Performing weighted fusion processing on the brightness information of the image preprocessed by the second image signal and the image preprocessed by the first image signal to obtain a fused brightness image;
对所述融合亮度图像和所第二图像信号预处理后的图像的色彩信号进行融合处理,得到所述彩色融合图像。Performing fusion processing on the fusion brightness image and the color signal of the image preprocessed by the second image signal to obtain the color fusion image.
在一种可能的实现方式中,所述联合降噪单元对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,包括:In a possible implementation manner, the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain the first target image and the second target image after noise reduction, including:
根据第一目标图像和第二目标图像之间的相关性,对所述第一目标图像和第二目标图像分别进行联合滤波处理,得到所述降噪后的第一目标图像和第二目标图像。According to the correlation between the first target image and the second target image, the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction .
在一种可能的实现方式中,所述联合降噪单元对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,包括:In a possible implementation manner, the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain the first target image and the second target image after noise reduction, including:
所述时域降噪单元根据所述第一目标图像和所述第二目标图像进行运动估计,得到运动估计结果,根据所述运动估计结果对所述第一目标图像进行时域滤波,得到所述降噪后的第一目标图像,根据所述运动估计结果对所述第二目标图像进行时域滤波,得到所述降噪后的第二目标图像;The temporal noise reduction unit performs motion estimation according to the first target image and the second target image to obtain a motion estimation result, and performs temporal filtering on the first target image according to the motion estimation result to obtain The first target image after noise reduction, performing temporal filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
所述空域降噪单元根据所述第一目标图像和所述第二目标图像进行边缘估计,得到边缘估计结果,根据所述边缘估计结果对所述第一目标图像进行空域滤波,得到所述降噪后的第一目标图像,根据所述边缘估计结果对所述第二目标图像进行空域滤波,得到所述降噪后的第二目标图像。The spatial noise reduction unit performs edge estimation according to the first target image and the second target image to obtain an edge estimation result, and performs spatial filtering on the first target image according to the edge estimation result to obtain the reduction The first target image after noise is subjected to spatial filtering on the second target image according to the edge estimation result to obtain the second target image after noise reduction.
在一种可能的实现方式中,所述联合降噪单元对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,包括:In a possible implementation manner, the joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain the first target image and the second target image after noise reduction, including:
所述时域降噪单元根据所述第一目标图像和所述第二目标图像进行运动估计,得到运动估计结果,根据所述运动估计结果对所述第一目标图像进行时域滤波,得到第一时域降噪图像,根据所述运动估计结果对所述第二目标图像进行时域滤波,得到第二时域降噪图像;The temporal noise reduction unit performs motion estimation according to the first target image and the second target image to obtain a motion estimation result, and performs temporal filtering on the first target image according to the motion estimation result to obtain the first target image A temporal noise reduction image, performing temporal filtering on the second target image according to the motion estimation result to obtain a second temporal noise reduction image;
所述空域降噪单元根据所述第一时域降噪图像和所述第二时域降噪图像进行边缘估计,得到边缘估计结果,根据所述边缘估计结果对所述第一时域降噪图像进行空域滤波,得到所述降噪后的第一目标图像,根据所述边缘估计结果对所述第二时域降噪图像进行空域滤波,得到所述降噪后的第二目标图像;The spatial noise reduction unit performs edge estimation according to the first temporal noise reduction image and the second temporal noise reduction image to obtain an edge estimation result, and performs noise reduction on the first temporal domain according to the edge estimation result Performing spatial filtering on the image to obtain the denoised first target image, and performing spatial filtering on the second temporal denoised image according to the edge estimation result to obtain the denoised second target image;
或者,or,
所述空域降噪单元根据所述第一目标图像和所述第二目标图像进行边缘估计,得到边缘估计结果,根据所述边缘估计结果对所述第一目标图像进行空域滤波,得到第一空域降噪图像,根据所述边缘估计结果对所述第二目标图像进行空域滤波,得到第二空域降噪图像;The spatial noise reduction unit performs edge estimation according to the first target image and the second target image to obtain an edge estimation result, and performs spatial filtering on the first target image according to the edge estimation result to obtain a first spatial domain Denoising an image, performing spatial filtering on the second target image according to the edge estimation result to obtain a second spatial denoising image;
所述时域降噪单元根据所述第一空域降噪图像和所述第二空域降噪图像进行运动估计,得到运动估计结果,根据所述运动估计结果对所述第一空域降噪图像进行时域滤波,得到所述降噪后的第一目标图像,根据所述运动估计结果对所述第二空域降噪图像进行时域滤波,得到所述降噪后的第二目标图像。The temporal noise reduction unit performs motion estimation according to the first spatial noise reduction image and the second spatial noise reduction image to obtain a motion estimation result, and performs motion estimation on the first spatial noise reduction image according to the motion estimation result Time domain filtering is performed to obtain the denoised first target image, and time domain filtering is performed on the second spatial domain denoised image according to the motion estimation result to obtain the denoised second target image.
在一种可能的实现方式中,滤光组件还可以包括第二滤光片和切换部件,此时,还可以通过切换部件将第二滤光片切换到图像传感器的入光侧。在第二滤光片切换到图像传感器的入光侧之后,通过第二滤光片使可见光波段的光通过,阻挡近红外光波段的光,在第二滤光片通过可见光波段的光且阻挡近红外光波段的光之后,通过图像传感器进行曝光,以产生并输出第三图像信号。In a possible implementation manner, the filter assembly may further include a second filter and a switching component. In this case, the second filter may also be switched to the light incident side of the image sensor through the switching component. After the second filter is switched to the light-incident side of the image sensor, the second filter allows light in the visible light band to pass and blocks the light in the near-infrared light band. The second filter passes the visible light and blocks the light. After the light in the near-infrared light band, exposure is performed by the image sensor to generate and output a third image signal.
在一种可能的实现方式中,补光器还可以包括第二补光装置,此时,通过滤光组件包括的第一滤光片使可见光波段的光和部分近红外光通过之前,还通过第二补光装置进行可见光补光。In a possible implementation manner, the light fill device may further include a second light fill device. At this time, the first light filter included in the filter assembly allows light in the visible light band and part of the near-infrared light to pass through. The second light supplement device performs visible light supplement light.
在一种可能的实现方式中,第一补光装置进行近红外光补光时通过第一滤光片的近红外光的强度高于第一补光装置未进行近红外补光时通过第一滤光片的近红外光的强度。In a possible implementation, the intensity of the near-infrared light that passes through the first filter when the first light supplement device performs near-infrared light supplementation is higher than that when the first light supplement device does not perform near-infrared light supplementation. The intensity of the near-infrared light of the filter.
在一种可能的实现方式中,入射到第一滤光片的近红外光的波段范围为第一参考波段范围,第一参考波段范围为650纳米~1100纳米。In a possible implementation manner, the wavelength range of the near-infrared light incident on the first filter is the first reference wavelength range, and the first reference wavelength range is 650 nanometers to 1100 nanometers.
在一种可能的实现方式中,第一补光装置进行近红外补光的中心波长为设定特征波长或者落在设定特征波长范围时,通过第一滤光片的近红外光的中心波长和/或波段宽度达到约束条件。In a possible implementation manner, the center wavelength of the near-infrared supplement light performed by the first light supplement device is the set characteristic wavelength or the center wavelength of the near-infrared light passing through the first filter when it falls within the set characteristic wavelength range And/or the band width meets the constraints.
在一种可能的实现方式中,第一补光装置进行近红外补光的中心波长为750±10纳米的波长范围内的任一波长;或者In a possible implementation manner, the center wavelength of the near-infrared supplement light performed by the first light supplement device is any wavelength within the wavelength range of 750±10 nanometers; or
第一补光装置进行近红外补光的中心波长为780±10纳米的波长范围内的任一波长;或者The center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 780±10 nanometers; or
第一补光装置进行近红外补光的中心波长为940±10纳米的波长范围内的任一波长。The center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 940±10 nanometers.
在一种可能的实现方式中,约束条件包括:In a possible implementation, the constraints include:
通过第一滤光片的近红外光的中心波长与第一补光装置进行近红外补光的中心波长之间的差值位于波长波动范围内,波长波动范围为0~20纳米。The difference between the center wavelength of the near-infrared light passing through the first filter and the center wavelength of the near-infrared light supplemented by the first light supplement device is within the wavelength fluctuation range, and the wavelength fluctuation range is 0-20 nanometers.
在一种可能的实现方式中,约束条件包括:In a possible implementation, the constraints include:
通过第一滤光片的近红外光的半带宽小于或等于50纳米。The half bandwidth of the near-infrared light passing through the first filter is less than or equal to 50 nanometers.
在一种可能的实现方式中,约束条件包括:In a possible implementation, the constraints include:
第一波段宽度小于第二波段宽度;其中,第一波段宽度是指通过第一滤光片的近红外光的波段宽度,第二波段宽度是指被第一滤光片阻挡的近红外光的波段宽度。The first waveband width is smaller than the second waveband width; where the first waveband width refers to the waveband width of the near-infrared light passing through the first filter, and the second waveband width refers to the near-infrared light that is blocked by the first filter. Band width.
在一种可能的实现方式中,约束条件为:In a possible implementation, the constraints are:
第三波段宽度小于参考波段宽度,第三波段宽度是指通过率大于设定比例的近红外光 的波段宽度,参考波段宽度为50纳米~150纳米的波段范围内的任一波段宽度。The third waveband width is smaller than the reference waveband width. The third waveband width refers to the waveband width of near-infrared light whose pass rate is greater than a set ratio. The reference waveband width is any waveband width in the range of 50nm to 150nm.
在一种可能的实现方式中,设定比例为30%~50%的比例范围内的任一比例。In a possible implementation manner, the set ratio is any ratio within a ratio range of 30% to 50%.
在一种可能的实现方式中,第一预设曝光与第二预设曝光的至少一个曝光参数不同,至少一个曝光参数为曝光时间、曝光增益、光圈大小中的一种或多种,曝光增益包括模拟增益,和/或,数字增益。In a possible implementation manner, at least one exposure parameter of the first preset exposure and the second preset exposure is different, the at least one exposure parameter is one or more of exposure time, exposure gain, and aperture size, and the exposure gain Including analog gain, and/or, digital gain.
在一种可能的实现方式中,第一预设曝光的曝光增益小于第二预设曝光的曝光增益。In a possible implementation manner, the exposure gain of the first preset exposure is smaller than the exposure gain of the second preset exposure.
在一种可能的实现方式中,第一预设曝光和所述第二预设曝光的至少一个曝光参数相同,至少一个曝光参数包括曝光时间、曝光增益、光圈大小中的一种或多种,曝光增益包括模拟增益,和/或,数字增益。In a possible implementation manner, at least one exposure parameter of the first preset exposure and the second preset exposure is the same, and the at least one exposure parameter includes one or more of exposure time, exposure gain, and aperture size, The exposure gain includes analog gain, and/or, digital gain.
在一种可能的实现方式中,第一预设曝光的曝光时间等于第二预设曝光的曝光时间。In a possible implementation manner, the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure.
在一种可能的实现方式中,图像传感器包括多个感光通道,每个感光通道用于感应至少一种可见光波段的光,以及感应近红外波段的光。In a possible implementation, the image sensor includes a plurality of light-sensing channels, and each light-sensing channel is used to sense at least one type of light in the visible light waveband and light in the near-infrared waveband.
在一种可能的实现方式中,多个感光通道用于感应至少两种不同的可见光波段的光。In a possible implementation, multiple photosensitive channels are used to sense at least two different visible light wavelength bands.
在一种可能的实现方式中,多个感光通道包括R感光通道、G感光通道、B感光通道、Y感光通道、W感光通道和C感光通道中的至少两种;In a possible implementation, the multiple photosensitive channels include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, Y photosensitive channels, W photosensitive channels, and C photosensitive channels;
其中,R感光通道用于感应红光波段和近红外波段的光,G感光通道用于感应绿光波段和近红外波段的光,B感光通道用于感应蓝光波段和近红外波段的光,Y感光通道用于感应黄光波段和近红外波段的光,W感光通道用于感应全波段的光,C感光通道用于感应全波段的光。Among them, the R photosensitive channel is used to sense the light in the red and near-infrared bands, the G photosensitive channel is used to sense the light in the green and near-infrared bands, and the B photosensitive channel is used to sense the light in the blue and near-infrared bands. Y The photosensitive channel is used to sense the light in the yellow band and the near-infrared band, the W photosensitive channel is used to sense the full band of light, and the C photosensitive channel is used to sense the full band of light.
在一种可能的实现方式中,图像传感器为RGB传感器、RGBW传感器,或RCCB传感器,或RYYB传感器。In a possible implementation, the image sensor is an RGB sensor, an RGBW sensor, or an RCCB sensor, or an RYYB sensor.
在一种可能的实现方式中,第二补光装置用于以常亮方式进行可见光补光;或者In a possible implementation manner, the second light supplement device is used to perform visible light supplement light in a constant light mode; or
第二补光装置用于以频闪方式进行可见光补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在可见光补光,在第二预设曝光的整个曝光时间段内不存在可见光补光;或者The second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein there is visible light supplement light at least during a part of the exposure time period of the first preset exposure, and during the entire exposure time period of the second preset exposure There is no visible light fill light; or
第二补光装置用于以频闪方式进行可见光补光,其中,至少在第一预设曝光的整个曝光时间段内不存在可见光补光,在第二预设曝光的部分曝光时间段内存在可见光补光。The second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein at least there is no visible light supplement light during the entire exposure time period of the first preset exposure, and there is no visible light supplement light during the partial exposure time period of the second preset exposure Visible light fill light.
在一种可能的实现方式中,第一补光装置在单位时间长度内的补光次数低于图像传感器在单位时间长度内的曝光次数,其中,每相邻两次补光的间隔时间段内,间隔一次或多次曝光。In a possible implementation manner, the number of times of supplementary light in the unit time length of the first supplementary light device is lower than the number of exposures of the image sensor in the unit time length, wherein, within the interval of every two adjacent times of supplementary light , Interval one or more exposures.
在一种可能的实现方式中,图像传感器采用全局曝光方式进行多次曝光,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,近红外补光的时间段是第一预设曝光的曝光时间段的子集,或者,近红外补光的时间段与第一预设曝光的曝光时间段存在交集,或者第一预设曝光的曝光时间段是近红外补光的子集。In a possible implementation, the image sensor uses a global exposure method for multiple exposures. For any one near-infrared fill light, the time period of the near-infrared fill light does not exist with the exposure time period of the nearest second preset exposure Intersection, the time period of near-infrared fill light is a subset of the exposure time period of the first preset exposure, or the time period of near-infrared fill light and the exposure time period of the first preset exposure overlap, or the first preset The exposure period of exposure is a subset of the near-infrared fill light.
在一种可能的实现方式中,图像传感器采用卷帘曝光方式进行多次曝光,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集;In a possible implementation, the image sensor adopts rolling shutter exposure for multiple exposures. For any one near-infrared fill light, the time period of near-infrared fill light is different from the exposure time period of the nearest second preset exposure. There is an intersection
近红外补光的开始时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻,近红外补光的结束时刻不晚于第一预设曝光中第一行有效图像的曝光结束时刻;The start time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure end time of the first line of the effective image in the first preset exposure ;
或者,or,
近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光结束时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻;或者The start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure Exposure end time, the end time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure The start time of the exposure of the effective image; or
近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光开始时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光结束时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻。The start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure The exposure start time, the end time of the near-infrared fill light is no earlier than the exposure end time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure. The time at which the exposure of the effective image starts.
在一种可能的实现方式中,多次曝光包括奇数次曝光和偶数次曝光;In a possible implementation manner, multiple exposures include odd exposures and even exposures;
第一预设曝光为奇数次曝光中的一次曝光,第二预设曝光为偶数次曝光中的一次曝光;或者The first preset exposure is one exposure in odd-numbered exposures, and the second preset exposure is one exposure in even-numbered exposures; or
第一预设曝光为偶数次曝光中的一次曝光,第二预设曝光为奇数次曝光中的一次曝光;或者The first preset exposure is one exposure in an even number of exposures, and the second preset exposure is one exposure in an odd number of exposures; or
第一预设曝光为指定的奇数次曝光中的一次曝光,第二预设曝光为除指定的奇数次曝光之外的其他曝光中的一次曝光;或者The first preset exposure is one of the specified odd exposures, and the second preset exposure is one of the exposures except the specified odd exposures; or
第一预设曝光为指定的偶数次曝光中的一次曝光,第二预设曝光为除指定的偶数次曝光之外的其他曝光中的一次曝光;或者,The first preset exposure is one of the specified even-numbered exposures, and the second preset exposure is one of the other exposures except the specified even-numbered exposures; or,
第一预设曝光为第一曝光序列中的一次曝光,第二预设曝光为第二曝光序列中的一次曝光;或者The first preset exposure is one exposure in the first exposure sequence, and the second preset exposure is one exposure in the second exposure sequence; or
第一预设曝光为第二曝光序列中的一次曝光,第二预设曝光为第一曝光序列中的一次曝光;The first preset exposure is one exposure in the second exposure sequence, and the second preset exposure is one exposure in the first exposure sequence;
其中,多次曝光包括多个曝光序列,第一曝光序列和第二曝光序列为多个曝光序列中的一个曝光序列或者两个曝光序列,每个曝光序列包括N次曝光,N次曝光包括1次第一预设曝光和N-1次第二预设曝光,或者,N次曝光包括1次第二预设曝光和N-1次第二预设曝光,N为大于2的正整数。Among them, the multiple exposure includes multiple exposure sequences, the first exposure sequence and the second exposure sequence are one exposure sequence or two exposure sequences among the multiple exposure sequences, each exposure sequence includes N exposures, and N exposures include 1. First preset exposure and N-1 second preset exposure, or N exposures include 1 second preset exposure and N-1 second preset exposure, and N is a positive integer greater than 2.
本申请实施例中还提供一种图像融合方法,应用于图1-35所示的实施例提供的图像融合设备,所述图像融合设备包括图像传感器、补光器、滤光组件和处理器,所述图像传感器位于所述滤光组件的出光侧,所述补光器包括第一补光装置,所述滤光组件包括第一滤光片,所述处理器包括:缓存单元和图像处理单元,参见图36,所述方法包括:The embodiment of the present application also provides an image fusion method, which is applied to the image fusion device provided in the embodiment shown in FIGS. 1-35. The image fusion device includes an image sensor, a light supplement, a filter component, and a processor, The image sensor is located at the light exit side of the light filter assembly, the light supplement includes a first light supplement device, the filter assembly includes a first filter, and the processor includes: a buffer unit and an image processing unit , See Figure 36, the method includes:
步骤3301、通过所述第一补光装置进行近红外补光,其中,至少在第一预设曝光的部分曝光时间段内进行近红外补光,在第二预设曝光的曝光时间段内不进行近红外补光,所述第一预设曝光和所述第二预设曝光为图像传感器的多次曝光中的其中两次曝光;Step 3301: Perform near-infrared compensation light by the first light-filling device, wherein the near-infrared compensation light is performed at least during a partial exposure time period of the first preset exposure, and no near-infrared light compensation is performed during the exposure time period of the second preset exposure. Performing near-infrared supplementary light, and the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
步骤3302、通过所述第一滤光片使可见光波段的光和部分近红外光通过; Step 3302, passing the visible light and part of the near-infrared light through the first filter;
步骤3303、通过所述图像传感器采用全局曝光方式进行多次曝光,以产生并输出第一图像信号和第二图像信号,所述第一图像信号是根据所述第一预设曝光产生的图像信号,所述第二图像信号是根据所述第二预设曝光产生的图像信号;Step 3303: Perform multiple exposures through the image sensor in a global exposure mode to generate and output a first image signal and a second image signal, where the first image signal is an image signal generated according to the first preset exposure , The second image signal is an image signal generated according to the second preset exposure;
步骤3304、通过图像处理单元接收图像传感器当前输出的第一目标图像信号,将 第一目标图像信号预处理后得到第一目标图像,在第一目标图像需要缓存时,至少将第一目标图像同步输出至缓存单元进行缓存,以及在缓存单元需要同步输出缓存单元已缓存的第二目标图像时,至少接收缓存单元同步输出的第二目标图像,根据第一目标图像和第二目标图像生成彩色融合图像;Step 3304: Receive the first target image signal currently output by the image sensor through the image processing unit, preprocess the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least synchronize the first target image Output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, at least receive the second target image synchronously output by the buffer unit, and generate color fusion based on the first target image and the second target image image;
其中,若所述第一目标图像信号为第一图像信号则所述第一目标图像为第一图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后生成的图像,所述第二目标图像信号为所述第二图像信号;若所述第一目标图像信号为第二图像信号,则所述第一目标图像为第二图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后的图像,所述第二目标图像信号为所述第一图像信号;Wherein, if the first target image signal is a first image signal, the first target image is an image generated after the first image signal is preprocessed, and the second target image is a buffered frame by the second target An image generated after image signal preprocessing, the second target image signal is the second image signal; if the first target image signal is the second image signal, the first target image is the second image signal An image generated after preprocessing, the second target image is a buffered frame of an image preprocessed by a second target image signal, and the second target image signal is the first image signal;
步骤3305、通过缓存单元在获知所述第一目标图像需要缓存时,至少将所述图像处理单元同步输出的所述第一目标图像进行缓存,以及在获知需要同步输出已缓存的第二目标图像时,至少将已缓存的第二目标图像信号同步输出至所述图像处理单元。Step 3305: When the cache unit learns that the first target image needs to be cached, at least cache the first target image synchronously output by the image processing unit, and when it is known that the cached second target image needs to be output synchronously At least the buffered second target image signal is synchronously output to the image processing unit.
在一种可能的实现方式中,所述方法还包括:In a possible implementation manner, the method further includes:
所述同步单元确定所述图像处理单元预处理生成的第一目标图像需要缓存时,指示缓存单元将所述第一目标图像进行缓存,以及从已缓存的图像中确定需要同步输出第二目标图像时,指示缓存单元将所述第二目标图像同步输出至图像处理单元。When the synchronization unit determines that the first target image preprocessed and generated by the image processing unit needs to be cached, it instructs the cache unit to cache the first target image, and determines from the cached images that the second target image needs to be output synchronously At this time, the cache unit is instructed to synchronously output the second target image to the image processing unit.
在一种可能的实现方式中,所述图像处理单元接收图像传感器当前输出的第一目标图像信号,将第一目标图像信号预处理后得到第一目标图像,在第一目标图像需要缓存时,将第一目标图像同步输出至缓存单元进行缓存,以及在缓存单元需要同步输出缓存单元已缓存的第二目标图像时,接收缓存单元同步输出的第二目标图像,根据第一目标图像和第二目标图像生成彩色融合图像,包括:In a possible implementation manner, the image processing unit receives the first target image signal currently output by the image sensor, preprocesses the first target image signal to obtain the first target image, and when the first target image needs to be cached, Synchronously output the first target image to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, receive the second target image synchronously output by the buffer unit, according to the first target image and the second target image. The target image generates a color fusion image, including:
所述图像预处理单元将所述第一目标图像信号经预处理后生成第一目标图像,并将所述第二目标图像信号经预处理后生成第二目标图像;The image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target image;
所述图像融合单元将所述第一目标图像和所述第二目标图像进行融合处理,得到所述彩色融合图像。The image fusion unit performs fusion processing on the first target image and the second target image to obtain the color fusion image.
在一种可能的实现方式中,所述图像预处理单元将所述第一目标图像信号经预处理后生成第一目标图像,并将所述第二目标图像信号经预处理后生成第二目标图像,包括:In a possible implementation, the image preprocessing unit preprocesses the first target image signal to generate a first target image, and preprocesses the second target image signal to generate a second target Images, including:
所述第一预处理单元对所述第一目标图像信号进行第一预处理操作,得到预处理后的第一目标图像;The first preprocessing unit performs a first preprocessing operation on the first target image signal to obtain a preprocessed first target image;
所述第二预处理单元对所述第二目标图像信号进行第二预处理操作,得到第二目标图像;The second preprocessing unit performs a second preprocessing operation on the second target image signal to obtain a second target image;
所述联合降噪单元对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,所述降噪后的第一目标图像和第二目标图像用于进行融合处理,得到所述彩色融合图像,所述降噪后的第一目标图像和第二目标图像用于进行融合处理,得到所述彩色融合图像。The joint noise reduction unit performs filtering processing on the first target image and the second target image to obtain a first target image and a second target image after noise reduction, the first target image after noise reduction and The second target image is used for fusion processing to obtain the color fusion image, and the noise-reduced first target image and the second target image are used for fusion processing to obtain the color fusion image.
在一种可能的实现方式中,所述同步单元确定所述图像处理单元预处理生成的第一目标图像需要缓存时,指示缓存单元将所述第一目标图像进行缓存,以及从已缓存的图像中确定需要同步输出第二目标图像时,指示缓存单元将所述第二目标图像同步输出至图像处 理单元,包括:In a possible implementation manner, when the synchronization unit determines that the first target image generated by the preprocessing of the image processing unit needs to be cached, it instructs the cache unit to cache the first target image, and retrieve the cached image from the When it is determined that the second target image needs to be output synchronously in the process, instructing the buffer unit to synchronously output the second target image to the image processing unit includes:
同步单元确定每一帧所述第一目标图像需要缓存,并且需要同步输出所述第二目标图像,所述第二目标图像为所述缓存单元前一次缓存的图像;The synchronization unit determines that each frame of the first target image needs to be buffered, and needs to output the second target image synchronously, and the second target image is the image buffered by the buffer unit last time;
其中,若所述第一目标图像为第二图像信号预处理后生成的图像,则所述缓存单元当前缓存第二图像信号预处理后生成的图像,并将前一次缓存的第一图像信号预处理后生成的图像确定为所述第二目标图像输出至图像预处理单元;Wherein, if the first target image is an image generated after preprocessing of the second image signal, the buffering unit currently buffers the image generated after preprocessing of the second image signal, and preprocesses the first image signal buffered previously. The image generated after the processing is determined to be the second target image and output to the image preprocessing unit;
若所述第一目标图像信号为第一图像信号预处理后生成的图像,则所述缓存单元当前缓存第一图像信号预处理后生成的图像,并将前一次缓存的第二图像信号预处理后生成的图像确定为所述第二目标图像输出至图像预处理单元。If the first target image signal is an image generated after preprocessing of the first image signal, the buffering unit currently buffers the image generated after the preprocessing of the first image signal, and preprocesses the second image signal buffered previously The generated image is determined to be the second target image and output to the image preprocessing unit.
在一种可能的实现方式中,所述同步单元确定所述图像处理单元预处理生成的第一目标图像需要缓存时,指示缓存单元将所述第一目标图像进行缓存,以及从已缓存的图像中确定需要同步输出第二目标图像时,指示缓存单元将所述第二目标图像同步输出至图像处理单元,包括:In a possible implementation manner, when the synchronization unit determines that the first target image generated by the preprocessing of the image processing unit needs to be cached, it instructs the cache unit to cache the first target image, and retrieve the cached image from the When it is determined that the second target image needs to be output synchronously in the process, instructing the buffer unit to synchronously output the second target image to the image processing unit includes:
所述同步单元确定所述第一目标图像为第一图像信号预处理后的图像时需要缓存,以及在确定所述第一目标图像为第二图像信号预处理后的图像时,需要同步输出所述第二目标图像,所述第二目标图像为所述缓存单元已缓存的图像中最近一次缓存的第一图像信号预处理后的图像;其中,若所述第一目标图像为第二图像信号预处理后的图像,则所述缓存单元将最近一次缓存的第一图像信号预处理后的图像确定为所述第二目标图像输出至图像预处理单元;若所述第一目标图像为第一图像信号预处理后的图像,则所述缓存单元缓存所述第一图像信号预处理后的图像;或者,When the synchronization unit determines that the first target image is an image preprocessed by the first image signal, it needs to be buffered, and when determining that the first target image is an image preprocessed by the second image signal, it needs to synchronize the output data. The second target image, the second target image is an image preprocessed by the first image signal that is buffered last time among the images that have been buffered by the buffer unit; wherein, if the first target image is a second image signal Preprocessed image, the buffering unit determines the image preprocessed by the first image signal that was buffered last time as the second target image and output to the image preprocessing unit; if the first target image is the first If the image is preprocessed by the image signal, the buffer unit buffers the image after the preprocessing of the first image signal; or,
所述同步单元确定所述第一目标图像为第二图像信号预处理后的图像时需要缓存,以及在确定所述第一目标图像为第一图像信号预处理后的图像时,需要同步输出所述第二目标图像,所述第二目标图像为所述缓存单元已缓存的图像中最近一次缓存的第二图像信号预处理后的图像;其中,若所述第一目标图像为第一图像信号预处理后的图像,则所述缓存单元将最近一次缓存的第二图像信号预处理后的图像确定为所述第二目标图像输出至图像预处理单元;若所述第一目标图像为第二图像信号预处理后的图像,则所述缓存单元缓存所述第二图像信号预处理后的图像。When the synchronization unit determines that the first target image is an image preprocessed by the second image signal, it needs to be buffered, and when determining that the first target image is an image preprocessed by the first image signal, it needs to synchronize the output data. The second target image, the second target image is an image preprocessed by a second image signal that has been buffered last time among images that have been buffered by the buffer unit; wherein, if the first target image is a first image signal Preprocessed image, the buffering unit determines the image preprocessed by the second image signal that was buffered last time as the second target image and output to the image preprocessing unit; if the first target image is the second If the image signal is preprocessed, the buffer unit buffers the image after the second image signal preprocessing.
在一种可能的实现方式中,滤光组件还可以包括第二滤光片和切换部件,此时,还可以通过切换部件将第二滤光片切换到图像传感器的入光侧。在第二滤光片切换到图像传感器的入光侧之后,通过第二滤光片使可见光波段的光通过,阻挡近红外光波段的光,在第二滤光片通过可见光波段的光且阻挡近红外光波段的光之后,通过图像传感器进行曝光,以产生并输出第三图像信号。In a possible implementation manner, the filter assembly may further include a second filter and a switching component. In this case, the second filter may also be switched to the light incident side of the image sensor through the switching component. After the second filter is switched to the light-incident side of the image sensor, the second filter allows light in the visible light band to pass and blocks the light in the near-infrared light band. The second filter passes the visible light and blocks the light. After the light in the near-infrared light band, exposure is performed by the image sensor to generate and output a third image signal.
在一种可能的实现方式中,补光器还可以包括第二补光装置,此时,通过滤光组件包括的第一滤光片使可见光波段的光和部分近红外光通过之前,还通过第二补光装置进行可见光补光。In a possible implementation manner, the light fill device may further include a second light fill device. At this time, the first light filter included in the filter assembly allows light in the visible light band and part of the near-infrared light to pass through. The second light supplement device performs visible light supplement light.
在一种可能的实现方式中,第一补光装置进行近红外光补光时通过第一滤光片的近红外光的强度高于第一补光装置未进行近红外补光时通过第一滤光片的近红外光的强度。In a possible implementation, the intensity of the near-infrared light that passes through the first filter when the first light supplement device performs near-infrared light supplementation is higher than that when the first light supplement device does not perform near-infrared light supplementation. The intensity of the near-infrared light of the filter.
在一种可能的实现方式中,入射到第一滤光片的近红外光的波段范围为第一参考波段范围,第一参考波段范围为650纳米~1100纳米。In a possible implementation manner, the wavelength range of the near-infrared light incident on the first filter is the first reference wavelength range, and the first reference wavelength range is 650 nanometers to 1100 nanometers.
在一种可能的实现方式中,第一补光装置进行近红外补光的中心波长为设定特征波长或者落在设定特征波长范围时,通过第一滤光片的近红外光的中心波长和/或波段宽度达到约束条件。In a possible implementation manner, the center wavelength of the near-infrared supplement light performed by the first light supplement device is the set characteristic wavelength or the center wavelength of the near-infrared light passing through the first filter when it falls within the set characteristic wavelength range And/or the band width meets the constraints.
在一种可能的实现方式中,第一补光装置进行近红外补光的中心波长为750±10纳米的波长范围内的任一波长;或者In a possible implementation manner, the center wavelength of the near-infrared supplement light performed by the first light supplement device is any wavelength within the wavelength range of 750±10 nanometers; or
第一补光装置进行近红外补光的中心波长为780±10纳米的波长范围内的任一波长;或者The center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 780±10 nanometers; or
第一补光装置进行近红外补光的中心波长为940±10纳米的波长范围内的任一波长。The center wavelength of the first light supplement device for near-infrared supplement light is any wavelength within the wavelength range of 940±10 nanometers.
在一种可能的实现方式中,约束条件包括:In a possible implementation, the constraints include:
通过第一滤光片的近红外光的中心波长与第一补光装置进行近红外补光的中心波长之间的差值位于波长波动范围内,波长波动范围为0~20纳米。The difference between the center wavelength of the near-infrared light passing through the first filter and the center wavelength of the near-infrared light supplemented by the first light supplement device is within the wavelength fluctuation range, and the wavelength fluctuation range is 0-20 nanometers.
在一种可能的实现方式中,约束条件包括:In a possible implementation, the constraints include:
通过第一滤光片的近红外光的半带宽小于或等于50纳米。The half bandwidth of the near-infrared light passing through the first filter is less than or equal to 50 nanometers.
在一种可能的实现方式中,约束条件包括:In a possible implementation, the constraints include:
第一波段宽度小于第二波段宽度;其中,第一波段宽度是指通过第一滤光片的近红外光的波段宽度,第二波段宽度是指被第一滤光片阻挡的近红外光的波段宽度。The first waveband width is smaller than the second waveband width; where the first waveband width refers to the waveband width of the near-infrared light passing through the first filter, and the second waveband width refers to the near-infrared light that is blocked by the first filter. Band width.
在一种可能的实现方式中,约束条件为:In a possible implementation, the constraints are:
第三波段宽度小于参考波段宽度,第三波段宽度是指通过率大于设定比例的近红外光的波段宽度,参考波段宽度为50纳米~150纳米的波段范围内的任一波段宽度。The third waveband width is smaller than the reference waveband width. The third waveband width refers to the waveband width of near-infrared light whose pass rate is greater than a set ratio. The reference waveband width is any waveband width in the range of 50 nm to 150 nm.
在一种可能的实现方式中,设定比例为30%~50%的比例范围内的任一比例。In a possible implementation manner, the set ratio is any ratio within a ratio range of 30% to 50%.
在一种可能的实现方式中,第一预设曝光与第二预设曝光的至少一个曝光参数不同,至少一个曝光参数为曝光时间、曝光增益、光圈大小中的一种或多种,曝光增益包括模拟增益,和/或,数字增益。In a possible implementation manner, at least one exposure parameter of the first preset exposure and the second preset exposure is different, the at least one exposure parameter is one or more of exposure time, exposure gain, and aperture size, and the exposure gain Including analog gain, and/or, digital gain.
在一种可能的实现方式中,第一预设曝光的曝光增益小于第二预设曝光的曝光增益。In a possible implementation manner, the exposure gain of the first preset exposure is smaller than the exposure gain of the second preset exposure.
在一种可能的实现方式中,第一预设曝光和所述第二预设曝光的至少一个曝光参数相同,至少一个曝光参数包括曝光时间、曝光增益、光圈大小中的一种或多种,曝光增益包括模拟增益,和/或,数字增益。In a possible implementation manner, at least one exposure parameter of the first preset exposure and the second preset exposure is the same, and the at least one exposure parameter includes one or more of exposure time, exposure gain, and aperture size, The exposure gain includes analog gain, and/or, digital gain.
在一种可能的实现方式中,第一预设曝光的曝光时间等于第二预设曝光的曝光时间。In a possible implementation manner, the exposure time of the first preset exposure is equal to the exposure time of the second preset exposure.
在一种可能的实现方式中,图像传感器包括多个感光通道,每个感光通道用于感应至少一种可见光波段的光,以及感应近红外波段的光。In a possible implementation, the image sensor includes a plurality of light-sensing channels, and each light-sensing channel is used to sense at least one type of light in the visible light waveband and light in the near-infrared waveband.
在一种可能的实现方式中,多个感光通道用于感应至少两种不同的可见光波段的光。In a possible implementation, multiple photosensitive channels are used to sense at least two different visible light wavelength bands.
在一种可能的实现方式中,多个感光通道包括R感光通道、G感光通道、B感光通道、Y感光通道、W感光通道和C感光通道中的至少两种;In a possible implementation, the multiple photosensitive channels include at least two of R photosensitive channels, G photosensitive channels, B photosensitive channels, Y photosensitive channels, W photosensitive channels, and C photosensitive channels;
其中,R感光通道用于感应红光波段和近红外波段的光,G感光通道用于感应绿光波段和近红外波段的光,B感光通道用于感应蓝光波段和近红外波段的光,Y感光通道用于感应黄光波段和近红外波段的光,W感光通道用于感应全波段的光,C感光通道用于感应全波段的光。Among them, the R photosensitive channel is used to sense the light in the red and near-infrared bands, the G photosensitive channel is used to sense the light in the green and near-infrared bands, and the B photosensitive channel is used to sense the light in the blue and near-infrared bands. Y The photosensitive channel is used to sense the light in the yellow band and the near-infrared band, the W photosensitive channel is used to sense the full band of light, and the C photosensitive channel is used to sense the full band of light.
在一种可能的实现方式中,图像传感器为RGB传感器、RGBW传感器,或RCCB传感器,或RYYB传感器。In a possible implementation, the image sensor is an RGB sensor, an RGBW sensor, or an RCCB sensor, or an RYYB sensor.
在一种可能的实现方式中,第二补光装置用于以常亮方式进行可见光补光;或者In a possible implementation manner, the second light supplement device is used to perform visible light supplement light in a constant light mode; or
第二补光装置用于以频闪方式进行可见光补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在可见光补光,在第二预设曝光的整个曝光时间段内不存在可见光补光;或者The second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein there is visible light supplement light at least during a part of the exposure time period of the first preset exposure, and during the entire exposure time period of the second preset exposure There is no visible light fill light; or
第二补光装置用于以频闪方式进行可见光补光,其中,至少在第一预设曝光的整个曝光时间段内不存在可见光补光,在第二预设曝光的部分曝光时间段内存在可见光补光。The second light supplement device is used to perform visible light supplement light in a stroboscopic manner, wherein at least there is no visible light supplement light during the entire exposure time period of the first preset exposure, and there is no visible light supplement light during the partial exposure time period of the second preset exposure Visible light fill light.
在一种可能的实现方式中,第一补光装置在单位时间长度内的补光次数低于图像传感器在单位时间长度内的曝光次数,其中,每相邻两次补光的间隔时间段内,间隔一次或多次曝光。In a possible implementation manner, the number of times of supplementary light in the unit time length of the first supplementary light device is lower than the number of exposures of the image sensor in the unit time length, wherein, within the interval of every two adjacent times of supplementary light , Interval one or more exposures.
在一种可能的实现方式中,图像传感器采用全局曝光方式进行多次曝光,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集,近红外补光的时间段是第一预设曝光的曝光时间段的子集,或者,近红外补光的时间段与第一预设曝光的曝光时间段存在交集,或者第一预设曝光的曝光时间段是近红外补光的子集。In a possible implementation, the image sensor uses a global exposure method for multiple exposures. For any one near-infrared fill light, the time period of the near-infrared fill light does not exist with the exposure time period of the nearest second preset exposure Intersection, the time period of near-infrared fill light is a subset of the exposure time period of the first preset exposure, or the time period of near-infrared fill light and the exposure time period of the first preset exposure overlap, or the first preset The exposure period of exposure is a subset of the near-infrared fill light.
在一种可能的实现方式中,图像传感器采用卷帘曝光方式进行多次曝光,对于任意一次近红外补光,近红外补光的时间段与最邻近的第二预设曝光的曝光时间段不存在交集;In a possible implementation, the image sensor adopts rolling shutter exposure for multiple exposures. For any one near-infrared fill light, the time period of near-infrared fill light is different from the exposure time period of the nearest second preset exposure. There is an intersection
近红外补光的开始时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻,近红外补光的结束时刻不晚于第一预设曝光中第一行有效图像的曝光结束时刻;The start time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure, and the end time of the near-infrared fill light is no later than the exposure end time of the first line of the effective image in the first preset exposure ;
或者,or,
近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光结束时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光开始时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻;或者The start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure Exposure end time, the end time of the near-infrared fill light is no earlier than the exposure start time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure The start time of the exposure of the effective image; or
近红外补光的开始时刻不早于第一预设曝光之前的最邻近的第二预设曝光的最后一行有效图像的曝光结束时刻且不晚于第一预设曝光中第一行有效图像的曝光开始时刻,近红外补光的结束时刻不早于第一预设曝光中最后一行有效图像的曝光结束时刻且不晚于第一预设曝光之后的最邻近的第二预设曝光的第一行有效图像的曝光开始时刻。The start time of the near-infrared fill light is no earlier than the exposure end time of the last effective image line of the nearest second preset exposure before the first preset exposure and no later than the first effective image line of the first preset exposure The exposure start time, the end time of the near-infrared fill light is no earlier than the exposure end time of the last line of the effective image in the first preset exposure and no later than the first preset exposure of the nearest second preset exposure after the first preset exposure. The time at which the exposure of the effective image starts.
在一种可能的实现方式中,多次曝光包括奇数次曝光和偶数次曝光;In a possible implementation manner, multiple exposures include odd exposures and even exposures;
第一预设曝光为奇数次曝光中的一次曝光,第二预设曝光为偶数次曝光中的一次曝光;或者The first preset exposure is one exposure in odd-numbered exposures, and the second preset exposure is one exposure in even-numbered exposures; or
第一预设曝光为偶数次曝光中的一次曝光,第二预设曝光为奇数次曝光中的一次曝光;或者The first preset exposure is one exposure in an even number of exposures, and the second preset exposure is one exposure in an odd number of exposures; or
第一预设曝光为指定的奇数次曝光中的一次曝光,第二预设曝光为除指定的奇数次曝光之外的其他曝光中的一次曝光;或者The first preset exposure is one of the specified odd-numbered exposures, and the second preset exposure is one of the other exposures except the specified odd-numbered exposures; or
第一预设曝光为指定的偶数次曝光中的一次曝光,第二预设曝光为除指定的偶数次曝光之外的其他曝光中的一次曝光;或者,The first preset exposure is one of the specified even-numbered exposures, and the second preset exposure is one of the other exposures except the specified even-numbered exposures; or,
第一预设曝光为第一曝光序列中的一次曝光,第二预设曝光为第二曝光序列中的一次曝光;或者The first preset exposure is one exposure in the first exposure sequence, and the second preset exposure is one exposure in the second exposure sequence; or
第一预设曝光为第二曝光序列中的一次曝光,第二预设曝光为第一曝光序列中的一次 曝光;The first preset exposure is one exposure in the second exposure sequence, and the second preset exposure is one exposure in the first exposure sequence;
其中,多次曝光包括多个曝光序列,第一曝光序列和第二曝光序列为多个曝光序列中的一个曝光序列或者两个曝光序列,每个曝光序列包括N次曝光,N次曝光包括1次第一预设曝光和N-1次第二预设曝光,或者,N次曝光包括1次第二预设曝光和N-1次第二预设曝光,N为大于2的正整数。Among them, the multiple exposure includes multiple exposure sequences, the first exposure sequence and the second exposure sequence are one exposure sequence or two exposure sequences among the multiple exposure sequences, each exposure sequence includes N exposures, and N exposures include 1. First preset exposure and N-1 second preset exposure, or N exposures include 1 second preset exposure and N-1 second preset exposure, and N is a positive integer greater than 2.
需要说明的是,由于本实施例与上述图1-34所示的实施例可以采用同样的发明构思,因此,关于本实施例内容的解释可以参考上述图1-34所示实施例中相关内容的解释,此处不再赘述。It should be noted that since this embodiment and the embodiment shown in Figs. 1-34 can adopt the same inventive concept, for the explanation of the content of this embodiment, please refer to the relevant content in the embodiment shown in Figs. 1-34. The explanation is not repeated here.
本领域技术人员在考虑说明书及实践这里公开的申请后,将容易想到本公开的其它实施方案。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由下面的权利要求书指出。Those skilled in the art will easily think of other embodiments of the present disclosure after considering the specification and practicing the application disclosed herein. This application is intended to cover any variations, uses, or adaptive changes of the present disclosure, which follow the general principles of the present disclosure and include common knowledge or conventional technical means in the technical field not disclosed in the present disclosure . The description and the embodiments are only regarded as exemplary, and the true scope and spirit of the present disclosure are pointed out by the following claims.
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求书来限制。It should be understood that the present disclosure is not limited to the precise structure that has been described above and shown in the drawings, and various modifications and changes can be made without departing from its scope. The scope of the present disclosure is only limited by the appended claims.

Claims (26)

  1. 一种图像融合设备,其特征在于,包括:An image fusion device, characterized in that it comprises:
    镜头、滤光组件、单个图像传感器、补光器和处理器,所述图像传感器位于所述滤光组件的出光侧;A lens, a filter component, a single image sensor, a light supplement and a processor, the image sensor is located on the light exit side of the filter component;
    所述图像传感器,用于通过多次曝光产生并输出第一图像信号和第二图像信号,其中,所述第一图像信号是根据第一预设曝光产生的图像信号,所述第二图像信号是根据第二预设曝光产生的图像信号,所述第一预设曝光和所述第二预设曝光为所述多次曝光中的其中两次曝光;The image sensor is configured to generate and output a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal Is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure being two of the multiple exposures;
    所述补光器包括第一补光装置,所述第一补光装置用于进行近红外补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在近红外补光,在所述第二预设曝光的曝光时间段内不存在近红外补光;The light supplementer includes a first light supplement device, and the first light supplement device is used to perform near-infrared supplement light, wherein at least the near-infrared supplement light exists in a partial exposure time period of the first preset exposure, There is no near-infrared fill light in the exposure time period of the second preset exposure;
    所述滤光组件包括第一滤光片,所述第一滤光片用于通过可见光波段和部分近红外光;The filter assembly includes a first filter, and the first filter is used to pass visible light and part of near-infrared light;
    所述处理器,包括缓存单元和图像处理单元;The processor includes a cache unit and an image processing unit;
    所述缓存单元,用于在获知所述图像传感器当前输出的第一目标图像信号需要缓存时,将所述第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,至少将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若所述第一目标图像信号为第一图像信号,所述第二目标图像信号为已缓存的一帧第二图像信号,或者所述第一目标图像信号为第二图像信号,所述第二目标图像信号为已缓存的一帧第一图像信号;The buffer unit is configured to buffer the first target image signal when it is known that the first target image signal currently output by the image sensor needs to be buffered, and when it is known that the buffered second target image signal needs to be output synchronously At least the buffered second target image signal is synchronously output to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a second buffered frame An image signal, or the first target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
    所述图像处理单元,用于至少接收所述图像传感器当前输出的第一目标图像信号,以及至少接收所述缓存单元同步输出的第二目标图像信号,根据所述第一目标图像信号和所述第二目标图像信号生成彩色融合图像。The image processing unit is configured to receive at least a first target image signal currently output by the image sensor, and at least a second target image signal synchronously output by the buffer unit, according to the first target image signal and the The second target image signal generates a color fusion image.
  2. 根据权利要求1所述的设备,其特征在于,所述处理器还包括:同步单元;The device according to claim 1, wherein the processor further comprises: a synchronization unit;
    所述同步单元用于确定所述图像传感器当前输出的第一目标图像信号需要缓存时,指示缓存单元将所述第一目标图像信号进行缓存,以及从已缓存的图像信号中确定需要同步输出第二目标图像信号时,指示缓存单元将所述第二目标图像信号同步输出至所述图像处理单元。The synchronization unit is used to determine that the first target image signal currently output by the image sensor needs to be buffered, instruct the buffering unit to buffer the first target image signal, and determine from the buffered image signals that the first target image signal needs to be output synchronously When the second target image signal is used, the buffer unit is instructed to synchronously output the second target image signal to the image processing unit.
  3. 根据权利要求2所述的设备,其特征在于,The device according to claim 2, wherein:
    所述同步单元,用于确定每一帧所述第一目标图像信号需要缓存,并且需要同步输出所述第二目标图像信号,所述第二目标图像信号为所述缓存单元前一次缓存的图像信号;The synchronization unit is configured to determine that each frame of the first target image signal needs to be buffered, and the second target image signal needs to be output synchronously, and the second target image signal is the image buffered by the buffer unit last time signal;
    其中,若所述第一目标图像信号为第二图像信号,则所述缓存单元当前缓存第二图像信号,并将前一次缓存的第一图像信号确定为所述第二目标图像信号输出至图像预处理单元;Wherein, if the first target image signal is a second image signal, the buffer unit currently buffers the second image signal, and determines the first image signal buffered last time as the second target image signal and outputs it to the image Preprocessing unit
    若所述第一目标图像信号为第一图像信号,则所述缓存单元当前缓存第一图像信号,并将前一次缓存的第二图像信号确定为所述第二目标图像信号输出至图像预处理单元。If the first target image signal is the first image signal, the buffer unit currently buffers the first image signal, and determines the second image signal buffered last time as the second target image signal and outputs it to the image preprocessing unit.
  4. 根据权利要求2所述的设备,其特征在于,The device according to claim 2, wherein:
    所述同步单元,用于确定所述第一目标图像信号为第一图像信号时需要缓存,以及在 确定所述第一目标图像信号为第二图像信号时,需要同步输出所述第二目标图像信号,所述第二目标图像信号为所述缓存单元已缓存的图像信号中最近一次缓存的第一图像信号;其中,若所述第一目标图像信号为第二图像信号,则所述缓存单元将最近一次缓存的第一图像信号确定为所述第二目标图像信号输出至图像预处理单元;若所述第一目标图像信号为第一图像信号,则所述缓存单元缓存所述第一图像信号;或者,The synchronization unit is configured to determine that the first target image signal is a first image signal and need to be buffered, and when it is determined that the first target image signal is a second image signal, need to synchronously output the second target image Signal, the second target image signal is the most recently buffered first image signal among the image signals buffered by the buffer unit; wherein, if the first target image signal is a second image signal, the buffer unit Determine the first image signal buffered most recently as the second target image signal and output to the image preprocessing unit; if the first target image signal is the first image signal, the buffer unit buffers the first image Signal; or,
    所述同步单元,用于确定所述第一目标图像信号为第二图像信号时需要缓存,以及在确定所述第一目标图像信号为第一图像信号时,需要同步输出所述第二目标图像信号,所述第二目标图像信号为所述缓存单元已缓存的第二图像信号中最近一次缓存的第二图像信号;其中,若所述第一目标图像信号为第一图像信号,则所述缓存单元将最近一次缓存的第二图像信号确定为所述第二目标图像信号输出至图像预处理单元;若所述第一目标图像信号为第二图像信号,则所述缓存单元缓存所述第二图像信号。The synchronization unit is used to determine that the first target image signal is a second image signal and need to be buffered, and when it is determined that the first target image signal is the first image signal, need to synchronously output the second target image Signal, the second target image signal is the most recently buffered second image signal among the second image signals buffered by the buffer unit; wherein, if the first target image signal is the first image signal, the The buffer unit determines the second image signal buffered most recently as the second target image signal and outputs it to the image preprocessing unit; if the first target image signal is the second image signal, the buffer unit buffers the first image signal. 2. Image signal.
  5. 根据权利要求2所述的设备,其特征在于,The device according to claim 2, wherein:
    所述同步单元,用于确定每一帧所述第一目标图像信号需要缓存,并且需要同步输出最近一次缓存的第二目标图像信号和最近一次缓存的第一目标图像信号;The synchronization unit is configured to determine that the first target image signal of each frame needs to be buffered, and needs to synchronously output the second target image signal buffered last time and the first target image signal buffered last time;
    其中,若所述第一目标图像信号为第二图像信号,则所述缓存单元当前缓存第二图像信号,并将最近一次缓存的第一图像信号和最近一次缓存的第二图像信号输出;Wherein, if the first target image signal is a second image signal, the buffer unit currently buffers the second image signal, and outputs the most recently buffered first image signal and the most recently buffered second image signal;
    若所述第一目标图像信号为第一图像信号,则所述缓存单元当前缓存第一图像信号,并将最近一次缓存的第一图像信号和最近一次缓存的第二图像信号输出。If the first target image signal is the first image signal, the buffering unit currently buffers the first image signal, and outputs the most recently buffered first image signal and the most recently buffered second image signal.
  6. 根据权利要求1-5任一项所述的设备,其特征在于,所述图像处理单元,包括:图像预处理单元和图像融合单元;The device according to any one of claims 1 to 5, wherein the image processing unit comprises: an image preprocessing unit and an image fusion unit;
    所述图像预处理单元,用于将所述第一目标图像信号经预处理后生成第一目标图像,将所述第二目标图像信号经预处理后生成第二目标图像;The image preprocessing unit is configured to preprocess the first target image signal to generate a first target image, and preprocess the second target image signal to generate a second target image;
    所述图像融合单元,用于将所述第一目标图像和所述第二目标图像进行融合处理,得到所述彩色融合图像。The image fusion unit is configured to perform fusion processing on the first target image and the second target image to obtain the color fusion image.
  7. 根据权利要求6所述的设备,其特征在于,The device according to claim 6, wherein:
    所述图像预处理单元,包括:第一预处理单元、第二预处理单元和联合降噪单元;The image preprocessing unit includes: a first preprocessing unit, a second preprocessing unit, and a joint noise reduction unit;
    所述第一预处理单元,用于对所述第一目标图像信号进行第一预处理操作,得到预处理后的第一目标图像;The first preprocessing unit is configured to perform a first preprocessing operation on the first target image signal to obtain a preprocessed first target image;
    所述第二预处理单元,用于对所述第二目标图像信号进行第二预处理操作,得到第二目标图像;The second preprocessing unit is configured to perform a second preprocessing operation on the second target image signal to obtain a second target image;
    所述联合降噪单元,用于对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,所述降噪后的第一目标图像和第二目标图像用于进行融合处理,得到所述彩色融合图像。The joint noise reduction unit is configured to perform filtering processing on the first target image and the second target image to obtain a first target image and a second target image after noise reduction, and the first target image after noise reduction is The target image and the second target image are used for fusion processing to obtain the color fusion image.
  8. 根据权利要求6所述的设备,其特征在于,所述图像融合单元包括:色彩提取单元、亮度提取单元,分别与所述色彩提取单元和所述亮度提取单元连接的融合处理单元;The device according to claim 6, wherein the image fusion unit comprises: a color extraction unit, a brightness extraction unit, and a fusion processing unit respectively connected to the color extraction unit and the brightness extraction unit;
    其中,所述色彩提取单元,用于提取所述第二图像信号预处理后的图像的色彩信号;Wherein, the color extraction unit is configured to extract the color signal of the image preprocessed by the second image signal;
    所述亮度提取单元,用于提取所述第二图像信号预处理后的图像的亮度信号;The brightness extraction unit is configured to extract the brightness signal of the image preprocessed by the second image signal;
    所述融合处理单元,用于对所述第一图像信号预处理后的图像、所述第二图像信号预处理后的图像的色彩信号和亮度信号进行融合处理,得到所述彩色融合图像。The fusion processing unit is configured to perform fusion processing on the color signal and brightness signal of the image preprocessed by the first image signal and the image preprocessed by the second image signal to obtain the color fusion image.
  9. 根据权利要求8所述的设备,其特征在于,所述融合处理单元,具体用于:The device according to claim 8, wherein the fusion processing unit is specifically configured to:
    对所述第二图像信号预处理后的图像的亮度信息和所述第一图像信号预处理后的图像进行加权融合处理,得到融合亮度图像;Performing weighted fusion processing on the brightness information of the image preprocessed by the second image signal and the image preprocessed by the first image signal to obtain a fused brightness image;
    对所述融合亮度图像和所第二图像信号预处理后的图像的色彩信号进行融合处理,得到所述彩色融合图像。Performing fusion processing on the fusion brightness image and the color signal of the image preprocessed by the second image signal to obtain the color fusion image.
  10. 根据权利要求7所述的设备,其特征在于,所述联合降噪单元,具体用于:The device according to claim 7, wherein the joint noise reduction unit is specifically configured to:
    根据第一目标图像和第二目标图像之间的相关性,对所述第一目标图像和第二目标图像分别进行联合滤波处理,得到所述降噪后的第一目标图像和第二目标图像。According to the correlation between the first target image and the second target image, the first target image and the second target image are respectively subjected to joint filtering processing to obtain the first target image and the second target image after noise reduction .
  11. 根据权利要求7所述的设备,其特征在于,所述联合降噪单元包括时域降噪单元或空域降噪单元;The device according to claim 7, wherein the joint noise reduction unit comprises a temporal noise reduction unit or a spatial noise reduction unit;
    所述时域降噪单元用于根据所述第一目标图像和所述第二目标图像进行运动估计,得到运动估计结果,根据所述运动估计结果对所述第一目标图像进行时域滤波,得到所述降噪后的第一目标图像,根据所述运动估计结果对所述第二目标图像进行时域滤波,得到所述降噪后的第二目标图像;The temporal noise reduction unit is configured to perform motion estimation according to the first target image and the second target image to obtain a motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result, Obtaining the first target image after noise reduction, and performing temporal filtering on the second target image according to the motion estimation result to obtain the second target image after noise reduction;
    所述空域降噪单元用于根据所述第一目标图像和所述第二目标图像进行边缘估计,得到边缘估计结果,根据所述边缘估计结果对所述第一目标图像进行空域滤波,得到所述降噪后的第一目标图像,根据所述边缘估计结果对所述第二目标图像进行空域滤波,得到所述降噪后的第二目标图像。The spatial noise reduction unit is configured to perform edge estimation according to the first target image and the second target image to obtain an edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the In the first target image after noise reduction, spatial filtering is performed on the second target image according to the edge estimation result to obtain the second target image after noise reduction.
  12. 根据权利要求7所述的设备,其特征在于,所述联合降噪单元包括时域降噪单元和空域降噪单元;The device according to claim 7, wherein the joint noise reduction unit comprises a time domain noise reduction unit and a spatial domain noise reduction unit;
    所述时域降噪单元用于根据所述第一目标图像和所述第二目标图像进行运动估计,得到运动估计结果,根据所述运动估计结果对所述第一目标图像进行时域滤波,得到第一时域降噪图像,根据所述运动估计结果对所述第二目标图像进行时域滤波,得到第二时域降噪图像;The temporal noise reduction unit is configured to perform motion estimation according to the first target image and the second target image to obtain a motion estimation result, and perform temporal filtering on the first target image according to the motion estimation result, Obtaining a first time-domain denoised image, and performing time-domain filtering on the second target image according to the motion estimation result to obtain a second time-domain denoised image;
    所述空域降噪单元用于根据所述第一时域降噪图像和所述第二时域降噪图像进行边缘估计,得到边缘估计结果,根据所述边缘估计结果对所述第一时域降噪图像进行空域滤波,得到所述降噪后的第一目标图像,根据所述边缘估计结果对所述第二时域降噪图像进行空域滤波,得到所述降噪后的第二目标图像;The spatial denoising unit is configured to perform edge estimation based on the first temporal denoised image and the second temporal denoised image to obtain an edge estimation result, and compare the first time domain noise according to the edge estimation result. Perform spatial filtering on the denoised image to obtain the denoised first target image, and perform spatial filtering on the second temporal denoised image according to the edge estimation result to obtain the denoised second target image ;
    或者,or,
    所述空域降噪单元用于根据所述第一目标图像和所述第二目标图像进行边缘估计,得到边缘估计结果,根据所述边缘估计结果对所述第一目标图像进行空域滤波,得到第一空域降噪图像,根据所述边缘估计结果对所述第二目标图像进行空域滤波,得到第二空域降噪图像;The spatial noise reduction unit is configured to perform edge estimation according to the first target image and the second target image to obtain an edge estimation result, and perform spatial filtering on the first target image according to the edge estimation result to obtain the first target image A spatial denoising image, performing spatial filtering on the second target image according to the edge estimation result to obtain a second spatial denoising image;
    所述时域降噪单元用于根据所述第一空域降噪图像和所述第二空域降噪图像进行运动估计,得到运动估计结果,根据所述运动估计结果对所述第一空域降噪图像进行时域滤波,得到所述降噪后的第一目标图像,根据所述运动估计结果对所述第二空域降噪图像进行时域滤波,得到所述降噪后的第二目标图像。The temporal noise reduction unit is configured to perform motion estimation according to the first spatial noise reduction image and the second spatial noise reduction image to obtain a motion estimation result, and reduce noise in the first spatial domain according to the motion estimation result The image is filtered in time domain to obtain the denoised first target image, and the second spatial domain denoised image is temporally filtered according to the motion estimation result to obtain the denoised second target image.
  13. 一种视频图像融合设备,其特征在于,包括:A video image fusion device is characterized by comprising:
    镜头、滤光组件、单个图像传感器、补光器和处理器,所述图像传感器位于所述 滤光组件的出光侧;A lens, a filter component, a single image sensor, a light supplement and a processor, the image sensor is located on the light exit side of the filter component;
    所述图像传感器,用于通过多次曝光产生并输出第一图像信号和第二图像信号,其中,所述第一图像信号是根据第一预设曝光产生的图像信号,所述第二图像信号是根据第二预设曝光产生的图像信号,所述第一预设曝光和所述第二预设曝光为所述多次曝光中的其中两次曝光;The image sensor is configured to generate and output a first image signal and a second image signal through multiple exposures, wherein the first image signal is an image signal generated according to a first preset exposure, and the second image signal Is an image signal generated according to a second preset exposure, the first preset exposure and the second preset exposure being two of the multiple exposures;
    所述补光器包括第一补光装置,所述第一补光装置用于进行近红外补光,其中,至少在所述第一预设曝光的部分曝光时间段内存在近红外补光,在所述第二预设曝光的曝光时间段内不存在近红外补光;The light supplementer includes a first light supplement device, and the first light supplement device is used to perform near-infrared supplement light, wherein at least the near-infrared supplement light exists in a partial exposure time period of the first preset exposure, There is no near-infrared fill light in the exposure time period of the second preset exposure;
    所述滤光组件包括第一滤光片,所述第一滤光片用于通过可见光波段和部分近红外光;The filter assembly includes a first filter, and the first filter is used to pass visible light and part of near-infrared light;
    所述处理器,包括缓存单元和图像处理单元;The processor includes a cache unit and an image processing unit;
    图像处理单元,用于接收所述图像传感器当前输出的第一目标图像信号,将所述第一目标图像信号预处理后得到第一目标图像,在所述第一目标图像需要缓存时,至少将所述第一目标图像同步输出至所述缓存单元进行缓存,以及在所述缓存单元需要同步输出所述缓存单元已缓存的第二目标图像时,接收所述缓存单元同步输出的所述第二目标图像,根据所述第一目标图像和所述第二目标图像生成彩色融合图像;其中,若所述第一目标图像信号为第一图像信号则所述第一目标图像为第一图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后生成的图像,所述第二目标图像信号为所述第二图像信号;若所述第一目标图像信号为第二图像信号,则所述第一目标图像为第二图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后的图像,所述第二目标图像信号为所述第一图像信号;The image processing unit is configured to receive the first target image signal currently output by the image sensor, preprocess the first target image signal to obtain the first target image, and when the first target image needs to be cached, at least The first target image is synchronously output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, receiving the second target image synchronously output by the buffer unit Target image, a color fusion image is generated according to the first target image and the second target image; wherein, if the first target image signal is the first image signal, the first target image is the first image signal preset An image generated after processing, the second target image is a buffered frame that is preprocessed by a second target image signal, and the second target image signal is the second image signal; if the first If a target image signal is a second image signal, the first target image is an image generated after preprocessing of the second image signal, and the second target image is a buffered frame that is preprocessed by the second target image signal , The second target image signal is the first image signal;
    缓存单元,用于在获知所述第一目标图像需要缓存时,至少将所述图像处理单元同步输出的所述第一目标图像进行缓存,以及在获知需要同步输出已缓存的第二目标图像时,至少将已缓存的第二目标图像信号同步输出至所述图像处理单元。The cache unit is configured to cache at least the first target image synchronously output by the image processing unit when it is known that the first target image needs to be cached, and when it is known that the cached second target image needs to be output synchronously At least synchronously outputting the buffered second target image signal to the image processing unit.
  14. 根据权利要求13所述的设备,其特征在于,所述处理器还包括:同步单元;所述同步单元用于确定所述图像处理单元预处理生成的第一目标图像需要缓存时,指示缓存单元将所述第一目标图像进行缓存,以及从已缓存的图像中确定需要同步输出第二目标图像时,指示缓存单元将所述第二目标图像同步输出至图像处理单元。The device according to claim 13, wherein the processor further comprises: a synchronization unit; the synchronization unit is used to determine that the first target image generated by the preprocessing of the image processing unit needs to be cached, instruct the cache unit The first target image is cached, and when it is determined from the cached images that the second target image needs to be output synchronously, the cache unit is instructed to synchronously output the second target image to the image processing unit.
  15. 根据权利要求14所述的设备,其特征在于,所述同步单元,用于确定每一帧所述第一目标图像需要缓存,并且需要同步输出所述第二目标图像,所述第二目标图像为所述缓存单元前一次缓存的图像;The device according to claim 14, wherein the synchronization unit is configured to determine that each frame of the first target image needs to be cached, and the second target image needs to be output synchronously, and the second target image Is the image previously buffered by the buffer unit;
    其中,若所述第一目标图像为第二图像信号预处理后生成的图像,则所述缓存单元当前缓存第二图像信号预处理后生成的图像,并将前一次缓存的第一图像信号预处理后生成的图像确定为所述第二目标图像输出至图像预处理单元;Wherein, if the first target image is an image generated after preprocessing of the second image signal, the buffering unit currently buffers the image generated after preprocessing of the second image signal, and preprocesses the first image signal buffered previously. The image generated after the processing is determined to be the second target image and output to the image preprocessing unit;
    若所述第一目标图像信号为第一图像信号预处理后生成的图像,则所述缓存单元当前缓存第一图像信号预处理后生成的图像,并将前一次缓存的第二图像信号预处理后生成的图像确定为所述第二目标图像输出至图像预处理单元。If the first target image signal is an image generated after preprocessing of the first image signal, the buffering unit currently buffers the image generated after the preprocessing of the first image signal, and preprocesses the second image signal buffered previously The generated image is determined to be the second target image and output to the image preprocessing unit.
  16. 根据权利要求14所述的设备,其特征在于,The device according to claim 14, wherein:
    所述同步单元,用于确定所述第一目标图像为第一图像信号预处理后的图像时需要缓存,以及在确定所述第一目标图像为第二图像信号预处理后的图像时,需要同步输出所述第二目标图像,所述第二目标图像为所述缓存单元已缓存的图像中最近一次缓存的第一图像信号预处理后的图像;其中,若所述第一目标图像为第二图像信号预处理后的图像,则所述缓存单元将最近一次缓存的第一图像信号预处理后的图像确定为所述第二目标图像输出至图像预处理单元;若所述第一目标图像为第一图像信号预处理后的图像,则所述缓存单元缓存所述第一图像信号预处理后的图像;或者,The synchronization unit is used to determine that the first target image is an image preprocessed by a first image signal and needs to be buffered, and when determining that the first target image is an image preprocessed by a second image signal, The second target image is synchronously output, and the second target image is an image preprocessed by the first image signal buffered last time among the images that have been buffered by the buffering unit; wherein, if the first target image is the first 2. An image after image signal preprocessing, the buffering unit determines the image preprocessed by the first image signal that was buffered last time as the second target image and output to the image preprocessing unit; if the first target image Is an image preprocessed by the first image signal, then the buffer unit buffers the image preprocessed by the first image signal; or,
    所述同步单元,用于确定所述第一目标图像为第二图像信号预处理后的图像时需要缓存,以及在确定所述第一目标图像为第一图像信号预处理后的图像时,需要同步输出所述第二目标图像,所述第二目标图像为所述缓存单元已缓存的图像中最近一次缓存的第二图像信号预处理后的图像;其中,若所述第一目标图像为第一图像信号预处理后的图像,则所述缓存单元将最近一次缓存的第二图像信号预处理后的图像确定为所述第二目标图像输出至图像预处理单元;若所述第一目标图像为第二图像信号预处理后的图像,则所述缓存单元缓存所述第二图像信号预处理后的图像。The synchronization unit is used to determine that the first target image is an image preprocessed by a second image signal and need to be cached, and when determining that the first target image is an image preprocessed by the first image signal, it needs to be cached. The second target image is synchronously output, and the second target image is an image preprocessed by a second image signal that has been buffered most recently among images buffered by the buffer unit; wherein, if the first target image is the first A preprocessed image of an image signal, the buffering unit determines the image preprocessed by the second image signal that was buffered last time as the second target image and output to the image preprocessing unit; if the first target image If it is an image preprocessed by the second image signal, the buffer unit buffers the image preprocessed by the second image signal.
  17. 根据权利要求14所述的设备,其特征在于,The device according to claim 14, wherein:
    所述同步单元,用于确定每一帧所述第一目标图像需要缓存,并且需要同步输出最近一次缓存的第二目标图像和最近一次缓存的第一目标图像;The synchronization unit is configured to determine that the first target image in each frame needs to be buffered, and needs to synchronously output the last buffered second target image and the last buffered first target image;
    其中,若所述第一目标图像为第二图像信号预处理后生成的图像,则所述缓存单元当前缓存第二图像信号预处理后生成的图像,并将最近一次缓存的第一图像信号预处理后生成的图像和最近一次缓存的第二图像信号预处理后生成的图像输出;Wherein, if the first target image is an image generated after preprocessing of the second image signal, the buffering unit currently buffers the image generated after preprocessing of the second image signal, and preprocesses the first image signal buffered last time. The image generated after processing and the image output generated after preprocessing of the last buffered second image signal;
    若所述第一目标图像信号为第一图像信号预处理后生成的图像,则所述缓存单元当前缓存第一图像信号预处理后生成的图像,并将最近一次缓存的第二图像信号预处理后生成的图像和最近一次缓存的第一图像信号预处理后生成的图像输出。If the first target image signal is an image generated after the preprocessing of the first image signal, the buffering unit currently buffers the image generated after the preprocessing of the first image signal, and preprocesses the last buffered second image signal The image generated afterwards and the image generated after preprocessing of the last buffered first image signal are output.
  18. 根据权利要求13-17任一项所述的设备,其特征在于,所述图像处理单元,包括:图像预处理单元和图像融合单元;The device according to any one of claims 13-17, wherein the image processing unit comprises: an image preprocessing unit and an image fusion unit;
    所述图像预处理单元,用于将所述第一目标图像信号经预处理后生成第一目标图像,并将所述第二目标图像信号经预处理后生成第二目标图像;The image preprocessing unit is configured to preprocess the first target image signal to generate a first target image, and preprocess the second target image signal to generate a second target image;
    所述图像融合单元,用于将所述第一目标图像和所述第二目标图像进行融合处理,得到所述彩色融合图像。The image fusion unit is configured to perform fusion processing on the first target image and the second target image to obtain the color fusion image.
  19. 根据权利要求18所述的设备,其特征在于,所述图像预处理单元,包括:第一预处理单元、第二预处理单元和联合降噪单元;The device according to claim 18, wherein the image preprocessing unit comprises: a first preprocessing unit, a second preprocessing unit, and a joint noise reduction unit;
    所述第一预处理单元,用于对所述第一目标图像信号进行第一预处理操作,得到预处理后的第一目标图像;The first preprocessing unit is configured to perform a first preprocessing operation on the first target image signal to obtain a preprocessed first target image;
    所述第二预处理单元,用于对所述第二目标图像信号进行第二预处理操作,得到第二目标图像;The second preprocessing unit is configured to perform a second preprocessing operation on the second target image signal to obtain a second target image;
    所述联合降噪单元,用于对所述第一目标图像和所述第二目标图像进行滤波处理,得到降噪后的第一目标图像和第二目标图像,所述降噪后的第一目标图像和第二目标图像用于进行融合处理,得到所述彩色融合图像。The joint noise reduction unit is configured to perform filtering processing on the first target image and the second target image to obtain a first target image and a second target image after noise reduction, and the first target image after noise reduction is The target image and the second target image are used for fusion processing to obtain the color fusion image.
  20. 根据权利要求1-19任一项所述的设备,其特征在于,所述第一补光装置进 行近红外补光的中心波长为设定特征波长或者落在设定特征波长范围时,通过所述第一滤光片的近红外光的中心波长和/或波段宽度达到约束条件。The device according to any one of claims 1-19, wherein the center wavelength of the near-infrared supplement light performed by the first supplementary light device is the set characteristic wavelength or falls within the set characteristic wavelength range through the The center wavelength and/or the band width of the near-infrared light of the first filter meet the constraint condition.
  21. 根据权利要求1-19任一项所述的设备,其特征在于,所述图像传感器包括多个感光通道,每个感光通道用于感应至少一种可见光波段的光,以及感应近红外波段的光。The device according to any one of claims 1-19, wherein the image sensor comprises a plurality of photosensitive channels, and each photosensitive channel is used to sense at least one light in the visible light band, and to sense light in the near-infrared band. .
  22. 根据权利要求21所述的设备,其特征在于,所述多个感光通道用于感应至少两种不同的可见光波段的光。The device according to claim 21, wherein the plurality of photosensitive channels are used to sense at least two different visible light wavelength bands.
  23. 根据权利要求1-19任一项所述的设备,其特征在于,所述第一预设曝光与所述第二预设曝光的至少一个曝光参数不同,所述至少一个曝光参数为曝光时间、曝光增益、光圈大小中的一种或多种,所述曝光增益包括模拟增益,和/或,数字增益。The device according to any one of claims 1-19, wherein at least one exposure parameter of the first preset exposure and the second preset exposure is different, and the at least one exposure parameter is exposure time, One or more of exposure gain and aperture size, where the exposure gain includes analog gain and/or digital gain.
  24. 根据权利要求1-19任一项所述的设备,其特征在于,所述第一预设曝光和所述第二预设曝光的至少一个曝光参数相同,所述至少一个曝光参数包括曝光时间、曝光增益、光圈大小中的一种或多种,所述曝光增益包括模拟增益,和/或,数字增益。The device according to any one of claims 1-19, wherein at least one exposure parameter of the first preset exposure and the second preset exposure are the same, and the at least one exposure parameter includes exposure time, One or more of exposure gain and aperture size, where the exposure gain includes analog gain and/or digital gain.
  25. 一种图像融合方法,应用于图像融合设备,所述图像融合设备包括图像传感器、补光器、滤光组件和处理器,所述图像传感器位于所述滤光组件的出光侧,所述补光器包括第一补光装置,所述滤光组件包括第一滤光片,所述处理器包括:缓存单元和图像处理单元,其特征在于,所述方法包括:An image fusion method is applied to an image fusion device, the image fusion device includes an image sensor, a light supplement, a filter component, and a processor. The image sensor is located on the light exit side of the filter component, and the supplement light The filter includes a first light supplement device, the filter assembly includes a first filter, and the processor includes a buffer unit and an image processing unit, characterized in that the method includes:
    通过所述第一补光装置进行近红外补光,其中,至少在第一预设曝光的部分曝光时间段内进行近红外补光,在第二预设曝光的曝光时间段内不进行近红外补光,所述第一预设曝光和所述第二预设曝光为图像传感器的多次曝光中的其中两次曝光;The near-infrared light-filling is performed by the first light-filling device, wherein the near-infrared light-filling is performed at least during a partial exposure time period of the first preset exposure, and the near-infrared light is not performed during the exposure time period of the second preset exposure Fill light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
    通过所述第一滤光片使可见光波段的光和部分近红外光通过;Passing visible light and part of near-infrared light through the first filter;
    通过所述图像传感器采用全局曝光方式进行多次曝光,以产生并输出第一图像信号和第二图像信号,所述第一图像信号是根据所述第一预设曝光产生的图像信号,所述第二图像信号是根据所述第二预设曝光产生的图像信号;The image sensor adopts a global exposure mode to perform multiple exposures to generate and output a first image signal and a second image signal, the first image signal is an image signal generated according to the first preset exposure, the The second image signal is an image signal generated according to the second preset exposure;
    通过所述缓存单元,在获知所述图像传感器当前输出的第一目标图像信号需要缓存时,将所述第一目标图像信号进行缓存,以及在获知需要同步输出已缓存的第二目标图像信号时,将已缓存的第二目标图像信号同步输出至图像处理单元;其中,若所述第一目标图像信号为第一图像信号,所述第二目标图像信号为已缓存的一帧第二图像信号,或者所述第一目标图像信号为第二图像信号,所述第二目标图像信号为已缓存的一帧第一图像信号;Through the buffer unit, when it is known that the first target image signal currently output by the image sensor needs to be buffered, the first target image signal is buffered, and when it is known that the buffered second target image signal needs to be output synchronously , Synchronously output the buffered second target image signal to the image processing unit; wherein, if the first target image signal is the first image signal, the second target image signal is a second image signal of one frame that has been buffered , Or the first target image signal is a second image signal, and the second target image signal is a buffered frame of the first image signal;
    通过所述图像处理单元接收所述图像传感器当前输出的第一目标图像信号,以及接收所述缓存单元同步输出的第二目标图像信号,根据所述第一目标图像信号和所述第二目标图像信号生成彩色融合图像。The image processing unit receives the first target image signal currently output by the image sensor, and receives the second target image signal synchronously output by the buffer unit, according to the first target image signal and the second target image The signal generates a color fusion image.
  26. 一种图像融合方法,应用于图像融合设备,所述图像融合设备包括图像传感器、补光器、滤光组件和处理器,所述图像传感器位于所述滤光组件的出光侧,所述补光器包括第一补光装置,所述滤光组件包括第一滤光片,所述处理器包括:缓存单元和图像处理单元,其特征在于,所述方法包括:An image fusion method, applied to an image fusion device, the image fusion device includes an image sensor, a light supplement, a filter component, and a processor. The image sensor is located on the light exit side of the filter component. The filter includes a first light-filling device, the filter assembly includes a first filter, and the processor includes a buffer unit and an image processing unit, characterized in that the method includes:
    通过所述第一补光装置进行近红外补光,其中,至少在第一预设曝光的部分曝光 时间段内进行近红外补光,在第二预设曝光的曝光时间段内不进行近红外补光,所述第一预设曝光和所述第二预设曝光为图像传感器的多次曝光中的其中两次曝光;The near-infrared light-filling is performed by the first light-filling device, wherein the near-infrared light-filling is performed at least during a partial exposure time period of the first preset exposure, and the near-infrared light is not performed during the exposure time period of the second preset exposure Fill light, the first preset exposure and the second preset exposure are two of the multiple exposures of the image sensor;
    通过所述第一滤光片使可见光波段的光和部分近红外光通过;Passing visible light and part of near-infrared light through the first filter;
    通过所述图像传感器采用全局曝光方式进行多次曝光,以产生并输出第一图像信号和第二图像信号,所述第一图像信号是根据所述第一预设曝光产生的图像信号,所述第二图像信号是根据所述第二预设曝光产生的图像信号;The image sensor adopts a global exposure mode to perform multiple exposures to generate and output a first image signal and a second image signal, the first image signal is an image signal generated according to the first preset exposure, the The second image signal is an image signal generated according to the second preset exposure;
    通过所述图像处理单元接收所述图像传感器当前输出的第一目标图像信号,将所述第一目标图像信号预处理后得到第一目标图像,在所述第一目标图像需要缓存时,至少将所述第一目标图像同步输出至所述缓存单元进行缓存,以及在所述缓存单元需要同步输出所述缓存单元已缓存的第二目标图像时,接收所述缓存单元同步输出的所述第二目标图像,根据所述第一目标图像和所述第二目标图像生成彩色融合图像;其中,若所述第一目标图像信号为第一图像信号则所述第一目标图像为第一图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后生成的图像,所述第二目标图像信号为所述第二图像信号;若所述第一目标图像信号为第二图像信号,则所述第一目标图像为第二图像信号预处理后生成的图像,所述第二目标图像为已缓存的一帧由第二目标图像信号预处理后的图像,所述第二目标图像信号为所述第一图像信号;The image processing unit receives the first target image signal currently output by the image sensor, and preprocesses the first target image signal to obtain the first target image. When the first target image needs to be cached, at least The first target image is synchronously output to the buffer unit for buffering, and when the buffer unit needs to synchronously output the second target image buffered by the buffer unit, receiving the second target image synchronously output by the buffer unit Target image, a color fusion image is generated according to the first target image and the second target image; wherein, if the first target image signal is a first image signal, the first target image is a first image signal preset An image generated after processing, the second target image is a buffered frame that is preprocessed by a second target image signal, and the second target image signal is the second image signal; if the first If a target image signal is a second image signal, the first target image is an image generated after preprocessing of the second image signal, and the second target image is a buffered frame that is preprocessed by the second target image signal , The second target image signal is the first image signal;
    通过所述缓存单元在获知所述第一目标图像需要缓存时,至少将所述图像处理单元同步输出的所述第一目标图像进行缓存,以及在获知需要同步输出已缓存的第二目标图像时,至少将已缓存的第二目标图像信号同步输出至所述图像处理单元。When the cache unit knows that the first target image needs to be cached, at least the first target image output synchronously by the image processing unit is cached, and when it is known that the cached second target image needs to be output synchronously At least synchronously outputting the buffered second target image signal to the image processing unit.
PCT/CN2020/092364 2019-05-31 2020-05-26 Image fusion device and method WO2020238905A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910473517.5 2019-05-31
CN201910473517.5A CN110505377B (en) 2019-05-31 2019-05-31 Image fusion apparatus and method

Publications (1)

Publication Number Publication Date
WO2020238905A1 true WO2020238905A1 (en) 2020-12-03

Family

ID=68585820

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/092364 WO2020238905A1 (en) 2019-05-31 2020-05-26 Image fusion device and method

Country Status (2)

Country Link
CN (1) CN110505377B (en)
WO (1) WO2020238905A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538255A (en) * 2021-05-31 2021-10-22 浙江大华技术股份有限公司 Motion fusion noise reduction method and device and computer readable storage medium

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110505377B (en) * 2019-05-31 2021-06-01 杭州海康威视数字技术股份有限公司 Image fusion apparatus and method
CN110490811B (en) * 2019-05-31 2022-09-09 杭州海康威视数字技术股份有限公司 Image noise reduction device and image noise reduction method
CN110493492B (en) * 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 Image acquisition device and image acquisition method
CN110493491B (en) * 2019-05-31 2021-02-26 杭州海康威视数字技术股份有限公司 Image acquisition device and camera shooting method
US11146727B2 (en) * 2020-03-16 2021-10-12 Ke.Com (Beijing) Technology Co., Ltd. Method and device for generating a panoramic image
CN114697584B (en) * 2020-12-31 2023-12-26 杭州海康威视数字技术股份有限公司 Image processing system and image processing method
CN113132080A (en) * 2021-04-19 2021-07-16 青岛冠成软件有限公司 Image processing method and device, electronic equipment and storage medium
CN113538926B (en) * 2021-05-31 2023-01-17 浙江大华技术股份有限公司 Face snapshot method, face snapshot system and computer-readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020140822A1 (en) * 2001-03-28 2002-10-03 Kahn Richard Oliver Camera with visible and infra-red imaging
CN102901703A (en) * 2012-10-10 2013-01-30 彩虹集团公司 Three-dimensional (3D) image displaying method for security inspection equipment
CN107566747A (en) * 2017-09-22 2018-01-09 浙江大华技术股份有限公司 A kind of brightness of image Enhancement Method and device
CN108259880A (en) * 2018-03-22 2018-07-06 人加智能机器人技术(北京)有限公司 Multidirectional binocular vision cognitive method, apparatus and system
CN110505377A (en) * 2019-05-31 2019-11-26 杭州海康威视数字技术股份有限公司 Image co-registration device and method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101281342A (en) * 2006-05-08 2008-10-08 北京体运国际经济会议服务中心 Photography video camera
US20130162835A1 (en) * 2011-12-23 2013-06-27 Fluke Corporation Thermal imaging camera for infrared rephotography
JP2014216734A (en) * 2013-04-24 2014-11-17 日立マクセル株式会社 Imaging apparatus and imaging system
JP6319449B2 (en) * 2014-09-18 2018-05-09 株式会社島津製作所 Imaging device
JP2016096430A (en) * 2014-11-13 2016-05-26 パナソニックIpマネジメント株式会社 Imaging device and imaging method
CN106778518B (en) * 2016-11-24 2021-01-08 汉王科技股份有限公司 Face living body detection method and device
CN108289164B (en) * 2017-01-10 2020-07-03 杭州海康威视数字技术股份有限公司 Mode switching method and device of camera with infrared light supplement lamp
CN208190776U (en) * 2018-05-02 2018-12-04 杭州海康威视数字技术股份有限公司 A kind of Dynamic IR light-supplementing system and the video camera with it

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020140822A1 (en) * 2001-03-28 2002-10-03 Kahn Richard Oliver Camera with visible and infra-red imaging
CN102901703A (en) * 2012-10-10 2013-01-30 彩虹集团公司 Three-dimensional (3D) image displaying method for security inspection equipment
CN107566747A (en) * 2017-09-22 2018-01-09 浙江大华技术股份有限公司 A kind of brightness of image Enhancement Method and device
CN108259880A (en) * 2018-03-22 2018-07-06 人加智能机器人技术(北京)有限公司 Multidirectional binocular vision cognitive method, apparatus and system
CN110505377A (en) * 2019-05-31 2019-11-26 杭州海康威视数字技术股份有限公司 Image co-registration device and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113538255A (en) * 2021-05-31 2021-10-22 浙江大华技术股份有限公司 Motion fusion noise reduction method and device and computer readable storage medium

Also Published As

Publication number Publication date
CN110505377B (en) 2021-06-01
CN110505377A (en) 2019-11-26

Similar Documents

Publication Publication Date Title
WO2020238905A1 (en) Image fusion device and method
WO2020238807A1 (en) Image fusion device and image fusion method
WO2020238970A1 (en) Image denoising device and image denoising method
WO2020238903A1 (en) Device and method for acquiring face images
CN107959778B (en) Imaging method and device based on dual camera
CN110519489B (en) Image acquisition method and device
JP6471953B2 (en) Imaging apparatus, imaging system, and imaging method
WO2020238806A1 (en) Image collection apparatus and photography method
US10634830B2 (en) Imaging device, image processing method and program for imaging device
WO2020238805A1 (en) Facial recognition apparatus and door access control device
CN110490187B (en) License plate recognition device and method
CN110706178B (en) Image fusion device, method, equipment and storage medium
CN110490044B (en) Face modeling device and face modeling method
CN110493535B (en) Image acquisition device and image acquisition method
JP5071198B2 (en) Signal recognition device, signal recognition method, and signal recognition program
US20180309940A1 (en) Image processing apparatus, image processing method, and imaging system
CN110493536B (en) Image acquisition device and image acquisition method
CN108712608A (en) Terminal device image pickup method and device
JPWO2006098358A1 (en) Image processing apparatus and method, program, and recording medium
CN110493495B (en) Image acquisition device and image acquisition method
CN110493493B (en) Panoramic detail camera and method for acquiring image signal
WO2020238804A1 (en) Image acquisition apparatus and image acquisition method
CN114338958A (en) Image processing method and related equipment
CN113126252A (en) Low-light-level imaging system
JP5222779B2 (en) License plate recognition device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20813676

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20813676

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20813676

Country of ref document: EP

Kind code of ref document: A1