WO2022073364A1 - 图像获取方法及装置、终端和计算机可读存储介质 - Google Patents

图像获取方法及装置、终端和计算机可读存储介质 Download PDF

Info

Publication number
WO2022073364A1
WO2022073364A1 PCT/CN2021/105464 CN2021105464W WO2022073364A1 WO 2022073364 A1 WO2022073364 A1 WO 2022073364A1 CN 2021105464 W CN2021105464 W CN 2021105464W WO 2022073364 A1 WO2022073364 A1 WO 2022073364A1
Authority
WO
WIPO (PCT)
Prior art keywords
filter
image
sub
filters
output mode
Prior art date
Application number
PCT/CN2021/105464
Other languages
English (en)
French (fr)
Inventor
唐城
李龙佳
张弓
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202022245405.5U external-priority patent/CN213279832U/zh
Priority claimed from CN202011073863.3A external-priority patent/CN112118378A/zh
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP21876886.9A priority Critical patent/EP4216534A4/en
Publication of WO2022073364A1 publication Critical patent/WO2022073364A1/zh
Priority to US18/193,134 priority patent/US20230254553A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/46Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/133Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing panchromatic light, e.g. filters passing white light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only

Definitions

  • the present application relates to the technical field of image processing, and in particular, to an image acquisition method, an image acquisition device, a terminal, and a non-volatile computer-readable storage medium.
  • the output mode of the image is generally fixed, and the adaptability to different scenes is poor, and some compensation can only be performed by adjusting the exposure parameters to improve the image quality in different scenes.
  • Embodiments of the present application provide an image acquisition method, an image acquisition apparatus, a terminal, and a non-volatile computer-readable storage medium.
  • the image acquisition method of the embodiment of the present application is applied to an image sensor, the image sensor includes a filter array and a pixel array, the filter array includes a minimum repeating unit, and the minimum repeating unit includes a plurality of filter groups,
  • the filter set includes a color filter and a panchromatic filter, and the width of the wavelength band of the light transmitted by the color filter is smaller than the width of the wavelength band of the light transmitted by the panchromatic filter,
  • Both the color filter and the panchromatic filter include a plurality of sub-filters
  • the pixel array includes a plurality of pixels, and each pixel corresponds to one of the sub-filters of the filter array and the pixels are configured to receive light passing through the corresponding sub-filters to generate electrical signals;
  • the image acquisition method includes outputting an image through at least one of multiple image output modes, the multiple images
  • the output mode includes a full-resolution output mode for obtaining a first image according to a first pixel value read out from each pixel, a second pixel value
  • the image acquisition apparatus of the embodiment of the present application includes an output module.
  • the output module is configured to output an image through at least one of multiple image output modes, the multiple image output modes include a full-resolution output mode for obtaining the first image according to the first pixel value read out from each pixel , obtain the first pixel value of the second image according to the second pixel value read out by the combination of multiple pixels corresponding to the panchromatic filter and the third pixel value read out by the combination of multiple pixels corresponding to the color filter Combined output mode, and a fourth pixel value read out according to a plurality of pixels corresponding to all the panchromatic filters in the filter group and a plurality of pixels corresponding to all the color filters in the filter group.
  • the outputted fifth pixel value is obtained to obtain the second combined output mode of the third image.
  • the terminal includes an image sensor and a processor
  • the image sensor includes a filter array and a pixel array
  • the filter array includes a minimum repeating unit, and the minimum repeating unit includes a plurality of filter groups
  • the filter set includes a color filter and a panchromatic filter, and the width of the wavelength band of the light transmitted by the color filter is smaller than the width of the wavelength band of the light transmitted by the panchromatic filter, Both the color filter and the panchromatic filter include a plurality of sub-filters;
  • the pixel array includes a plurality of pixels, and each pixel corresponds to one of the sub-filters of the filter array
  • the pixel is configured to receive the light passing through the corresponding sub-filter to generate an electrical signal;
  • the processor is configured to: output an image through at least one of multiple image output modes, the multiple images
  • the output mode includes a full-resolution output mode for obtaining a first image according to a first pixel value read out from each pixel, a second pixel value read out according to
  • the one or more non-volatile computer-readable storage media of the embodiments of the present application include a computer program that, when executed by one or more processors, causes the processors to execute Image acquisition method.
  • the image acquisition method is applied to an image sensor, the image sensor includes a filter array and a pixel array, the filter array includes a minimum repeating unit, the minimum repeating unit includes a plurality of filter groups, and the filter
  • the light filter group includes a color filter and a panchromatic filter.
  • the width of the wavelength band of the light transmitted by the color filter is smaller than the width of the wavelength band of the light transmitted by the panchromatic filter.
  • Both the color filter and the panchromatic filter include a plurality of sub-filters, the pixel array includes a plurality of pixels, and each of the pixels corresponds to one of the sub-filters of the filter array, so the pixel array includes a plurality of pixels.
  • the pixels are configured to receive light passing through the corresponding sub-filters to generate electrical signals;
  • the image acquisition method includes outputting an image through at least one of multiple image output modes, the multiple image output modes including
  • the full-resolution output mode of the first image is obtained according to the first pixel value read out from each pixel, the second pixel value read out according to the combination of multiple pixels corresponding to the panchromatic filter, and the color filter
  • the first combined output mode of the second image is obtained by combining a plurality of pixels corresponding to the filter to obtain a first combined output mode, and a plurality of pixels corresponding to all the panchromatic filters in the filter group are combined.
  • the read-out fourth pixel value and the plurality of pixels corresponding to all the color filters are combined with the read-out fifth pixel value to obtain a second combined output mode of the third image.
  • FIG. 1 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 2 is a schematic block diagram of an image acquisition apparatus according to some embodiments of the present application.
  • FIG. 3 is a schematic structural diagram of a terminal according to some embodiments of the present application.
  • FIG. 4 is an exploded schematic view of an image sensor according to some embodiments of the present application.
  • FIG. 5 is a schematic diagram of the connection between a pixel array and a readout circuit according to some embodiments of the present application.
  • FIG. 6 is a schematic plan view of a filter array according to some embodiments of the present application.
  • 7a is a schematic plan view of the minimum repeating unit of the filter array of some embodiments of the present application.
  • FIG. 7b is a schematic plan view of the minimum repeating unit of the filter array of some embodiments of the present application.
  • FIG. 7c is a schematic plan view of a minimal repeating unit of a filter array of some embodiments of the present application.
  • 7d is a schematic plan view of the minimum repeating unit of the filter array of some embodiments of the present application.
  • FIG. 8 is a schematic plan view of a pixel array according to some embodiments of the present application.
  • FIG. 9 is a schematic plan view of a minimum repeating unit of a pixel array according to some embodiments of the present application.
  • FIG. 10 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 11 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 12 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 13 is a schematic flowchart of an image acquisition method according to some embodiments of the present application.
  • FIG. 14 is a schematic diagram of the principle of an image acquisition method according to some embodiments of the present application.
  • FIG. 15 is a schematic diagram of the principle of an image acquisition method according to some embodiments of the present application.
  • the image acquisition method of the embodiment of the present application is applied to an image sensor 21 , the image sensor 21 includes a filter array 22 and a pixel array 23 , the filter array 22 includes a minimum repeating unit 221 , and the minimum repeating unit 221 includes multiple filter set 222, the filter set 222 includes a color filter 223 and a panchromatic filter 224, the width of the wavelength band of the light transmitted by the color filter 223 is smaller than that of the light transmitted by the panchromatic filter 224
  • the width of the wavelength band of the light, the color filter 223 and the panchromatic filter 224 both include a plurality of sub-filters 225; the pixel array 23 includes a plurality of pixels 231, and each pixel 231 corresponds to a sub-filter of the filter array 22
  • the light sheet 225, the pixels 231 are used to receive the light passing through the corresponding sub-filter 225 to generate electrical signals; the image acquisition method includes the following steps:
  • the multiple image output modes include a full-resolution output mode for obtaining the first image according to the first pixel value read out from each pixel 231, a full-resolution output mode according to a full-color filter a first combined output mode in which a plurality of pixels 231 corresponding to the light sheet 224 are combined and read out, and a plurality of pixels 231 corresponding to the color filter 223 are combined and read out a third pixel value to obtain a first combined output mode, and According to the fourth pixel value that is read by combining the plurality of pixels 231 corresponding to all the panchromatic filters 224 in the filter set 222 and the fifth pixel value that is read by combining the plurality of pixels 231 corresponding to all the color filters 223 to obtain the second merged output mode of the third image.
  • the terminal 100 includes an image sensor 21 and a processor 30 .
  • the image sensor 21 includes a filter array 22 and a pixel array 23 , and the processor 30 is used to execute step 011 . That is, the processor 30 is configured to output images through at least one of a plurality of image output modes.
  • the terminal 100 includes a mobile phone, a tablet computer, a notebook computer, an ATM, a gate, a smart watch, a head-mounted display device, etc. It can be understood that the terminal 100 can also be any other device with image processing functions. Hereinafter, the terminal 100 will be described as a mobile phone, but the terminal 100 is not limited to the mobile phone.
  • the terminal 100 includes a camera 20 , a processor 30 and a housing 40 .
  • the camera 20 and the processor 30 are both disposed in the casing 40, and the casing 40 can also be used to install functional modules such as a power supply device and a communication device of the terminal 100, so that the casing 40 provides dustproof, drop-proof, waterproof, etc. for the functional modules. Protect.
  • the camera 20 may be a front-facing camera, a rear-facing camera, a side-facing camera, an under-screen camera, etc., which is not limited herein.
  • the camera 20 includes a lens and an image sensor 21. When the camera 20 captures an image, light passes through the lens and reaches the image sensor 21.
  • the image sensor 21 is used to convert the light signal irradiated on the image sensor 21 into an electrical signal.
  • the image sensor 21 includes a microlens array 25 , a filter array 22 , a pixel array 23 and a readout circuit 24 .
  • Both the filter array 22 and the pixel array 23 may be arranged by a plurality of minimum repeating units 221.
  • the minimum repeating unit 221 of the filter array 22 is hereinafter referred to as the first minimum repeating unit
  • the minimum repeating unit of the pixel array 23 is hereinafter referred to as the first minimum repeating unit
  • Unit 232 is hereinafter referred to as the second smallest repeating unit.
  • the first minimum repeating unit includes a plurality of filter groups 222, for example, the first minimum repeating unit includes 2 filter groups 222, 3 filter groups 222, 4 filter groups 222, 5 filter groups Group 222, 6 filter groups 222, etc. In the embodiment of the present application, the first minimum repeating unit includes 4 filter groups 222, and the 4 filter groups 222 are arranged in a matrix.
  • each filter set 222 includes a color filter 223 (as shown in FIG. 6 , a rectangular portion composed of four sub-filters 225 with filling patterns) and a panchromatic filter 224 (as shown in FIG. 6
  • the width of the wavelength band of the light transmitted by the color filter 223 is smaller than the width of the wavelength band of the light transmitted by the panchromatic filter 224, for example,
  • the wavelength band of the light transmitted by the color filter 223 may correspond to the wavelength band of red light, the wavelength band of green light, or the wavelength band of blue light
  • the wavelength band of the light transmitted by the panchromatic filter 224 is the wavelength band of all visible light, that is, That is, the color filter 223 allows only certain colors of light to pass through, while the panchromatic filter 224 allows all colors of light to pass.
  • the wavelength band of light transmitted by the color filter 223 may also correspond to wavelength bands of other colors, such as magenta light, violet light, cyan light, yellow light, etc., which are not limited herein.
  • the sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is 4, 9, 16, 25, etc., which can be arranged in a matrix. In this embodiment, the sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is four.
  • the ratio of the number of color filters 223 to the number of panchromatic filters 224 may be 1:3, 1:1 or 3:1. For example, if the ratio between the number of color filters 223 and the number of panchromatic filters 224 is 1:3, the number of color filters 223 is 1, and the number of panchromatic filters 224 is 3. If the number of color filters 224 is large, the image quality under dark light is better; or, if the ratio of the number of color filters 223 to the number of panchromatic filters 224 is 1:1, then the color filters 223 The number of color filters 223 is 2, and the number of panchromatic filters 224 is 2.
  • the ratio of the number of color filters 224 is 3:1, then the number of color filters 223 is 3, and the number of panchromatic filters 224 is 1. At this time, better color performance can be obtained, and the darkening can be improved. Image quality in light.
  • the number of color filters 223 is 2, and the number of panchromatic filters 224 is 2, and the two color filters 223 and the two panchromatic filters 224 are in the form of 2.
  • two color filters 223 are located in the direction of the first diagonal D1 of the rectangle corresponding to the matrix (specifically, on the first diagonal D1), and two panchromatic filters 224 are located in the corresponding rectangle of the matrix.
  • the direction of the second diagonal line D2 of the rectangle specifically, on the second diagonal line D2
  • the direction of the first diagonal line D1 and the direction of the second diagonal line D2 are different (for example, the direction of the first diagonal line D1 is different from the direction of the second diagonal line D2).
  • the direction of the diagonal D2 is not parallel), thus taking into account the color performance and low-light imaging quality.
  • one color filter 223 and one panchromatic filter 224 are located on the first diagonal D1
  • the other color filter 223 and the other panchromatic filter 224 are located on the second diagonal D2.
  • the color corresponding to the wavelength band of the light transmitted through the color filter 223 of the filter set 222 in the first minimum repeating unit includes color a, color b and/or color c, for example, the filter color in the first minimum repeating unit
  • the colors corresponding to the wavelength bands of the light transmitted through the color filter 223 of the film group 222 include color a, color b and color c, or color a, color b or color c, or color a and color b, or color b and color Color c, or color a and color c.
  • the color a is red, the color b is green, and the color c is blue, or, for example, the color a is magenta, the color b is cyan, and the color c is yellow, etc., which are not limited herein.
  • the colors corresponding to the wavelength bands of the transmitted light rays of the color filters 223 of the filter group 222 in the first minimum repeating unit include color a, color b and color c, and color a, color b and The colors c are green, red, and blue, respectively.
  • the four filter groups 222 in the first minimum repeating unit are the first filter group 2221 and the second filter group respectively).
  • the colors corresponding to the color filters 223 of the group 2222, the third filter group 2223 and the fourth filter group 2224) are red, green, blue and green, respectively, to form an arrangement similar to a Bayer array, of course,
  • the colors corresponding to the first filter group 2221, the second filter group 2222, the third filter group 2223 and the fourth filter group 2224 may also be green, red, green and blue, or blue, respectively. , green, red and green, etc., are not limited here.
  • Both the color filter 223 and the panchromatic filter 224 include a plurality of sub-filters 225, for example, the color filter 223 and the panchromatic filter 224 include 2 sub-filters 225, 3 sub-filters 225, 4 1 sub-filter 225, 5 sub-filters 225, 6 sub-filters 225, etc.
  • the color filter 223 includes 4 color sub-filters
  • Filter 224 includes 4 panchromatic sub-filters.
  • the wavelength bands of light transmitted through the sub-filters 225 in the same color filter 223 (panchromatic filter 224 ) are the same.
  • the sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is 4, and the number of color filters 223 and the number of panchromatic filters are 4.
  • the ratio of the number of 224 is 1:1, then the first minimum repeating unit is 8 rows and 8 columns, including 64 sub-filters 225, and the arrangement can be:
  • w represents the panchromatic sub-filter
  • a, b and c represent the color sub-filter
  • the panchromatic sub-filter refers to the sub-filter 225 that can filter all light except the visible light band
  • the color sub-filter 225 The sub-filters include a red sub-filter, a green sub-filter, a blue sub-filter, a magenta sub-filter, a cyan sub-filter, and a yellow sub-filter.
  • the red sub-filter is the sub-filter 225 that filters all light except red light
  • the green sub-filter is the sub-filter 225 that filters all light except green light
  • the blue sub-filter is the sub-filter 225 that filters out all light except blue light
  • the magenta sub-filter is the sub-filter 225 that filters all light except magenta
  • the cyan sub-filter is the sub-filter 225 that filters out all light except cyan.
  • the yellow sub-filter is the sub-filter 225 for filtering all light except yellow light.
  • a can be a red sub-filter, green sub-filter, blue sub-filter, magenta sub-filter, cyan sub-filter or yellow sub-filter
  • b can be a red sub-filter, Green sub-filter, blue sub-filter, magenta sub-filter, cyan sub-filter or yellow sub-filter
  • c can be red sub-filter, green sub-filter, blue sub-filter filter, magenta sub-filter, cyan sub-filter, or yellow sub-filter.
  • b is a red sub-filter, a is a green sub-filter, and c is a blue sub-filter; alternatively, c is a red sub-filter, a is a green sub-filter, and b is a blue sub-filter filter; for another example, c is a red sub-filter, a is a green sub-filter, b is a blue sub-filter; or, a is a red sub-filter, b is a blue sub-filter , c are green sub-filters, etc., which are not limited here; for another example, b is a magenta sub-filter, a is a cyan sub-filter, b is a yellow sub-filter, and so on.
  • the color filter may further include sub-filters of other colors, such as orange sub-filters, purple sub-filters, etc., which are not limited herein.
  • the sum of the number of color filters 223 and the number of panchromatic filters 224 in the filter set 222 is 4, and the number of color filters 223 and the number of panchromatic filters are 4.
  • the ratio of the number of slices 224 is 1:1, then the first minimum repeating unit is 8 rows and 8 columns, including 64 sub-filters 225, and the arrangement can also be:
  • the sum of the number of color filters 223 in the filter set 222 and the number of the panchromatic filters 224 is 9, and the color filters 223 and 224 in the filter set 222 and The panchromatic filters 224 are arranged in a matrix, and the ratio of the number of the color filters 223 to the number of the panchromatic filters 224 is 4:5, then the number of the color filters 223 is 4, and the number of the panchromatic filters 224 is 4. The number is 5.
  • the panchromatic filters 224 are located on the third diagonal line D3 and On the fourth diagonal line D4, the third diagonal line D3 and the fourth diagonal line D4 are the diagonal lines of the rectangle, and the color filter 223 is located in the direction of the third diagonal line D3 or the fourth diagonal line D4. and not on the third diagonal D3 and the fourth diagonal D4, the direction of the third diagonal D3 and the fourth diagonal D4 are different (such as the direction of the third diagonal D3 and the fourth diagonal D4 direction is not parallel), specifically, the first minimum repeating unit is 12 rows and 12 columns, including 144 sub-filters 225, and the arrangement can be:
  • the sum of the number of color filters 223 in the filter set 222 and the number of the panchromatic filters 224 is 9, and the color filters 223 and 224 in the filter set 222 and The panchromatic filters 224 are arranged in a matrix, and the ratio of the number of the color filters 223 to the number of the panchromatic filters 224 is 5:4, then the number of the color filters 223 is 5, and the number of the panchromatic filters 224 The number of color filters 223 is 4. At this time, the number of color filters 223 is large, and better color performance can be obtained at this time, and the imaging quality under dark light can be improved; the color filters 223 are located in the rectangle corresponding to the filter set 222.
  • the fifth diagonal line D5 and the sixth diagonal line D6 are the diagonal lines of the rectangle, and the panchromatic filter 223 is located on the fifth diagonal line D5
  • the direction or the direction of the sixth diagonal D6 and not located on the fifth diagonal D5 and the sixth diagonal D6, the direction of the fifth diagonal D5 and the sixth diagonal D6 are different (such as the fifth diagonal The direction of the line D5 and the direction of the sixth diagonal line D6 are not parallel)
  • the first minimum repeating unit is 12 rows and 12 columns, including 144 sub-filters 225, and the arrangement can also be:
  • the image sensor 21 , the camera 20 , and the terminal 100 in this embodiment include a panchromatic filter 224 .
  • the image sensor 10 can obtain more There is more light intensity, so there is no need to adjust the shooting parameters, and the imaging quality in dark light can be improved without affecting the stability of shooting.
  • both stability and quality can be taken into account.
  • the quality is high.
  • the panchromatic filter 224 and the color filter 223 are composed of 4 sub-filters 225.
  • the pixels 231 corresponding to the 4 sub-filters 225 can be combined and output, resulting in a high signal-to-noise ratio.
  • the pixel 231 corresponding to each sub-filter 225 can be output separately, so as to obtain an image with high definition and high signal-to-noise ratio.
  • the pixel array 23 includes a plurality of pixels 231, each pixel 231 corresponds to a sub-filter 225, and the pixel 231 is used for receiving light passing through the corresponding sub-filter 225 to generate electrical signals,
  • the processor 30 processes the electrical signal to obtain the pixel value of the pixel 231 .
  • the second minimum repeating unit includes a plurality of pixel groups 233, corresponding to the filter groups 222 in the second minimum repeating unit, and the second minimum repeating unit includes four pixel groups 233 arranged in a matrix, and each pixel group 233 corresponds to one The filter group 222, as shown in FIG.
  • the four pixel groups 233 include a first pixel group 2331, a second pixel group 2332, a third pixel group 2333 and a fourth pixel group 2334, the first pixel group 2331, the second pixel group 2331, the second pixel group 2332
  • the pixel group 2332 , the third pixel group 2333 and the fourth pixel group 2334 are respectively set corresponding to the first filter group 2221 , the second filter group 2222 , the third filter group 2223 and the fourth filter group 2224 .
  • the pixel group 233 includes a color pixel unit 234 and a panchromatic pixel unit 235 , and the color pixel unit 234 and the panchromatic pixel unit 235 are respectively arranged in a one-to-one correspondence with the color filters 223 and the panchromatic filters 224 .
  • the two color pixel units 234 and the two panchromatic pixel units 235 are arranged in a matrix, and the two color pixel units 234 are located in the rectangle corresponding to the matrix.
  • On a seventh diagonal line D7 of two full-color pixel units 235 are located on the eighth diagonal line D8 of the rectangle corresponding to the matrix.
  • Color pixel unit 234 includes color pixel 2341
  • panchromatic pixel unit 235 includes panchromatic pixel 2311 .
  • the color pixels 2341 and the sub-filters 225 of the color filter 223 (hereinafter referred to as the color sub-filters) are arranged in a one-to-one correspondence, and the panchromatic pixels 2311 and the sub-filters 225 of the panchromatic filter 224 (hereinafter referred to as the full color filters)
  • Color sub-filters) are set in one-to-one correspondence, and the color filter 223 and the panchromatic filter 224 respectively include 4 color sub-filters and 4 panchromatic sub-filters, corresponding to the color pixel unit 234 and the full color sub-filter.
  • the color pixel unit 235 also includes four color pixels 2341 and four panchromatic pixels 2311, respectively.
  • the color pixel 2341 can receive light of a specific color (such as red, green, or blue) transmitted by the corresponding color sub-filter to generate an electrical signal, and the panchromatic pixel 2311 can receive the corresponding full-color sub-filter to transmit light. All colors of light are generated to generate electrical signals, and the processor 30 can obtain pixel values corresponding to the panchromatic pixels 2311 and the color pixels 2341 according to the electrical signals.
  • a specific color such as red, green, or blue
  • the colors included in the color pixels 2341 correspond to the wavelength bands of the light transmitted by the correspondingly arranged color sub-filters.
  • the color pixels 2341 in the second minimum repeating unit also include color a, color b and color c, such as the first minimum repeating unit.
  • the wavelength bands of the light transmitted by the color sub-filters in the color sub-filters include the wavelength band of red light, the wavelength band of green light and the wavelength band of blue light, and the color pixel 2341 includes red, green and blue.
  • the color pixel units in the 4 pixel groups 233 are respectively red, green, blue and green, that is, the color a is green, the color b is red, and the color c is blue. It can be understood that the color included in the color pixel 2341 is not the color of the color pixel 2341 itself, but the color corresponding to the wavelength band of the light transmitted by the color sub-filter corresponding to the color pixel 2341 .
  • the color of the panchromatic pixel 2311 in the second minimum repeating unit corresponds to the wavelength band of the light transmitted by the panchromatic sub-filter in the corresponding first minimum repeating unit.
  • the wavelength band of the light transmitted by the filter is the visible light band, and the color W is white. It can be understood that the color included in the panchromatic pixel 2311 is not the color of the panchromatic pixel 2311 itself, but the color corresponding to the wavelength band of the light transmitted by the panchromatic sub-filter corresponding to the panchromatic pixel 2311 .
  • the readout circuit 24 is electrically connected to the pixel array 23 for controlling exposure of the pixel array 23 and reading and outputting the pixel value of the pixel 231 .
  • the readout circuit 24 includes a vertical driving unit 241 , a control unit 242 , a column processing unit 243 and a horizontal driving unit 244 .
  • the vertical driving unit 241 includes a shift register and an address decoder.
  • the vertical driving unit 241 includes readout scan and reset scan functions. Readout scanning refers to sequentially scanning the pixels 231 row by row, and reading signals from these pixels 231 row by row. For example, the signal output by each pixel 231 in the selected and scanned pixel row is transmitted to the column processing unit 243 .
  • the reset scan is used to reset the charges, and the photocharges of the photoelectric conversion elements of the pixels 231 are discarded, so that the accumulation of new photocharges can be started.
  • the signal processing performed by the column processing unit 243 is correlated double sampling (CDS) processing.
  • CDS correlated double sampling
  • the reset level and the signal level output from each pixel 231 in the selected pixel row are taken out, and the level difference is calculated.
  • the signals of the pixels 231 in one row are obtained.
  • the column processing unit 243 may have an analog-to-digital (A/D) conversion function for converting an analog pixel signal into a digital format.
  • the horizontal driving unit 244 includes a shift register and an address decoder.
  • the horizontal driving unit 244 sequentially scans the pixel array 11 column by column. Through the selection scan operation performed by the horizontal driving unit 244, each pixel column is sequentially processed by the column processing unit 243 and sequentially output.
  • the control unit 242 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 241 , the column processing unit 243 and the horizontal driving unit 244 to work together.
  • the processor 30 may select at least one of multiple image output modes to output images for the current scene. For example, in order to obtain an image with the highest definition, the user can select a full-resolution output mode among multiple image output modes to output the image. In the full resolution output mode, each pixel 231 outputs a first pixel value, thereby generating an image with a resolution equal to the resolution of the image sensor 21. For example, if the resolution of the image sensor 21 is 48 million pixels, a A 48-megapixel first image;
  • the user can select the first combined output mode among the multiple image output modes to output the image.
  • the electrical signals of the four panchromatic pixels 2311 in the panchromatic pixel unit 235 corresponding to the panchromatic filter 224 will be combined and read out to obtain a second pixel value
  • the color filter 223 The electrical signals of the four color pixels 2341 in the corresponding color pixel unit 234 will be combined and read out to obtain a third pixel value, and an image with a resolution equal to the size of the image can be generated according to all the third pixel values and the fourth pixel values.
  • the user can select the second combined output mode among the multiple image output modes to output the image.
  • the electrical signals of the eight panchromatic pixels 2311 in the panchromatic pixel unit 235 corresponding to all the panchromatic filters 224 in each filter group 222 will be combined and read out to obtain one
  • the electrical signals of the 8 color pixels 2341 in the color pixel unit 234 corresponding to all the color filters 223 in each filter group 222 will be combined and read out to obtain a fifth pixel value.
  • all the fourth pixel values and all the fifth pixel values generate an intermediate image respectively.
  • an image with a resolution equal to 1/16 of the resolution of the image sensor 21 can be generated.
  • a resolution of 48 megapixels a third image with a size of 3 megapixels can be generated.
  • the combined readout of the electrical signals may be to accumulate the electrical signals accumulated by the plurality of pixels 231 to obtain an accumulated electrical signal, and then determine the corresponding pixel value according to the accumulated electrical signal, or the combined readout of the electrical signals may also be to combine After the pixel value of each pixel 231 is read out, a plurality of pixel values are accumulated to be the pixel value of one pixel.
  • the processor 30 may simultaneously select multiple of multiple image output modes to output the first image, the second image and/or the third image.
  • the processor 30 simultaneously outputs the first image and the second image, or the second image and the third image, or the first image and the third image, or the first image, the second image and the third image. The user can select a more satisfactory image from a variety of images output by a variety of image output modes.
  • the image acquisition method, the image acquisition device, and the terminal 100 output images through at least one of multiple image output modes, and can use different image output modes for different scenarios, and have strong adaptability to different scenarios, and can be used in A better balance between sharpness and signal-to-noise ratio is achieved to improve the imaging effect in different scenes.
  • the image acquisition method includes:
  • Step 012 Acquire shooting information, where the shooting information includes at least one of ambient brightness and shooting parameters;
  • 013 Determine the image output mode adapted to the shooting information.
  • the image processing apparatus 10 further includes an acquisition module 12 and a determination module 13 .
  • the acquiring module 12 and the determining module 13 are used to execute step 012 and step 013, respectively. That is, the acquiring module 12 is configured to acquire shooting information; the determining module 13 is configured to determine the image output mode adapted to the shooting information.
  • the processor 30 is further configured to perform step 012 and step 013 . That is, the processor 20 is further configured to acquire shooting information and determine the image output mode adapted to the shooting information.
  • the processor 30 first acquires shooting information, where the shooting information includes at least one of ambient brightness and shooting parameters, for example, the shooting information includes ambient brightness, or the shooting information includes shooting parameters, or the shooting information includes ambient brightness and shooting parameters, where the shooting information includes ambient brightness and shooting parameters.
  • the parameters may include shooting modes, exposure parameters, and the like. This embodiment is described by taking the shooting parameters including ambient brightness and the shooting parameters (the shooting parameters including the shooting mode) as an example.
  • the processor 30 can acquire the current shooting mode and the ambient light intensity signal collected by the light sensor 50 (shown in FIG. 3 ) of the terminal 100, and then determine the ambient brightness according to the light intensity signal; or the processor 30 can control the camera 20 to shoot images, and then The gray value distribution of the image determines the ambient brightness; or when shooting images, in order to achieve better shooting results under different ambient brightness, exposure parameters are generally automatically adjusted, such as adjusting aperture size, sensitivity, etc., ambient brightness and exposure parameters There is a mapping relationship, and the processor 30 can determine the ambient brightness according to the exposure parameters when capturing the image.
  • the processor 30 may determine an image output mode adapted to the ambient brightness and/or shooting parameters. For example, the processor 30 may determine an image output mode adapted to the shooting mode and ambient brightness.
  • the processor 30 can preferentially determine the image output mode according to the shooting mode. For example, when the shooting mode is the full resolution mode, the processor 30 determines that the adapted image output mode is the full resolution output mode For another example, if the shooting mode is the high-resolution mode, the processor 30 determines that the adapted image output mode is the first combined output mode; for another example, if the shooting mode is the low-resolution mode, the processor 30 determines the adapted image output mode The output mode is the second combined output mode.
  • the processor 30 may determine an image output mode adapted to the ambient brightness.
  • the processor 30 may determine that the adapted image output mode is the full-resolution output mode; when the ambient brightness is normal (eg, the ambient brightness is higher than the first ambient brightness threshold) The second ambient brightness threshold is less than the first ambient brightness threshold), the processor 30 may determine that the adapted image output mode is the first combined output mode; when the ambient brightness is low (eg, the ambient brightness is less than the second ambient brightness threshold), the processing The controller 30 may determine that the adapted image output mode is the second combined output mode.
  • an appropriate image output mode is selected for different environmental brightness, and a good balance is achieved between the sharpness and the signal-to-noise ratio, so as to ensure that the sharpness and the signal-to-noise ratio are not too low, thereby improving the image quality.
  • the processor 30 can control the image sensor 21 to output the corresponding image according to the adapted image output mode.
  • the image output mode can be changed in real time.
  • the processor 30 acquires the shooting information in real time, and determines the image output mode every predetermined time, thereby ensuring real-time adaptation of the image output mode to the current shooting information.
  • the image sensor 21 includes a panchromatic filter 224, which can increase the amount of light entering the pixels and improve the imaging effect under dark light.
  • the corresponding image output mode can be determined according to the shooting information, so that when dealing with scenes with different environmental brightness, shooting parameters and other shooting information, an appropriate image output mode can be selected to achieve a better balance between clarity and signal-to-noise ratio. , the ability to adapt to different scenes is strong, and the imaging effect in different scenes can be improved.
  • step 013 (specifically determining an image output mode adapted to ambient brightness) includes the following steps:
  • 0132 When the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determine the image output mode as the first combined output mode; and 0133: When the ambient brightness is less than the second ambient brightness threshold, determine the image output mode as In the second combined output mode, the first ambient brightness threshold is greater than the second ambient brightness threshold.
  • the determination module 13 is further configured to perform step 0131 , step 0132 and step 0133 . That is, the determining module 13 is further configured to determine that the image output mode is the full-resolution output mode when the ambient brightness is greater than the first ambient brightness threshold; when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determine the image output mode The output mode is the first combined output mode; and when the ambient brightness is less than the second ambient brightness threshold, the image output mode is determined to be the second combined output mode.
  • the processor 30 is further configured to perform step 0131 , step 0132 and step 0133 . That is, the processor 30 is further configured to determine that the image output mode is the full resolution output mode when the ambient brightness is greater than the first ambient brightness threshold; when the ambient brightness is greater than the second ambient brightness threshold and less than the first ambient brightness threshold, determine the image output mode The output mode is the first combined output mode; and when the ambient brightness is less than the second ambient brightness threshold, the image output mode is determined to be the second combined output mode.
  • the shooting information acquired by the processor 30 may only include the ambient brightness.
  • an image output mode adapted to the ambient brightness is determined.
  • the ambient brightness is relatively simple to obtain and can be simple and fast. to determine the image output mode.
  • the first ambient brightness threshold and the second ambient brightness threshold can be preset to decrease in sequence.
  • the first ambient brightness threshold and the second ambient brightness threshold can be determined according to empirical values, or obtained by testing the terminal 100.
  • the terminal 100 is placed in an environment with adjustable ambient brightness, and by adjusting the ambient brightness, the electrical signals of the pixels of the image sensor 21 corresponding to the ambient brightness are obtained, for example, the average value of the electrical signals of the pixels of the image sensor 21 and the The mapping relationship of environmental brightness, when the pixel value corresponding to the average value is 200, the environmental brightness corresponding to the average value is considered to be the first environmental brightness threshold, and when the pixel value corresponding to the average value is 150, it is considered that the average value corresponds to The ambient brightness is the second ambient brightness threshold.
  • the ambient brightness threshold is obtained by testing the image sensor 21 of the terminal 100, the ambient brightness threshold is more suitable for the terminal 100, and the accuracy of the ambient brightness threshold is higher.
  • the processor 30 can determine that the adapted image output mode is full-resolution output mode to obtain the first image with high definition and signal-to-noise ratio; when the ambient brightness is greater than the second ambient brightness threshold and less than or equal to the first ambient brightness threshold (hereinafter referred to as the medium-bright environment), the ambient light is still However, compared with the bright environment, the amount of light that can be obtained by each pixel is reduced, and the processor 30 can determine that the adapted image output mode is the first combined output mode, so as to obtain a slightly reduced definition but an improved signal-to-noise ratio.
  • the first ambient brightness threshold hereinafter referred to as the high-brightness environment
  • the processor 30 can determine the adaptation The image output mode is the second combined output mode, so as to obtain a third image with reduced sharpness but significantly improved signal-to-noise ratio. Therefore, an appropriate image output mode can be selected for different environmental brightness, and a good balance between sharpness and signal-to-noise ratio can be achieved to ensure that the sharpness and signal-to-noise ratio will not be too low, thereby improving the image quality.
  • the shooting parameters include exposure parameters
  • step 013 specifically determining an image output mode adapted to the ambient brightness and shooting parameters also includes the following steps:
  • 0134 Determine the amount of incoming light according to ambient brightness and exposure parameters
  • 0136 When the ambient brightness is greater than the second light input threshold and smaller than the first light input threshold, determine that the image output mode is the first combined output mode;
  • the determination module 13 is further configured to perform step 0134 , step 0135 , step 0136 and step 0137 . That is, the determination module 13 is further configured to determine the light input amount according to the ambient brightness and the exposure parameter; when the light input amount is greater than the first light input amount threshold, determine that the image output mode is the full resolution output mode; when the ambient brightness is greater than the second light input amount threshold and When it is less than the first light input threshold, the image output mode is determined to be the first combined output mode; and when the ambient brightness is less than the second light input threshold, the image output mode is determined to be the second combined output mode.
  • the processor 30 is further configured to perform step 0134 , step 0135 , step 0136 and step 0137 . That is, the processor 30 is further configured to determine the amount of incoming light according to the ambient brightness and the exposure parameter; when the incoming light amount is greater than the first incoming light amount threshold, determine that the image output mode is the full resolution output mode; when the ambient brightness is greater than the second incoming light amount threshold and When it is less than the first light input threshold, the image output mode is determined to be the first combined output mode; and when the ambient brightness is less than the second light input threshold, the image output mode is determined to be the second combined output mode.
  • the camera 20 can adjust exposure parameters such as aperture size, shutter time, sensitivity, etc. during shooting, even under the same ambient brightness, the pixel values of pixels under different exposure parameters are significantly different.
  • the larger the aperture the greater the amount of incoming light, the greater the amount of light each pixel can get, and the greater the pixel value
  • the longer the shutter time is, the greater the amount of incoming light, the more light that each pixel can get, and the larger the pixel value; for another example, when the ambient brightness is constant, the greater the sensitivity, although the actual amount of incoming light is different.
  • the exposure parameters also affect the image output mode. For example, taking the exposure parameters including the aperture size as an example, the light input amount when the aperture size is small in the bright environment may be smaller than the light input amount when the aperture size is large in the medium bright environment. The parameters determine the amount of incoming light, and then determine the image output mode according to the amount of incoming light.
  • each pixel can obtain a large amount of light, and the processor 30 can determine that the adapted image output mode is the full-resolution output mode, so as to obtain the clarity and signal-to-noise ratio.
  • the processor 30 can determine the adapted image output mode as The first combined output mode is used to obtain a second image with slightly reduced definition but improved signal-to-noise ratio; when the incoming light amount is less than or equal to the second light incoming amount threshold, the amount of light that can be obtained by each pixel is also less, and the processor 30 may The adapted image output mode is determined to be the second combined output mode, so as to obtain a third image with reduced definition but significantly improved signal-to-noise ratio.
  • an appropriate image output mode is selected for different environmental brightness and exposure parameters, and a good balance is achieved between the sharpness and the signal-to-noise ratio, ensuring that the sharpness and the signal-to-noise ratio are not too low, thereby improving the image quality.
  • step 011 includes the following steps:
  • 0111 output the first image in full resolution output mode
  • 0112 output the second image via the first merge output mode
  • the determination module 13 is further configured to perform step 0111 , step 0112 and step 0113 . That is, the determining module 13 is further configured to output the first image through the full resolution output mode; and/or output the second image through the first combined output mode; and/or output the third image through the second combined output mode.
  • the processor 30 is further configured to perform step 0111 , step 0112 and step 0113 . That is, the processor 30 is configured to output the first image through the full resolution output mode; and/or output the second image through the first combined output mode; and/or output the third image through the second combined output mode.
  • the processor 30 controls the image sensor 21 to output the first image in the full resolution output mode; when the image output mode is the first combined output mode, the processor 30 controls the image The sensor 21 outputs the second image in the first combined output mode; when the image output mode is the second combined output mode, the processor 30 controls the image sensor 21 to output the third image in the second combined output mode.
  • the processor 30 may also control the image sensor 21 to output the first image and the second image simultaneously in the full resolution output mode and the first combined output mode, or the processor 30 may control the image sensor 21 simultaneously in the full resolution output mode and the second image
  • the combined output mode outputs the first image and the third image
  • the processor 30 may control the image sensor 21 to simultaneously output the second image and the third image in the first combined output mode and the second combined output mode
  • the processor 30 may control the image The sensor 21 simultaneously outputs the first image, the second image and the third image in the full resolution output mode, the first combined output mode and the second combined output mode.
  • the image sensor 21 simultaneously outputs the first image and the second image, or the second image and the third image, or the first image and the third image, or the first image, the second image and the third image, the user can Favorite Select the target image to save.
  • the image sensor 21 can output multiple images at the same time: the image sensor 21 quickly outputs multiple images according to different image output modes to obtain multiple images; it can also be: the image sensor 21 outputs the pixel value of each pixel (ie , outputting the first image in a full resolution mode), and then the processor 30 performs a binning process according to each pixel value to output the first image, the second image and/or the third image, respectively.
  • the processor 30 can control the image sensor 21 to output the corresponding image through the adapted image output mode.
  • step 0111 includes the following steps:
  • 01111 Interpolate each first pixel value based on a predetermined first interpolation algorithm to obtain a first image arranged in a Bayer array.
  • the determining module 13 is further configured to perform step 01111. That is, the determining module 13 is further configured to perform interpolation on each first pixel value based on a predetermined first interpolation algorithm to obtain a first image arranged in a Bayer array.
  • the processor 30 is further configured to perform step 01111. That is, the processor 30 is further configured to perform interpolation on each first pixel value based on a predetermined first interpolation algorithm to obtain a first image arranged in a Bayer array.
  • the image sensor 21 obtains the first pixel value of each pixel to generate the original image P0, the pixel P01 in the original image P0 and the pixel array 23 Pixels 231 (shown in FIG. 14, specifically, when it is determined that the image output mode is the full-resolution output mode, the image sensor 21 obtains the first pixel value of each pixel to generate the original image P0, the pixel P01 in the original image P0 and the pixel array 23 Pixels 231 (shown in FIG.
  • the processor 30 performs interpolation on the first pixel value of each pixel P01 in the original image P0 based on a preset first interpolation algorithm, so that the original image P0
  • Each first pixel value of the first image P1 is interpolated to the pixel value of the corresponding target pixel P11 in the first image P1
  • the pixel P11 of the first image P1 corresponds to the pixel P01 of the original image P0 one-to-one.
  • the pixel corresponding to the position of the interpolated pixel is the target pixel. As shown in FIG.
  • the pixel P01 in the original image P0 is The first pixel value of the first image P1 is converted into the target pixel value of the color of the target pixel P11 in the first image P1, for example, the first target pixel P11 (target pixel of the pixel to be interpolated) in the upper left corner of the first image P1 is a red pixel , then the processor 30 performs interpolation processing (such as averaging, etc.) on the pixel to be interpolated according to the first pixel value of the pixel to be interpolated and the first pixel value of the red pixel P01 around the pixel to be interpolated in the original image P0, Thus, the first pixel value of the pixel to be interpolated is converted into the target pixel value of the target pixel P11. In this way, each pixel P01 in the original image P0
  • step 0112 includes:
  • 01121 Interpolate each of the second pixel value and the third pixel value based on a predetermined second interpolation algorithm to obtain a second image arranged in a Bayer array.
  • the determining module 13 is further configured to perform step 0321. That is, the determining module 13 is further configured to perform interpolation on each of the second pixel value and the third pixel value based on a predetermined second interpolation algorithm to obtain a second image arranged in a Bayer array.
  • the processor 30 is further configured to perform step 0321. That is, the processor 30 is further configured to perform interpolation on each of the second pixel value and the third pixel value based on a predetermined second interpolation algorithm to obtain a second image arranged in a Bayer array.
  • the image sensor 21 when it is determined that the image output mode is the first combined output mode, the image sensor 21 combines the electrical signals of the four panchromatic pixels 2351 in the panchromatic pixel unit 235 corresponding to the panchromatic filter 224 read out to obtain a second pixel value, the electrical signals of the four color pixels 2341 in the color pixel unit corresponding to the color filter 223 are combined and read out to obtain a third pixel value, and then the image sensor 21 according to the second pixel value The pixel value and the third pixel value output the original image P0', and the number of pixels of the original image P0' at this time is 1/4 of the original image P0.
  • the processor 30 interpolates the second pixel value and the third pixel value in the original image P0 ′ based on a preset second interpolation algorithm, so as to obtain a second image P2 arranged in a Bayer array, and the pixels P21 of the second image P2 There is a one-to-one correspondence with the pixels P01' of the original image P0', and the pixel corresponding to the position of the pixel to be interpolated in the second image P2 is the target pixel P21.
  • the processor 30 may, according to the color of each pixel in the second image P2 of the Bayer array to be generated (color a is green, color b is red, and color c is blue), the pixel P01' in the original image P0'
  • the second pixel value or the third pixel value is converted into the target pixel value of the color of the target pixel P21 in the second image P2.
  • the first pixel P21 in the upper left corner of the second image P2 is a red pixel (the target pixel of the pixel to be interpolated).
  • the processor 30 performs the interpolation on the pixel to be interpolated according to the second pixel value of the first pixel P01' (that is, the pixel to be interpolated) in the upper left corner of the original image P0' and the third pixel value of the surrounding red pixel P01'.
  • Interpolation processing is performed to convert the second pixel value of the pixel to be interpolated into the target pixel value of the target pixel P21. In this way, the pixel P01' in the original image P0' can be interpolated into the corresponding target pixel P21 in the second image P2, so as to generate the second image P2 arranged in a Bayer array.
  • step 0113 includes:
  • 01131 Interpolate each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain a third image arranged in a Bayer array.
  • the determining module 13 is further configured to perform step 0331. That is, the determining module 13 is further configured to perform interpolation on each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain a third image arranged in a Bayer array.
  • the processor 30 is further configured to perform step 0331. That is, the processor 30 is further configured to perform interpolation on each of the fourth pixel value and the fifth pixel value based on a predetermined third interpolation algorithm to obtain a third image arranged in a Bayer array.
  • the image sensor 21 converts 8 panchromatic pixels in the panchromatic pixel unit 235 corresponding to all panchromatic filters 224 in each filter group 222
  • the electrical signals of 2351 are combined and read to obtain a fourth pixel value
  • the electrical signals of 8 color pixels 2341 in the color pixel unit 234 corresponding to all color filters 223 in each filter group 222 are combined and read. to obtain a fifth pixel value
  • the image sensor 21 outputs the first intermediate image B1 and the second intermediate image B2 according to the fourth pixel value and the fifth pixel value, respectively.
  • the processor 30 performs interpolation on the first intermediate image B1 and the second intermediate image B2 based on a preset third interpolation algorithm to obtain a third image P3 arranged in a Bayer array.
  • the pixel values of the pixels corresponding to the positions in the first intermediate image B1 and the second intermediate image B2 may be weighted and summed (for example, the weights are both 0.5), to be used as the target pixel P31 at the corresponding position in the third image P3.
  • the target pixel value for example, the weighted summation of the fourth pixel value x1 of the first pixel B11 in the upper left corner of the first intermediate image B1 and the fifth pixel value x2 of the first pixel B21 in the upper left corner of the second intermediate image B2
  • the target pixel value is 0.5x1+0.5x2, so as to obtain the third image P3 arranged in the Bayer array by interpolation according to the first intermediate image B1 and the second intermediate image B2 .
  • the pixels in the corresponding positions between different images refer to the pixels with the same coordinates in different images, taking the first pixel in the upper left corner of the image as the origin of coordinates, and the pixels with the same coordinates in different images.
  • an embodiment of the present application further provides a non-volatile computer-readable storage medium 200 .
  • the computer program 201 when executed by one or more processors 300, causes the processor 300 to perform the following steps:
  • 011 Output an image through at least one of multiple image output modes.
  • the processor 300 can also perform the following steps:
  • 0133 When the ambient brightness is less than the second ambient brightness threshold, determine that the image output mode is the second combined output mode, and the first ambient brightness threshold is greater than the second ambient brightness threshold.
  • the processor 30 in this embodiment of the present application may be an image processing circuit 80, and the image processing circuit 80 may be implemented by hardware and/or software components, including various processes that define an ISP (Image Signal Processing, image signal processing) pipeline unit.
  • FIG. 18 is a schematic diagram of an image processing circuit 800 in one embodiment. As shown in FIG. 18 , for the convenience of description, only various aspects of the image processing technology related to the embodiments of the present application are shown.
  • the image processing circuit 80 includes an ISP processor 81 and a control logic 82 .
  • Image data captured by camera 83 is first processed by ISP processor 81, which analyzes the image data to capture image statistics that can be used to determine one or more control parameters of camera 83.
  • the camera 83 (the camera 83 may be the camera 20 of the terminal 100 as shown in FIG. 3 ) may include one or more lenses 832 and an image sensor 834 (the image sensor 834 may be the image sensor 21 of the camera 20 as shown in FIG. 3 ) .
  • the image sensor 834 may include a color filter array (the color filter array may be the filter array 22 shown in FIG.
  • the image sensor 834 may obtain the light intensity and wavelength information captured by each imaging pixel, and provide information that can be processed by the ISP.
  • the sensor 84 eg, a gyroscope
  • the interface of the sensor 84 may be an SMIA (Standard Mobile Imaging Architecture, Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above interfaces.
  • the image sensor 834 may also send raw image data to the sensor 84, the sensor 84 may provide the raw image data to the ISP processor 81 based on the sensor 84 interface type, or the sensor 84 may store the raw image data in the image memory 85.
  • the ISP processor 81 processes raw image data pixel by pixel in various formats.
  • each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 81 may perform one or more image processing operations on the raw image data, collecting statistical information about the image data. Among them, the image processing operations can be performed with the same or different bit depth precision.
  • the ISP processor 81 may perform one or more image processing operations, such as interpolation processing, median filtering, bilateral smoothing filtering, etc. .
  • the processed image data may be sent to image memory 85 for additional processing before being displayed.
  • the ISP processor 81 receives the processed data from the image memory 85 and performs image data processing in the original domain and in the RGB and YCbCr color spaces on the processed data.
  • the image data processed by the ISP processor 81 can be output to the display 87 (the display 87 can be the display screen 60 of the terminal 100 as shown in graphics processor) for further processing.
  • the output of the ISP processor 81 can also be sent to the image memory 85, and the display 87 can read the image data from the image memory 85.
  • image memory 85 may be configured to implement one or more frame buffers.
  • the output of ISP processor 81 may be sent to encoder/decoder 86 for encoding/decoding image data.
  • the encoded image data can be saved and decompressed prior to display on the display 87 device.
  • the encoder/decoder 86 may be implemented by a CPU or GPU or a co-processor.
  • Statistics determined by the ISP processor 81 may be sent to the control logic 82 unit.
  • the statistics may include image sensor 834 statistics such as image output mode, auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens 832 shading correction, and the like.
  • Control logic 82 may include a processing element and/or a microcontroller that executes one or more routines (eg, firmware) that may determine control parameters for camera 83 and an ISP processor based on received statistics 81 control parameters.
  • camera 83 control parameters may include sensor 84 control parameters (eg, gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 832 control parameters (eg, focal length for focus or zoom), or these parameters The combination.
  • ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (eg, during RGB processing), and lens 832 shading correction parameters.
  • the following are the steps of using the image processing circuit 80 (specifically, the ISP processor 81) to realize the image acquisition method:
  • the image processing circuit 80 can also perform the following steps:
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

一种图获取方法,包括:(011)通过多种图像输出模式中的至少一种输出图像,多种图像输出模式包括根据第一像素值以获取第一图像的全分辨率输出模式、根据第二像素值和第三像素值以获取第二图像的第一合并输出模式、及根第四像素值和第五像素值以获取第三图像的第二合并输出模式。

Description

图像获取方法及装置、终端和计算机可读存储介质
优先权信息
本申请请求2020年10月09日向中国国家知识产权局提交的、专利申请号为202011073863.3的专利申请、及专利申请号为202022245405.5的专利申请及的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及图像处理技术领域,特别涉及一种图像获取方法、图像获取装置、终端和非易失性计算机可读存储介质。
背景技术
目前,相机在进行拍摄时,图像的输出模式一般是固定的,对不同场景的适应能力较差,只能通过调节曝光参数进行一些补偿以提高不同场景下的成像质量。
发明内容
本申请的实施例提供了一种图像获取方法、图像获取装置、终端和非易失性计算机可读存储介质。
本申请实施方式的图像获取方法应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括多个子滤光片,所述像素阵列包括多个像素,每个所述像素对应所述滤光片阵列的一个所述子滤光片,所述像素用于接收穿过对应的所述子滤光片的光线以生成电信号;所述图像获取方法包括通过多种图像输出模式中的至少一种输出图像,所述多种图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式。
本申请实施方式的图像获取装置包括输出模块。所述输出模块用于通过多种图像输出模式中的至少一种输出图像,所述多种图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式。
本申请实施方式的终端包括图像传感器和处理器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括多个子滤光片;所述像素阵列包括多个像素,每个所述像素对应所述滤光片阵列的一个所述子滤光片,所述像素用于接收穿过对应的所述子滤光片的光线以生成电信号;所述处理器用于:通过多种图像输出模式中的至少一种输出图像,所述多种图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式。
本申本申请实施方式的请实施方式的一个或多个包含计算机程序的非易失性计算机可读存储介质,当所述计算机程序被一个或多个处理器执行时,使得所述处理器执行图像获取方法。所述图像获取方法应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括多个子滤光片,所述像素阵列包括多个像素,每个所述像素对应所述滤光片阵列的一个所述子滤光片,所述像素用于接收穿过对应的所述子滤光片的光线以生成电信号;所述图像获取方法包括通过多种图像输出模式中的至少一种输出图像,所述多种图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请某些实施方式的图像获取方法的流程示意图。
图2是本申请某些实施方式的图像获取装置的模块示意图。
图3是本申请某些实施方式的终端的结构示意图。
图4是本申请某些实施方式的图像传感器的分解示意图。
图5是本申请某些实施方式的像素阵列和读出电路的连接示意图。
图6是本申请某些实施方式的滤光片阵列的平面示意图。
图7a是本申请某些实施方式的滤光片阵列的最小重复单元的平面示意图。
图7b是本申请某些实施方式的滤光片阵列的最小重复单元的平面示意图。
图7c是本申请某些实施方式的滤光片阵列的最小重复单元的平面示意图。
图7d是本申请某些实施方式的滤光片阵列的最小重复单元的平面示意图。
图8是本申请某些实施方式的像素阵列的平面示意图。
图9是本申请某些实施方式的像素阵列的最小重复单元的平面示意图。
图10是本申请某些实施方式的图像获取方法的流程示意图。
图11是本申请某些实施方式的图像获取方法的流程示意图。
图12是本申请某些实施方式的图像获取方法的流程示意图。
图13是本申请某些实施方式的图像获取方法的流程示意图。
图14是本申请某些实施方式的图像获取方法的原理示意图。
图15是本申请某些实施方式的图像获取方法的原理示意图。
图16是本申请某些实施方式的图像获取方法的原理示意图。
图17是本申请某些实施方式的可读存储介质与处理器的连接示意图。
图18是本申请某些实施方式的图像处理电路的模块示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
请参阅图1,本申请实施方式的图像获取方法应用于图像传感器21,图像传感器21包括滤光片阵列22和像素阵列23,滤光片阵列22包括最小重复单元221,最小重复单元221包括多个滤光片组222,滤光片组222包括彩色滤光片223和全色滤光片224,彩色滤光片223的透过的光线的波段的宽度小于全色滤光片224透过的光线的波段的宽度,彩色滤光片223和全色滤光片224均包括多个子滤光片225;像素阵列23包括多个像素231,每个像素231对应滤光片阵列22的一个子滤光片225,像素231用于接收穿过对应的子滤光片225的光线以生成电信号;图像获取方法包括以下步骤:
011:通过多种图像输出模式中的至少一种输出图像,多种图像输出模式包括根据每个像素231读出的第一像素值以获取第一图像的全分辨率输出模式、根据全色滤光片224对应的多个像素231合并读出的第二像素值和彩色滤光片223对应的多个像素231合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据滤光片组222中的所有全色滤光片224对应的多个像素231合并读出的第四像素值和所有彩色滤光片223对应的多个像素231合并读出的第五像素值以获取第三图像的第二合并输出模式。
请结合图2,本申请实施方式的图像获取装置10应用于图像传感器21,图像获取装置10包括输出模块11。输出模块11用于执行步骤011。即,输出模块11用于通过多种图像输出模式中的至少一种输出图像。
请结合图3至图5,本申请实施方式的终端100包括图像传感器21和处理器30。图像传感器21包括滤光片阵列22和像素阵列23,处理器30用于执行步骤011。即,处理器30用于通过多种图像输出模式中的至少一种输出图像。
终端100包括手机、平板电脑、笔记本电脑、柜员机、闸机、智能手表、头显设备等,可以理解,终端100还可以是其他任意图像处理功能的装置。下面以终端100为手机进行说明,但终端100不限于手机。终端100包括相机20、处理器30和壳体40。相机20和处理器30均设置在壳体40内,壳体40还可用于安装终端100的供电装置、通信装置等功能模块,以使壳体40为功能模块提供防尘、防摔、防水等保护。
相机20可以是前置相机、后置相机、侧置相机、屏下相机等,在此不做限制。相机20包括镜头及图像传感器21,相机20在拍摄图像时,光线穿过镜头并到达图像传感器21,图像传感器21用于将照射到图像传感器21上的光信号转化为电信号。
请参阅图4和图5,图像传感器21包括微透镜阵列25、滤光片阵列22、像素阵列23和读出电路24。
微透镜阵列25包括多个微透镜251,微透镜251、子滤光片225和像素231一一对应设置,微透镜251用于将入射的光线进行会聚,会聚的光线穿过对应的子滤光片235后被对应的像素231接收,像素231根据接收的光线生成电信号。
滤光片阵列22和像素阵列23均可以由多个最小重复单元221排列而成,为方便描述,滤光片阵列22的最小重复单元221下称第一最小重复单元,像素阵列23的最小重复单元232下称第二最小重复单元。
第一最小重复单元包括多个滤光片组222,例如第一最小重复单元包括2个滤光片组222、3个滤光片组222、4个滤光片组222、5个滤光片组222、6个滤光片组222等,本申请实施方式中,第一最小重复单元包括4个滤光片组222,4个滤光片组222呈矩阵排列。
请参阅图6,每个滤光片组222均包括彩色滤光片223(如图6中存在填充图案的4个子滤光片225组成的矩形部分)和全色滤光片224(如图6中不存在填充图案的4个子滤光片225组成的矩形部分),彩色滤光片223的透过的光线的波段的宽度小于全色滤光片224透过的光线的波段的宽度,例如,彩色滤光片223的透过的光线的波段可对应红光的波段、绿光的波段、或蓝光的波段,全色滤光片224透过的光线的波段为所有可见光的波段,也即是说,彩色滤光片223仅允许特定颜色光线透光,而全色滤光片224可通过所有颜色的光线。当然,彩色滤光片223的透过的光线的波段还可对应其他色光的波段,如品红色光、紫色光、青色光、黄色光等,在此不作限制。
滤光片组222中彩色滤光片223的数量和全色滤光片224的数量之和为4、9、16、25等可排列成矩阵的数量。本 实施方式中,滤光片组222中彩色滤光片223的数量和全色滤光片224的数量之和为4。
彩色滤光片223的数量和全色滤光片224的数量的比例可以是1:3、1:1或3:1。例如,彩色滤光片223的数量和全色滤光片224的数量的比例为1:3,则彩色滤光片223的数量为1,全色滤光片224的数量为3,此时全色滤光片224数量较多,在暗光下的成像质量更好;或者,彩色滤光片223的数量和全色滤光片224的数量的比例为1:1,则彩色滤光片223的数量为2,全色滤光片224的数量为2,此时既可以获得较好的色彩表现的同时,暗光下的成像质量也较好;或者,彩色滤光片223的数量和全色滤光片224的数量的比例为3:1,则彩色滤光片223的数量为3,全色滤光片224的数量为1,此时可获得更好的色彩表现,且能提高暗光下的成像质量。本申请实施方式中,如图7a所示,彩色滤光片223的数量为2,全色滤光片224的数量为2,2个彩色滤光片223和2个全色滤光片224呈矩阵排列,2个彩色滤光片223位于该矩阵对应的矩形的第一对角线D1方向上(具体为第一对角线D1上),2个全色滤光片224位于该矩阵对应的矩形的第二对角线D2方向上(具体为第二对角线D2上),第一对角线D1方向和第二对角线D2方向不同(如第一对角线D1方向和第二对角线D2方向不平行),从而兼顾了色彩表现和暗光成像质量。在其他实施方式中,一个彩色滤光片223和一个全色滤光片224位于第一对角线D1,另一个彩色滤光片223和另一个全色滤光片224位于第二对角线D2。
第一最小重复单元中的滤光片组222的彩色滤光片223的透过的光线的波段对应的颜色包括颜色a、颜色b和/或颜色c,例如第一最小重复单元中的滤光片组222的彩色滤光片223的透过的光线的波段对应的颜色包括颜色a、颜色b和颜色c、或者颜色a、颜色b或颜色c、或者颜色a和颜色b、或者颜色b和颜色c、或者颜色a和颜色c。其中,颜色a为红色,颜色b为绿色,颜色c为蓝色,或者例如颜色a为品红色,颜色b为青色,颜色c为黄色等,在此不做限制。本申请实施方式中,第一最小重复单元中的滤光片组222的彩色滤光片223的透过的光线的波段对应的颜色包括颜色a、颜色b和颜色c,颜色a、颜色b和颜色c分别为绿色、红色和蓝色,具体的,第一最小重复单元中的4个滤光片组222(如图7所示,分别为第一滤光片组2221、第二滤光片组2222、第三滤光片组2223和第四滤光片组2224)的彩色滤光片223对应的颜色分别为红色、绿色、蓝色和绿色,以形成类似拜耳阵列的排布,当然,第一滤光片组2221、第二滤光片组2222、第三滤光片组2223和第四滤光片组2224对应的颜色还可以分别是绿色、红色、绿色和蓝色、或者蓝色、绿色、红色和绿色等,在此不做限制。
彩色滤光片223和全色滤光片224均包括多个子滤光片225,例如彩色滤光片223和全色滤光片224包括2个子滤光片225、3个子滤光片225、4个子滤光片225、5个子滤光片225、6个子滤光片225等,本申请实施方式中,为方便矩阵排布,彩色滤光片223包括4个彩色子滤光片,和全色滤光片224包括4个全色子滤光片。同一彩色滤光片223(全色滤光片224)中的子滤光片225透过的光线的波段相同。
请参阅图7a,在一个例子中,滤光片组222中彩色滤光片223的数量和全色滤光片224的数量之和为4,彩色滤光片223的数量和全色滤光片224的数量的比例为1:1,则第一最小重复单元为8行8列,包含64个子滤光片225,排布方式可以是:
Figure PCTCN2021105464-appb-000001
其中,w表示全色子滤光片,a、b和c表示彩色子滤光片,全色子滤光片指的是可滤除可见光波段之外的所有光线的子滤光片225,彩色子滤光片包括红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片和黄色子滤光片。红色子滤光片为滤除红光之外的所有光线的子滤光片225,绿色子滤光片为滤除绿光之外的所有光线的子滤光片225,蓝色子滤光片为滤除蓝光之外的所有光线的子滤光片225,品红色子滤光片为滤除品红色光之外的所有光线的子滤光片225,青色色子滤光片为滤除青光之外的所有光线的子滤光片225,黄色子滤光片为滤除黄光之外的所有光线的子滤光片225。
a可以是红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片或黄色子滤光片,b可以是红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片或黄色子滤光片,c可以是红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片或黄色子滤光片。例如,b为红色子滤光片、a为绿色子滤光片、c为蓝色子滤光片;或者,c为红色子滤光片、a为绿色子滤光片、b为蓝色子滤光片;再例如,c为红色子滤光片、a为绿色子滤光片、b为蓝色子滤光片;或者,a为红色子滤光片、b为蓝色子滤光片、c为绿色子滤光片等,在此不作限制;再例如,b为品红色子滤光片、a为青色子滤光片、b为黄色子滤光片等。在其他实施方式中,彩色滤光片还可包括其他颜色的子滤光片,如橙色子滤光片、紫色子滤光片等,在此不作限制。
请参阅图7b,在另一个例子中,滤光片组222中彩色滤光片223的数量和全色滤光片224的数量之和为4,彩色滤光片223的数量和全色滤光片224的数量的比例为1:1,则第一最小重复单元为8行8列,包含64个子滤光片225,排布方式还可以是:
Figure PCTCN2021105464-appb-000002
Figure PCTCN2021105464-appb-000003
请参阅图7c,在再一个例子中,滤光片组222中彩色滤光片223的数量和全色滤光片224的数量之和为9,滤光片组222中彩色滤光片223和全色滤光片224呈矩阵排列,彩色滤光片223的数量和全色滤光片224的数量的比例为4:5,则彩色滤光片223的数量为4,全色滤光片224的数量为5,此时全色滤光片224数量较多,在暗光下的成像质量更好;全色滤光片224位于滤光片组222对应的矩形的第三对角线D3和第四对角线D4上,第三对角线D3和第四对角线D4为该矩形的对角线,彩色滤光片223位于第三对角线D3方向或第四对角线D4方向上且不位于第三对角线D3和第四对角线D4上,第三对角线D3方向和第四对角线D4方向不同(如第三对角线D3方向和第四对角线D4方向不平行),具体地,第一最小重复单元为12行12列,包含144个子滤光片225,排布方式可以是:
Figure PCTCN2021105464-appb-000004
请参阅图7d,在又一个例子中,滤光片组222中彩色滤光片223的数量和全色滤光片224的数量之和为9,滤光片组222中彩色滤光片223和全色滤光片224呈矩阵排列,彩色滤光片223的数量和全色滤光片224的数量的比例为5:4,则彩色滤光片223的数量为5,全色滤光片224的数量为4,此时彩色滤光片223数量较多,此时可获得更好的色彩表现,且能提高暗光下的成像质量;彩色滤光片223位于滤光片组222对应的矩形的第五对角线D5和第六对角线D6上,第五对角线D5和第四对角线D6为该矩形的对角线,全色滤光片223位于第五对角线D5方向或第六对角线D6方向上且不位于第五对角线D5和第六对角线D6上,第五对角线D5方向和第六对角线D6方向不同(如第五对角线D5方向和第六对角线D6方向不平行),具体地,第一最小重复单元为12行12列,包含144个子滤光片225,排布方式还可以是:
Figure PCTCN2021105464-appb-000005
本实施方式的图像传感器21、相机20和终端100包括全色滤光片224,针对调节曝光参数后的成像效果提升有限,成像效果依旧较差的问题,在拍摄时图像传感器10可获取到更多的光量,从而无需调节拍摄参数,在不影响拍摄的稳定性的情况下,提高暗光下的成像质量,暗光下成像时,可兼顾稳定性和质量,暗光下成像的稳定性和质量均较高。且全色滤光片224和彩色滤光片223均由4个子滤光片225组成,在暗光下成像时可将4个子滤光片225对应的像素231合并输出,得到信噪比较高的图像,而在光线较为充足的场景下,可将每个子滤光片225对应的像素231单独进行输出,从而得到清晰度和信噪比均较高的图像。
请结合图4和图8,像素阵列23包括多个像素231,每个像素231对应一个子滤光片225,像素231用于接收穿过对应的子滤光片225的光线以生成电信号,处理器30处理电信号以获取像素231的像素值。
第二最小重复单元包括多个像素组233,与第二最小重复单元中的滤光片组222对应,第二最小重复单元包括4个像素组233且呈矩阵排列,每个像素组233对应一个滤光片组222,如图9所示,4个像素组233包括第一像素组2331、第二像素组2332、第三像素组2333和第四像素组2334,第一像素组2331、第二像素组2332、第三像素组2333和第四像素组2334分别与第一滤光片组2221、第二滤光片组2222、第三滤光片组2223和第四滤光片组2224对应设置。
像素组233包括彩色像素单元234和全色像素单元235,彩色像素单元234和全色像素单元235分别与彩色滤光片223和全色滤光片224一一对应设置。本实施方式中,彩色像素单元234和全色像素单元235均为2个,2个彩色像素单元234和2个全色像素单元235呈矩阵排列,2个彩色像素单元234位于该矩阵对应的矩形的一个第七对角线D7上,2个全色像素单元235位于该矩阵对应的矩形的第八对角线D8上。
彩色像素单元234包括彩色像素2341,全色像素单元235包括全色像素2311。彩色像素2341与彩色滤光片223的 子滤光片225(下称彩色子滤光片)一一对应设置,全色像素2311与全色滤光片224的子滤光片225(下称全色子滤光片)一一对应设置,与彩色滤光片223和全色滤光片224分别包括4个彩色子滤光片和4个全色子滤光片对应,彩色像素单元234和全色像素单元235也分别包括4个彩色像素2341和4个全色像素2311。彩色像素2341能够接收对应的彩色子滤光片透过的特定颜色(如红色、绿色、或蓝色)的光线以生成电信号,全色像素2311能够接收对应的全色子滤光片透过的所有颜色的光线以生成电信号,处理器30根据电信号即可获取全色像素2311及彩色像素2341对应的像素值。
彩色像素2341包括的颜色与对应设置的彩色子滤光片透过的光线的波段对应,第二最小重复单元中的彩色像素2341同样包括颜色a、颜色b和颜色c,例如第一最小重复单元中的彩色子滤光片透过的光线的波段包括红光的波段、绿光的波段和蓝光的波段,则彩色像素2341包括红色、绿色和蓝色。与4个滤光片组222对应的颜色对应,4个像素组233(即,第一像素组2331、第二像素组2332、第三像素组2333和第四像素组2334)中的彩色像素单元234的彩色像素2341对应的颜色分别为红色、绿色、蓝色和绿色,即颜色a为绿色、颜色b为红色、颜色c为蓝色。可以理解,彩色像素2341包括的颜色并不是彩色像素2341本身的颜色,而是彩色像素2341对应的彩色子滤光片透过的光线的波段对应的颜色。
第二最小重复单元中的全色像素2311的颜色与对应设置的第一最小重复单元中的全色子滤光片透过的光线的波段对应,例如全色像素2311包括颜色W,全色子滤光片透过的光线的波段为可见光波段,则颜色W为白色。可以理解,全色像素2311包括的颜色并不是全色像素2311本身的颜色,而是全色像素2311对应的全色子滤光片透过的光线的波段对应的颜色。
请参阅图5,读出电路24与像素阵列23电连接,用于控制像素阵列23的曝光以及像素231的像素值的读取和输出。
读出电路24包括垂直驱动单元241、控制单元242、列处理单元243和水平驱动单元244。
垂直驱动单元241包括移位寄存器和地址译码器。垂直驱动单元241包括读出扫描和复位扫描功能。读出扫描是指顺序地逐行扫描像素231,从这些像素231逐行地读取信号。例如,被选择并被扫描的像素行中的每一像素231输出的信号被传输到列处理单元243。复位扫描用于复位电荷,像素231的光电转换元件的光电荷被丢弃,从而可以开始新的光电荷的积累。
由列处理单元243执行的信号处理是相关双采样(CDS)处理。在CDS处理中,取出从所选像素行中的每一像素231输出的复位电平和信号电平,并且计算电平差。因而,获得了一行中的像素231的信号。列处理单元243可以具有用于将模拟像素信号转换为数字格式的模数(A/D)转换功能。
水平驱动单元244包括移位寄存器和地址译码器。水平驱动单元244顺序逐列扫描像素阵列11。通过水平驱动单元244执行的选择扫描操作,每一像素列被列处理单元243顺序地处理,并且被顺序输出。
控制单元242根据操作模式配置时序信号,利用多种时序信号来控制垂直驱动单元241、列处理单元243和水平驱动单元244协同工作。
具体地,处理器30可针对当前场景,在多种图像输出模式中选取至少一种来输出图像。例如,用户为了实现最高清晰度的图像的获取,可选取多种图像输出模式中的全分辨率输出模式输出图像。全分辨率输出模式中,每个像素231均输出一个第一像素值,从而生成分辨率大小等于图像传感器21的分辨率的图像,例如图像传感器21的分辨率为4800万像素,则可生成一张4800万像素大小的第一图像;
再例如,当前环境亮度不是很充足,用户为了提高图像的信噪比,可选择多种图像输出模式中的第一合并输出模式输出图像。第一合并输出模式中,全色滤光片224对应的全色像素单元235内的4个全色像素2311的电信号会被合并读出,以得到一个第二像素值,彩色滤光片223对应的彩色像素单元234内4个彩色像素2341的电信号会被合并读出,以得到一个第三像素值,根据所有的第三像素值和第四像素值可生成一张分辨率大小等于图像传感器21的分辨率的1/4的图像,例如图像传感器21的分辨率为4800万像素,则可生成一张1200万像素大小的第二图像;
再例如,当前环境亮度严重不足时,用户为了最大化的提高图像的信噪比,可选择多种图像输出模式中的第二合并输出模式输出图像。第二合并输出模式中,每个滤光片组222中的所有全色滤光片224对应的全色像素单元235内的8个全色像素2311的电信号会被合并读出,以得到一个第四像素值,每个滤光片组222中的所有彩色滤光片223对应的彩色像素单元234内的8个彩色像素2341的电信号会被合并读出,以得到一个第五像素值,然后所有第四像素值和所有第五像素值分别生成一个中间图像,两个中间图像合成后可生成一张分辨率大小等于图像传感器21的分辨率的1/16的图像,例如图像传感器21的分辨率为4800万像素,则可生成一张300万像素大小的第三图像。
其中,电信号合并读出可以是将多个像素231积累的电信号进行累加以得到一个累加电信号,然后根据该累加电信号确定对应的像素值,或者,电信号合并读出还可以是将每个像素231的像素值读出后将多个像素值进行累加以作为一个像素的像素值。
当然,处理器30可同时选择多种图像输出模式中的多种,以输出第一图像、第二图像和/或第三图像。例如,处理器30同时输出第一图像和第二图像、或第二图像和第三图像、或第一图像和第三图像、或第一图像、第二图像和第三图像。用户可从多种图像输出模式输出的多种图像中选取较为满意的图像。
本申请实施方式的图像获取方法、图像获取装置和终端100通过多种图像输出模式中至少一种输出图像,可针对不同场景使用不同的图像输出模式,对不同场景的适应能力较强,可在清晰度和信噪比之间取得较佳的平衡,提高不同场景下的成像效果。
请再次参阅图1,在某些实施方式中,图像获取方法包括:
012:获取拍摄信息,拍摄信息包括环境亮度和拍摄参数中至少一个;
013:确定与所述拍摄信息适配的所述图像输出模式。
请再次参阅图2,在某些实施方式中,图像处理装置10还包括获取模块12和确定模块13。获取模块12和确定模 块13分别用于执行步骤012和步骤013。即,获取模块12用于获取拍摄信息;确定模块13用于确定与所述拍摄信息适配的所述图像输出模式。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤012和步骤013。即,处理器你20还用于获取拍摄信息和确定与所述拍摄信息适配的所述图像输出模式。
具体地,处理器30首先获取拍摄信息,拍摄信息包括环境亮度和拍摄参数中至少一个,例如拍摄信息包括环境亮度、或者拍摄信息包括拍摄参数、或者拍摄信息包括环境亮度和拍摄参数,其中,拍摄参数可包括拍摄模式、曝光参数等。本实施方式以拍摄参数包括环境亮度和拍摄参数(拍摄参数包括拍摄模式)为例进行说明。
处理器30可获取当前拍摄模式和终端100的光传感器50(图3示)采集的环境光强度信号,然后根据光强度信号确定环境亮度;或者处理器30可控制相机20拍摄图像,然后根据拍摄图像的灰度值分布确定环境亮度;或者在拍摄图像时,为了在不同环境亮度下取得较好的拍摄效果,一般会自动调节曝光参数,如调节光圈大小、感光度等,环境亮度和曝光参数存在映射关系,处理器30根据拍摄图像时的曝光参数即可确定环境亮度。
在获取到环境亮度和拍摄参数后,处理器30可确定与环境亮度和/或拍摄参数适配的图像输出模式。例如,处理器30可确定与拍摄模式和环境亮度适配的图像输出模式。
由于拍摄模式一般需要用户主动选择,因此,处理器30可优先根据拍摄模式确定图像输出模式,例如拍摄模式为全分辨率模式时,处理器30确定适配的图像输出模式为全分辨率输出模式;再例如,拍摄模式为高分辨率模式,则处理器30确定适配的图像输出模式为第一合并输出模式;再例如,拍摄模式为低分辨率模式,则处理器30确定适配的图像输出模式为第二合并输出模式。
在未选择拍摄模式时,处理器30可确定与环境亮度适配的图像输出模式。
例如,在环境亮度较高时(如环境亮度高于第一环境亮度阈值),处理器30可确定适配的图像输出模式为全分辨率输出模式;在环境亮度正常时(如环境亮度高于第二环境亮度阈值小于第一环境亮度阈值),处理器30可确定适配的图像输出模式为第一合并输出模式;在环境亮度较低时(如环境亮度小于第二环境亮度阈值),处理器30可确定适配的图像输出模式为第二合并输出模式。从而针对不同环境亮度选择适配的图像输出模式,在清晰度和信噪比之间取得较好的平衡,保证清晰度和信噪比不会过低,从而提高成像质量。
在确定图像输出模式后,处理器30即可控制图像传感器21按适配的图像输出模式输出对应的图像。随着拍摄信息的变化,图像输出模式可实时变化,处理器30实时获取拍摄信息,每隔预定时间确定一次图像输出模式,从而保证图像输出模式与当前拍摄信息的实时适配。且图像传感器21包括全色滤光片224,可提高像素的进光量,提升暗光下的成像效果。
能够根据拍摄信息确定对应的图像输出模式,从而在应对具有不同的环境亮度、拍摄参数等拍摄信息的场景时,选择合适的图像输出模式,在清晰度和信噪比之间取得较佳的平衡,对不同场景的适应能力较强,可提高不同场景下的成像效果。
请参阅图10,在某些实施方式中,步骤013(具体为确定与环境亮度适配的图像输出模式)包括以下步骤:
0131:在环境亮度大于第一环境亮度阈值时,确定图像输出模式为全分辨率输出模式;
0132:在环境亮度大于第二环境亮度阈值且小于第一环境亮度阈值时,确定图像输出模式为第一合并输出模式;及0133:在环境亮度小于第二环境亮度阈值时,确定图像输出模式为第二合并输出模式,第一环境亮度阈值大于第二环境亮度阈值。
请再次参阅图2,在某些实施方式中,确定模块13还用于执行步骤0131、步骤0132和步骤0133。即,确定模块13还用于在环境亮度大于第一环境亮度阈值时,确定图像输出模式为全分辨率输出模式;在环境亮度大于第二环境亮度阈值且小于第一环境亮度阈值时,确定图像输出模式为第一合并输出模式;及在环境亮度小于第二环境亮度阈值时,确定图像输出模式为第二合并输出模式。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤0131、步骤0132和步骤0133。即,处理器30还用于在环境亮度大于第一环境亮度阈值时,确定图像输出模式为全分辨率输出模式;在环境亮度大于第二环境亮度阈值且小于第一环境亮度阈值时,确定图像输出模式为第一合并输出模式;及在环境亮度小于第二环境亮度阈值时,确定图像输出模式为第二合并输出模式。
具体的,处理器30获取的拍摄信息可仅包括环境亮度,在确定与拍摄信息适配的图像输出模式时,确定与环境亮度适配的图像输出模式,环境亮度的获取较为简单,可简单快速地确定图像输出模式。
在终端100出厂时,可预设依次减小的第一环境亮度阈值和第二环境亮度阈值,第一环境亮度阈值和第二环境亮度阈值可根据经验值确定,或者通过对终端100进行测试得到,例如将终端100放置在一环境亮度可调的环境下,通过调节环境亮度,获取与环境亮度对应的图像传感器21的像素的电信号,例如建立图像传感器21的像素的电信号的平均值与环境亮度的映射关系,在该平均值对应的像素值为200时,认为该平均值对应的环境亮度为第一环境亮度阈值,在该平均值对应的像素值为150时,认为该平均值对应的环境亮度为第二环境亮度阈值。如此,环境亮度阈值根据对终端100的图像传感器21测试得到,环境亮度阈值与终端100更为适配,环境亮度阈值的准确性较高。
在环境亮度大于第一环境亮度阈值时(下称高亮环境),此时环境光充足,每个像素能够得到的光量较多,处理器30可确定适配的图像输出模式为全分辨率输出模式,以得到清晰度和信噪比均较高的第一图像;在环境亮度大于第二环境亮度阈值且小于或等于第一环境亮度阈值时(下称中亮环境),此时环境光仍较多,但相较于高亮环境,每个像素能够得到的光量降低了,处理器30可确定适配的图像输出模式为第一合并输出模式,以得到清晰度略微降低但信噪比提升的第二图像;在环境亮度小于或等于第二环境亮度阈值时(下称低亮环境),此时环境光较少,每个像素能够得到的光量也较少,处理器30可确定适配的图像输出模式为第二合并输出模式,以得到清晰度降低但信噪比显著提升的第三图像。从而针对不同环境亮度选择适配的图像输出模式,在清晰度和信噪比之间取得较好的平衡,保证清晰度和信噪 比不会过低,从而提高成像质量。
请参阅图11,在某些实施方式中,拍摄参数包括曝光参数,步骤013(具体为确定与环境亮度和拍摄参数适配的图像输出模式)还包括以下步骤:
0134:根据环境亮度和曝光参数确定进光量;
0135:在进光量大于第一进光量阈值时,确定图像输出模式为全分辨率输出模式;
0136:在环境亮度大于第二进光量阈值且小于第一进光量阈值时,确定图像输出模式为第一合并输出模式;
0137:在环境亮度小于第二进光量阈值时,确定图像输出模式为第二合并输出模式。
请再次参阅图2,在某些实施方式中,确定模块13还用于执行步骤0134、步骤0135、步骤0136和步骤0137。即,确定模块13还用于根据环境亮度和曝光参数确定进光量;在进光量大于第一进光量阈值时,确定图像输出模式为全分辨率输出模式;在环境亮度大于第二进光量阈值且小于第一进光量阈值时,确定图像输出模式为第一合并输出模式;及在环境亮度小于第二进光量阈值时,确定图像输出模式为第二合并输出模式。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤0134、步骤0135、步骤0136和步骤0137。即,处理器30还用于根据环境亮度和曝光参数确定进光量;在进光量大于第一进光量阈值时,确定图像输出模式为全分辨率输出模式;在环境亮度大于第二进光量阈值且小于第一进光量阈值时,确定图像输出模式为第一合并输出模式;及在环境亮度小于第二进光量阈值时,确定图像输出模式为第二合并输出模式。
具体的,由于相机20在拍摄时可调节曝光参数如光圈大小、快门时间、感光度等,而即使相同的环境亮度下,不同的曝光参数下像素的像素值也是存在明显差异的。例如,在环境亮度不变的情况下,光圈越大,则进光量越大,每个像素能够得到的光量就越多,像素值则越大;再例如,在环境亮度不变的情况下,快门时间越大,进光量也越大,每个像素能够得到的光量就越多,像素值则越大;再例如,在环境亮度不变的情况下,感光度越大,虽然实际进光量并不会发生变化,但是相同的进光量产生的电信号变大了,也可以等同于进光量变大了,像素值也会越大;因此,除了环境亮度外,曝光参数也影响着图像输出模式的选择,例如以曝光参数包括光圈大小为例,高亮环境下光圈大小较小时的进光量可能小于中亮环境下光圈较大时的进光量,因此,处理器30可先根据环境亮度和曝光参数确定进光量,然后根据进光量来确定图像输出模式。
具体为,在进光量大于第一进光量阈值时,每个像素能够得到的光量较多,处理器30可确定适配的图像输出模式为全分辨率输出模式,以得到清晰度和信噪比均较高的第一图像;在进光量大于第二进光量阈值且小于或等于第一进光量阈值时,每个像素能够得到的光量降低了,处理器30可确定适配的图像输出模式为第一合并输出模式,以得到清晰度略微降低但信噪比提升的第二图像;在进光量小于或等于第二进光量阈值时,每个像素能够得到的光量也较少,处理器30可确定适配的图像输出模式为第二合并输出模式,以得到清晰度降低但信噪比显著提升的第三图像。从而针对不同环境亮度和曝光参数选择适配的图像输出模式,在清晰度和信噪比之间取得较好的平衡,保证清晰度和信噪比不会过低,从而提高成像质量。
请参阅图12,在某些实施方式中,步骤011包括以下步骤:
0111:通过全分辨率输出模式输出第一图像;和/或
0112:通过第一合并输出模式输出第二图像;和/或
0113:通过第二合并输出模式输出第三图像。
请再次参阅图2,在某些实施方式中,确定模块13还用于执行步骤0111、步骤0112和步骤0113。即,确定模块13还用于通过全分辨率输出模式输出第一图像;和/或通过第一合并输出模式输出第二图像;和/或通过第二合并输出模式输出第三图像。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤0111、步骤0112和步骤0113。即,处理器30用于通过全分辨率输出模式输出第一图像;和/或通过第一合并输出模式输出第二图像;和/或通过第二合并输出模式输出第三图像。
具体的,在图像输出模式为全分辨率输出模式时,处理器30控制图像传感器21按全分辨率输出模式输出第一图像;在图像输出模式为第一合并输出模式时,处理器30控制图像传感器21按第一合并输出模式输出第二图像;在图像输出模式为第二合并输出模式时,处理器30控制图像传感器21按第二合并输出模式输出第三图像。
处理器30还可控制图像传感器21同时按全分辨率输出模式和第一合并输出模式输出第一图像和第二图像、或者处理器30可控制图像传感器21同时按全分辨率输出模式和第二合并输出模式输出第一图像和第三图像、或者处理器30可控制图像传感器21同时按第一合并输出模式和第二合并输出模式输出第二图像和第三图像、或者处理器30可控制图像传感器21同时按全分辨率输出模式、第一合并输出模式和第二合并输出模式输出第一图像、第二图像和第三图像。
在图像传感器21同时输出第一图像和第二图像、或者第二图像和第三图像、或者第一图像和第三图像、或者第一图像、第二图像和第三图像后,用户可根据自身喜好选择目标图像进行保存。
可以理解,图像传感器21同时输出多张图像可以是:图像传感器21根据不同的图像输出模式快速进行多次输出以得到多张图像;还可以是:图像传感器21输出每个像素的像素值(即,以全分辨率模式输出第一图像),然后处理器30根据每个像素值进行合并处理以分别输出第一图像、第二图像和/或第三图像。
如此,处理器30可通过适配的图像输出模式控制图像传感器21输出对应的图像。
请参阅图13,在某些实施方式中,步骤0111包括以下步骤:
01111:基于预定的第一插值算法,对每个第一像素值进行插值以获取呈拜耳阵列排布的第一图像。
请再次参阅图2,在某些实施方式中,确定模块13还用于执行步骤01111。即,确定模块13还用于基于预定的第一插值算法,对每个第一像素值进行插值以获取呈拜耳阵列排布的第一图像。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤01111。即,处理器30还用于基于预定的第一插值算法,对每个第一像素值进行插值以获取呈拜耳阵列排布的第一图像。
请参阅图14,具体地,在确定图像输出模式为全分辨率输出模式时,图像传感器21获取每个像素的第一像素值以生成原始图像P0,原始图像P0中的像素P01与像素阵列23中的像素231(图8示)一一对应,然后处理器30基于预设的第一插值算法,对原始图像P0中的每个像素P01的第一像素值进行插值,以使得原始图像P0中的每个第一像素值均***值为第一图像P1中对应的目标像素P11的像素值,第一图像P1的像素P11和原始图像P0的像素P01一一对应,第一图像P1中与***值的像素的位置对应的像素为目标像素。如图14所示,根据要生成的呈拜耳阵列的第一图像P1中每个像素的颜色(颜色a为绿色、颜色b为红色、颜色c为蓝色),将原始图像P0中的像素P01的第一像素值均转换为第一图像P1中目标像素P11的颜色的目标像素值,例如,第一图像P1的左上角的第一个目标像素P11(待插值像素的目标像素)为红色像素,则处理器30根据原始图像P0中,该待插值像素的第一像素值及该待插值像素周围的红色像素P01的第一像素值对该待插值像素进行插值处理(如取均值等),从而将待插值像素的第一像素值转换为目标像素P11的目标像素值。如此,可将原始图像P0中每个像素P01插值转换成第一图像P1中对应的目标像素P11,以生成呈拜耳阵列排布的第一图像P1。
请再次参阅图13,在某些实施方式中,步骤0112包括:
01121:基于预定的第二插值算法,对每个第二像素值和第三像素值进行插值以获取呈拜耳阵列排布的第二图像。
请再次参阅图2,在某些实施方式中,确定模块13还用于执行步骤0321。即,确定模块13还用于基于预定的第二插值算法,对每个第二像素值和第三像素值进行插值以获取呈拜耳阵列排布的第二图像。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤0321。即,处理器30还用于基于预定的第二插值算法,对每个第二像素值和第三像素值进行插值以获取呈拜耳阵列排布的第二图像。
请参阅图15,具体地,在确定图像输出模式为第一合并输出模式时,图像传感器21将全色滤光片224对应的全色像素单元235内的4个全色像素2351的电信号合并读出,以得到一个第二像素值,彩色滤光片223对应的彩色像素单元内的4个彩色像素2341的电信号合并读出,以得到一个第三像素值,然后图像传感器21根据第二像素值和第三像素值输出原始图像P0’,此时的原始图像P0’的像素数为原始图像P0的1/4。处理器30基于预设的第二插值算法对原始图像P0’中的第二像素值和第三像素值进行插值,以获取呈拜耳阵列排布的第二图像P2,第二图像P2的像素P21和原始图像P0’的像素P01’一一对应,第二图像P2中与***值的像素的位置对应的像素为目标像素P21。处理器30可根据要生成的拜耳阵列的第二图像P2中每个像素的颜色(颜色a为绿色、颜色b为红色、颜色c为蓝色),将原始图像P0’中的像素P01’的第二像素值或第三像素值转换为第二图像P2中目标像素P21的颜色的目标像素值,例如,第二图像P2的左上角的第一个像素P21为红色像素(待插值像素的目标像素),则处理器30根据原始图像P0’左上角第一个像素P01’(即,待插值像素)的第二像素值及周围的红色像素P01’的第三像素值对该待插值像素进行插值处理,从而将该待插值像素的第二像素值转换为目标像素P21的目标像素值。如此,可将原始图像P0’中像素P01’插值转换成第二图像P2中对应的目标像素P21,以生成呈拜耳阵列排布的第二图像P2。
请再次参阅图13,在某些实施方式中,步骤0113包括:
01131:基于预定的第三插值算法,对每个第四像素值和第五像素值进行插值以获取呈拜耳阵列排布的第三图像。
请再次参阅图2,在某些实施方式中,确定模块13还用于执行步骤0331。即,确定模块13还用于基于预定的第三插值算法,对每个第四像素值和第五像素值进行插值以获取呈拜耳阵列排布的第三图像。
请再次参阅图3,在某些实施方式中,处理器30还用于执行步骤0331。即,处理器30还用于基于预定的第三插值算法,对每个第四像素值和第五像素值进行插值以获取呈拜耳阵列排布的第三图像。
具体地,在确定图像输出模式为第二合并输出模式时,图像传感器21将每个滤光片组222中的所有全色滤光片224对应的全色像素单元235内的8个全色像素2351的电信号合并读出,以得到一个第四像素值,将每个滤光片组222中的所有彩色滤光片223对应的彩色像素单元234内的8个彩色像素2341的电信号合并读出,以得到一个第五像素值,然后图像传感器21根据第四像素值和第五像素值分别输出第一中间图像B1和第二中间图像B2。处理器30基于预设的第三插值算法,对第一中间图像B1和第二中间图像B2进行插值,以获取呈拜耳阵列排布的第三图像P3。例如,可将第一中间图像B1和第二中间图像B2中位置对应的像素的像素值进行加权求和(例如权值均为0.5),以作为第三图像P3中对应位置的目标像素P31的目标像素值,例如,将第一中间图像B1左上角的第一个像素B11的第四像素值x1和第二中间图像B2左上角的第一个像素B21的第五像素值x2进行加权求和以得到第三图像P3左上角第一个像素P31的目标像素值为0.5x1+0.5x2,从而根据第一中间图像B1和第二中间图像B2插值得到呈拜尔阵列排布的第三图像P3。
可以理解的是,上述实施方式中,不同图像之间的对应位置的像素指的是,以图像左上角第一个像素为坐标原点,不同图像中坐标相同的像素即为位置对应的像素。
请结合图1和图17,本申请实施例还提供了一种非易失性计算机可读存储介质200。一个或多个包含计算机程序201的非易失性计算机可读存储介质200中,当计算机程序201被一个或多个处理器300执行时,使得处理器300执行以下步骤:
011:通过多种图像输出模式中的至少一种输出图像。
请结合图10,进一步地,当计算机程序201被一个或多个处理器300执行时,处理器300还可以执行以下步骤:
0131:在环境亮度大于第一环境亮度阈值时,确定图像输出模式为全分辨率输出模式;
0132:在环境亮度大于第二环境亮度阈值且小于第一环境亮度阈值时,确定图像输出模式为第一合并输出模式;及
0133:在环境亮度小于第二环境亮度阈值时,确定图像输出模式为第二合并输出模式,第一环境亮度阈值大于第二环境亮度阈值。
请参阅图18,本申请实施例处理器30可以是图像处理电路80,图像处理电路80可利用硬件和/或软件组件实现,包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图18为一个实施例中图像处理电路800的示意图。如图18所示,为便于说明,仅示出与本申请实施例相关的图像处理技术的各个方面。
如图18所示,图像处理电路80包括ISP处理器81和控制逻辑器82。相机83捕捉的图像数据首先由ISP处理器 81处理,ISP处理器81对图像数据进行分析以捕捉可用于确定相机83的一个或多个控制参数的图像统计信息。相机83(相机83可以是如图3所示的终端100的相机20)可包括一个或多个透镜832和图像传感器834(图像传感器834可以是如图3所示的相机20的图像传感器21)。图像传感器834可包括色彩滤镜阵列(色彩滤镜阵列可以是图6所示的滤光片阵列22),图像传感器834可获取每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器81处理的一组原始图像数据。传感器84(如陀螺仪)可基于传感器84接口类型把采集的图像处理的参数(如防抖参数)提供给ISP处理器81。传感器84接口可以为SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器834也可将原始图像数据发送给传感器84,传感器84可基于传感器84接口类型把原始图像数据提供给ISP处理器81,或者传感器84将原始图像数据存储到图像存储器85中。
ISP处理器81按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有8、10、12或14比特的位深度,ISP处理器81可对原始图像数据进行一个或多个图像处理操作、收集关于图像数据的统计信息。其中,图像处理操作可按相同或不同的位深度精度进行。
ISP处理器81还可从图像存储器85接收图像数据。例如,传感器84接口将原始图像数据发送给图像存储器85,图像存储器85中的原始图像数据再提供给ISP处理器81以供处理。图像存储器85可为存储器53、存储器53的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器834接口或来自传感器84接口或来自图像存储器85的原始图像数据时,ISP处理器81可进行一个或多个图像处理操作,如插值处理、中值滤波、双边平滑滤波等。处理后的图像数据可发送给图像存储器85,以便在被显示之前进行另外的处理。ISP处理器81从图像存储器85接收处理数据,并对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。ISP处理器81处理后的图像数据可输出给显示器87(显示器87可以是如图3所示的终端100的显示屏60),以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器81的输出还可发送给图像存储器85,且显示器87可从图像存储器85读取图像数据。在一个实施例中,图像存储器85可被配置为实现一个或多个帧缓冲器。此外,ISP处理器81的输出可发送给编码器/解码器86,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器87设备上之前解压缩。编码器/解码器86可由CPU或GPU或协处理器实现。
ISP处理器81确定的统计数据可发送给控制逻辑器82单元。例如,统计数据可包括图像输出模式、自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜832阴影校正等图像传感器834统计信息。控制逻辑器82可包括执行一个或多个例程(如固件)的处理元件和/或微控制器,一个或多个例程可根据接收的统计数据,确定相机83的控制参数及ISP处理器81的控制参数。例如,相机83的控制参数可包括传感器84控制参数(例如增益、曝光控制的积分时间、防抖参数等)、照相机闪光控制参数、透镜832控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜832阴影校正参数。
请结合图1,以下为运用图像处理电路80(具体为ISP处理器81)实现图像获取方法的步骤:
011:通过多种图像输出模式中的至少一种输出图像。
请结合图10,进一步地,运用图像处理电路80(具体为ISP处理器81)还可以执行以下步骤:
0131:在环境亮度大于第一环境亮度阈值时,确定图像输出模式为全分辨率输出模式;
0132:在环境亮度大于第二环境亮度阈值且小于第一环境亮度阈值时,确定图像输出模式为第一合并输出模式;及
0133:在环境亮度小于第二环境亮度阈值时,确定图像输出模式为第二合并输出模式,第一环境亮度阈值大于第二环境亮度阈值。本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,是可以通过计算机程序来指令相关的硬件来完成,程序可存储于一非易失性计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)等。
以上实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (76)

  1. 一种图像获取方法,其特征在于,应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括多个子滤光片,所述像素阵列包括多个像素,每个所述像素对应所述滤光片阵列的一个所述子滤光片,所述像素用于接收穿过对应的所述子滤光片的光线以生成电信号;所述图像获取方法包括:通过多种图像输出模式中的至少一种输出图像,所述多种图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式。
  2. 根据权利要求1所述的图像获取方法,其特征在于,还包括:获取拍摄信息,所述拍摄信息包括环境亮度和拍摄参数中至少一个;及确定与所述拍摄信息适配的所述图像输出模式。
  3. 根据权利要求2所述的图像获取方法,其特征在于,所述拍摄参数包括曝光参数,所述获取拍摄信息,包括:根据光传感器获取的环境光强度信号确定所述环境亮度;或根据所述曝光参数确定所述环境亮度。
  4. 根据权利要求2所述的图像获取方法,其特征在于,所述确定与所述拍摄信息适配的所述图像输出模式,包括:确定与所述环境亮度和/或所述拍摄参数适配的所述图像输出模式。
  5. 根据权利要求4所述的图像获取方法,其特征在于,所述确定与所述环境亮度适配的所述图像输出模式,包括:在所述环境亮度大于第一环境亮度阈值时,确定所述图像输出模式为所述全分辨率输出模式;在所述环境亮度大于第二环境亮度阈值且小于所述第一环境亮度阈值时,确定所述图像输出模式为所述第一合并输出模式;及在所述环境亮度小于所述第二环境亮度阈值时,确定所述图像输出模式为所述第二合并输出模式。
  6. 根据权利要求4所述的图像获取方法,其特征在于,所述拍摄参数包括曝光参数,所述确定与所述环境亮度和所述拍摄参数适配的所述图像输出模式,包括:根据所述环境亮度和所述曝光参数确定进光量;在所述进光量大于第一进光量阈值时,确定所述图像输出模式为所述全分辨率输出模式;在所述环境亮度大于第二进光量阈值且小于所述第一进光量阈值时,确定所述图像输出模式为所述第一合并输出模式;及在所述环境亮度小于所述第二进光量阈值时,确定所述图像输出模式为所述第二合并输出模式。
  7. 根据权利要求1所述的图像获取方法,其特征在于,所述通过预设的多种图像输出模式中的至少一种输出图像,包括:通过所述全分辨率输出模式输出所述第一图像;和/或通过所述第一合并输出模式输出所述第二图像;和/或通过所述第二合并输出模式输出所述第三图像。
  8. 根据权利要求7所述的图像获取方法,其特征在于,所述通过所述全分辨率输出模式输出所述第一图像,包括:基于预定的第一插值算法,对每个所述第一像素值进行插值以获取呈拜耳阵列排布的所述第一图像。
  9. 根据权利要求7所述的图像获取方法,其特征在于,所述通过所述第一合并输出模式输出所述第二图像,包括:基于预定的第二插值算法,对每个所述第二像素值和所述第三像素值进行插值以获取呈拜耳阵列排布的所述第二图像。
  10. 根据权利要求7所述的图像获取方法,其特征在于,所述通过所述第二合并输出模式输出第三图像,包括:基于预定的第三插值算法,对每个所述第四像素值和所述第五像素值进行插值以获取呈拜耳阵列排布的所述第三图像。
  11. 根据权利要求1所述的图像获取方法,其特征在于,所述滤光片组为4个,4个所述滤光片组呈矩阵排列,所述彩色滤光片和所述全色滤光片均包括4个所述子滤光片。
  12. 根据权利要求1所述的图像获取方法,其特征在于,所述滤光片组包括2个所述彩色滤光片和2个所述全色滤光片,2个所述彩色滤光片和2个所述全色滤光片呈矩阵排列,2个所述彩色滤光片位于第一对角线方向上,2个所述全色滤光片位于第二对角线方向上,所述第一对角线方向和所述第二对角线方向不同。
  13. 根据权利要求12所述的图像获取方法,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100001
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  14. 根据权利要求12所述的图像获取方法,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100002
    Figure PCTCN2021105464-appb-100003
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  15. 根据权利要求1所述的图像获取方法,其特征在于,在每个所述滤光片组中,所述全色滤光片设置在第三对角线及第四对角线上,所述彩色滤光片设置在所述第三对角线方向或所述第四对角线方向,所述第三对角线方向与所述第四对角线方向不同。
  16. 根据权利要求15所述的图像获取方法,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100004
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  17. 根据权利要求1所述的图像获取方法,其特征在于,在每个所述滤光片组中,所述彩色滤光片设置在第五对角线及第六对角线上,所述全色滤光片设置在所述第五对角线方向或所述第六对角线方向,所述第五对角线方向与所述第六对角线方向不同。
  18. 根据权利要求17所述的图像获取方法,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100005
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  19. 一种图像获取装置,其特征在于,应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括多个子滤光片,所述像素阵列包括多个像素,每个所述像素对应所述滤光片阵列的一个所述子滤光片,所述像素用于接收穿过对应的所述子滤光片的光线以生成电信号;所述图像获取装置包括:获取模块,用于获取拍摄信息,所述拍摄信息包括环境亮度和拍摄参数中至少一个;第一确定模块,用于确定与所述拍摄信息适配的图像输出模式,所述图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式;及第二确定模块,用于根据适配的所述图像输出模式输出图像。
  20. 一种终端,其特征在于,所述终端包括图像传感器和处理器,所述图像传感器包括滤光片阵列、像素阵列和读出电路,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括多个子滤光片;所述像素阵列包括多个像素,每个所述像素对应所述滤光片阵列的一个所述子滤光片,所述像素用于接收穿过对应的所述子滤光片的光线以生成电信号;所述处理器用于:获取拍摄信息,所述拍摄信息包括环境亮度和拍摄参数中至少一个;确定与所述拍摄信息适配的图像输出模式,所述图像输出模式包括根据每个像素读出的第一像素值以获取第一图像的全分辨率输出模式、根据所述全色滤光片对应的多个像素合并读出的第二像素值和所述彩色滤光片对应的多个像素合并读出的第三像素值以获取第二图像的第一合并输出模式、及根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取第三图像的第二合并输出模式;及根据适配的所述图像输出模式控制所述读出电路输出图像。
  21. 根据权利要求20所述的终端,其特征在于,所述处理器还用于确定与所述环境亮度和/或所述拍摄参数适配的所述图像输出模式。
  22. 根据权利要求21所述的终端,其特征在于,所述处理器还用于:在所述环境亮度大于第一环境亮度阈值时,确定所述图像输出模式为所述全分辨率输出模式;在所述环境亮度大于第二环境亮度阈值且小于所述第一环境亮度阈值时,确定所述图像输出模式为所述第一合并输出模式;及在所述环境亮度小于所述第二环境亮度阈值时,确定所述图像输出模式为所述第二合并输出模式。
  23. 根据权利要求21所述的终端,其特征在于,所述拍摄参数包括曝光参数,所述处理器还用于:根据所述环境亮度和所述曝光参数确定进光量;在所述进光量大于第一进光量阈值时,确定所述图像输出模式为所述全分辨率输出模式;在所述环境亮度大于第二进光量阈值且小于所述第一进光量阈值时,确定所述图像输出模式为所述第一合并输出模式;及在所述环境亮度小于所述第二进光量阈值时,确定所述图像输出模式为所述第二合并输出模式。
  24. 根据权利要求20所述的终端,其特征在于,所述处理器还用于:根据所述全分辨率输出模式输出所述第一图像;和/或根据所述第一合并输出模式输出所述第二图像;和/或根据所述第二合并输出模式输出所述第三图像。
  25. 一种包含计算机程序的非易失性计算机可读存储介质,当所述计算机程序被一个或多个处理器执行时,使得所述处理器执行权利要求1至18中任一项所述的图像获取方法。
  26. 一种图像传感器,其特征在于,所述图像传感器包括滤光片阵列、像素阵列和读出电路,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片和所述全色滤光片均包括4个子滤光片,所述像素阵列包括像素,所述读出电路与所述像素阵列电连接,所述读出电路用于控制所述像素阵列曝光、及所述像素的像素值的读取和输出。
  27. 根据权利要求26所述的图像传感器,其特征在于,所述滤光片组为4个,4个所述滤光片组呈矩阵排列。
  28. 根据权利要求27所述的图像传感器,其特征在于,在每个所述滤光片组中,所述全色滤光片设置在第一对角线方向,所述彩色滤光片设置在第二对角线方向,所述第一对角线方向与所述第二对角线方向不同。
  29. 根据权利要求28所述的图像传感器,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100006
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  30. 根据权利要求28所述的图像传感器,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100007
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  31. 根据权利要求27所述的图像传感器,其特征在于,在每个所述滤光片组中,所述全色滤光片设置在第三对角线及第四对角线上,所述彩色滤光片设置在所述第三对角线方向或所述第四对角线方向,所述第三对角线方向与所述第四对角线方向不同。
  32. 根据权利要求31所述的图像传感器,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100008
    Figure PCTCN2021105464-appb-100009
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  33. 根据权利要求27所述的图像传感器,其特征在于,在每个所述滤光片组中,所述彩色滤光片设置在第五对角线及第六对角线上,所述全色滤光片设置在所述第五对角线方向或所述第六对角线方向,所述第五对角线方向与所述第六对角线方向不同。
  34. 根据权利要求33所述的图像传感器,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100010
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  35. 根据权利要求29、30、32和34任一项所述的图像传感器,其特征在于,所述a为红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片、或黄色子滤光片中的任意一种;和/或所述b为红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片、或黄色子滤光片中的任意一种;和/或所述c为红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片、或黄色子滤光片中的任意一种。
  36. 根据权利要求35所述的图像传感器,其特征在于,所述b为红色子滤光片、所述a为绿色子滤光片、所述c为蓝色子滤光片;和/或所述c为红色子滤光片、所述a为绿色子滤光片、所述b为蓝色子滤光片和/或所述b为品红色子滤光片、所述a为青色子滤光片、所述c为黄色子滤光片。
  37. 一种图像获取方法,其特征在于,应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的光谱响应波长范围小于所述全色滤光片光谱响应波长范围,所述彩色滤光片包括N个子滤光片,所述全色滤光片包括N个子滤光片,所述像素阵列包括多个像素,所述像素阵列的像素与所述滤光片阵列的子滤光片对应设置,所述像素阵列被配置成用于光电转换通过所述滤光片阵列的被摄对象的给定集合的场景的光线,并基于所述被摄对象的给定集合的场景的光线生成图像数据;所述图像获取方法包括:通过多种图像输出模式中的至少一种输出模式生成图像,所述多种图像输出模式包括,根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,所述第一图像的第一分辨率与所述像素阵列的分辨率相同;或根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,所述第二图像的第二分辨率小于所述第一图像的第一分辨率,所述第二分辨率为所述第一分辨率的1/N,所述N为大于或等于2的正整数。
  38. 根据权利要求37所述的一种图像获取方法,其特征在于,一个所述滤光片组内包括的所述彩色滤光片和所述全色滤光片共M个,所述多种图像输出模式还包括,根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,所述第三图像的第三分辨率小于所述第二图像的第二分辨率,所述第三分辨率为所述第二分辨率的1/M,所述M为大于或等于2的正整数。
  39. 根据权利要求38所述的一种图像获取方法,其特征在于,根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,包括,根据每个所述像素读出的第一像素值以获取所述第一图像;根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,包括,根据所述全色滤光片对应的N个像素合并读出的第二像素值和所述彩色滤光片对应的N个像素合并读出的第三像素值以获取所述第二图像;根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,包括,根据所述滤光片组中的所有所述全色滤光片对应的多个像素合并读出的第四像素值和所有所述彩色滤光片对应的多个像素合并读出的第五像素值以获取所述第三图像。
  40. 根据权利要求38所述的一种图像获取方法,其特征在于,所述通过多种图像输出模式中的至少一种输出模式生成图像之前,还包括:根据环境亮度确定所述图像输出模式。
  41. 根据权利要求38所述的一种图像获取方法,其特征在于,所述通过多种图像输出模式中的至少一种输出模式生成图像之前,还包括:获取拍摄信息,所述拍摄信息包括环境亮度和拍摄参数中至少一个;及确定与所述拍摄信息适配的所述图像输出模式。
  42. 根据权利要求41所述的一种图像获取方法,其特征在于,所述拍摄参数包括曝光参数,所述获取拍摄信息具体包括:根据光传感器获取的环境光强度信号确定所述环境亮度;或根据所述曝光参数确定所述环境亮度。
  43. 根据权利要求40至42中任意一项所述的一种图像获取方法,其特征在于,所述方法还包括,在所述环境亮度大于第一环境亮度阈值时,确定所述图像输出模式为所述第一输出模式;在所述环境亮度大于第二环境亮度阈值且小于所述第一环境亮度阈值时,确定所述图像输出模式为所述第二输出模式;及在所述环境亮度小于所述第二环境亮度阈值时,确定所述图像输出模式为所述第三输出模式。
  44. 根据权利要求39所述的图像获取方法,其特征在于,所述通过多种图像输出模式中的至少一种输出模式生成图像,具体包括所述根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,其中所述根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像,包括:基于预定的第一插值算法,对每个所述第一像素值进行插值以获取呈拜耳阵列排布的所述第一图像。
  45. 根据权利要求39所述的图像获取方法,其特征在于,所述通过多种图像输出模式中的至少一种输出模式生成图像,具体包括所述根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,其中所述根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像,包括:基于预定的第二插值算法,对每个所述第二像素值和每个所述第三像素值进行插值以获取呈拜耳阵列排布的所述第二图像。
  46. 根据权利要求39所述的图像获取方法,其特征在于,所述通过多种图像输出模式中的至少一种输出模式生成图像,具体包括所述根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,其中所述根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像,包括:基于预定的第三插值算法,对每个所述第四像素值和每个所述第五像素值进行插值以获取呈拜耳阵列排布的所述第三图像。
  47. 根据权利要求37或38所述的图像获取方法,其特征在于,所述滤光片组为4个,4个所述滤光片组按照2x2矩阵排列构成一个最小重复单元,所述滤光片组的所述彩色滤光片包括4个具备相同光谱响应波长范围特性的子滤光片,所述滤光片组的所述全色滤光片包括4个具备相同光谱响应波长范围特性的子滤光片。
  48. 根据权利要求37或38所述的图像获取方法,其特征在于,2个所述彩色滤光片和2个所述全色滤光片呈矩阵排列组合构成所述滤光片组,所述滤光片组呈矩形,2个所述彩色滤光片位于矩阵对应的矩形的第一对角线方向上,2个所述全色滤光片位于矩阵对应的矩形的第二对角线方向上,所述第一对角线方向和所述第二对角线方向不同。
  49. 根据权利要求48所述的图像获取方法,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100011
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  50. 根据权利要求48所述的图像获取方法,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100012
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  51. 根据权利要求37或38所述的图像获取方法,其特征在于,在每个所述滤光片组中,所述全色滤光片设置在第三对角线及第四对角线上,所述彩色滤光片设置在所述第三对角线方向或所述第四对角线方向,所述第三对角线方向与所述第四对角线方向不同。
  52. 根据权利要求51所述的图像获取方法,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100013
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  53. 根据权利要求37或38所述的图像获取方法,其特征在于,在每个所述滤光片组中,所述彩色滤光片设置在第 五对角线及第六对角线上,所述全色滤光片设置在所述第五对角线方向或所述第六对角线方向,所述第五对角线方向与所述第六对角线方向不同。
  54. 根据权利要求53所述的图像获取方法,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100014
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  55. 一种图像获取装置,其特征在于,应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的光谱响应波长范围小于所述全色滤光片光谱响应波长范围,所述彩色滤光片包括N个子滤光片,所述全色滤光片包括N个子滤光片,所述像素阵列包括多个像素,所述像素阵列的像素与所述滤光片阵列的子滤光片对应设置,所述像素阵列被配置成用于光电转换通过所述滤光片阵列的被摄对象的给定集合的场景的光线,并基于所述被摄对象的给定集合的场景的光线生成图像数据;所述图像获取装置包括:确定模块,用于通过多种图像输出模式中的至少一种输出模式生成图像,所述多种图像输出模式包括,根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,所述第一图像的第一分辨率与所述像素阵列的分辨率相同;或根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,所述第二图像的第二分辨率小于所述第一图像的第一分辨率,所述第二分辨率为所述第一分辨率的1/N,所述N为大于或等于2的正整数。
  56. 根据权利要求55所述的一种图像获取装置,其特征在于,一个所述滤光片组内包括的所述彩色滤光片和所述全色滤光片共M个,所述多种图像输出模式还包括,根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,所述第三图像的第三分辨率小于所述第二图像的第二分辨率,所述第三分辨率为所述第二分辨率的1/M,所述M为大于或等于2的正整数。
  57. 一种终端,其特征在于,所述终端包括图像传感器和处理器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的光谱响应波长范围小于所述全色滤光片光谱响应波长范围,所述彩色滤光片包括N个子滤光片,所述全色滤光片包括N个子滤光片,所述像素阵列包括多个像素,所述像素阵列的像素与所述滤光片阵列的子滤光片对应设置,所述像素阵列被配置成用于光电转换通过所述滤光片阵列的被摄对象的给定集合的场景的光线,并基于所述被摄对象的给定集合的场景的光线生成图像数据;所述处理器用于:通过多种图像输出模式中的至少一种输出模式生成图像,所述多种图像输出模式包括,根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,所述第一图像的第一分辨率与所述像素阵列的分辨率相同;或根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,所述第二图像的第二分辨率小于所述第一图像的第一分辨率,所述第二分辨率为所述第一分辨率的1/N,所述N为大于或等于2的正整数。
  58. 根据权利要求57所述的一种终端,其特征在于,一个所述滤光片组内包括的所述彩色滤光片和所述全色滤光片共M个,所述多种图像输出模式还包括,根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,所述第三图像的第三分辨率小于所述第二图像的第二分辨率,所述第三分辨率为所述第二分辨率的1/M,所述M为大于或等于2的正整数。
  59. 一种包含计算机程序的非易失性计算机可读存储介质,当所述计算机程序被一个或多个处理器执行时,使得所述处理器执行权利要求37至54中任一项所述的图像获取方法。
  60. 一种图像获取方法,其特征在于,应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,一个所述滤光片组内包括的所述彩色滤光片和所述全色滤光片共M个,所述彩色滤光片的光谱响应波长范围小于所述全色滤光片光谱响应波长范围,所述彩色滤光片包括N个子滤光片,所述全色滤光片包括N个子滤光片,所述像素阵列包括多个像素,所述像素阵列的像素与所述滤光片阵列的子滤光片对应设置,所述像素阵列被配置成用于光电转换通过所述滤光片阵列的被摄对象的给定集合的场景的光线,并基于所述被摄对象的给定集合的场景的光线生成图像数据;所述图像获取方法包括:通过多种图像输出模式中的至少一种输出模式生成图像,所述多种图像输出模式包括,根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,所述第一图像的第一分辨率与所述像素阵列的分辨率相同;或根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,所述第三图像的第三分辨率小于所述第一图像的第一分辨率,所述第三分辨率为所述第一分辨率的1/(M*N),所述M为大于或等于2的正整数,所述N为大于或等于2的正整数。
  61. 根据权利要求60所述的一种图像获取方法,其特征在于,所述多种图像输出模式还包括,根据所述像素阵列 的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,所述第二图像的第二分辨率小于所述第一图像的第一分辨率,所述第二分辨率为所述第一分辨率的1/N。
  62. 一种图像获取方法,其特征在于,应用于图像传感器,所述图像传感器包括滤光片阵列和像素阵列,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,一个所述滤光片组内包括的所述彩色滤光片和所述全色滤光片共M个,所述彩色滤光片的光谱响应波长范围小于所述全色滤光片光谱响应波长范围,所述彩色滤光片包括N个子滤光片,所述全色滤光片包括N个子滤光片,所述像素阵列包括多个像素,所述像素阵列的像素与所述滤光片阵列的子滤光片对应设置,所述像素阵列被配置成用于光电转换通过所述滤光片阵列的被摄对象的给定集合的场景的光线,并基于所述被摄对象的给定集合的场景的光线生成图像数据;所述图像获取方法包括:通过多种图像输出模式中的至少一种输出模式生成图像,所述多种图像输出模式包括,根据所述像素阵列的所有像素读出信号生成第二分辨率的第二图像的第二输出模式,所述第二图像的第二分辨率小于所述像素阵列的分辨率,所述第二分辨率为所述像素阵列的分辨率的1/N,所述N为大于或等于2的正整数;或根据所述像素阵列的所有像素读出信号生成第三分辨率的第三图像的第三输出模式,所述第三图像的第三分辨率小于所述第二图像的第二分辨率,所述第三分辨率为所述第二分辨率的1/M,所述M为大于或等于2的正整数。
  63. 根据权利要求62所述的一种图像获取方法,其特征在于,所述多种图像输出模式还包括,根据所述像素阵列的所有像素读出信号生成第一分辨率的第一图像的第一输出模式,所述第一图像的第一分辨率与所述像素阵列的分辨率相同。
  64. 一种图像传感器,其特征在于,所述图像传感器包括滤光片阵列、像素阵列和读出电路,所述滤光片阵列包括最小重复单元,所述最小重复单元包括多个滤光片组,所述滤光片组包括彩色滤光片和全色滤光片,所述彩色滤光片的透过的光线的波段的宽度小于所述全色滤光片透过的光线的波段的宽度,所述彩色滤光片包括具备相同光谱响应波长范围特性的N个第一子滤光片,所述全色滤光片均包括具备相同光谱响应波长范围特性的N个第二子滤光片,所述像素阵列包括多个像素,所述像素阵列的像素与所述滤光片阵列的子滤光片对应设置,所述读出电路与所述像素阵列电连接,所述读出电路用于控制所述像素阵列曝光、及所述像素的像素值的读取和输出,所述N为大于或等于2的正整数。
  65. 根据权利要求64所述的图像处理器,其特征在于,所述N为4或9或16。
  66. 根据权利要求64所述的图像传感器,其特征在于,所述滤光片组中,所述彩色滤光片与所述全色滤光片的比例关系为1:1或1:3或3:1或4:5或5:4。
  67. 根据权利要求64所述的图像传感器,其特征在于,多个滤光片组按照矩阵排列构成一个最小重复单元。
  68. 根据权利要求67所述的图像传感器,其特征在于,所述滤光片组为4个,4个所述滤光片组按照2x2矩阵排列构成一个最小重复单元,所述滤光片组呈矩形,在每个所述滤光片组中,所述全色滤光片设置在第一对角线方向,所述彩色滤光片设置在第二对角线方向,所述第一对角线方向与所述第二对角线方向不同。
  69. 根据权利要求68所述的图像传感器,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100015
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  70. 根据权利要求68所述的图像传感器,其特征在于,所述最小重复单元为8行8列64个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100016
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  71. 根据权利要求67所述的图像传感器,其特征在于,所述滤光片组为4个,4个所述滤光片组按照2x2矩阵排列构成一个最小重复单元,所述滤光片组呈矩形,在每个所述滤光片组中,所述全色滤光片设置在第三对角线及第四对角线上,所述彩色滤光片设置在所述第三对角线方向或所述第四对角线方向,所述第三对角线方向与所述第四对角线方向不同。
  72. 根据权利要求71所述的图像传感器,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100017
    Figure PCTCN2021105464-appb-100018
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  73. 根据权利要求67所述的图像传感器,其特征在于,所述滤光片组为4个,4个所述滤光片组按照2x2矩阵排列构成一个最小重复单元,所述滤光片组呈矩形,在每个所述滤光片组中,所述彩色滤光片设置在第五对角线及第六对角线上,所述全色滤光片设置在所述第五对角线方向或所述第六对角线方向,所述第五对角线方向与所述第六对角线方向不同。
  74. 根据权利要求73所述的图像传感器,其特征在于,所述最小重复单元为12行12列144个所述子滤光片,排布方式为:
    Figure PCTCN2021105464-appb-100019
    其中,w表示全色子滤光片,a、b和c均表示彩色子滤光片。
  75. 根据权利要求69、70、72和74任一项所述的图像传感器,其特征在于,所述a为红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片、或黄色子滤光片中的任意一种;所述b为红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片、或黄色子滤光片中的任意一种;所述c为红色子滤光片、绿色子滤光片、蓝色子滤光片、品红色子滤光片、青色子滤光片、或黄色子滤光片中的任意一种。
  76. 根据权利要求75所述的图像传感器,其特征在于,所述b为红色子滤光片、所述a为绿色子滤光片、所述c为蓝色子滤光片;或者所述c为红色子滤光片、所述a为绿色子滤光片、所述b为蓝色子滤光片;或者所述b为品红色子滤光片、所述a为青色子滤光片、所述c为黄色子滤光片。
PCT/CN2021/105464 2020-10-09 2021-07-09 图像获取方法及装置、终端和计算机可读存储介质 WO2022073364A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21876886.9A EP4216534A4 (en) 2020-10-09 2021-07-09 METHOD AND DEVICE FOR RECEIVING IMAGES, TERMINAL AND COMPUTER-READABLE STORAGE MEDIUM
US18/193,134 US20230254553A1 (en) 2020-10-09 2023-03-30 Image obtaining method and apparatus, terminal, and computer-readable storage medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202022245405.5U CN213279832U (zh) 2020-10-09 2020-10-09 图像传感器、相机和终端
CN202011073863.3 2020-10-09
CN202011073863.3A CN112118378A (zh) 2020-10-09 2020-10-09 图像获取方法及装置、终端和计算机可读存储介质
CN202022245405.5 2020-10-09

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/193,134 Continuation US20230254553A1 (en) 2020-10-09 2023-03-30 Image obtaining method and apparatus, terminal, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
WO2022073364A1 true WO2022073364A1 (zh) 2022-04-14

Family

ID=81126325

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/105464 WO2022073364A1 (zh) 2020-10-09 2021-07-09 图像获取方法及装置、终端和计算机可读存储介质

Country Status (3)

Country Link
US (1) US20230254553A1 (zh)
EP (1) EP4216534A4 (zh)
WO (1) WO2022073364A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024093518A1 (zh) * 2022-11-01 2024-05-10 荣耀终端有限公司 一种出图模式切换方法及相关设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101233762A (zh) * 2005-07-28 2008-07-30 伊斯曼柯达公司 具有改善的光敏感度的图像传感器
CN102369721A (zh) * 2009-03-10 2012-03-07 美商豪威科技股份有限公司 具有合成全色图像的彩色滤光器阵列(cfa)图像
CN102461175A (zh) * 2009-06-09 2012-05-16 全视科技有限公司 用于四通道彩色滤光片阵列的内插
CN104280803A (zh) * 2013-07-01 2015-01-14 全视科技有限公司 彩色滤光片阵列、彩色滤光片阵列设备及图像传感器
US20160150199A1 (en) * 2014-11-25 2016-05-26 Omnivision Technologies, Inc. Rgbc color filter array patterns to minimize color aliasing
US20180122046A1 (en) * 2015-06-24 2018-05-03 Tripurari Singh Method and system for robust and flexible extraction of image information using color filter arrays
CN110876027A (zh) * 2018-08-29 2020-03-10 三星电子株式会社 图像传感器和包括其的电子设备、以及图像缩放处理方法
CN111586323A (zh) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 图像传感器、控制方法、摄像头组件和移动终端
CN213279832U (zh) * 2020-10-09 2021-05-25 Oppo广东移动通信有限公司 图像传感器、相机和终端

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101233762A (zh) * 2005-07-28 2008-07-30 伊斯曼柯达公司 具有改善的光敏感度的图像传感器
CN102369721A (zh) * 2009-03-10 2012-03-07 美商豪威科技股份有限公司 具有合成全色图像的彩色滤光器阵列(cfa)图像
CN102461175A (zh) * 2009-06-09 2012-05-16 全视科技有限公司 用于四通道彩色滤光片阵列的内插
CN104280803A (zh) * 2013-07-01 2015-01-14 全视科技有限公司 彩色滤光片阵列、彩色滤光片阵列设备及图像传感器
US20160150199A1 (en) * 2014-11-25 2016-05-26 Omnivision Technologies, Inc. Rgbc color filter array patterns to minimize color aliasing
US20180122046A1 (en) * 2015-06-24 2018-05-03 Tripurari Singh Method and system for robust and flexible extraction of image information using color filter arrays
CN110876027A (zh) * 2018-08-29 2020-03-10 三星电子株式会社 图像传感器和包括其的电子设备、以及图像缩放处理方法
CN111586323A (zh) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 图像传感器、控制方法、摄像头组件和移动终端
CN213279832U (zh) * 2020-10-09 2021-05-25 Oppo广东移动通信有限公司 图像传感器、相机和终端

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4216534A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024093518A1 (zh) * 2022-11-01 2024-05-10 荣耀终端有限公司 一种出图模式切换方法及相关设备

Also Published As

Publication number Publication date
EP4216534A4 (en) 2024-01-24
EP4216534A1 (en) 2023-07-26
US20230254553A1 (en) 2023-08-10

Similar Documents

Publication Publication Date Title
CN213279832U (zh) 图像传感器、相机和终端
CN112118378A (zh) 图像获取方法及装置、终端和计算机可读存储介质
WO2021196554A1 (zh) 图像传感器、处理***及方法、电子设备和存储介质
TWI504257B (zh) 在產生數位影像中曝光像素群組
JP5330258B2 (ja) カラー画素およびパンクロマティック画素を有する画像の処理
US6982756B2 (en) Digital camera, image signal processing method and recording medium for the same
WO2021208593A1 (zh) 高动态范围图像处理***及方法、电子设备和存储介质
US8106976B2 (en) Peripheral light amount correction apparatus, peripheral light amount correction method, electronic information device, control program and readable recording medium
KR100580911B1 (ko) 화상합성방법 및 촬상장치
US8125543B2 (en) Solid-state imaging device and imaging apparatus with color correction based on light sensitivity detection
JP5345944B2 (ja) 低解像度画像の生成
JP2011010108A (ja) 撮像制御装置、撮像装置及び撮像制御方法
US9160937B2 (en) Signal processing apparatus and signal processing method, solid-state imaging apparatus, electronic information device, signal processing program, and computer readable storage medium
CN111711755B (zh) 图像处理方法及装置、终端和计算机可读存储介质
WO2009025825A1 (en) Image sensor having a color filter array with panchromatic checkerboard pattern
CN108419022A (zh) 控制方法、控制装置、计算机可读存储介质和计算机设备
CN113170061B (zh) 图像传感器、成像装置、电子设备、图像处理***及信号处理方法
CN113840067B (zh) 图像传感器、图像生成方法、装置和电子设备
JP2006157600A (ja) デジタルカメラ
US8937680B2 (en) Image pickup unit and image processing unit for image blur correction
US20230254553A1 (en) Image obtaining method and apparatus, terminal, and computer-readable storage medium
CN115280766B (zh) 图像传感器、成像装置、电子设备、图像处理***及信号处理方法
CN110602420B (zh) 相机、黑电平调整方法及装置
JP2006121165A (ja) 撮像装置、画像形成方法
KR20240082675A (ko) 이미지 신호 프로세서 및 그 이미지 센싱 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21876886

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021876886

Country of ref document: EP

Effective date: 20230417

NENP Non-entry into the national phase

Ref country code: DE