CN111726549A - Image sensor, electronic device, and chip - Google Patents

Image sensor, electronic device, and chip Download PDF

Info

Publication number
CN111726549A
CN111726549A CN202010605102.1A CN202010605102A CN111726549A CN 111726549 A CN111726549 A CN 111726549A CN 202010605102 A CN202010605102 A CN 202010605102A CN 111726549 A CN111726549 A CN 111726549A
Authority
CN
China
Prior art keywords
pixel
original
white
pixel data
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010605102.1A
Other languages
Chinese (zh)
Other versions
CN111726549B (en
Inventor
池文明
王炳文
李顺展
王磊
张玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Priority to CN202010605102.1A priority Critical patent/CN111726549B/en
Publication of CN111726549A publication Critical patent/CN111726549A/en
Application granted granted Critical
Publication of CN111726549B publication Critical patent/CN111726549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/702SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/71Charge-coupled device [CCD] sensors; Charge-transfer registers specially adapted for CCD sensors
    • H04N25/75Circuitry for providing, modifying or processing image signals from the pixel array

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Facsimile Heads (AREA)

Abstract

The embodiment of the application provides an image sensor, electronic equipment and a chip. The image sensor comprises a pixel array and a readout circuit, wherein the pixel array comprises a plurality of sub-pixel arrays, each sub-pixel array comprises a plurality of pixel groups arranged in a matrix, each pixel group comprises a color pixel of one color and at least one white pixel, and each pixel corresponds to one photoelectric conversion area; the readout circuit is configured to acquire charges in photoelectric conversion regions of pixels in the sub-pixel array, and generate first original pixel data and second original pixel data from the acquired charges. In addition, the reading circuit correspondingly arranged is adopted to generate original pixel data containing different information, and the two kinds of original pixel data are used for later-stage image processing, so that a low-illumination image with better performance can be obtained.

Description

Image sensor, electronic device, and chip
Technical Field
The embodiment of the application relates to the field of image processing, in particular to an image sensor, electronic equipment and a chip.
Background
In recent years, miniaturization and pixel-increasing image sensors have pixels becoming increasingly hyperfine, and image sensors having high-density pixels are essential for capturing high-resolution images. However, since the image sensor is limited in size, the photosensitive area of each pixel is also limited in general, resulting in a limitation in photographing performance in a low-light environment.
For this reason, an image sensor having a quad-Bayer (quad-Bayer) arrangement of each color of RGB as shown in fig. 1 is proposed in the related art. The image sensor can combine four adjacent same-color pixels into one pixel for use in a low-illumination or high-frame-rate application scene, so as to achieve the effect of increasing the photosensitive area of each pixel. However, in this image sensor, in order to accumulate light of a specific color wavelength individually for each pixel unit, a color filter is provided that allows only light in a specific wavelength range to pass through, resulting in a loss of the amount of light entering the pixel unit in the image sensor, affecting the imaging performance of the image sensor in a low-light environment.
Disclosure of Invention
An object of the present invention is to provide an image sensor, an electronic device and a chip, which overcome all or part of the above-mentioned disadvantages.
In a first aspect, an embodiment of the present application provides an image sensor, which includes a pixel array and a readout circuit;
the pixel array comprises a plurality of sub-pixel arrays, each sub-pixel array comprises a plurality of pixel groups arranged in a matrix, each pixel group comprises a color pixel of one color and at least one white pixel, and each pixel corresponds to one photoelectric conversion region;
a readout circuit for acquiring charges in photoelectric conversion regions of pixels in the sub-pixel array, and generating at least one first original pixel data and at least one second original pixel data from the acquired charges, the first original pixel data including original color pixel data corresponding to at least one of the color pixels, and the second original pixel data including original white pixel data corresponding to at least one of the white pixels.
In a second aspect, embodiments of the present application provide an image sensor, which includes a pixel array and a readout circuit;
the pixel array comprises a plurality of sub-pixel arrays, wherein each sub-pixel array comprises a red pixel group, a green pixel group, a white pixel group and a blue pixel group which are arranged in a matrix, each pixel group comprises P multiplied by P pixels with the same color, each pixel corresponds to one photoelectric conversion area, and P is an integer greater than or equal to 2;
the readout circuit is configured to acquire charges in photoelectric conversion regions of pixels in the sub-pixel array, and generate at least one first original pixel data and at least one second original pixel data according to the readout charges, where the first original pixel data includes at least original red pixel data corresponding to the red pixel, original green pixel data corresponding to the green pixel, original white pixel data corresponding to the white pixel, or original blue pixel data corresponding to the blue pixel; the second original pixel data includes original white pixel data corresponding to the white pixel.
In a third aspect, embodiments provide an electronic device comprising an image signal processor and an image sensor according to any one of claims 1-8, the image signal processor being configured to:
respectively carrying out image processing on at least one first original pixel data and at least one second original pixel data output by the image sensor to obtain a color image and a white image;
and carrying out fusion processing on the color image and the white image to obtain a fused image.
In a fourth aspect, embodiments provide a chip comprising an image sensor according to any one of claims 1-8.
In the embodiment of the application, in each pixel group of the sub-pixel array in the pixel array, besides the color pixels, at least one white pixel is arranged to increase the light incoming amount of the whole image sensor, so that the whole light sensing capability of the image sensor is improved, and the imaging performance of the image sensor in a low-light scene is further improved; furthermore, with the readout circuits correspondingly arranged, the first original pixel data and the second original pixel data can be generated, and since the first original pixel data includes the original color pixel data corresponding to at least one color pixel and the second original pixel data includes the original white pixel data corresponding to at least one white pixel, the first original pixel data and the second original pixel data containing two different kinds of information are used in the post-image processing in combination, so that a low-light image with better performance can be obtained, that is, the imaging performance of the image sensor in a low-light scene can be further improved.
Drawings
Some specific embodiments of the present application will be described in detail hereinafter by way of illustration and not limitation with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
FIG. 1 is a plan view of a subpixel array of a typical image sensor;
fig. 2 is a schematic block diagram of an image sensor provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of an image sensor in a first image readout mode according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of another image sensor in a first image readout mode according to an embodiment of the present application;
fig. 5 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 7 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 8 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 9 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 10 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 11 is a schematic block diagram of another image sensor provided in an embodiment of the present application;
fig. 12 is a schematic diagram illustrating a relationship between data read out by a sub-pixel array and a readout circuit of a further image sensor in a first image readout mode according to an embodiment of the present application;
fig. 13 is a schematic block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
With respect to the image sensor of the quad bayer arrangement proposed in the related art, in order to accumulate light rays of a specific color wavelength individually per pixel unit, a color filter is provided, which allows only light rays within a specific wavelength range to pass through, so that the amount of light entering a pixel unit in the image sensor is lost, resulting in poor imaging performance of the image sensor in a low light environment.
In the embodiment of the application, in each pixel group of the sub-pixel array in the pixel array, besides the color pixels, at least one white pixel is arranged to increase the light incoming amount of the whole image sensor, so that the whole light sensing capability of the image sensor is improved, and the imaging performance of the image sensor in a low-light scene is further improved; furthermore, with the readout circuits correspondingly arranged, the first original pixel data and the second original pixel data can be generated, and since the first original pixel data includes the original color pixel data corresponding to at least one color pixel and the second original pixel data includes the original white pixel data corresponding to at least one white pixel, the first original pixel data and the second original pixel data containing two different kinds of information are used in the post-image processing in combination, so that a low-light image with better performance can be obtained, that is, the imaging performance of the image sensor in a low-light scene can be further improved.
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application.
Fig. 2 is a schematic block diagram of an image sensor provided in an embodiment of the present application. As shown in fig. 2, the image sensor includes a pixel array 20, the pixel array 20 includes a plurality of sub-pixel arrays 201, the pixel array 20 is formed by the sub-pixel arrays 201 periodically and repeatedly, and the sub-pixel arrays 201 can periodically and repeatedly cover the whole effective pixel area. The sub-pixel array 201 includes a plurality of pixel groups 201a arranged in a matrix. Each pixel group 201a includes a color pixel of one color and at least one white pixel, each pixel corresponding to one photoelectric conversion region.
The color pixels may include red pixels, green pixels, or blue pixels, among others. Accordingly, each pixel group may be composed of a red pixel and a white pixel, or may be composed of a green pixel and a white pixel, or may be composed of a blue pixel and a white pixel, respectively. The red pixels receive red light in the visible light and accumulate charges in their respective photoelectric conversion regions, the green pixels receive green light in the visible light and accumulate charges in their respective photoelectric conversion regions, and the blue pixels receive blue light in the visible light and accumulate charges in their respective photoelectric conversion regions. The white pixels receive light in substantially all wavelength bands of visible light and accumulate charge within their respective photoelectric conversion regions. Because the white pixel can receive light rays with wider spectrum, under the same illumination condition, the number of photons received by the white pixel is larger than the number of photons received by the red pixel, the green pixel and the blue pixel, and therefore the white pixel is arranged in each pixel group, and the whole light inlet quantity of the image sensor can be increased.
As shown with continued reference to fig. 2, the image sensor further includes a readout circuit 21, the readout circuit 21 being configured to acquire charges in photoelectric conversion regions of pixels in the sub-pixel array, and to generate at least one first raw pixel data and at least one second raw pixel data from the acquired charges. The first raw pixel data includes raw color pixel data corresponding to at least one color pixel, and the second raw pixel data includes raw white pixel data corresponding to at least one white pixel.
The at least one first original pixel data may include only original color pixel data corresponding to a color pixel, or may include original color pixel data corresponding to a color pixel and original white pixel data corresponding to a white pixel. One first original pixel data may correspond to one color pixel or one white pixel, and may also correspond to all color pixels of the same color in one pixel group. The second original pixel data includes only original white pixel data corresponding to a white pixel, and one second original pixel data may correspond to one white pixel or all white pixels in one sub-pixel array. Since the readout circuit 21 can generate the first original pixel data and the second original pixel data containing two different kinds of information, and the first original pixel data and the second original pixel data are used in combination for the post-image processing, a low-light image with better performance can be obtained.
In the embodiment of the application, besides the color pixels, at least one white pixel is arranged in each pixel group of the sub-pixel array in the pixel array to increase the light entering amount of the whole image sensor, so that the whole light sensing capability of the image sensor is improved, and the imaging performance of the image sensor in a low-light scene is further improved; furthermore, with the readout circuits provided correspondingly, the first original pixel data and the second original pixel data can be generated, and since the first original pixel data includes the original color pixel data corresponding to at least one color pixel and the second original pixel data includes the original white pixel data corresponding to at least one white pixel, the first original pixel data and the second original pixel data containing different information are used in combination for the post-image processing, so that a low-light image with better performance can be obtained, that is, the imaging performance of the image sensor in a low-light scene can be further improved.
Based on the image sensor shown in fig. 2, further, a specific implementation of the image sensor is provided. In the image sensor, a sub-pixel array periodically and repeatedly covering the whole effective pixel area includes a plurality of pixel groups arranged in a 2 × 2 matrix, each pixel group including N × N pixels including M white pixels and (N × N-M) color pixels of the same color, where N is an integer greater than or equal to 2 and M is an integer greater than or equal to 1.
Specifically, for convenience of description, a plurality of pixel groups arranged in a matrix of 2 × 2 may be respectively represented as a first pixel group, a second pixel group, a third pixel group, and a fourth pixel group, the first pixel group may include (N × N-M) red pixels and M white pixels, the second pixel group may include (N × N-M) green pixels and M white pixels, the third pixel group may be the same as the second pixel group, including (N × N-M) green pixels and M white pixels, and the fourth pixel group may include (N × N-M) blue pixels and M white pixels. Wherein the first pixel group and the fourth pixel group are arranged diagonally, and the second pixel group and the third pixel group may be respectively located at the right side and the lower side of the first pixel group. Alternatively, the first pixel group and the fourth pixel group may form a diagonal arrangement, the second pixel group and the third pixel group may be respectively located at the left side and the upper side of the first pixel group, or the first pixel group and the fourth pixel group may form a diagonal arrangement, the second pixel group and the third pixel group may be respectively located at the left side and the lower side of the first pixel group, or the first pixel group and the fourth pixel group may form a diagonal arrangement, and the second pixel group and the third pixel group may be respectively located at the upper side and the right side of the first pixel group.
Further, a readout circuit provided in correspondence with the image sensor acquires accumulated charges in photoelectric conversion regions of (N × N-M) color pixels in each pixel group in the sub-pixel array in a first image readout mode, generates first original pixel data based on the accumulated charges, at least one of the first original pixel data being arranged in a 2 × 2 matrix; and acquiring accumulated charges in photoelectric conversion regions of 4 × M white pixels in the sub-pixel array, and generating second original pixel data from the accumulated charges.
Illustratively, in the first image readout mode, the readout circuits respectively read out and accumulate the charges in the photoelectric conversion regions of the (N × N-M) red pixels in the first data group, and generate one first original pixel data from the accumulated charges, the first original pixel data including the original red pixel data R1; in a similar manner, the readout circuit reads out and accumulates the charges in the photoelectric conversion regions of the (N × N-M) green pixels in the second pixel group, respectively, and generates one first original pixel data including the original green pixel data G1, based on the accumulated charges; the readout circuit reads out and accumulates the charges in the photoelectric conversion regions of the (N × N-M) green pixels in the third pixel group, respectively, and generates one first original pixel data including the original green pixel data G2, based on the accumulated charges; the readout circuit reads out and accumulates the charges in the photoelectric conversion regions of the (N × N-M) blue pixels in the second pixel group, respectively, and generates one first original pixel data including the original blue pixel data B1 from the accumulated charges.
In the case where the first pixel group and the fourth pixel group are formed in a diagonal arrangement, and the second pixel group and the third pixel group may be respectively located at the right and lower sides of the first pixel group, the four first original pixel data are arranged in a 2 × 2 matrix, which may be expressed as
Figure BDA0002560735980000061
Further, the readout circuit reads and accumulates the charges in the photoelectric conversion regions of a total of 4 × M white pixels in the sub-pixel array, respectively, and generates one second original pixel data based on the accumulated charges, the second original pixel data including the original white pixel data W1 — the second original pixel data may be represented as W1.
In the present embodiment, since the readout circuit combines the electric charges in the photoelectric conversion regions of (N × N-M) color pixels of the same color in each pixel group to thereby output one first original pixel data, and combines the electric charges in the photoelectric conversion regions of 4 × M white pixels to thereby output one second original pixel data, the first image readout mode may also be referred to as a pixel combination mode. In the pixel combination mode, signals corresponding to a plurality of pixels are combined and output, so that the signal-to-noise ratio of an imaging image is improved, and the method is suitable for low-illumination scenes.
It should be noted that, in the above-described embodiment, the readout circuit first reads out the charges in the photoelectric conversion regions of the color pixels of the same color in each pixel group, respectively, and then accumulates, and generates the first original pixel data based on the accumulated charges, and reads out the charges in the photoelectric conversion regions of all the white pixels in the sub-pixel array, respectively, and then accumulates, and generates the second original pixel data based on the accumulated charges. However, in other embodiments, the readout circuit may first accumulate the charges in the photoelectric conversion regions of the color pixels of the same color in each pixel group, and then read out, and generate the first original pixel data according to the read out charges, and may accumulate the charges in the photoelectric conversion regions of all the white pixels in the sub-pixel array, and then read out, and generate the second original pixel data according to the read out charges, which is not limited in this embodiment.
Further, a readout circuit provided in correspondence with the image sensor, in a second image readout mode, respectively acquires electric charges in a photoelectric conversion region of each pixel in the sub-pixel array, generates first original pixel data based on the acquired electric charges, at least one of the first original pixel data being arranged in a 2N × 2N matrix; and acquiring charges in the photoelectric conversion region of each white pixel in the sub-pixel array, and generating second original pixel data from the acquired charges.
Illustratively, in the second image readout mode, the readout circuit respectively reads the charges in the photoelectric conversion regions of the 2N × 2N pixels in the sub-pixel array, and respectively generates first original pixel data corresponding to each pixel from the charges read from each pixel, i.e., 2N × 2N first original pixel data can be obtained. The 2N × 2N first raw pixel data may be arranged in a 2N × 2N matrix corresponding to the pixels in the sub-pixel array. The sub-pixel array includes color pixels and white pixels, and accordingly, the 2N × 2N first original pixel data may include original color pixel data and original white pixel data. Since the first raw pixel data corresponds to each pixel in the sub-pixel array, the second image readout mode may be referred to as a full-resolution image readout mode. In the second image readout mode, the readout circuits further respectively read out charges of photoelectric conversion regions of 4 × M white pixels in the sub-pixel array, and respectively generate second original pixel data corresponding to each white pixel according to the charges read from each white pixel, that is, 4 × M second original pixel data can be obtained, and the 4 × M second original pixel data can be re-spliced as needed to be arranged in a matrix.
For example, when N is 2 and M is 1, in a case where the first pixel group and the fourth pixel group are diagonally arranged, the second pixel group and the third pixel group may be respectively located at the right side and the lower side of the first pixel group, and in a case where the white pixel is located at the lower right corner in each pixel group, 2N × 2N first original pixel data output from the readout circuit may be represented as 2N first original pixel data
Figure BDA0002560735980000081
Wherein R1 to R3 respectively represent first original pixel data corresponding to each red pixel in the first pixel group, the first original pixel data including original red pixel data, Gr1 to Gr3 and Gb1 to Gb3 respectively represent first original pixel data corresponding to each green pixel in the second pixel group and the third pixel group, the th pixel groupA raw pixel data includes raw green pixel data, B1-B3 respectively represent first raw pixel data corresponding to each blue pixel in a fourth pixel group, the first raw pixel data includes raw blue pixel data, W1-W4 respectively represent first raw pixel data corresponding to white pixels in the first, second, third and fourth pixel groups, the first raw pixel data may be raw white pixel data, furthermore, 4 × M second raw pixel data output by the readout circuit may be represented as raw white pixel data
Figure BDA0002560735980000082
W1 to W4 respectively indicate second original pixel data corresponding to one white pixel in the first pixel group, the second pixel group, the third pixel group, and the fourth pixel group, and the second original pixel data corresponding to the white pixel is the same as the first original pixel data corresponding to the white pixel.
In this embodiment, the readout circuit outputs the first original pixel data corresponding to each pixel in the pixel array, that is, directly outputs the full-frame original pixel data without pixel combination, so that an image with a higher resolution can be obtained, and the method is suitable for a normal light intensity scene.
Based on the image sensor provided in the foregoing embodiment, further, the readout circuit is further configured to receive a control signal, and operate in the first image readout mode or the second image readout mode according to the control signal.
Wherein the control signal is automatically sent by the electronic device of the image sensor according to the user setting or the current ambient light intensity detected by the electronic device. For example, when the current scene is in a low-light scene, control information indicating that the readout circuit works in the first image readout mode is output to the readout circuit, so that the readout circuit combines a plurality of pixels into one pixel to output original pixel data in the first image readout mode, and an imaging image with a higher signal-to-noise ratio can be obtained. When the current scene is in a normal light intensity scene, control information indicating that the reading circuit works in the second image reading mode is output to the reading circuit, so that the reading circuit works in the second image reading mode, the original pixel data of the full frame is directly output, and an imaging image with higher resolution is obtained.
For a better understanding of the present application, the relationship between the sub-pixel arrays of the plurality of image sensors and the data read out by the readout circuit is described below with reference to fig. 3 to 10. Since the readout circuit directly outputs full-frame original pixel data in the second image readout mode, and the pixels in the sub-pixel array correspond to the original pixel data read out by the readout circuit one to one, it is easy to understand that the following description is mainly made in detail with respect to the relationship between the pixels in the sub-pixel array and the data read out by the readout circuit in the first image readout mode.
Example 1
Referring to fig. 3, the sub-pixel array includes four pixel groups arranged in a 2 × 2 matrix, a first pixel group 31 includes 3 red pixels 311 and 1 white pixel 312, a second pixel group 32 includes 3 green pixels 231 and 1 white pixel 322, a third pixel group 33 includes 3 green pixels 331 and 1 white pixel 332, and a fourth pixel group 34 includes 3 red pixels 341 and 1 white pixel 342. The white pixels 312, 322, 332, 342 are located in the lower right corner of the respective pixel groups.
In the first image readout mode, the readout circuit acquires the charges in the photoelectric conversion regions of the 3 red pixels 312 in the first pixel group 31, and generates one original red pixel data from the total charges in the 3 photoelectric conversion regions. In a similar manner, the readout circuit generates two raw green pixel data and one raw blue pixel data. Further, the readout circuit acquires the electric charges in the photoelectric conversion regions of the total of 4 white pixels in the above four pixel groups, and generates one original white pixel data from the total electric charges in these 4 photoelectric conversion regions. In the first image readout mode, since 3 color pixels are combined to output first original color pixel data and 4 white pixels are combined to output one original white pixel data, an image with a higher signal-to-noise ratio can be obtained, which is suitable for a low-light scene.
In the second image readout mode, the readout circuit acquires the electric charges in the photoelectric conversion region of each pixel in the sub-pixel array, respectively, generating first original pixel data arranged in a 4 × 4 matrix and one second original pixel data arranged in a 2 × 2 matrix. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example two
Referring to fig. 4, the sub-pixel array is different from the sub-pixel array in the first example: the position of the white pixel is different. Specifically, in the first pixel group 41, the white pixel 412 is located at the lower left corner, in the second pixel group 42, the white pixel 422 is located at the lower right corner, in the third pixel group 43, the white pixel 432 is located at the upper left corner, and in the fourth pixel group 44, the white pixel 442 is located at the upper right corner.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. Referring to fig. 4, in the first image readout mode, since 3 color pixels are combined to output first original color pixel data and 4 white pixels are combined to output one original white pixel data, an image with a higher signal-to-noise ratio can be obtained, which is suitable for a low-light scene. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example three
Referring to fig. 5, the sub-pixel array includes four pixel groups arranged in a 3 × 3 matrix, a first pixel group 51 includes 7 red pixels 511 and 1 white pixel 512, a second pixel group 52 includes 7 green pixels 521 and 1 white pixel 522, a third pixel group 53 includes 7 green pixels 531 and 1 white pixel 532, and a fourth pixel group 54 includes 7 blue pixels 541 and 1 white pixel 542, the white pixel 512 is located at a lower left corner in the first pixel group 51, the white pixel 522 is located at a lower right corner in the second pixel group 52, the white pixel 532 is located at an upper left corner in the third pixel group 53, and the white pixel 542 is located at an upper right corner in the fourth pixel group 54.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. As can be seen from fig. 5, in the first image readout mode, since 8 color pixels are combined to output the first original color pixel data and 4 white pixels are combined to output one original white pixel data, a color image with a higher signal-to-noise ratio can be obtained as compared with the sub-pixel arrays in example one and example two, which is suitable for a low-light scene. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example four
Referring to fig. 6, the sub-pixel array is different from the sub-pixel array in example three only in that: each pixel group includes 2 white pixels, and the positions where the white pixels are located are different. Specifically, the first pixel group 61 includes 6 red pixels 611 and 2 white pixels 612, the second pixel group 62 includes 7 green pixels 621 and 1 white pixel 622, the third pixel group 63 includes 7 green pixels 631 and 1 white pixel 632, the fourth pixel group 64 includes 7 blue pixels 641 and 1 white pixel 642, the white pixel 621 is located at the lower left corner in the first pixel group 61, the white pixel 622 is located at the lower right corner in the second pixel group 62, the white pixel 632 is located at the upper left corner in the third pixel group 63, and the white pixel 642 is located at the upper right corner in the fourth pixel group 64.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. Since the proportion of white pixels is increased in this example, a higher signal-to-noise ratio image can be obtained compared to the sub-pixel array in example three. Also, as can be seen from fig. 6, in the first image readout mode, since 7 color pixels are combined to output the first original color pixel data and 8 white pixels are combined to output one original white pixel data, a color image with a higher signal-to-noise ratio can be obtained as compared with the sub-pixel array in the foregoing example, which is suitable for a low-light scene. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example five
Referring to fig. 7, the sub-pixel array is similar to the sub-pixel array in example three except that: each pixel group includes 4 white pixels, and in each pixel group, the 4 white pixels are located at the lower right corner of the respective pixel group.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. Since the number of white pixels is increased in this example, an image with a higher signal-to-noise ratio can be obtained compared to the sub-pixel array in example three. And referring to fig. 7, in the first image readout mode, since 5 color pixels are combined, the first original color pixel data is output, and 16 white pixels are combined, one original white pixel data is output, which makes it possible to obtain a color image with a higher signal-to-noise ratio than the sub-pixel array in the foregoing example, suitable for a low-light scene. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example six
Referring to fig. 8, the sub-pixel array includes four pixel groups arranged in a 4 × 4 matrix, a first pixel group 81 including 12 red pixels 811 and 4 white pixels 812, a second pixel group 82 including 12 green pixels 821 and 4 white pixels 822, a third pixel group 83 including 12 green pixels 831 and 4 white pixels 832, and a fourth pixel group 84 including 12 blue pixels 841 and 4 white pixels 842. In each pixel group, 4 white pixels are located at the center of each pixel group.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. Referring to fig. 8, in the first image readout mode, since 12 color pixels are combined to output first original color pixel data and 16 white pixels are combined to output one original white pixel data, it is possible to obtain a color image with a higher signal-to-noise ratio suitable for a low-light scene. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example seven
Referring to fig. 9, the sub-pixel array is similar to the sub-pixel array in example six except that the white pixel is located differently. Specifically, 4 white pixels (912,922,932,942) are located at the four corners of each pixel group.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. As can be seen from fig. 9, since the white pixels have the same occupation ratio as in example six, in the first image readout mode, since 12 color pixels (911,921,931,941) are combined, the first original color pixel data is output, and 16 white pixels (912,922,932,942) are combined, one original white pixel data is output, which makes it possible to obtain a color image with a signal-to-noise ratio similar to that in example six, suitable for low-light scenes. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Example eight
Referring to fig. 10, the sub-pixel array is different from the sub-pixel array in the sixth embodiment in that: the number of white pixels is different and the position of the white pixels is different. Specifically, in each pixel group, 4 white pixels (1012, 1022,1032,1042) are located at the center of the pixel group and 4 white pixels (1012, 1022,1032,1042) are located at the four corners of the pixel group.
Since the operation of the output circuit corresponding to the sub-pixel array in the first image readout mode and the second image readout mode is the same as that in example one, the description is omitted here for the sake of brevity. As can be seen from fig. 10, since the ratio of white pixels is increased, an image with a higher signal-to-noise ratio can be output compared to example six and example seven. Also, in the first image readout mode, since 8 color pixels are combined (1011, 1021,1031,1041) and the first original color pixel data is output, and 32 white pixels (1012, 1022,1032,1042) are combined and one original white pixel data is output, it is possible to output a color image with a higher signal-to-noise ratio as compared with embodiment six and embodiment seven, and it is suitable for a low-light scene. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
It should be noted that the above-mentioned examples one to eight are merely examples, and the present application is not limited thereto.
Referring to fig. 11, fig. 11 is a schematic block diagram of an image sensor provided in an embodiment of the present application. As shown in fig. 11, the image sensor includes a pixel array 110, the pixel array includes a plurality of sub-pixel arrays 1101, the pixel array 110 is formed by periodically repeating the sub-pixel arrays 1101, and the sub-pixel arrays 1101 can be periodically repeatedly distributed over the entire effective pixel area. The sub-pixel array 1101 includes a red pixel group 1101a, a green pixel group 1101b, a white pixel group 1101c, and a blue pixel group 1101d arranged in a matrix, each pixel group (1101a, 1101b, 1101c, 1101d) including P × P pixels of the same color, each pixel corresponding to one photoelectric conversion region, where P is an integer greater than or equal to 2.
Specifically, the red pixel group 1101a includes P × P red pixels, the green pixel group 1101b includes P × P green pixels, the white pixel group 1101c includes P × P white pixels, and the blue pixel group 1101d includes P × P blue pixels. Since the red pixel receives red light in visible light and accumulates charges in its corresponding photoelectric conversion region, the green pixel receives green light in visible light and accumulates charges in its corresponding photoelectric conversion region, the blue pixel receives blue light in visible light and accumulates charges in its corresponding photoelectric conversion region, and the white pixel receives light of substantially all wavelength bands of visible light and accumulates charges in its corresponding photoelectric conversion region, the number of photons received by the white pixel is larger than the number of photons received by the red pixel, the green pixel, and the blue pixel under the same illumination condition, and therefore, in the sub-pixel array 1101 of the image sensor, the white pixel group 1101c is included in addition to the red pixel group 1101a, the green pixel group 1101b, and the blue pixel group 1101d, so that the overall light entering amount of the image sensor is increased and the light sensing capability is improved, and further, the image sensor has better imaging performance in a low-light scene.
It should be noted that, in the sub-pixel array 1101, the positions of the red pixel group 1101a, the green pixel group 1101b, the white pixel group 1101c and the blue pixel group 1101d may be interchanged, for example, the red pixel group 1101a and the blue pixel group 1101d are diagonally arranged, the green pixel group 1101b and the white pixel group 1101c may be respectively located at the right side and the lower side of the red pixel group 1101a, the red pixel group 1101a and the blue pixel group 1101d may be respectively located at the left side and the upper side, and the like, which is not limited in this embodiment.
As shown with continued reference to fig. 11, the image sensor further includes a readout circuit 111, the readout circuit 111 is configured to acquire charges in photoelectric conversion regions of pixels in the sub-pixel array 1101, and generate at least one first original pixel data and at least one second original pixel data according to the read-out charges, the first original pixel data including original red pixel data corresponding to a red pixel, original green pixel data corresponding to a green pixel, original white pixel data corresponding to a white pixel, or original blue pixel data corresponding to a blue pixel; the second original pixel data includes original white pixel data corresponding to at least one white pixel. The second original pixel data may be acquired from the first original pixel data, for example, acquiring the first original pixel data corresponding to at least one white pixel as the second original pixel data.
Since the readout circuit 111 can generate the first original pixel data and the second original pixel data containing two different kinds of information, and the first original pixel data and the second original pixel data are used in combination for the post-image processing, a low-light image with better performance can be obtained.
In the embodiment of the application, in the sub-pixel array of the image sensor, the white pixel group is arranged besides the red pixel group, the green pixel group and the blue pixel group to increase the light incoming quantity of the whole image sensor, so that the whole light sensing capacity of the image sensor is improved, and the imaging performance of the image sensor in a low-illumination scene is further improved; furthermore, with the readout circuits arranged correspondingly, the first original pixel data and the second original pixel data can be generated, and since the first original pixel data includes the original color pixel data corresponding to at least one color pixel and the second original pixel data includes the original white pixel data corresponding to at least one white pixel, the first original pixel data and the second original pixel data containing different information are combined for the post-image processing, so that a low-light image with better performance can be obtained, that is, the imaging performance of the image sensor in a low-light scene can be further improved.
Based on the embodiment shown in fig. 11, further, a specific implementation of the image sensor is provided. In the image sensor, a sub-pixel array periodically repeating throughout the entire effective pixel area includes a red pixel group, a green pixel group, a white pixel group, and a blue pixel group arranged in a 2 × 2 matrix.
Further, a readout circuit corresponding to the image sensor acquires accumulated charges in photoelectric conversion regions of P × P pixels in each pixel group in the sub-pixel array in a first image readout mode, and generates first original pixel data based on the accumulated charges, the first original pixel data being arranged in a 2 × 2 matrix; and acquiring accumulated charges in photoelectric conversion regions of the P × P white pixels in the white pixel group in the sub-pixel array, and generating second original pixel data according to the accumulated charges.
Illustratively, in the first image readout mode, the readout circuits read out and accumulate the charges in the photoelectric conversion regions of the P × P red pixels in the red pixel group, respectively, and generate one first original pixel data from the accumulated charges, the first original pixel data including the original red pixel data R1; in a similar manner, the readout circuit has other three first raw pixel data including raw green pixel data G1, raw white pixel data W1, and raw blue pixel data B1, respectively.
In the case where the red pixel group is diagonally disposed from the blue pixel group, and the green pixel group and the white pixel group are respectively located at the right and lower sides of the red pixel group, a total of 4 first original pixel data corresponding to the sub-pixel array may be represented as
Figure BDA0002560735980000141
Further, the readout circuit additionally outputs the original white pixel data W1 as second original pixel data, which may be represented as W1.
In the present embodiment, since the readout circuits respectively combine the charges in the photoelectric conversion regions of P × P red pixels in the red pixel group to thereby output one piece of original red pixel data, combine the charges in the photoelectric conversion regions of P × P green pixels in the green pixel group to thereby output one piece of original green pixel data, combine the charges in the photoelectric conversion regions of P × P white pixels in the white pixel group to thereby output one piece of original white pixel data, and combine the charges in the photoelectric conversion regions of P × P blue pixels in the blue pixel group to thereby output one piece of original blue pixel data, the first image readout mode may also be referred to as a binning mode. In the pixel combination mode, the RGBW data of multiple pixels is output. Meanwhile, the image sensor may additionally output raw white pixel data corresponding to a white pixel group in the pixel array, that is, W data of a plurality of pixels in one. In the pixel combination mode, signals corresponding to a plurality of pixels are combined and output, so that the resolution of an imaged image is relatively reduced, but the signal-to-noise ratio of the imaged image is obviously improved, and the method is suitable for low-illumination scenes.
It should be noted that, in the above embodiment, the readout charges first read out the charges in the photoelectric conversion regions of all the pixels in each pixel group, and then are accumulated, and the first original pixel data is generated based on the accumulated charges. However, in other embodiments, the readout circuit may also accumulate the charges in the photoelectric conversion regions of all the pixels in each pixel group, and then read out the charges, so as to generate the first original pixel data according to the read charges, which is not limited in this embodiment.
Further, a readout circuit provided in correspondence with the image sensor, in a second image readout mode, in the second image readout mode, acquiring electric charges in photoelectric conversion regions of each pixel in the sub-pixel array, respectively, generating first original pixel data from the acquired electric charges, at least one of the first original pixel data being arranged in a 2P × 2P matrix; and acquiring charges in the photoelectric conversion region of each white pixel in the sub-pixel array, and generating second original pixel data from the acquired charges, the second original pixel data being arranged in a P × P matrix.
In this embodiment, the operation principle of the readout circuit in the second image readout mode is similar to that of the readout circuit in the second image readout mode in the embodiment shown in fig. 2, and is not described here again. Similar to the embodiment shown in fig. 2, in this embodiment, the readout circuit outputs the first original pixel data corresponding to each pixel in the pixel array in the second image readout mode, i.e. directly outputs the full-frame original pixel data without pixel combination, which can obtain a higher resolution image suitable for a normal light intensity scene.
Based on the image sensor provided in the foregoing embodiment, further, the readout circuit is further configured to receive a control signal, and operate in the first image readout mode or the second image readout mode according to the control signal.
Wherein the control signal is automatically sent by the electronic device of the image sensor according to the user setting or the current ambient light intensity detected by the electronic device. For example, when the current scene is in a low-light scene, control information indicating that the readout circuit works in the first image readout mode is output to the readout circuit, so that the readout circuit combines a plurality of pixels into one pixel to output original pixel data in the first image readout mode, and an imaging image with a higher signal-to-noise ratio can be obtained. When the current scene is in a normal light intensity scene, control information indicating that the reading circuit works in the second image reading mode is output to the reading circuit, so that the reading circuit works in the second image reading mode, the original pixel data of the full frame is directly output, and an imaging image with higher resolution is obtained.
In order to facilitate understanding of the present application, a detailed description will be given below of a relationship between the sub-pixel array of still another image sensor and data read out by the readout circuit in the first image readout mode, with reference to fig. 12.
For a better understanding of the present application, the relationship between the sub-pixel array of an image sensor and the data read out by the readout circuitry is described below in conjunction with fig. 12. Since the readout circuit directly outputs full-frame original pixel data in the second image readout mode, and the pixels in the sub-pixel array correspond to the original pixel data read out by the readout circuit one to one, it is easy to understand that the following description is mainly made in detail with respect to the relationship between the pixels in the sub-pixel array and the data read out by the readout circuit in the first image readout mode.
Referring to fig. 12, the sub-pixel array includes a red pixel group 121, a green pixel group 122, a white pixel group 123 and a blue pixel group 124 arranged in a 2 × 2 matrix, the red pixel group 121 includes 2 × 2 red pixels, the green pixel group 122 includes 2 × 2 green pixels, the white pixel group 123 includes 2 × 2 white pixels, and the blue pixel group 124 includes 2 × 2 blue pixels.
In the first image readout mode, the readout circuit acquires the charges in the photoelectric conversion regions of the 4 red pixels in the red pixel group 1201, and generates one original red pixel data based on the total charges in the 4 photoelectric conversion regions; similarly, the readout circuit generates one original green pixel data, one original white pixel data, and one blue pixel data, which are arranged in a 2 × 2 matrix, constituting the first original pixel data. Since 4 pixels are combined into one, the resolution of the finally obtained imaging image is reduced to 1/4, however, since 4 pixels are combined into one, the signal-to-noise ratio of the imaging image is improved.
In the second image readout mode, the readout circuit acquires charges in photoelectric conversion regions of pixels in the sub-pixel array, respectively, generating first original pixel data arranged in a 4 × 4 matrix and one second original pixel data arranged in a 2 × 2 matrix. In the second image readout mode, since the original pixel data of the full frame is directly output, an image with higher resolution can be obtained, and the method is suitable for normal light intensity scenes.
Referring to fig. 13, fig. 13 is a schematic block diagram of an electronic device provided in the embodiment of the present application, and as shown in fig. 13, the electronic device 13 may include an image sensor 1301 and an image signal processor 1302, where the image sensor 1301 is an image sensor in the foregoing embodiment, and for the sake of brevity, details are not repeated here.
The image sensor 1301 may output first raw pixel data and second raw pixel data.
The image signal processor 1302 is configured to perform image processing on the first original pixel data and the second original pixel data output by the image sensor 1301, respectively, to obtain a color image and a white image; and carrying out fusion processing on the color image and the white image to obtain a fused image.
The image processing may include operations such as linear correction, color interpolation and correction, noise removal, white balance processing, and automatic exposure control. Because the first original pixel data and the second original pixel data are acquired by using the same chip, a color image obtained by performing image processing on the first original pixel data and a white image obtained by performing image processing on the second original pixel data have pixel-level alignment accuracy, and an image with higher signal-to-noise ratio can be acquired by adopting the color image and the white image through image fusion.
It should be noted that the electronic device may be, for example, a portable or mobile computing device such as a smart phone, a notebook computer, a tablet computer, a game device, or other electronic devices such as an electronic database, an automobile, and an Automated Teller Machine (ATM), which is not limited in this embodiment of the present application.
The embodiment of the application also provides a chip, and the chip comprises the image sensor provided by the embodiment.
It should be understood that the specific examples in the embodiments of the present application are for the purpose of promoting a better understanding of the embodiments of the present application and are not intended to limit the scope of the embodiments of the present application.
It is to be understood that the terminology used in the embodiments of the present application and the appended claims is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the present application. For example, as used in the examples of this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in this application, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. An image sensor comprising a pixel array and a readout circuit,
the pixel array comprises a plurality of sub-pixel arrays, each sub-pixel array comprises a plurality of pixel groups arranged in a matrix, each pixel group comprises a color pixel of one color and at least one white pixel, and each pixel corresponds to one photoelectric conversion region;
a readout circuit for acquiring charges in photoelectric conversion regions of pixels in the sub-pixel array, and generating at least one first original pixel data and at least one second original pixel data from the acquired charges, the first original pixel data including original color pixel data corresponding to at least one of the color pixels, and the second original pixel data including original white pixel data corresponding to at least one of the white pixels.
2. The image sensor according to claim 1, wherein the plurality of pixel groups are arranged in a 2 x 2 matrix, each pixel group including N x N pixels including M white pixels and (N x N-M) same-color pixels, where N is an integer greater than or equal to 2 and M is an integer greater than or equal to 1 and less than N.
3. The image sensor of claim 2, wherein the readout circuit is to: acquiring accumulated charges in photoelectric conversion regions of (N × N-M) color pixels of the same color in each pixel group in the sub-pixel array in a first image readout mode, generating the first original pixel data from the accumulated charges, at least one of the first original pixel data being arranged in a 2 × 2 matrix; and acquiring accumulated charges in photoelectric conversion regions of 4 × M white pixels in the sub-pixel array, and generating the second original pixel data according to the accumulated charges.
4. The image sensor of claim 2, wherein the readout circuit is to: acquiring charges in a photoelectric conversion region of each pixel in the sub-pixel array in a second image readout mode, and generating the first original pixel data from the acquired charges, at least one of the first original pixel data being arranged in a 2N × 2N matrix; and acquiring charges in a photoelectric conversion region of each white pixel in the sub-pixel array, the second original pixel data being generated from the acquired charges.
5. The image sensor of claim 1, the plurality of pixel groups comprising: the pixel array includes a first pixel group having a red pixel and at least one white pixel, a second pixel group having a green pixel and at least one white pixel, a third pixel group having a green pixel and at least one white pixel, and a fourth pixel group having a blue pixel and at least one white pixel, the number of the white pixels in each pixel group being the same.
6. An image sensor comprising a pixel array and a readout circuit,
the pixel array comprises a plurality of sub-pixel arrays, wherein each sub-pixel array comprises a red pixel group, a green pixel group, a white pixel group and a blue pixel group which are arranged in a matrix, each pixel group comprises P multiplied by P pixels with the same color, each pixel corresponds to one photoelectric conversion area, and P is an integer greater than or equal to 2;
the readout circuit is configured to acquire charges in photoelectric conversion regions of pixels in the sub-pixel array, and generate at least one first original pixel data and at least one second original pixel data according to the readout charges, where the first original pixel data includes original red pixel data corresponding to the red pixel, original green pixel data corresponding to the green pixel, original white pixel data corresponding to the white pixel, or original blue pixel data corresponding to the blue pixel; the second original pixel data includes original white pixel data corresponding to the white pixel.
7. The image sensor of claim 6, wherein the red pixel group, the green pixel group, the white pixel group, and the blue pixel group are arranged in a 2 x 2 matrix;
the readout circuit is to: in a first image readout mode, acquiring accumulated charges in photoelectric conversion regions of P × P pixels in each pixel group in the sub-pixel array, and generating first original pixel data according to the accumulated charges, at least one of the first original pixel data being arranged in a 2 × 2 matrix; and acquiring accumulated charges in photoelectric conversion regions of P × P white pixels in a white pixel group in the sub-pixel array, and generating the second original pixel data according to the accumulated charges.
8. The image sensor of claim 6, wherein the red pixel group, the green pixel group, the white pixel group, and the blue pixel group are arranged in a 2 x 2 matrix;
the readout circuit is to: acquiring charge corresponding to each pixel in the sub-pixel array in a second image reading mode, and generating first original pixel data according to the acquired charge, wherein at least one first original pixel data is arranged in a 2P multiplied by 2P matrix; and acquiring charges in a photoelectric conversion region of each white pixel in the sub-pixel array, and generating the second original pixel data according to the acquired charges, at least one of the second original pixel data being arranged in a P × P matrix.
9. The image sensor of any of claims 1-8, wherein the readout circuitry is further to: and receiving a control signal, and working in a first image reading mode or a second image reading mode according to the control signal.
10. An electronic device, comprising an image signal processor and an image sensor according to any one of claims 1-8, the image signal processor being configured to:
respectively carrying out image processing on at least one first original pixel data and at least one second original pixel data output by the image sensor to obtain a color image and a white image;
and carrying out fusion processing on the color image and the white image to obtain a fused image.
11. A chip characterized by comprising an image sensor according to any one of claims 1-8.
CN202010605102.1A 2020-06-29 2020-06-29 Image sensor, electronic device, and chip Active CN111726549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010605102.1A CN111726549B (en) 2020-06-29 2020-06-29 Image sensor, electronic device, and chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010605102.1A CN111726549B (en) 2020-06-29 2020-06-29 Image sensor, electronic device, and chip

Publications (2)

Publication Number Publication Date
CN111726549A true CN111726549A (en) 2020-09-29
CN111726549B CN111726549B (en) 2022-08-23

Family

ID=72569551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010605102.1A Active CN111726549B (en) 2020-06-29 2020-06-29 Image sensor, electronic device, and chip

Country Status (1)

Country Link
CN (1) CN111726549B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
CN113676651A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113973181A (en) * 2021-11-30 2022-01-25 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
WO2022111015A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Image sensor and imaging apparatus

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101467444A (en) * 2006-06-14 2009-06-24 株式会社东芝 Solid-state image sensor
CN101854488A (en) * 2009-03-31 2010-10-06 索尼公司 The signal processing method of solid camera head, solid camera head and camera head
CN102857708A (en) * 2011-10-17 2013-01-02 北京瑞澜联合通信技术有限公司 Image sensor, photographing device and image data generation method
CN104170376A (en) * 2012-03-27 2014-11-26 索尼公司 Image processing device, image-capturing element, image processing method, and program
US20150350582A1 (en) * 2014-05-29 2015-12-03 Semiconductor Components Industries, Llc Systems and methods for operating image sensor pixels having different sensitivities and shared charge storage regions
CN105611125A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic device
CN105611257A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Imaging method, image sensor, imaging device and electronic device
US20170302866A1 (en) * 2016-04-14 2017-10-19 Qualcomm Incorporated Image sensors with dynamic pixel binning
US20180006078A1 (en) * 2014-12-22 2018-01-04 Teledyne E2V Semiconductors Sas Colour image sensor with white pixels and colour pixels
CN110324546A (en) * 2018-03-28 2019-10-11 黑魔法设计私人有限公司 Image processing method and filter array
CN110649056A (en) * 2019-09-30 2020-01-03 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111314592A (en) * 2020-03-17 2020-06-19 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101467444A (en) * 2006-06-14 2009-06-24 株式会社东芝 Solid-state image sensor
CN101854488A (en) * 2009-03-31 2010-10-06 索尼公司 The signal processing method of solid camera head, solid camera head and camera head
CN102857708A (en) * 2011-10-17 2013-01-02 北京瑞澜联合通信技术有限公司 Image sensor, photographing device and image data generation method
CN104170376A (en) * 2012-03-27 2014-11-26 索尼公司 Image processing device, image-capturing element, image processing method, and program
US20150350582A1 (en) * 2014-05-29 2015-12-03 Semiconductor Components Industries, Llc Systems and methods for operating image sensor pixels having different sensitivities and shared charge storage regions
US20180006078A1 (en) * 2014-12-22 2018-01-04 Teledyne E2V Semiconductors Sas Colour image sensor with white pixels and colour pixels
CN105611125A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Imaging method, imaging device and electronic device
CN105611257A (en) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 Imaging method, image sensor, imaging device and electronic device
US20170302866A1 (en) * 2016-04-14 2017-10-19 Qualcomm Incorporated Image sensors with dynamic pixel binning
CN110324546A (en) * 2018-03-28 2019-10-11 黑魔法设计私人有限公司 Image processing method and filter array
CN110649056A (en) * 2019-09-30 2020-01-03 Oppo广东移动通信有限公司 Image sensor, camera assembly and mobile terminal
CN111314592A (en) * 2020-03-17 2020-06-19 Oppo广东移动通信有限公司 Image processing method, camera assembly and mobile terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
何春良等: "TDI CMOS图像传感器曝光时间优化方法研究", 《光学学报》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022111015A1 (en) * 2020-11-30 2022-06-02 华为技术有限公司 Image sensor and imaging apparatus
CN112822466A (en) * 2020-12-28 2021-05-18 维沃移动通信有限公司 Image sensor, camera module and electronic equipment
WO2022143280A1 (en) * 2020-12-28 2022-07-07 维沃移动通信有限公司 Image sensor, camera module, and electronic device
CN113676651A (en) * 2021-08-25 2021-11-19 维沃移动通信有限公司 Image sensor, control method, control device, electronic apparatus, and storage medium
CN113973181A (en) * 2021-11-30 2022-01-25 维沃移动通信有限公司 Image sensor, camera module and electronic equipment

Also Published As

Publication number Publication date
CN111726549B (en) 2022-08-23

Similar Documents

Publication Publication Date Title
CN111726549B (en) Image sensor, electronic device, and chip
TWI504257B (en) Exposing pixel groups in producing digital images
KR101340688B1 (en) Image capture using luminance and chrominance sensors
EP2087725B1 (en) Improved light sensitivity in image sensors
US8237831B2 (en) Four-channel color filter array interpolation
EP3035667B1 (en) Electronic device
US8203633B2 (en) Four-channel color filter array pattern
US8452082B2 (en) Pattern conversion for interpolation
US20080205792A1 (en) Colour binning of a digital image
US9159758B2 (en) Color imaging element and imaging device
CN109417613B (en) Image sensor method and apparatus having multiple successive infrared filtering units
WO2012057621A1 (en) System and method for imaging using multi aperture camera
CN113170061B (en) Image sensor, imaging device, electronic apparatus, image processing system, and signal processing method
WO2011063063A1 (en) Sparse color pixel array with pixel substitutes
US20150077597A1 (en) Imaging device, image processing device, and image processing method
CN113840067B (en) Image sensor, image generation method and device and electronic equipment
CN104471929B (en) Color image sensor and camera head
CN111741239B (en) Image sensor and electronic device
WO2023082766A1 (en) Image sensor, camera module, electronic device, and image generation method and apparatus
US20230007191A1 (en) Image sensor, imaging apparatus, electronic device, image processing system, and signal processing method
US20140307135A1 (en) Color imaging element
CN114125242A (en) Image sensor, camera module, electronic equipment, image generation method and device
US8976275B2 (en) Color imaging element
CN112019823A (en) Filter array and image sensor
US11988849B2 (en) Imaging device and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant