CN111787248A - Image sensor, terminal device, and imaging method - Google Patents

Image sensor, terminal device, and imaging method Download PDF

Info

Publication number
CN111787248A
CN111787248A CN202010675565.5A CN202010675565A CN111787248A CN 111787248 A CN111787248 A CN 111787248A CN 202010675565 A CN202010675565 A CN 202010675565A CN 111787248 A CN111787248 A CN 111787248A
Authority
CN
China
Prior art keywords
array
column
image sensor
sub
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010675565.5A
Other languages
Chinese (zh)
Other versions
CN111787248B (en
Inventor
姚国峰
沈健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Goodix Technology Co Ltd
Original Assignee
Shenzhen Goodix Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Goodix Technology Co Ltd filed Critical Shenzhen Goodix Technology Co Ltd
Priority to CN202010675565.5A priority Critical patent/CN111787248B/en
Publication of CN111787248A publication Critical patent/CN111787248A/en
Application granted granted Critical
Publication of CN111787248B publication Critical patent/CN111787248B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14609Pixel-elements with integrated switching, control, storage or amplification elements
    • H01L27/14612Pixel-elements with integrated switching, control, storage or amplification elements involving a transistor
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

The application provides an image sensor, a terminal device and an imaging method, wherein the image sensor comprises a pixel array, and the pixel array comprises a plurality of triangular pixel units; the pixel unit comprises a photoelectric conversion element and a readout circuit connected with the photoelectric conversion element; the shape of each triangular pixel unit is an isosceles triangle, the length of the bottom edge of the isosceles triangle is equal to the length of the height of the bottom edge of the isosceles triangle, and the triangular pixel units are arranged in a staggered and connected mode in a positive and reverse mode; the photoelectric conversion elements are used for receiving incident light signals and converting the incident light signals into electric signals; the readout circuit is configured to read and output the electrical signal, which is used to generate image data. According to the image sensor, when the image sensor works in a spatial multiplexing mode, image distortion is reduced.

Description

Image sensor, terminal device, and imaging method
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an image sensor, a terminal device and an imaging method.
Background
The image sensor includes a pixel array and a readout circuit connected to a pixel cell in the pixel array. In order to improve the dynamic range of the image sensor to obtain a good shooting effect, the image sensor is usually operated in a spatial multiplexing mode, specifically, the pixel array is divided into two sub-arrays, different shooting parameters are respectively configured for the two sub-arrays, and then image signals obtained by the two sub-arrays are fused to generate an image.
In the prior art, the pixel array is generally arranged as shown in fig. 1, a color filter array is disposed above the pixel array, and the color filter array is arranged in an RGGB form of a bayer array. When an image is acquired by spatial multiplexing, different shooting parameters are respectively configured for the first sub-array 101 and the second sub-array 102
However, in the prior art, since the distance of the misalignment between the two sub-arrays is large when the spatial multiplexing method is adopted, the difference between the image signals obtained by the two sub-arrays is large, and the distortion of the generated image is serious after the two image signals are subjected to fusion processing.
Disclosure of Invention
The embodiment of the application provides an image sensor, a terminal device and an imaging method, and when the image sensor works in a spatial multiplexing mode, the distortion of an image can be reduced.
In a first aspect, an embodiment of the present application provides an image sensor, including:
a pixel array including a plurality of delta pixel units each including a photoelectric conversion element and a readout circuit connected to the photoelectric conversion element;
the shape of each triangular pixel unit is an isosceles triangle, the length of the bottom edge of the isosceles triangle is equal to the length of the height of the bottom edge of the isosceles triangle, and the triangular pixel units are arranged in a staggered and connected mode in a positive and reverse mode;
the photoelectric conversion element is used for receiving an incident light signal and converting the incident light signal into an electric signal;
the readout circuit is configured to read and output the electrical signal, which is used to generate image data.
In one possible implementation, in the first direction, the waists of two adjacent triangular pixel units overlap;
in the second direction, the symmetry axes of two adjacent triangular pixel units are on the same straight line, and the distance between the bottom edges of the two adjacent triangular pixel units is equal to the length of the height on the bottom edge;
the first direction is orthogonal to the second direction.
In one possible implementation, in the first direction, the waists of two adjacent triangular pixel units overlap;
in the second direction, the symmetry axes of two adjacent triangular pixel units are on the same straight line, and the bottom sides of the two adjacent triangular pixel units are overlapped, or the vertexes corresponding to the bottom sides of the two adjacent triangular pixel units are overlapped;
the first direction is orthogonal to the second direction.
In a possible implementation manner, the method further includes: the color filter array is arranged on one side, facing the incident light signal, of the pixel array; the color filter array comprises a plurality of color filters, each color filter corresponds to one triangular pixel unit and is used for filtering the received incident light signal so as to output light signals with different light colors to the triangular pixel units.
In one possible implementation, the pixel array includes a first sub-array and a second sub-array; in the first direction, the triangular pixel units of the first sub-array and the triangular pixel units of the second sub-array are arranged in a staggered mode;
the arrangement mode of the color filter corresponding to the triangular pixel unit of the first sub-array is the same as that of the color filter array corresponding to the triangular pixel unit of the second sub-array.
In a possible implementation manner, a lens is correspondingly disposed on each color filter of the color filter array towards the incident light signal side.
In a possible implementation manner, the method further includes: the device comprises a control module and a processing module; the control module is electrically connected with the processing module, and the control module and the processing module are respectively electrically connected with the pixel array;
the control module is used for selecting the triangular pixel units in the corresponding row in the pixel array and transmitting the column signals output by the selected triangular pixel units to the processing module;
the control module is further configured to control the processing module to process the received column signals to generate image data.
In one possible implementation manner, the control module includes a control circuit and a row selection circuit; the processing module comprises a column signal processing circuit, a column selection circuit and a processing unit;
the control circuit is respectively electrically connected with a first end of the row selection circuit, a first end of the column selection circuit and a first end of the column signal processing circuit, a second end of the row selection circuit is electrically connected with the pixel array, a second end of the column signal processing circuit is electrically connected with the pixel array, a third end of the column signal processing circuit is electrically connected with a second end of the column selection circuit, and a fourth end of the column signal processing circuit is electrically connected with the processing unit;
the control circuit is used for controlling the row selection circuit to select the triangular pixel units in the corresponding row in the pixel array and transmitting the electric signals output by the selected triangular pixel units to the column signal processing circuit;
the control circuit is also used for acquiring shooting mode information;
the control circuit is further configured to control the column selection circuit to select a column signal belonging to the same pixel pair in the column signals and control the column signal processing circuit to pre-process the column signal belonging to the same pixel pair if the shooting mode information is in a first mode, where the pixel pair is formed by a triangular pixel unit in the first sub-array and a triangular pixel unit in a second sub-array adjacent to the triangular pixel unit in the first sub-array in a first direction;
the processing unit is used for processing the preprocessed column signals to generate image data.
In a possible implementation manner, the control circuit is further configured to control the column signal processing circuit to transmit the column signal to the processing unit if the shooting mode information is the second mode;
the processing unit is used for processing a first signal belonging to a first sub array in the column signals to generate first image data, and processing a second signal belonging to a second sub array in the column signals to generate second image data.
In one possible implementation, the image sensor is a back-illuminated CMOS image sensor or a stacked CMOS image sensor.
In a second aspect, the present application provides a terminal device, which includes a storage device, a processing device, and the image sensor described in the first aspect and any possible implementation manner of the embodiments of the present application.
In a third aspect, an embodiment of the present application provides an imaging method, which is applied to a control module in an image sensor according to the first aspect of the embodiment of the present application, where the method includes:
selecting triangular pixel units in corresponding rows in the pixel array, and transmitting column signals output by the selected triangular pixel units to a processing module;
and controlling the processing module to process the received column signals to generate image data.
In one possible implementation, the controlling the processing module to process the received column signals to generate image data includes:
acquiring shooting mode information;
if the shooting mode information is a first mode, controlling the processing module to add signals belonging to the same pixel pair in the column signals to generate a sum signal so that the processing module generates image data according to the sum signal, wherein the pixel pair is formed by a triangular pixel unit in the first sub-array and a triangular pixel unit in a second sub-array adjacent to the triangular pixel unit in the first sub-array in a first direction;
and if the shooting mode information is a second mode, controlling the processing module to process a first signal belonging to a first sub-array in the column signals to generate first image data, and processing a second signal belonging to a second sub-array in the column signals to generate second image data.
The application provides an image sensor, a terminal device and an imaging method, wherein the image sensor comprises a pixel array, the pixel array comprises a plurality of triangular pixel units, and each triangular pixel unit comprises a photoelectric conversion element and a reading circuit connected with the photoelectric conversion element; the shape of each triangular pixel unit is an isosceles triangle, the length of the bottom edge of the isosceles triangle is equal to the length of the height of the bottom edge of the isosceles triangle, and the triangular pixel units are arranged in a staggered and connected mode in a positive and reverse mode; the photoelectric conversion element is used for receiving an incident light signal and converting the incident light signal into an electric signal; the readout circuit is configured to read and output the electrical signal, which is used to generate image data. According to the image sensor, the pixel units in the pixel array in the image sensor are improved into triangular pixel units, each triangular pixel unit is in the shape of an isosceles triangle, the length of the bottom side of the isosceles triangle is equal to the length of the height of the bottom side of the isosceles triangle, and the triangular pixel units are arranged in a staggered and inverted mode, so that when the image sensor works in a spatial multiplexing mode, the distortion of an image can be reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a diagram of a pixel array formed by square pixel cells in the prior art;
fig. 2 is a first schematic structural diagram of an image sensor according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a triangular pixel unit according to an embodiment of the present disclosure;
FIG. 4 is a schematic circuit diagram of a delta pixel unit according to an embodiment of the present disclosure;
fig. 5 is a first schematic view illustrating an arrangement of a pixel array according to an embodiment of the present disclosure;
fig. 6 is a second schematic view illustrating an arrangement of a pixel array according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram illustrating an arrangement of a color filter array according to an embodiment of the present disclosure;
fig. 8 is a cross-sectional view of an image sensor provided in an embodiment of the present application;
fig. 9 is a schematic structural diagram of an image sensor according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an image sensor provided in the embodiment of the present application;
fig. 11 is a schematic diagram of a pixel pair provided in an embodiment of the present application;
FIG. 12 is a schematic flow chart of an imaging method provided by an embodiment of the present application;
fig. 13 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the development of multimedia systems, image sensors are in the focus of attention, especially solid-state image sensors, which are solid-state integrated components that can convert optical images into digital signals, and mainly perform imaging based on the photoelectric conversion effect of semiconductor materials. The solid-state image sensor has the advantages of small volume, light weight, high integration level, high resolution, low power consumption, long service life, low price and the like, and is widely applied to the fields of consumer electronics, security protection, automobiles and industry at present.
At present, the solid-state image sensor mainly includes a Charge Coupled Device (CCD) and a Complementary Metal Oxide Semiconductor (CMOS). Among them, the CCD is a high-end technical element applied to the aspect of photography and video shooting, has the advantages of good low illumination effect, high signal-to-noise ratio, strong transparency, good color reproducibility, and the like, and is widely applied in high-end fields such as transportation, medical treatment, and the like. The CMOS is applied to products with lower image quality, and has the characteristics of high integration level, low power consumption, high speed, low cost, and the like.
In general, the imaging principle of solid-state image sensors is mainly based on the Photoelectric Conversion effect of semiconductor materials, that is, a Pixel array (Pixel array) composed of a large number of Pixel cells (pixels) each including a Photoelectric Conversion Element (Photoelectric Conversion Element) and a readout circuit is disposed on a semiconductor substrate of the image sensor. When light is projected onto the pixel array, each pixel unit is subjected to photoelectric conversion, generated electric charges (electric signals) are read out through a reading circuit, and reach an analog-to-digital converter (ADC) of the image sensor to be converted into digital signals, and the digital signals are processed through an Image Signal Processor (ISP) to output images.
In order to improve the dynamic range of the image sensor to obtain a good shooting effect, the image sensor is usually operated in a spatial multiplexing mode, specifically, the pixel array is divided into two sub-arrays, different shooting parameters are respectively configured for the two sub-arrays, and then image signals obtained by the two sub-arrays are fused to generate an image.
In the prior art, a pixel array is usually formed by arranging a square pixel unit array, as shown in fig. 1, the pixel array is formed by connecting and arranging square pixel units, a color filter array is arranged above the pixel array, and the color filter array is arranged in an RGGB form of a bayer array. When an image is acquired by spatial multiplexing, different imaging parameters are configured for the first sub-array 101 and the second sub-array 102, respectively. If the side length of the square pixel cell is the pixel size a, the first sub-array 101 and the second sub-array 102 are shifted by a distance of 2 a. Therefore, when the square pixel unit adopts a spatial multiplexing mode, the staggered distance between the two sub-arrays is large, so that the difference of image signals obtained through the two sub-arrays is large, and the generated image distortion is serious after the image signals of the two sub-arrays are subjected to fusion processing.
Based on the above, the present application provides an image sensor, an imaging method and a terminal device, which can reduce image distortion when the image sensor is operated in a spatial multiplexing mode.
In this embodiment, by improving the shape and arrangement of the pixel units in the image sensor, the staggered distance between two sub-arrays in the pixel array of the image sensor is reduced, so that when the image sensor operates in the spatial multiplexing mode, the distortion of the image can be reduced.
Fig. 2 is a schematic structural diagram of an image sensor according to an embodiment of the present disclosure, and as shown in fig. 2, an image sensor 20 according to an embodiment of the present disclosure includes a pixel array 200, where the pixel array 200 includes a plurality of triangular pixel units, and each triangular pixel unit includes a photoelectric conversion element 201 and a readout circuit 202 connected to the photoelectric conversion element 201. The photoelectric conversion element 201 is configured to receive an incident light signal and convert the incident light signal into an electrical signal. The readout circuit 202 is used to read and output an electric signal, which is used to generate image data.
Fig. 3 is a schematic view of the shape of the triangular pixel unit provided in the embodiment of the present application, as shown in fig. 3, each of the triangular pixel units is shaped as an isosceles triangle, the base of the isosceles triangle is 11, the two waists of the isosceles triangle are 12, the height of the base of the isosceles triangle is 13, and the length of the base 11 of the isosceles triangle is equal to the length of the height 13 of the base. Illustratively, assuming that the length of the base 11 and the height 13 on the base are both a, a represents the pixel size.
In the embodiment of the present application, each triangle pixel unit can be abstracted to a point located at the center of gravity of the isosceles triangle, which is called a "pixel point". That is, in practical application, the pixel point corresponding to each triangle pixel unit can be represented by the gravity center of the isosceles triangle where the triangle pixel unit is located.
As shown in fig. 2, the triangular pixel units are arranged in a staggered manner in a forward and reverse manner. When the image sensor works in a spatial multiplexing mode, triangular pixel units are respectively selected as a first sub array and a second sub array at intervals in a first direction, namely the triangular pixel units of the first sub array and the triangular pixel units of the second sub array are arranged in a staggered mode, the staggered distance of the two sub arrays is the distance between a pixel point of the first sub array and a pixel point of the adjacent second sub array, and if the lengths of a bottom edge 11 and a height 13 on the bottom edge are both a, the staggered distance of the two sub arrays is ((a/2)2+(a/3)2)1/2Compared with a pixel array formed by arranging square pixel units, the staggered distance of the two sub-arrays is reduced, so that image signals output by the two sub-arrays are close to each other, and image distortion can be reduced.
The triangular pixel unit is provided with a photoelectric conversion element 201 and a readout circuit 202, and the photoelectric conversion element 201 may be a photodiode. An incident light signal is converted into an electric signal by the photoelectric conversion element 201, and the electric signal is read and output by the readout circuit 202.
Fig. 4 is a schematic circuit diagram of a delta pixel unit according to an embodiment of the present application, and as shown in fig. 4, the delta pixel unit includes: a photoelectric conversion element 201, a reset circuit 203, and a readout circuit including a transfer transistor 2021, a floating diffusion region 2022, a source follower transistor 2023, and a row drive transistor 2024. A first terminal of the photoelectric conversion element 201 is electrically connected to a ground terminal, and a second terminal of the photoelectric conversion element 201 is electrically connected to a first terminal of the transfer transistor 2021; the third terminal of the pass transistor 2021 is electrically connected to a pass signal; the floating diffusion region 2022 is electrically connected to the second terminal of the transfer transistor 2021, the first terminal of the reset circuit 203, and the third terminal of the source follower transistor 2023, respectively; the second end of the reset circuit 203 is electrically connected with the power supply voltage end, and the third end of the reset circuit 203 is electrically connected with a reset signal; a first terminal of the source follower transistor 2023 is electrically connected to a second terminal of the row driver transistor 2024, a second terminal of the source follower transistor 2023 is electrically connected to a power voltage terminal, a first terminal of the row driver transistor 2024 is electrically connected to the column signal output terminal, and a third terminal of the row driver transistor 2024 is electrically connected to the row driving signal. The photoelectric conversion element 201 is for receiving an incident light signal and converting it into an electrical signal; the reset circuit 203 is used to reset the photoelectric conversion element 201; the readout circuit is used for reading the electrical signal and outputting the electrical signal to the column signal output terminal.
According to the embodiment of the application, the pixel units in the pixel array are improved into the triangular pixel units, each triangular pixel unit is in the shape of an isosceles triangle, the length of the bottom side of the isosceles triangle is equal to the length of the height of the bottom side of the isosceles triangle, and the triangular pixel units are arranged in a staggered and inverted mode. When the image sensor works in a spatial multiplexing mode, the distortion of the image can be reduced.
In the embodiment of the present application, since the pixel units forming the pixel array are triangular pixel units, and the triangular pixel units are isosceles triangles with equal height on the bottom side and the bottom side, and the triangular pixel units are arranged in a staggered and connected manner in a positive and reverse manner, based on this, the arrangement manner of the pixel array may be as follows, specifically referring to the schematic diagrams of fig. 5 and 6.
Fig. 5 is a first schematic view illustrating an arrangement of a pixel array according to an embodiment of the present disclosure, as shown in fig. 5, in a first direction, the waists of two adjacent triangular pixel units overlap; in the second direction, the symmetry axes of two adjacent triangular pixel units are on the same straight line, and the distance between the bottom edges of the two adjacent triangular pixel units is equal to the length of the height on the bottom edge; wherein the first direction is orthogonal to the second direction.
In an embodiment of the present application, when the first direction is a horizontal direction, the second direction is a vertical direction; when the first direction is a vertical direction, the second direction is a horizontal direction. The embodiment of the present application does not limit the specific implementation of the first direction and the second direction, and as long as the first direction and the second direction are orthogonal, details are not described here. Fig. 5 illustrates the first direction as a horizontal direction and the second direction as a vertical direction.
As shown in fig. 5, in the horizontal direction, the waists of the adjacent triangular pixel units coincide with the waists; in the vertical direction, the symmetry axes of the adjacent triangular pixel units are overlapped and the bottom sides of the adjacent triangular pixel units are separated by a. By this arrangement, a pixel array can be formed by sequentially arranging a plurality of triangular pixel units.
As shown in fig. 5, in the horizontal direction, for example, in the first row, the waists of the triangular pixel cell R11 and the triangular pixel cell R12 coincide with the waist, the waists of the triangular pixel cell R12 and the triangular pixel cell R13 coincide with the waist, and so on. In the vertical direction, for example, in the first column, the symmetry axes of the triangular pixel cell R11 and the triangular pixel cell R21 are on the same straight line and the base distance thereof is equal to a, and the symmetry axes of the triangular pixel cell R21 and the triangular pixel cell R31 are on the same straight line and the base distance thereof is equal to a.
When the image sensor works in a spatial multiplexing mode, triangular pixel units are selected at intervals as a first subarray 21 and a second subarray 22 in the horizontal direction, namely the pixel units in the first subarray 21 and the pixel units in the second subarray 22 are arranged in a staggered mode, the first subarray 21 and the second subarray 22 are pixel point arrays with the minimum repetition period of a, the staggered distance of the two subarrays is the distance between a pixel point in the first subarray 21 and a pixel point in the adjacent second subarray 22, and the distance is ((a/2)2+(a/3)2)1/2
Fig. 6 is a schematic diagram illustrating a second arrangement of the pixel array according to the embodiment of the present application, as shown in fig. 6, in a first direction, the waists of two adjacent triangular pixel units overlap; in the second direction, the symmetry axes of two adjacent triangular pixel units are on the same straight line, and the bottom sides of the two adjacent triangular pixel units are overlapped, or the vertexes corresponding to the bottom sides of the two adjacent triangular pixel units are overlapped; wherein the first direction is orthogonal to the second direction.
The specific implementation of the first direction and the second direction is similar to that shown in fig. 5, and the detailed description thereof is omitted here.
As shown in fig. 6, in the horizontal direction, the waists of adjacent triangular pixel units coincide with the waists; in the vertical direction, the symmetry axes of adjacent triangular pixel units coincide (i.e. on the same straight line) and the bases coincide or the vertexes corresponding to the bases of two adjacent triangular pixel units coincide. By this arrangement, a pixel array can be formed by sequentially arranging a plurality of triangular pixel units.
As shown in fig. 6, in the horizontal direction, for example, in the first row, the waists of the triangular pixel cell R11 and the triangular pixel cell R12 coincide with the waist, the waists of the triangular pixel cell R12 and the triangular pixel cell R13 coincide with the waist, and so on. In the vertical direction, for example, in the first column, the symmetry axes of the triangular pixel unit R11 and the triangular pixel unit R21 are on the same straight line, and the bases thereof coincide, and the symmetry axes of the triangular pixel unit R21 and the triangular pixel unit R31 are on the same straight line, and the vertexes corresponding to the bases thereof coincide.
When the image sensor works in a spatial multiplexing mode, triangular pixel units are selected at intervals as a first subarray 31 and a second subarray 32 in the horizontal direction, namely the pixel units in the first subarray 31 and the pixel units in the second subarray 32 are arranged in a staggered mode, the first subarray 31 and the second subarray 32 are pixel point arrays with the minimum repetition period of a, the staggered distance of the two subarrays is the distance between a pixel point in the first subarray 31 and a pixel point in the second subarray 32 adjacent to the pixel point, and the distance is ((a/2)2+(a/3)2)1/2
As an embodiment of the present application, the image sensor further includes a color filter array. The color filter array is disposed on a side of the pixel array facing the incident light signal. The color filter array comprises a plurality of color filters, each color filter corresponds to one triangular pixel unit and is used for filtering received incident light signals so as to output light signals with different light colors to the triangular pixel units.
The color filter array may be arranged in an RGGB format using a Bayer Pattern (Bayer Pattern) array including three color filters of red, green, and blue, the basic unit being an array of 2 × 2, each array of 2 × 2 including 1 red filter R, one blue filter B, and 2 green filters GaAnd GbThrough the color filter array, any one triangular pixel unit in the pixel array can only obtain information of one of red, green and blue colors, and the restoration of the color of the image can be realized only by performing specific data processing on color data (RGB information) output by the image sensor. This process is also known as "Demosaicing".
It should be understood that the color filter array may also adopt other arrangements, for example, RGBW format, RYYB format, etc., and the embodiments of the present application are not limited in particular.
The color filter is disposed on a side of the pixel array facing the incident light signal, e.g., the upper surface of the pixel array faces the incident light signal, and the color filter array is disposed above the pixel array.
Fig. 7 shows an arrangement of the color filter array according to the embodiment of the present application, and as shown in fig. 7, two green filters in the first sub-array are Ga1And Gb1Blue filter is B1Red filter is R1Two green filters in the second sub-array are Ga2And Gb2Blue filter is B2Red filter is R2. The arrangement mode of the color filter corresponding to the triangular pixel unit of the first sub-array is the same as that of the color filter array corresponding to the triangular pixel unit of the second sub-array.
As an embodiment of the present application, each color filter in the color filter array is provided with a lens corresponding to a side of the color filter facing an incident light signal. The lens is disposed on a side of the color filter facing the incident light signal, e.g., an upper surface of the color filter facing the incident light signal, and the lens is disposed above the color filter. The incident light signal can be subjected to condensation processing through the lens, and the intensity of the incident light signal received by the pixel array is increased.
Since the pixel unit of the image sensor in the embodiment of the present application is triangular, in order to increase a Fill Factor (Fill Factor) and increase a photosensitive area, the image sensor preferably adopts a back-illuminated structure. Fig. 8 describes in detail the structure of the back-illuminated CMOS image sensor.
Fig. 8 is a cross-sectional view of an image sensor provided in an embodiment of the present application. As shown in fig. 8, the image sensor includes a substrate 801, the substrate 801 being a semiconductor material and having a first doping type, such as P-type silicon. The direction of the substrate 801 approaching the incident optical signal may be the substrate back side 801b, and the direction of the substrate away from the incident optical signal may be the substrate front side 801 f. The direction of the substrate back 801b approaching the incident optical signal is defined as the upper side of the substrate, the direction of the substrate front 801f departing the incident optical signal is defined as the lower side of the substrate front 801f, the color filter 802 is arranged above the substrate back 801b, and the lens 803 is arranged above the color filter.
Triangular pixel cells are formed in the substrate adjacent the front surface 801f of the substrate, each of which includes a Photodiode (Photodiode)804 and readout circuitry (not shown) therein, the region having a second doping type, e.g., N-type. A photodiode is a sensitive device that can respond to charges generated by light of a specific wavelength band to convert an optical signal into an electrical signal.
The dielectric layer 805 is located below the substrate front surface 801f, and includes a first metal wiring layer 8051 and a second metal wiring layer 8052 inside, and since the first metal wiring layer 8051 and the second metal wiring layer 8052 are located below the pixel array, the first metal wiring layer 8051 and the second metal wiring layer 8052 do not block incident light signals.
A second substrate 806 is disposed below the dielectric layer 805. The second substrate 806 may be a substrate without any circuit thereon, or may be a substrate including an image signal processing circuit. The substrate 801 and the second substrate 806 are bonded together by a Bonding Process. If the second substrate 806 is a substrate containing a circuit, the substrate 801 and the second substrate 806 are electrically connected through a through-silicon via interconnect structure or a Hybrid bonding (Hybrid bonding) interface.
The image sensor may also be a stacked CMOS image sensor, and a specific structure of the stacked CMOS image sensor is not an improvement of the embodiment of the present application, which is not described in detail herein.
Fig. 9 is a schematic structural diagram of the image sensor according to the second embodiment of the present disclosure, and as shown in fig. 9, the image sensor 20 according to the present embodiment further includes a control module 300 and a processing module 400. The control module 300 and the processing module 400 are electrically connected to the pixel array 200, respectively.
The control module 300 is used for selecting each triangle pixel unit in the pixel array 200, and transmitting the column signal output by the selected triangle pixel unit to the processing module 400.
The control module 300 is further configured to control the processing module 400 to process the received column signals to generate image data.
In the embodiment of the present application, the column signal is an electrical signal output by a triangular pixel unit in a selected row direction, for example, if the control module 300 selects a triangular pixel unit in a first row in the pixel array 200, the column signal is an electrical signal output by a readout circuit in each triangular pixel unit in the first row. The processing module 400 receives the column signal, and the control module 300 controls the processing module 400 to process the column signal to generate image data.
Fig. 10 is a schematic structural diagram of an image sensor according to an embodiment of the present application, and as shown in fig. 10, the control module 300 includes a control circuit 301 and a row selection circuit 302. The processing module 400 includes a column signal processing circuit 401, a column selection circuit 402, and a processing unit 403.
The control circuit 301 is electrically connected to a first terminal of the row selection circuit 302, a first terminal of the column selection circuit 402, and a first terminal of the column signal processing circuit, respectively, a second terminal of the row selection circuit 302 is electrically connected to the pixel array 200, a second terminal of the column signal processing circuit 401 is electrically connected to the pixel array 200, a third terminal of the column signal processing circuit 401 is electrically connected to a second terminal of the column selection circuit 402, and a third terminal of the column signal processing circuit 401 is electrically connected to the processing unit 403.
The control circuit 301 is used for controlling the row selection circuit 302 to select the triangular pixel units in the corresponding row of the pixel array 200, and transmitting the column signals output by the selected triangular pixel units to the column signal processing circuit 401.
The control circuit 301 is also used to acquire shooting mode information.
The control circuit 301 is further configured to, if the shooting mode information is the first mode, control the column selection circuit 402 to select column signals belonging to the same pixel pair from the column signals, and control the column signal processing circuit 401 to pre-process the column signals belonging to the same pixel pair; the pixel pair is formed by triangular pixel units in a first sub array and triangular pixel units in a second sub array adjacent to the triangular pixel units in the first sub array in the first direction. The processing unit 403 is configured to process the preprocessed column signals to generate image data.
The control circuit 301 is further configured to control the column signal processing circuit 401 to transmit the column signal to the processing unit 403 if the shooting mode information is the second mode; the processing unit 403 is configured to pre-process a first signal belonging to the first sub-array in the column signals to generate first image data, and pre-process a second signal belonging to the second sub-array in the column signals to generate second image data.
In the embodiment of the present application, as shown in fig. 5 and 7, the pixel array includes a first sub-array 21 and a second sub-array 22, and the arrangement manner of the color filters corresponding to the first sub-array 21 is the same as the arrangement manner of the color filters corresponding to the second sub-array 22, and both are bayer arrays. As shown in fig. 11, in the first direction, the triangular pixel cells in the first sub-array and the triangular pixel cells in the second sub-array adjacent to the first sub-array form a pixel pair, and the arrangement manner of the color filters corresponding to the pixel pair is still a bayer array. For example, the triangular pixel unit R11 and the triangular pixel unit R12 form a pixel pair, the color filters corresponding to the triangular pixel unit R11 and the triangular pixel unit R12 are both green filters, and the color filter corresponding to the pixel pair formed by the triangular pixel unit R11 and the triangular pixel unit R12 is still a green filter.
After the column signal processing circuit 401 receives the column signal output from the readout circuit, the control circuit 301 controls whether or not the column signal processing circuit 401 preprocesses the column signal based on the shooting mode information. If the shooting mode information is the first mode, the control circuit 301 controls the column selection circuit 402 to select the column signals belonging to the same pixel pair from the column signals, controls the column signal processing circuit 401 to preprocess the column signals belonging to the same pixel pair, transmits the preprocessed column signals to the processing unit 403, and processes the preprocessed column signals by the processing unit 403 to generate image data. When the shooting mode information indicates the second mode, the control circuit 301 controls the column signal processing circuit 401 to transmit the column signal to the processing section 403, and the processing section 403 processes the column signal to generate image data. Specifically, the processing unit 403 preprocesses a first signal belonging to the first sub-array in the column signals to generate first image data, and processes a second signal belonging to the second sub-array in the column signals to generate second image data. The column signal processing circuit 401 may perform preprocessing on the column signals by performing an addition operation, an averaging operation, or the like on the column signals.
As shown in fig. 5, the control circuit 301 controls the row selection circuit 302 to select the triangular pixel unit in the ith row, the corresponding readout circuit reads the electrical signal to obtain a column signal, and outputs the column signal to the column signal processing circuit 401, and the control circuit 301 controls the column selection circuit 402 to select the electrical signals in the jth column and the jth +1 column in the column signal, so that the column signal processing circuit 401 pre-processes the electrical signals in the jth column and the jth +1 column in the ith row. It should be understood that the column selection circuit may select one column of the column signals, may select a plurality of columns of the column signals, may select the column signals one by one, and may select the column signals alternately.
For example, when the shooting mode information is the first mode, the control circuit 301 controls the row selection circuit 302 to select the triangle pixel cell of the 1 st row, the corresponding readout circuit reads the corresponding electrical signal to obtain a column signal, and outputs the column signal to the column signal processing circuit 401, the control circuit 301 controls the column selection circuit 402 to select the electrical signals of the 1 st column and the 2 nd column in the column signal, the column signal processing circuit 401 adds the electrical signal of the 1 st row and the 1 st column and the electrical signal of the 1 st row and the 2 nd column, the control circuit 301 controls the column selection circuit to select the electric signals of the 3 rd column and the 4 th column in the column signals, and so on, until the column signal processing circuit 401 processes the electric signals of all the columns, then, the control circuit 301 controls the row selection circuit 302 to select the triangle pixel unit in row 2, and so on until the column signal processing circuit 401 processes the electrical signals of all rows and columns. The processing unit 403 further comprises an analog-to-digital converter 4031, an image signal preprocessor (Pre-ISP)4032 and a terminal Image Signal Processor (ISP)4033 connected in series. The analog-to-digital converter 4031 is electrically connected to the column signal processing circuit 402 and configured to perform analog-to-digital conversion on the preprocessed column signals to obtain digital signals, and the image signal preprocessor 4032 is configured to process the digital signals to obtain image data. After the processing module 403 processes the electrical signals output by all the triangular pixel units, the terminal image signal processor performs demosaicing on the obtained image data to obtain an image. The terminal image signal processor may be an application processing device (AP) on a mobile terminal such as a mobile phone or a computer, an Artificial Intelligence (AI) processor or other processors specially used for image processing.
For another example, when the shooting mode information is the second mode, the control circuit 301 controls the row selection circuit 302 to select the triangle pixel unit in the 1 st row, the corresponding readout circuit reads the corresponding electrical signal to obtain a column signal, and outputs the column signal to the column signal processing circuit 401, the control circuit 301 controls the column signal processing circuit 401 to transmit the column signal to the processing unit 403, the control circuit 301 controls the row selection circuit 302 to select the triangle pixel unit in the 2 nd row, and so on until the control circuit 301 controls the column signal processing circuit 401 to transmit the column signals of all the rows to the processing unit 403. The analog-to-digital converter 4031 respectively performs analog-to-digital conversion on the first signal belonging to the first sub-array and the second signal belonging to the second sub-array to obtain a first digital signal and a second digital signal, and the image signal preprocessor 4032 respectively processes the first digital signal and the second digital signal to obtain image data. The processing unit 403 processes the electrical signals of all rows to obtain first image data corresponding to the first sub-array and second image data corresponding to the second sub-array, and the terminal processor performs demosaic processing and fusion processing on the first image data and the second image data to obtain an image.
The first mode may be a synthesis mode and the second mode may be a spatial multiplexing mode. When the image sensor works in the synthesis mode, the shooting parameters of the first subarray are the same as the shooting parameters of the second subarray. When the image sensor works in the spatial multiplexing mode, the shooting parameters of the first subarray are different from those of the second subarray, for example, the exposure time of the first subarray is 0.1ms, and the exposure time of the second subarray is 33 ms.
When the image sensor works in a spatial multiplexing mode, the first image signal and the second image signal are subjected to fusion processing due to the fact that the staggered distance between the second sub array and the second sub array is small, and then image distortion can be reduced.
Under the dark light environment, the signal-to-noise ratio of the image can be increased and the imaging quality can be improved when the image sensor outputs the image by adopting the pixel synthesis mode. In addition, when the image sensor takes a video, the image sensor can reduce the data amount and the calculation load of the terminal processing device when outputting the video in the pixel synthesis mode.
Fig. 12 is a schematic flowchart of an imaging method provided in an embodiment of the present application, applied to a control module in the image sensor in the embodiment shown in fig. 9 or fig. 10, where the method includes the following steps:
step S1201 is to select a corresponding row of delta pixel units in the pixel array, and transmit a column signal output by the selected delta pixel unit to the processing module.
Step S1202, the control processing module processes the received column signal to generate image data.
In the embodiment of the present application, the shooting parameters include, but are not limited to, exposure time, gain factor, operation timing, and the like. The shooting parameters of the triangular pixel units may be the same or different, and the embodiment of the present application is not particularly limited.
Referring to fig. 9, the control module 301 selects a delta pixel unit, and controls the readout circuit in the selected delta pixel unit to read and output a corresponding electrical signal, so as to obtain a column signal, and transmit the column signal to the processing module 400, and the control module 300 controls the processing module 400 to process the column signal.
As an embodiment of the present application, one possible implementation manner of step S1202 is:
acquiring shooting mode information;
if the shooting mode information is a first mode, controlling the processing module to add signals belonging to the same pixel pair in the column signals to generate a sum signal so that the processing module generates image data according to the sum signal, wherein the pixel pair is formed by a triangular pixel unit in a first sub array and a triangular pixel unit in a second sub array adjacent to the triangular pixel unit in the first sub array in the first direction;
if the shooting mode information is the second mode, the control processing module processes a first signal belonging to the first subarray in the column signals to generate first image data, and processes a second signal belonging to the second subarray in the column signals to generate second image data.
The imaging method in the embodiment of the present application may be an imaging method in which the image sensor operates in a spatial multiplexing mode, or an imaging method in which the image sensor operates in a synthesis mode. The first mode may be a synthesis mode and the second mode may be a spatial multiplexing mode. When the imaging method employs the image sensor shown in fig. 10, it is determined whether or not the column signal is summed by the column signal processing circuit 401, based on the shooting mode information. If the shooting mode information is the first mode, the control circuit 301 controls the column signal processing circuit 401 to add the column signals; if the shooting mode information is the second mode, the control circuit 301 controls the column signal processing circuit 401 to transmit the column signal to the processing unit 403.
The specific implementation manner of step S1202 may refer to the description of the embodiment shown in fig. 10, and this embodiment is not described again.
Fig. 13 is a schematic structural diagram of a terminal Device according to an embodiment of the present disclosure, and as shown in fig. 13, the terminal Device may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a vehicle-mounted terminal (e.g., a car navigation terminal), and a fixed terminal such as a digital TV, a desktop computer, and the like. The terminal device shown in fig. 13 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 13, the terminal device may include a processing device (e.g., a central processing device, a graphics processing device, etc.) 1301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1302 or a program loaded from a storage device 1308 into a Random Access Memory (RAM) 1303. In the RAM 1303, various programs and data necessary for the operation of the terminal device are also stored. The processing device 1301, the ROM 1302, and the RAM 1303 are connected to each other via a bus 1304. An input/output (I/O) interface 1305 is also connected to bus 1304.
Generally, the following devices may be connected to the I/O interface 1305: input devices 1306 including, for example, touch screens, touch pads, keyboards, mice, cameras, microphones, accelerometers, gyroscopes, and the like; a Display panel 1307 including, for example, a Liquid Crystal Display (LCD), an Organic Light Emitting Display (OLED), and the like; including storage devices 1308 such as magnetic tape, hard disk, etc., communication devices 1309, and image sensors 1310. The communication means 1309 may allow the terminal device to communicate wirelessly or by wire with other devices to exchange data. While fig. 13 illustrates a terminal device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. An image sensor, comprising:
a pixel array including a plurality of delta pixel units each including a photoelectric conversion element and a readout circuit connected to the photoelectric conversion element;
the shape of each triangular pixel unit is an isosceles triangle, the length of the bottom edge of the isosceles triangle is equal to the length of the height of the bottom edge of the isosceles triangle, and the triangular pixel units are arranged in a staggered and connected mode in a positive and reverse mode;
the photoelectric conversion element is used for receiving an incident light signal and converting the incident light signal into an electric signal;
the readout circuit is configured to read and output the electrical signal, which is used to generate image data.
2. The image sensor of claim 1,
in the first direction, the waists of two adjacent triangular pixel units are overlapped;
in the second direction, the symmetry axes of two adjacent triangular pixel units are on the same straight line, and the distance between the bottom edges of the two adjacent triangular pixel units is equal to the length of the height on the bottom edge;
the first direction is orthogonal to the second direction.
3. The image sensor of claim 1,
in the first direction, the waists of two adjacent triangular pixel units are overlapped;
in the second direction, the symmetry axes of two adjacent triangular pixel units are on the same straight line, and the bottom sides of the two adjacent triangular pixel units are overlapped, or the vertexes corresponding to the bottom sides of the two adjacent triangular pixel units are overlapped;
the first direction is orthogonal to the second direction.
4. The image sensor according to any one of claims 1 to 3,
further comprising: the color filter array is arranged on one side, facing the incident light signal, of the pixel array; the color filter array comprises a plurality of color filters, each color filter corresponds to one triangular pixel unit and is used for filtering the received incident light signal so as to output light signals with different light colors to the triangular pixel units.
5. The image sensor of claim 4,
the pixel array comprises a first sub-array and a second sub-array; in the first direction, the triangular pixel units of the first sub-array and the triangular pixel units of the second sub-array are arranged in a staggered mode;
the arrangement mode of the color filter corresponding to the triangular pixel unit of the first sub-array is the same as that of the color filter array corresponding to the triangular pixel unit of the second sub-array.
6. The image sensor of claim 4, wherein a lens is disposed on each color filter of the color filter array facing the incident light signal.
7. The image sensor of claim 5, further comprising: the device comprises a control module and a processing module; the control module is electrically connected with the processing module, and the control module and the processing module are respectively electrically connected with the pixel array;
the control module is used for selecting the triangular pixel units in the corresponding row in the pixel array and transmitting the column signals output by the selected triangular pixel units to the processing module;
the control module is further configured to control the processing module to process the received column signals to generate image data.
8. The image sensor of claim 7, wherein the control module comprises a control circuit, a row select circuit; the processing module comprises a column signal processing circuit, a column selection circuit and a processing unit;
the control circuit is respectively electrically connected with a first end of the row selection circuit, a first end of the column selection circuit and a first end of the column signal processing circuit, a second end of the row selection circuit is electrically connected with the pixel array, a second end of the column signal processing circuit is electrically connected with the pixel array, a third end of the column signal processing circuit is electrically connected with a second end of the column selection circuit, and a fourth end of the column signal processing circuit is electrically connected with the processing unit;
the control circuit is used for controlling the row selection circuit to select the triangular pixel units in the corresponding row in the pixel array and transmitting the column signals output by the selected triangular pixel units to the column signal processing circuit;
the control circuit is also used for acquiring shooting mode information;
the control circuit is further configured to control the column selection circuit to select a column signal belonging to the same pixel pair in the column signals and control the column signal processing circuit to pre-process the column signal belonging to the same pixel pair if the shooting mode information is in a first mode, where the pixel pair is formed by a triangular pixel unit in the first sub-array and a triangular pixel unit in a second sub-array adjacent to the triangular pixel unit in the first sub-array in a first direction;
the processing unit is used for processing the preprocessed column signals to generate image data.
9. The image sensor of claim 8,
the control circuit is further configured to control the column signal processing circuit to transmit the column signal to the processing unit if the shooting mode information is a second mode;
the processing unit is used for processing a first signal belonging to a first sub array in the column signals to generate first image data, and processing a second signal belonging to a second sub array in the column signals to generate second image data.
10. The image sensor according to any one of claims 1 to 3 or 5 to 9, wherein the image sensor is a back-illuminated CMOS image sensor or a stacked CMOS image sensor.
11. A terminal device, characterized in that it comprises storage means, processing means and an image sensor according to any one of claims 1 to 10.
12. An imaging method applied to the control module in the image sensor according to any one of claims 7 to 9, the method comprising:
selecting triangular pixel units in corresponding rows in the pixel array, and transmitting column signals output by the selected triangular pixel units to a processing module;
and controlling the processing module to process the received column signals to generate image data.
13. The method of claim 12, wherein controlling the processing module to process the received column signals to generate image data comprises:
acquiring shooting mode information;
if the shooting mode information is a first mode, controlling the processing module to add column signals belonging to the same pixel pair in the column signals to generate an added signal so that the processing module generates image data according to the added signal, wherein the pixel pair is formed by a triangular pixel unit in the first sub-array and a triangular pixel unit in a second sub-array adjacent to the triangular pixel unit in the first sub-array in a first direction;
and if the shooting mode information is a second mode, controlling the processing module to process a first signal belonging to a first sub-array in the column signals to generate first image data, and processing a second signal belonging to a second sub-array in the column signals to generate second image data.
CN202010675565.5A 2020-07-14 2020-07-14 Image sensor, terminal device and imaging method Active CN111787248B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010675565.5A CN111787248B (en) 2020-07-14 2020-07-14 Image sensor, terminal device and imaging method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010675565.5A CN111787248B (en) 2020-07-14 2020-07-14 Image sensor, terminal device and imaging method

Publications (2)

Publication Number Publication Date
CN111787248A true CN111787248A (en) 2020-10-16
CN111787248B CN111787248B (en) 2023-05-02

Family

ID=72768619

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010675565.5A Active CN111787248B (en) 2020-07-14 2020-07-14 Image sensor, terminal device and imaging method

Country Status (1)

Country Link
CN (1) CN111787248B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023056585A1 (en) * 2021-10-08 2023-04-13 华为技术有限公司 Detection system, terminal device, control detection method, and control apparatus
WO2023098639A1 (en) * 2021-11-30 2023-06-08 维沃移动通信有限公司 Image sensor, camera module and electronic device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040109099A1 (en) * 2002-12-03 2004-06-10 Lee Chang-Hun Thin film transistor array panel for liquid crystal display
US20110242374A1 (en) * 2010-04-06 2011-10-06 Omnivision Technologies, Inc. Imager with variable area color filter array and pixel elements
CN110113546A (en) * 2018-06-06 2019-08-09 思特威电子科技(开曼)有限公司 The combination of adjacent pixel unit and reading method in imaging system and pixel array
CN110379824A (en) * 2019-07-08 2019-10-25 Oppo广东移动通信有限公司 A kind of cmos image sensor and image processing method, storage medium
CN210073853U (en) * 2019-07-15 2020-02-14 云谷(固安)科技有限公司 Pixel arrangement structure, display panel and display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040109099A1 (en) * 2002-12-03 2004-06-10 Lee Chang-Hun Thin film transistor array panel for liquid crystal display
US20110242374A1 (en) * 2010-04-06 2011-10-06 Omnivision Technologies, Inc. Imager with variable area color filter array and pixel elements
CN110113546A (en) * 2018-06-06 2019-08-09 思特威电子科技(开曼)有限公司 The combination of adjacent pixel unit and reading method in imaging system and pixel array
CN110379824A (en) * 2019-07-08 2019-10-25 Oppo广东移动通信有限公司 A kind of cmos image sensor and image processing method, storage medium
CN210073853U (en) * 2019-07-15 2020-02-14 云谷(固安)科技有限公司 Pixel arrangement structure, display panel and display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023056585A1 (en) * 2021-10-08 2023-04-13 华为技术有限公司 Detection system, terminal device, control detection method, and control apparatus
WO2023098639A1 (en) * 2021-11-30 2023-06-08 维沃移动通信有限公司 Image sensor, camera module and electronic device

Also Published As

Publication number Publication date
CN111787248B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
JP6651478B2 (en) Pixel binning in image sensors
CN212752389U (en) Image sensor and electronic device
TWI543614B (en) Image sensor with flexible pixel summing
US20170013217A1 (en) Image sensor having wide dynamic range, pixel circuit of the image sensor, and operating method of the image sensor
US7663685B2 (en) Hybrid solid-state image pickup element and image pickup apparatus using the same
KR20180128802A (en) Pixel circuit and image sensor including thereof
US20090027530A1 (en) Solid-state image pick-up device
CN108462841A (en) Pel array and imaging sensor
CN111741242A (en) Image sensor and method of operating the same
CN109075179A (en) Solid-state imaging element and electronic equipment
CN111787248B (en) Image sensor, terminal device and imaging method
US20150029355A1 (en) Image sensors and imaging devices including the same
CN113286067B (en) Image sensor, image pickup apparatus, electronic device, and imaging method
US20220336508A1 (en) Image sensor, camera assembly and mobile terminal
CN111741239B (en) Image sensor and electronic device
CN113747022A (en) Image sensor, camera assembly and mobile terminal
JPWO2018062303A1 (en) Image sensor and electronic camera
CN107251544B (en) Solid-state imaging device, driving method, and electronic apparatus
CN111818283A (en) Image sensor, electronic device and imaging method of triangular pixels
CN217765246U (en) Short wave infrared detector
KR100585118B1 (en) Solid state image sensing device providing sub-sampling mode improving dynamic range and driving method thereof
CN111835971B (en) Image processing method, image processing system, electronic device, and readable storage medium
CN216852142U (en) Pixel array and image sensor
TWI795895B (en) Solid-state imaging device, solid-state imaging device manufacturing method, and electronic apparatus
US20230217119A1 (en) Image sensor and image processing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant