WO2023234069A1 - Dispositif d'imagerie et appareil électronique - Google Patents

Dispositif d'imagerie et appareil électronique Download PDF

Info

Publication number
WO2023234069A1
WO2023234069A1 PCT/JP2023/018728 JP2023018728W WO2023234069A1 WO 2023234069 A1 WO2023234069 A1 WO 2023234069A1 JP 2023018728 W JP2023018728 W JP 2023018728W WO 2023234069 A1 WO2023234069 A1 WO 2023234069A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
imaging device
unit
section
semiconductor substrate
Prior art date
Application number
PCT/JP2023/018728
Other languages
English (en)
Japanese (ja)
Inventor
水輝 西田
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2023234069A1 publication Critical patent/WO2023234069A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals

Definitions

  • the present disclosure relates to, for example, an imaging device capable of acquiring imaging information and parallax information, and an electronic device equipped with the same.
  • Patent Document 1 in a pixel array section in which a plurality of pixels including a pixel in which a plurality of photoelectric conversion elements are formed for one on-chip lens are arranged in a two-dimensional manner, an interval between pixels formed between the pixels is disclosed.
  • a solid-state imaging device that improves the accuracy of phase difference detection while suppressing the deterioration of captured images by making at least one part of the separation part and the inter-pixel light-shielding part protrude toward the center of the pixel. is disclosed.
  • An imaging device is arranged at equal distances in a first direction and a second direction orthogonal to the first direction from the optical center of a pixel array section in which a plurality of unit pixels are arranged in a matrix.
  • the first direction is the phase difference acquisition direction
  • the pupil correction amount A in the first pixel and the pupil correction amount B in the second pixel are A> It has the relationship B.
  • An electronic device as an embodiment of the present disclosure includes the imaging device of the embodiment of the present disclosure.
  • the first direction and the second direction are orthogonal to each other from the optical center of a pixel array section in which a plurality of unit pixels are arranged in a matrix.
  • the pupil correction amount A in the first pixel and the pupil correction amount B in the second pixel are The relationship A>B was established. This reduces color mixture in the second direction, which is not the phase difference acquisition direction.
  • FIG. 3 is a schematic diagram illustrating a pupil correction amount in a pixel array section of an imaging device according to an embodiment of the present disclosure.
  • 2 is a block diagram showing the overall configuration of the imaging device shown in FIG. 1.
  • FIG. FIG. 2 is an equivalent circuit diagram of the unit pixel shown in FIG. 1.
  • FIG. 1 is a schematic cross-sectional view showing an example of the configuration of an imaging device according to an embodiment of the present disclosure.
  • 5 is a schematic plan view showing an example of the configuration of the imaging device shown in FIG. 4.
  • FIG. 2 is a schematic cross-sectional view showing an example of the configuration of a unit pixel PA shown in FIG. 1.
  • FIG. 2 is a schematic cross-sectional view showing an example of the configuration of a unit pixel PB shown in FIG. 1.
  • FIG. 2 is a schematic plan view showing a light condensing position in a unit pixel P1 shown in FIG. 1.
  • FIG. FIG. 2 is a schematic plan view showing a light condensing position in a unit pixel P2 shown in FIG. 1.
  • FIG. FIG. 2 is a schematic plan view showing a light condensing position in a unit pixel P3 shown in FIG. 1.
  • FIG. FIG. 2 is a schematic plan view showing a light condensing position in a unit pixel P4 shown in FIG. 1.
  • FIG. FIG. 7 is a characteristic diagram showing the relationship between the principal ray incident angle and the output at the right pixel and the left pixel when performing pupil correction.
  • FIG. 7 is a schematic plan view showing an example of the configuration of a unit pixel of an imaging device according to Modification 1 of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an imaging device according to Modification 2 of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view showing another example of the configuration of an imaging device according to Modification 2 of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view showing another example of the configuration of an imaging device according to Modification 2 of the present disclosure.
  • FIG. 7 is a schematic diagram illustrating the amount of pupil correction in the pixel array section of the imaging device according to Modification Example 3 of the present disclosure.
  • 14 is a schematic plan view showing an example of the configuration of the imaging device shown in FIG. 13.
  • FIG. 7 is a schematic plan view showing an example of the configuration of a unit pixel of an imaging device according to Modification 4 of the present disclosure.
  • FIG. 2 is a block diagram showing a configuration example of an electronic device including the imaging device shown in FIG. 1 and the like.
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 2 is an explanatory diagram showing an example of installation positions of an outside-vehicle information detection section and an imaging section.
  • FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.
  • FIG. 2 is a block diagram showing an example of the functional configuration of a camera head and a CCU.
  • FIG. 1 illustrates the amount of pupil correction in a pixel array section 100A of an imaging device (imaging device 1) according to an embodiment of the present disclosure.
  • FIG. 2 shows an example of the overall configuration of the imaging device 1 shown in FIG. 1.
  • the imaging device 1 is, for example, a CMOS (Complementary Metal Oxide Semiconductor) image sensor used in electronic devices such as digital still cameras and video cameras, and has a plurality of pixels arranged two-dimensionally in a matrix as an imaging area. (pixel array section 100A).
  • the imaging device 1 is, for example, a so-called back-illuminated imaging device, such as a CMOS image sensor.
  • the imaging device 1 includes a pixel array section 100A in which a plurality of pixels (unit pixels P) capable of simultaneously acquiring imaging information and parallax information are arranged in a matrix.
  • a pixel P A and a pixel are arranged at equal distances from the optical center O of the pixel array section 100A in the X-axis direction and the Y-axis direction, which are perpendicular to each other .
  • P B when the X-axis direction is the phase difference acquisition direction, the pupil correction amount A at the pixel PA and the pupil correction amount B at the pixel PB have a relationship of A>B.
  • the imaging device 1 takes in incident light (image light) from a subject through an optical lens system (not shown), and converts the amount of the incident light imaged onto the imaging surface into an electrical signal in pixel units P. and outputs it as a pixel signal.
  • the imaging device 1 has a pixel array section 100A as an imaging area on a semiconductor substrate 11, and includes, for example, a vertical drive circuit 111, a column signal processing circuit 112, and a horizontal drive circuit 113 in the peripheral area of this pixel array section 100A. , an output circuit 114, a control circuit 115, and an input/output terminal 116.
  • a plurality of unit pixels P are two-dimensionally arranged in a matrix.
  • Each of the plurality of unit pixels P serves as an imaging pixel and an image plane phase difference pixel.
  • the imaging pixel photoelectrically converts a subject image formed by an imaging lens in a photodiode PD to generate a signal for image generation.
  • the image plane phase difference pixel divides the pupil region of the imaging lens, photoelectrically converts a subject image from the divided pupil region, and generates a signal for phase difference detection.
  • a pixel drive line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column.
  • the pixel drive line Lread transmits a drive signal for reading signals from pixels.
  • One end of the pixel drive line Lread is connected to an output end corresponding to each row of the vertical drive circuit 111.
  • the vertical drive circuit 111 is composed of a shift register, an address decoder, etc., and is a pixel drive section that drives each unit pixel P of the pixel array section 100A, for example, row by row. Signals output from each unit pixel P in the pixel row selectively scanned by the vertical drive circuit 111 are supplied to the column signal processing circuit 112 through each vertical signal line Lsig.
  • the column signal processing circuit 112 includes an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
  • the horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and sequentially drives each horizontal selection switch of the column signal processing circuit 112 while scanning them. By this selective scanning by the horizontal drive circuit 113, the signals of each pixel transmitted through each of the vertical signal lines Lsig are sequentially outputted to the horizontal signal line 121, and transmitted to the outside of the semiconductor substrate 11 through the horizontal signal line 121. .
  • the output circuit 114 performs signal processing on the signals sequentially supplied from each of the column signal processing circuits 112 via the horizontal signal line 121 and outputs the processed signals.
  • the output circuit 114 may perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
  • the circuit portion consisting of the vertical drive circuit 111, column signal processing circuit 112, horizontal drive circuit 113, horizontal signal line 121, and output circuit 114 may be formed directly on the semiconductor substrate 11, or may be formed on an external control IC. It may be arranged. Moreover, those circuit parts may be formed on another board connected by a cable or the like.
  • the control circuit 115 receives a clock applied from outside the semiconductor substrate 11, data instructing an operation mode, etc., and outputs data such as internal information of the imaging device 1.
  • the control circuit 115 further includes a timing generator that generates various timing signals, and controls the vertical drive circuit 111, column signal processing circuit 112, horizontal drive circuit 113, etc. based on the various timing signals generated by the timing generator. Performs drive control of peripheral circuits.
  • the input/output terminal 116 is for exchanging signals with the outside.
  • FIG. 3 shows an example of a readout circuit for the unit pixel P of the imaging device 1 shown in FIG. 2.
  • the unit pixel P includes two photoelectric conversion sections 12A and 12B, transfer transistors TR1 and TR2, a floating diffusion FD, a reset transistor RST, an amplification transistor AMP, and a selection transistor SEL. It has
  • the photoelectric conversion units 12A and 12B are each photodiodes (PD).
  • the photoelectric conversion unit 12A has an anode connected to a ground voltage line and a cathode connected to the source of the transfer transistor TR1.
  • the photoelectric conversion section 12B has an anode connected to the ground voltage line and a cathode connected to the source of the transfer transistor TR2.
  • the transfer transistor TR1 is connected between the photoelectric conversion section 12A and the floating diffusion FD.
  • Transfer transistor TR2 is connected between photoelectric conversion section 12B and floating diffusion FD.
  • a drive signal TRsig is applied to the gate electrodes of the transfer transistors TR1 and TR2, respectively.
  • the drive signal TRsig becomes active, the transfer gates of the transfer transistors TR1 and TR2 become conductive, and the signal charges accumulated in the photoelectric conversion sections 12A and 12B float through the transfer transistors TR1 and TR2. Transferred to diffusion FD.
  • the floating diffusion FD is connected between the transfer transistors TR1, TR2 and the amplification transistor AMP.
  • the floating diffusion FD converts the signal charges transferred by the transfer transistors TR1 and TR2 into a voltage signal, and outputs the signal charge to the amplification transistor AMP.
  • the reset transistor RST is connected between the floating diffusion FD and the power supply section.
  • a drive signal RSTsig is applied to the gate electrode of the reset transistor RST.
  • this drive signal RSTsig becomes active, the reset gate of the reset transistor RST becomes conductive, and the potential of the floating diffusion FD is reset to the level of the power supply section.
  • the amplification transistor AMP has its gate electrode connected to the floating diffusion FD and its drain electrode connected to the power supply section, and serves as an input section of a so-called source follower circuit, which is a readout circuit for the voltage signal held by the floating diffusion FD. That is, the amplification transistor AMP has its source electrode connected to the vertical signal line Lsig via the selection transistor SEL, thereby forming a source follower circuit with a constant current source connected to one end of the vertical signal line Lsig.
  • the selection transistor SEL is connected between the source electrode of the amplification transistor AMP and the vertical signal line Lsig.
  • a drive signal SELsig is applied to the gate electrode of the selection transistor SEL.
  • the selection transistor SEL becomes conductive, and the unit pixel P becomes selected.
  • the read signal (pixel signal) output from the amplification transistor AMP is output to the vertical signal line Lsig via the selection transistor SEL.
  • the signal charges generated in the photoelectric conversion section 12A and the signal charges generated in the photoelectric conversion section 12B are respectively read out.
  • the signal charges read from each of the photoelectric conversion section 12A and the photoelectric conversion section 12B are added together in the floating diffusion FD, and outputted to an imaging block of an external signal processing unit, for example, so that the photoelectric conversion unit 12A and the photoelectric conversion unit A pixel signal based on the total charge of the portion 12B can be obtained.
  • FIG. 4 schematically shows an example of a cross-sectional configuration of the imaging device 1.
  • FIG. 5 schematically shows an example of the planar configuration of the imaging device 1
  • FIG. 4 shows a cross section corresponding to the line II shown in FIG.
  • the imaging device 1 is, for example, a back-illuminated imaging device.
  • the plurality of unit pixels P two-dimensionally arranged in a matrix in the pixel array section 100A are, for example, a light receiving section 10, a light collecting section 20 provided on the light incident side S1 of the light receiving section 10, and a light receiving section 10. It has a structure in which a multilayer wiring layer 30 provided on the opposite side to the light incident side S1 is stacked.
  • the light receiving section 10 includes a semiconductor substrate 11 having a first surface 11S1 and a second surface 11S2 facing each other, and a plurality of photoelectric conversion sections 12 embedded in the semiconductor substrate 11.
  • the semiconductor substrate 11 is made of, for example, a silicon substrate.
  • the photoelectric conversion unit 12 is, for example, a PIN (Positive Intrinsic Negative) type photodiode (PD), and has a pn junction in a predetermined region of the semiconductor substrate 11.
  • a plurality of photoelectric conversion sections 12 for example, two (photoelectric conversion sections 12A, 12B) are embedded in the unit pixel P.
  • the light receiving section 10 further includes an inter-pixel separation section 13 and an intra-pixel separation section 14.
  • the inter-pixel separation section 13 is provided between adjacent unit pixels P.
  • the inter-pixel separation section 13 is provided around the unit pixel P in the in-plane direction (XY plane direction) of the semiconductor substrate 11, and in the pixel array section 100A, for example, as shown in FIG. It is arranged in a grid pattern.
  • the inter-pixel separation section 13 is for electrically and optically separating adjacent unit pixels P.
  • the inter-pixel separation section 13 extends, for example, from the first surface 11S1 side of the semiconductor substrate 11 toward the second surface 11S2 side. It has an FTI (Full Trench Isolation) structure that penetrates between the first surface 11S1 and the second surface 11S2.
  • the inter-pixel isolation section 13 can be formed using, for example, an oxide film such as a silicon oxide (SiO x ) film.
  • a conductive light-shielding film may be embedded in an oxide film, or an air gap may be formed in the inter-pixel isolation section 13.
  • the light shielding film include a single layer film or a laminated film of tungsten (W), silver (Ag), copper (Cu), aluminum (Al), or an alloy of Al and copper (Cu).
  • the intra-pixel separation section 14 is provided between adjacent photoelectric conversion sections 12 (for example, the photoelectric conversion section 12A and the photoelectric conversion section 12B) within the unit pixel P.
  • the intra-pixel separation section 14 is for electrically separating adjacent photoelectric conversion sections 12.
  • the intra-pixel separation unit 14 divides the unit pixel P in the in-plane direction (XY plane direction) of the semiconductor substrate 11 so as to divide the unit pixel P in the phase difference acquisition direction (X-axis direction), for example, as shown in FIG.
  • the inter-pixel separation section 13 surrounding the pixel P extends, for example, from each of a pair of opposing sides in the Y-axis direction toward the center of the unit pixel P, and has a gap therebetween.
  • the intra-pixel separation section 14 extends, for example, from the second surface 11S2 of the semiconductor substrate 11 toward the first surface 11S1. Similarly, it has an FTI structure penetrating between the first surface 11S1 and the second surface 11S2 of the semiconductor substrate 11.
  • the intra-pixel isolation section 14 can be formed using an oxide film such as a silicon oxide (SiO x ) film, for example.
  • a conductive light-shielding film may be embedded in an oxide film, or an air gap may be formed.
  • the light shielding film include a single layer film or a laminated film of tungsten (W), silver (Ag), copper (Cu), aluminum (Al), or an alloy of Al and copper (Cu).
  • a fixed charge layer 15 is further provided on the first surface 11S1 of the semiconductor substrate 11, which also serves to prevent reflection on the first surface 11S1 of the semiconductor substrate 11.
  • the fixed charge layer 15 may be a film having a positive fixed charge or a film having a negative fixed charge.
  • Examples of the constituent material of the fixed charge layer 15 include a semiconductor material or a conductive material having a band gap wider than that of the semiconductor substrate 11.
  • the light collecting section 20 is provided on the light incident side S1 of the light receiving section 10, and selectively transmits, for example, red light (R), green light (G), or blue light (B) for each unit pixel P, for example. It has a color filter 21, a light shielding part 22 provided between unit pixels P of the color filter 21, and a lens layer 23, which are stacked in this order from the light receiving part 10 side.
  • color filter 21 for example, two color filters 21G that selectively transmit green light (G) are arranged diagonally with respect to four unit pixels P arranged in 2 rows x 2 columns, and red light Color filters 21R and 21B that selectively transmit blue light (R) and blue light (B) are arranged one by one on orthogonal diagonals (see, for example, the lower part of FIG. 5).
  • unit pixel P provided with each color filter 21R, 21G, 21B corresponding colored light is detected in each photoelectric conversion section 12, for example. That is, in the pixel array section 100A, unit pixels P that detect red light (R), green light (G), and blue light (B) are arranged in a Bayer pattern.
  • the light shielding part 22 is for preventing light obliquely incident on the color filter 21 from leaking into adjacent unit pixels P, and is provided between the unit pixels P of the color filter 21 as described above. .
  • the light shielding sections 22 are provided in a grid pattern in the pixel array section 100A.
  • the material constituting the light shielding portion 22 include a conductive material having a light shielding property. Specifically, examples thereof include tungsten (W), silver (Ag), copper (Cu), aluminum (Al), and an alloy of Al and copper (Cu).
  • the lens layer 23 is provided so as to cover the entire surface of the pixel array section 100A, and has, for example, a plurality of on-chip lenses 23L provided in a gapless manner on its surface.
  • the on-chip lens 23L is for condensing light incident from above onto the photoelectric conversion unit 12, and is provided for each unit pixel P, for example, as shown in FIGS. 4 and 5. That is, the on-chip lens 23L is provided across the plurality of photoelectric conversion sections 12 within the unit pixel P.
  • the lens layer 23 is made of, for example, an inorganic material such as silicon oxide (SiO x ) or silicon nitride (SiN x ).
  • the lens layer 23 may be formed using an organic material with a high refractive index such as an episulfide resin, a thietane compound, or a resin thereof.
  • the shape of the on-chip lens 23L is not particularly limited, and various lens shapes such as a hemispherical shape and a semicylindrical shape can be adopted.
  • a flat lower layer may be provided between the color filter 21 and the light shielding part 22 and the lens layer 23 to flatten the surface of the light incident side S1 constituted by the color filter 21 and the light shielding part 22.
  • the planarization layer can be formed using, for example, silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), or the like.
  • the multilayer wiring layer 30 is provided on the side opposite to the light incident side S1 of the light receiving section 10, specifically, on the second surface 11S2 side of the semiconductor substrate 11.
  • the multilayer wiring layer 30 has, for example, a structure in which a plurality of wiring layers 31, 32, and 33 are laminated with an interlayer insulating layer 34 interposed therebetween.
  • a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, an output circuit 114, a control circuit 115, an input/output terminal 116, etc. are formed. There is.
  • the wiring layers 31, 32, and 33 are formed using, for example, aluminum (Al), copper (Cu), or tungsten (W).
  • the wiring layers 31, 32, and 33 may be formed using polysilicon (Poly-Si).
  • the interlayer insulating layer 34 is, for example, a single layer film made of one of silicon oxide (SiO x ), TEOS, silicon nitride (SiN x ), silicon oxynitride (SiO x N y ), or one of these. It is formed from a laminated film composed of two or more types.
  • a plurality of unit pixels P capable of simultaneously acquiring imaging information and parallax information are arranged in a matrix in the pixel array section 100A. It is configured as a phase difference acquisition direction.
  • pupil correction is performed because the angle of incidence of the principal ray of the incident light from the optical lens is a predetermined angle depending on the design of the lens.
  • the plurality of unit pixels P arranged in a matrix in the pixel array section 100A are located at a distance from the optical center O of the pixel array section 100A (incident light angle contour line C1 shown in FIG. 1). (Concentric circles)) are the same, the pupil correction amount differs depending on the orientation (pupil correction amount contour line C2 (non-concentric circle) shown in FIG. 1). This will be explained in detail below.
  • FIG. 6A shows an example of the cross-sectional configuration of the unit pixel P A shown in FIG. It is something.
  • FIG. 6B is a cross section of a unit pixel P B arranged at a distance d2 in the Y-axis direction perpendicular to the phase difference acquisition direction (X-axis direction) from the optical center O of the pixel array section 100A shown in FIG. 1.
  • This shows an example of the configuration.
  • 7A to 7D each represent the condensing positions F in the unit pixels P1, P2, P3, and P4 shown in FIG. 1, respectively.
  • Pupil correction can be performed, for example, using a member constituting the light condensing section 20 provided on the light incident side S1 with respect to the light receiving section 10. That is, as shown in FIGS. 6A and 6B, the centers of the color filter 21 and the on-chip lens 23L are shifted from the center of the unit pixel P toward the optical center O of the pixel array section 100A.
  • the amount of deviation (pupil correction amount) between the center position of the color filter 21 or on-chip lens 23L and the center position of the unit pixel P differs depending on the direction from the optical center O of the pixel array section 100A.
  • the pupil correction amount of a unit pixel P A placed at a distance d1 in the phase difference acquisition direction (X-axis direction) from the optical center O of the pixel array section 100A is the pupil correction amount A
  • the optical center O of the pixel array section 100A is If the pupil correction amount of a unit pixel P B placed at a distance d2 from the center O in the Y-axis direction perpendicular to the phase difference acquisition direction (X-axis direction) is the pupil correction amount B, then A>B. It is configured to be.
  • the pupil correction amount A in the phase difference acquisition direction (X-axis direction) and the pupil correction amount B in the Y-axis direction orthogonal to the phase difference acquisition direction (X-axis direction) are determined by the distance from the optical center O of the pixel array section 100A. If they are equal, the distance increases toward the outer periphery of the pixel array section 100A.
  • the angle of incidence of the principal ray of the incident light from the optical lens is 0°, so pupil correction is not necessary, and the center of the unit pixel P and the color filter 21 and on-chip lens
  • the centers of 23l are in a coincident configuration as shown in FIG.
  • a pixel P A and a pixel P B are arranged equidistantly from the optical center O of the pixel array section 100A in the X-axis direction and the Y-axis direction, which are perpendicular to each other.
  • the pupil correction amount A at the pixel PA and the pupil correction amount B at the pixel PB are set to have a relationship of A>B. This will be explained below.
  • each pixel has multiple photodiodes, and by sharing one on-chip lens with these multiple photodiodes, it is possible to simultaneously acquire imaging information and parallax information. There is.
  • two pixels that share an on-chip lens perform pupil correction to match the X-point where their respective outputs are equal. It will be done.
  • pupil correction tailored to this X-point is performed in both the H direction and the V direction.
  • excessive correction is applied to pixels in a direction (eg, V direction) that is not in the phase difference acquisition direction (eg, H direction), and there is a concern that color mixture may increase.
  • An increase in color mixture causes, for example, deterioration of characteristics in the V direction and HV direction and enlargement of GrGb level difference within the chip surface.
  • the pupil correction amount of the unit pixel P A arranged at a distance d1 in the phase difference acquisition direction (X-axis direction) from the optical center O of the pixel array section 100A is defined as the pupil correction amount.
  • the pupil correction amount of a unit pixel P B placed at a distance d2 in the Y-axis direction perpendicular to the phase difference acquisition direction (X-axis direction) from the optical center O of the pixel array section 100A is defined as the pupil correction amount B.
  • the imaging device 1 of this embodiment compared to a general imaging device that performs equivalent pupil correction in the H direction (X-axis direction) and the V direction (Y-axis direction), as described above, It is possible to improve characteristics such as the sensitivity difference between the same colors and the GrGb level difference caused by color mixture.
  • the imaging device 1 of this embodiment is robust against manufacturing variations, it is possible to improve manufacturing yield.
  • FIG. 9 schematically represents an example of a planar configuration of a unit pixel P that constitutes an imaging device according to Modification 1 of the present disclosure.
  • the intra-pixel separation section 14 is attached to each of a pair of sides of the inter-pixel separation section 13 surrounding the unit pixel P in the in-plane direction (XY plane direction) of the semiconductor substrate 11, for example, facing each other in the Y-axis direction.
  • the intra-pixel separation unit 14 is configured to completely divide the unit pixel P in the phase difference acquisition direction (X-axis direction) in the in-plane direction (XY plane direction) of the semiconductor substrate 11.
  • the shape may be continuous between a pair of sides facing each other in the Y-axis direction.
  • the intra-pixel isolation section 14 was formed using an insulating film, but the present invention is not limited to this.
  • the intra-pixel isolation section 14 can also be formed by, for example, an impurity diffusion layer in which p-type impurities are diffused.
  • the intra-pixel isolation section 14 may be formed using a non-doped polysilicon (Poly-Si) film or an amorphous silicon film.
  • the intra-pixel separation unit 14 completely divides the unit pixel P in the phase difference acquisition direction (X-axis direction) by dividing the unit pixel P into two parts that are continuous between a pair of sides facing each other in the Y-axis direction. I did it like that. Further, the intra-pixel isolation section 14 is formed of a material other than an oxide film such as an impurity diffusion layer. Even in such a configuration, the same effects as in the above embodiment can be obtained.
  • FIG. 10 schematically represents an example of a cross-sectional configuration of an imaging device (imaging device 1A) according to Modification 2 of the present disclosure.
  • FIG. 11 schematically shows another example of the cross-sectional configuration of an imaging device 1A according to Modification 2 of the present disclosure.
  • FIG. 12 schematically represents another example of the cross-sectional configuration of an imaging device 1A according to Modification 2 of the present disclosure.
  • the imaging device 1A is, for example, a CMOS image sensor used in electronic equipment such as a digital still camera or a video camera, and is, for example, a so-called back-illuminated imaging device, as in the above embodiment.
  • inter-pixel isolation section 13 and the intra-pixel isolation section 14 both have an FTI structure penetrating between the first surface 11S1 and the second surface 11S2 of the semiconductor substrate 11, but the present invention is not limited to this. It is not something that will be done.
  • the inter-pixel isolation section 13 and the intra-pixel isolation section 14 extend from the first surface 11S1 or the second surface 11S2 of the semiconductor substrate 11 toward the other opposing surface, and the bottoms thereof are STI formed in the semiconductor substrate 11. (Shallow Trench Isolation) structure may also be used.
  • the inter-pixel isolation section 13 and the intra-pixel isolation section 14 have an STI structure, for example, as shown in FIG. It may also be an RDTI (Rear Deep Trench Isolation) structure extending from the side toward the second surface 11S2.
  • the inter-pixel isolation section 13 and the intra-pixel isolation section 14 have a FDTI (Front Deep Trench Isolation) structure extending from the second surface 11S2 side of the semiconductor substrate 11 toward the first surface 11S1.
  • the FTI structure and the STI structure may be combined.
  • the inter-pixel isolation section 13 may have an FTI structure
  • the intra-pixel isolation section 14 may have an FDTI structure extending, for example, from the second surface 11S2 side toward the first surface 11S1.
  • the inter-pixel separation unit 13 and the intra-pixel separation unit 14 have an STI structure including an FDTI structure and an RDTI structure, and a case where the FTI structure and the STI structure are combined is shown. Even in such a configuration, the same effects as in the above embodiment can be obtained.
  • FIG. 13 explains the amount of pupil correction in the pixel array section 100A of the imaging device (imaging device 2) according to Modification 3 of the present disclosure.
  • FIG. 14 schematically shows an example of the planar configuration of the imaging device 2 shown in FIG. 13.
  • the imaging device 2 is, for example, a CMOS image sensor used in electronic equipment such as a digital still camera or a video camera, and is, for example, a so-called back-illuminated imaging device, as in the above embodiment.
  • the Y-axis direction may be the phase difference acquisition direction, and in that case, as shown in FIG.
  • the pupil correction amount of the arranged unit pixel P A is defined as the pupil correction amount A, which is arranged at a distance d2 from the optical center O of the pixel array section 100A in the X-axis direction orthogonal to the phase difference acquisition direction (Y-axis direction).
  • the pupil correction amount of the unit pixel PB is the pupil correction amount B
  • the configuration is such that A>B.
  • the intra-pixel separation unit 44 sets the unit pixel P in the phase difference acquisition direction in the in-plane direction (XY plane direction) of the semiconductor substrate 11, as shown in FIG. 14, for example.
  • the pixel separation section 13 surrounding the unit pixel P extends from each of a pair of sides facing each other in the X-axis direction toward the center of the unit pixel P, and there is a gap between them. have.
  • the intra-pixel separation unit 44 divides the unit pixel P between the pair of sides facing each other in the X-axis direction so as to completely divide the unit pixel P in the phase difference acquisition direction (Y-axis direction). They may be provided continuously.
  • the intra-pixel isolation section 44 is made of, for example, an impurity diffusion layer in which p-type impurities are diffused, a non-doped polysilicon (Poly-Si) film, or an amorphous film. It can be formed using a silicon film.
  • the structure is not limited to the FTI structure, but may be an STI structure such as an FDTI structure or an RDTI structure.
  • the phase difference acquisition direction is set to the Y-axis direction. Even in such a configuration, the same effects as in the above embodiment can be obtained.
  • FIG. 15 schematically represents an example of a planar configuration of a unit pixel P that constitutes an imaging device according to Modification 3 of the present disclosure.
  • the on-chip lens 23L is arranged for each unit pixel P, for example, so as to straddle the two photoelectric conversion sections 12A and 12B arranged in the phase difference acquisition direction.
  • the present invention is not limited to this.
  • an elliptical on-chip lens 23L is arranged for every two unit pixels P arranged in parallel in the phase difference acquisition direction, and parallax information is acquired from the two unit pixels P. You can also do this.
  • the imaging device 1 and the like can be applied to any type of electronic device having an imaging function, such as a camera system such as a digital still camera or a video camera, or a mobile phone having an imaging function.
  • FIG. 16 shows a schematic configuration of electronic device 1000.
  • the electronic device 1000 includes, for example, a lens group 1001, an imaging device 1, a DSP (Digital Signal Processor) circuit 1002, a frame memory 1003, a display section 1004, a recording section 1005, an operation section 1006, and a power supply section 1007. and are interconnected via a bus line 1008.
  • a lens group 1001 an imaging device 1
  • a DSP (Digital Signal Processor) circuit 1002 a frame memory 1003, a display section 1004, a recording section 1005, an operation section 1006, and a power supply section 1007. and are interconnected via a bus line 1008.
  • DSP Digital Signal Processor
  • the lens group 1001 takes in incident light (image light) from a subject and forms an image on the imaging surface of the imaging device 1.
  • the imaging device 1 converts the amount of incident light focused on the imaging surface by the lens group 1001 into an electrical signal for each pixel, and supplies the electrical signal to the DSP circuit 1002 as a pixel signal.
  • the DSP circuit 1002 is a signal processing circuit that processes signals supplied from the imaging device 1.
  • the DSP circuit 1002 processes signals from the imaging device 1 and outputs image data obtained.
  • the frame memory 1003 temporarily stores image data processed by the DSP circuit 1002 in units of frames.
  • the display unit 1004 is composed of a panel type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel, and displays image data of moving images or still images captured by the imaging device 1 on a recording medium such as a semiconductor memory or a hard disk. to be recorded.
  • a panel type display device such as a liquid crystal panel or an organic EL (Electro Luminescence) panel
  • a recording medium such as a semiconductor memory or a hard disk. to be recorded.
  • the operation unit 1006 outputs operation signals regarding various functions owned by the electronic device 1000 in accordance with user operations.
  • the power supply unit 1007 appropriately supplies various kinds of power to serve as operating power for the DSP circuit 1002, frame memory 1003, display unit 1004, recording unit 1005, and operation unit 1006 to these supply targets.
  • the technology according to the present disclosure (this technology) can be applied to various products.
  • the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as a car, electric vehicle, hybrid electric vehicle, motorcycle, bicycle, personal mobility, airplane, drone, ship, robot, etc. It's okay.
  • FIG. 17 is a block diagram illustrating a schematic configuration example of a vehicle control system, which is an example of a mobile body control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output section 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a drive force generation device such as an internal combustion engine or a drive motor that generates drive force for the vehicle, a drive force transmission mechanism that transmits the drive force to wheels, and a drive force transmission mechanism that controls the steering angle of the vehicle. It functions as a control device for a steering mechanism to adjust and a braking device to generate braking force for the vehicle.
  • the body system control unit 12020 controls the operations of various devices installed in the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, or a fog lamp.
  • radio waves transmitted from a portable device that replaces a key or signals from various switches may be input to the body control unit 12020.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, etc. of the vehicle.
  • the external information detection unit 12030 detects information external to the vehicle in which the vehicle control system 12000 is mounted.
  • an imaging section 12031 is connected to the outside-vehicle information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the external information detection unit 12030 may perform object detection processing such as a person, car, obstacle, sign, or text on the road surface or distance detection processing based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electrical signal as an image or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • a driver condition detection section 12041 that detects the condition of the driver is connected to the in-vehicle information detection unit 12040.
  • the driver condition detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver condition detection unit 12041. It may be calculated, or it may be determined whether the driver is falling asleep.
  • the microcomputer 12051 calculates control target values for the driving force generation device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, Control commands can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of autonomous driving, etc., which does not rely on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of preventing glare, such as switching from high beam to low beam. It can be carried out.
  • the audio and image output unit 12052 transmits an output signal of at least one of audio and images to an output device that can visually or audibly notify information to the occupants of the vehicle or to the outside of the vehicle.
  • an audio speaker 12061, a display section 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
  • FIG. 18 is a diagram showing an example of the installation position of the imaging section 12031.
  • the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle 12100.
  • An imaging unit 12101 provided in the front nose and an imaging unit 12105 provided above the windshield inside the vehicle mainly acquire images in front of the vehicle 12100.
  • Imaging units 12102 and 12103 provided in the side mirrors mainly capture images of the sides of the vehicle 12100.
  • An imaging unit 12104 provided in the rear bumper or back door mainly captures images of the rear of the vehicle 12100.
  • the images of the front acquired by the imaging units 12101 and 12105 are mainly used for detecting preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 18 shows an example of the imaging range of the imaging units 12101 to 12104.
  • An imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • imaging ranges 12112 and 12113 indicate imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • an imaging range 12114 shows the imaging range of the imaging unit 12101 provided on the front nose.
  • the imaging range of the imaging unit 12104 provided in the rear bumper or back door is shown. For example, by overlapping the image data captured by the imaging units 12101 to 12104, an overhead image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of image sensors, or may be an image sensor having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104. In particular, by determining the three-dimensional object that is closest to the vehicle 12100 on its path and that is traveling at a predetermined speed (for example, 0 km/h or more) in approximately the same direction as the vehicle 12100, it is possible to extract the three-dimensional object as the preceding vehicle. can.
  • a predetermined speed for example, 0 km/h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving, etc., in which the vehicle travels autonomously without depending on the driver's operation.
  • the microcomputer 12051 transfers three-dimensional object data to other three-dimensional objects such as two-wheeled vehicles, regular vehicles, large vehicles, pedestrians, and utility poles based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic obstacle avoidance. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceeds a set value and there is a possibility of a collision, the microcomputer 12051 transmits information via the audio speaker 12061 and the display unit 12062. By outputting a warning to the driver via the vehicle control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceed
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether the pedestrian is present in the images captured by the imaging units 12101 to 12104.
  • pedestrian recognition involves, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and a pattern matching process is performed on a series of feature points indicating the outline of an object to determine whether it is a pedestrian or not.
  • the audio image output unit 12052 creates a rectangular outline for emphasis on the recognized pedestrian.
  • the display unit 12062 is controlled to display the .
  • the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
  • the imaging device 100 can be applied to the imaging unit 12031.
  • the technology according to the present disclosure (this technology) can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 19 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (present technology) can be applied.
  • FIG. 19 shows an operator (doctor) 11131 performing surgery on a patient 11132 on a patient bed 11153 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a pneumoperitoneum tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. , and a cart 11200 loaded with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into a body cavity of a patient 11132 over a predetermined length, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid scope having a rigid tube 11101 is shown, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible tube. good.
  • An opening into which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and the light is guided to the tip of the lens barrel. Irradiation is directed toward an observation target within the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to a camera control unit (CCU) 11201.
  • CCU camera control unit
  • the CCU 11201 is configured with a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal, such as development processing (demosaic processing), for displaying an image based on the image signal.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under control from the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), and supplies irradiation light to the endoscope 11100 when photographing the surgical site or the like.
  • a light source such as an LED (Light Emitting Diode)
  • LED Light Emitting Diode
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • a treatment tool control device 11205 controls driving of an energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, or the like.
  • the pneumoperitoneum device 11206 injects gas into the body cavity of the patient 11132 via the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of ensuring a field of view with the endoscope 11100 and a working space for the operator. send in.
  • the recorder 11207 is a device that can record various information regarding surgery.
  • the printer 11208 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies irradiation light to the endoscope 11100 when photographing the surgical site can be configured, for example, from a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source configured by a combination of RGB laser light sources
  • the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so the white balance of the captured image is adjusted in the light source device 11203. It can be carried out.
  • the laser light from each RGB laser light source is irradiated onto the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby supporting each of RGB. It is also possible to capture images in a time-division manner. According to this method, a color image can be obtained without providing a color filter in the image sensor.
  • the driving of the light source device 11203 may be controlled so that the intensity of the light it outputs is changed at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changes in the light intensity to acquire images in a time-division manner and compositing the images, a high dynamic It is possible to generate an image of a range.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength band compatible with special light observation.
  • Special light observation uses, for example, the wavelength dependence of light absorption in body tissues to illuminate the mucosal surface layer by irradiating a narrower band of light than the light used for normal observation (i.e., white light). So-called narrow band imaging is performed in which predetermined tissues such as blood vessels are photographed with high contrast.
  • fluorescence observation may be performed in which an image is obtained using fluorescence generated by irradiating excitation light.
  • Fluorescence observation involves irradiating body tissues with excitation light and observing the fluorescence from the body tissues (autofluorescence observation), or locally injecting reagents such as indocyanine green (ICG) into the body tissues and It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be able to supply narrowband light and/or excitation light compatible with such special light observation.
  • FIG. 20 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 19.
  • the camera head 11102 includes a lens unit 11401, an imaging section 11402, a driving section 11403, a communication section 11404, and a camera head control section 11405.
  • the CCU 11201 includes a communication section 11411, an image processing section 11412, and a control section 11413. Camera head 11102 and CCU 11201 are communicably connected to each other by transmission cable 11400.
  • the lens unit 11401 is an optical system provided at the connection part with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 is composed of an image sensor.
  • the imaging unit 11402 may include one image sensor (so-called single-plate type) or a plurality of image sensors (so-called multi-plate type).
  • image signals corresponding to RGB are generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display. By performing 3D display, the operator 11131 can more accurately grasp the depth of the living tissue at the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each imaging element.
  • the imaging unit 11402 does not necessarily have to be provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is constituted by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405. Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 to the CCU 11201 via the transmission cable 11400 as RAW data.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405.
  • the control signal may include, for example, information specifying the frame rate of the captured image, information specifying the exposure value at the time of capturing, and/or information specifying the magnification and focus of the captured image. Contains information about conditions.
  • the above imaging conditions such as the frame rate, exposure value, magnification, focus, etc. may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. good.
  • the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • the image signal and control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal, which is RAW data, transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site etc. by the endoscope 11100 and the display of the captured image obtained by imaging the surgical site etc. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a captured image showing the surgical site, etc., based on the image signal subjected to image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape and color of the edge of an object included in the captured image to detect surgical tools such as forceps, specific body parts, bleeding, mist when using the energy treatment tool 11112, etc. can be recognized.
  • the control unit 11413 may use the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgical support information in a superimposed manner and presenting it to the surgeon 11131, it becomes possible to reduce the burden on the surgeon 11131 and allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be suitably applied to the imaging unit 11402 provided in the camera head 11102 of the endoscope 11100 among the configurations described above.
  • the imaging unit 11402 can be made smaller or have higher definition, so it is possible to provide a smaller or higher definition endoscope 11100.
  • pupil correction is performed using the color filter 21 and the on-chip lens 23L.
  • 21 and the on-chip lens 23L this can also be done by shifting.
  • the present technology has been described using a back-illuminated imaging device (for example, the imaging device 1) as an example, but the present technology can also be applied to a front-illuminated imaging device.
  • the present technology not only the color filter 21 and the on-chip lens 23L, but also the pixel transistor provided on the light irradiation surface (for example, the first surface 11S1) of the semiconductor substrate 11, and the space between the first surface 11S1 and the color filter 21 are included.
  • the wiring layer provided in the pixel array section 100A is shifted from the center of the unit pixel P toward the optical center O of the pixel array section 100A to perform pupil correction. You may also do so.
  • the present disclosure can also have the following configuration.
  • a plurality of unit pixels are arranged at equal distances in a first direction and a second direction perpendicular to each other from the optical center of a pixel array section in which a plurality of unit pixels are arranged in a matrix.
  • the pupil correction amount A in the first pixel and the pupil correction amount B in the second pixel have a relationship of A>B. .
  • the plurality of unit pixels are provided on a semiconductor substrate having a first surface serving as a light incident surface and a second surface opposite to the first surface,
  • the semiconductor substrate includes an inter-pixel isolation section that is provided between each of the plurality of adjacent unit pixels and electrically and optically isolates the plurality of adjacent unit pixels;
  • the imaging device according to (1) further comprising an intra-pixel separation section that is provided respectively and separates each of the plurality of unit pixels in the phase difference acquisition direction.
  • the inter-pixel isolation section and the intra-pixel isolation section may include a Full Trench Isolation structure penetrating the first surface and the second surface of the semiconductor substrate, or a Full Trench Isolation structure penetrating the first surface or the second surface of the semiconductor substrate.
  • the imaging device according to (2) wherein the imaging device has any one of a Deep Trench Isolation structure that extends from one surface toward another surface and has a bottom surface within the semiconductor substrate.
  • the pixel isolation section is formed of an oxide film.
  • the intra-pixel separation section extends from each of a pair of sides of the inter-pixel separation section facing in the second direction toward the center of the unit pixel in a plan view, and has a gap therebetween. 2) The imaging device according to any one of (4).
  • the imaging device according to any one of (2) to (6), wherein the intra-pixel isolation section is formed of an oxide film, a polysilicon film, or an impurity diffusion layer.
  • (8) further comprising a plurality of on-chip lenses arranged for each one or more of the unit pixels on the first surface side of the semiconductor substrate, The imaging device according to any one of (2) to (7), wherein pupil correction is performed by shifting the plurality of on-chip lenses toward the optical center.
  • (9) further comprising a plurality of color filters arranged for each one or more of the unit pixels on the first surface side of the semiconductor substrate and a light shielding part provided between adjacent color filters; The imaging device according to any one of (2) to (8), wherein pupil correction is performed by shifting the plurality of color filters toward the optical center.
  • (10) further comprising a plurality of pixel transistors and a wiring layer arranged for each one or more of the unit pixels on the first surface side of the semiconductor substrate,
  • (12) A first pixel and a second pixel arranged at equal distances in a first direction and a second direction perpendicular to the first direction from the optical center of a pixel array section in which a plurality of unit pixels are arranged in a matrix. Equipped with When the first direction is the phase difference acquisition direction, the pupil correction amount A in the first pixel and the pupil correction amount B in the second pixel have a relationship of A>B. device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

Un dispositif d'imagerie d'un mode de réalisation de la présente divulgation comprend un premier pixel et un second pixel qui sont respectivement disposés, de manière équidistante, dans une première direction et dans une seconde direction orthogonale à la première direction, à partir du centre optique d'une unité de réseau de pixels dans laquelle une pluralité de pixels unitaires sont disposés dans une matrice. Si la première direction est une direction d'acquisition de déphasage, une quantité de correction de pupille A du premier pixel et une quantité de correction de pupille B du second pixel présentent la relation A > B
PCT/JP2023/018728 2022-05-30 2023-05-19 Dispositif d'imagerie et appareil électronique WO2023234069A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022087671 2022-05-30
JP2022-087671 2022-05-30

Publications (1)

Publication Number Publication Date
WO2023234069A1 true WO2023234069A1 (fr) 2023-12-07

Family

ID=89026524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/018728 WO2023234069A1 (fr) 2022-05-30 2023-05-19 Dispositif d'imagerie et appareil électronique

Country Status (1)

Country Link
WO (1) WO2023234069A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003273342A (ja) * 2002-03-13 2003-09-26 Sony Corp 固体撮像素子及びその製造方法
JP2007317870A (ja) * 2006-05-25 2007-12-06 Sony Corp 固体撮像装置とその製造方法、及びカメラモジュール
WO2016114154A1 (fr) * 2015-01-13 2016-07-21 ソニー株式会社 Élément d'imagerie à semi-conducteur, son procédé de fabrication et dispositif électronique
US20180091790A1 (en) * 2016-09-23 2018-03-29 Samsung Electronics Co., Ltd. Method and electronic device for detecting wavelength spectrum of incident light
WO2018221443A1 (fr) * 2017-05-29 2018-12-06 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie à semi-conducteurs et dispositif électronique
WO2021193254A1 (fr) * 2020-03-27 2021-09-30 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie et appareil électronique
WO2021193915A1 (fr) * 2020-03-27 2021-09-30 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie et appareil électronique

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003273342A (ja) * 2002-03-13 2003-09-26 Sony Corp 固体撮像素子及びその製造方法
JP2007317870A (ja) * 2006-05-25 2007-12-06 Sony Corp 固体撮像装置とその製造方法、及びカメラモジュール
WO2016114154A1 (fr) * 2015-01-13 2016-07-21 ソニー株式会社 Élément d'imagerie à semi-conducteur, son procédé de fabrication et dispositif électronique
US20180091790A1 (en) * 2016-09-23 2018-03-29 Samsung Electronics Co., Ltd. Method and electronic device for detecting wavelength spectrum of incident light
WO2018221443A1 (fr) * 2017-05-29 2018-12-06 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie à semi-conducteurs et dispositif électronique
WO2021193254A1 (fr) * 2020-03-27 2021-09-30 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie et appareil électronique
WO2021193915A1 (fr) * 2020-03-27 2021-09-30 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie et appareil électronique

Similar Documents

Publication Publication Date Title
US11563923B2 (en) Solid-state imaging device and electronic apparatus
US20220165770A1 (en) Solid-state imaging device and electronic device
JP7270616B2 (ja) 固体撮像素子および固体撮像装置
JPWO2018043654A1 (ja) 固体撮像装置およびその製造方法、並びに電子機器
JP7284171B2 (ja) 固体撮像装置
US20230224602A1 (en) Solid-state imaging device
US20230103730A1 (en) Solid-state imaging device
US20240006443A1 (en) Solid-state imaging device, imaging device, and electronic apparatus
WO2019239754A1 (fr) Élément d'imagerie à semi-conducteur, procédé de fabrication d'élément d'imagerie à semi-conducteur et dispositif électronique
TW202040807A (zh) 攝像裝置及攝像系統
WO2022220084A1 (fr) Dispositif d'imagerie
US20230387166A1 (en) Imaging device
WO2022172711A1 (fr) Élément de conversion photoélectrique et dispositif électronique
WO2023234069A1 (fr) Dispositif d'imagerie et appareil électronique
US20240204014A1 (en) Imaging device
WO2024057814A1 (fr) Dispositif de détection de lumière et instrument électronique
JP7437957B2 (ja) 受光素子、固体撮像装置及び電子機器
WO2024127853A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2023188899A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2024116302A1 (fr) Élément photodétecteur
WO2024053299A1 (fr) Dispositif de détection de lumière et appareil électronique
WO2023058352A1 (fr) Dispositif d'imagerie à semi-conducteurs
WO2024014326A1 (fr) Appareil de détection de lumière
WO2023248925A1 (fr) Élément d'imagerie et dispositif électronique
WO2023079835A1 (fr) Convertisseur photoélectrique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815828

Country of ref document: EP

Kind code of ref document: A1