WO2023176551A1 - Photoelectric conversion element and optical detection device - Google Patents

Photoelectric conversion element and optical detection device Download PDF

Info

Publication number
WO2023176551A1
WO2023176551A1 PCT/JP2023/008332 JP2023008332W WO2023176551A1 WO 2023176551 A1 WO2023176551 A1 WO 2023176551A1 JP 2023008332 W JP2023008332 W JP 2023008332W WO 2023176551 A1 WO2023176551 A1 WO 2023176551A1
Authority
WO
WIPO (PCT)
Prior art keywords
layer
photoelectric conversion
electrode
light
semiconductor
Prior art date
Application number
PCT/JP2023/008332
Other languages
French (fr)
Japanese (ja)
Inventor
涼介 鈴木
巖 八木
晋太郎 平田
正大 定榮
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2023176551A1 publication Critical patent/WO2023176551A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K30/00Organic devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation
    • H10K30/60Organic devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation in which radiation controls flow of current through the devices, e.g. photoresistors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K30/00Organic devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation
    • H10K30/80Constructional details
    • H10K30/81Electrodes
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K39/00Integrated devices, or assemblies of multiple devices, comprising at least one organic radiation-sensitive element covered by group H10K30/00
    • H10K39/30Devices controlled by radiation
    • H10K39/32Organic image sensors

Definitions

  • the present disclosure relates to a photoelectric conversion element using, for example, an organic material and a photodetection device equipped with the same.
  • Patent Document 1 in a photoelectric conversion unit in which a first electrode, a photoelectric conversion layer, and a second electrode are stacked, an indium-gallium-zinc composite oxide (IGZO) is provided between the first electrode and the photoelectric conversion layer. ) has been disclosed, which has disclosed an image sensor in which the photoresponsiveness is improved by providing a composite oxide layer consisting of the following.
  • IGZO indium-gallium-zinc composite oxide
  • a photoelectric conversion element includes a first electrode and a second electrode arranged in parallel, a third electrode arranged opposite to the first electrode and the second electrode, and a first electrode and a second electrode arranged in parallel.
  • a semiconductor layer comprising a first layer and a second layer stacked in order from the second electrode side, the first layer having a thickness smaller than the second layer and having a thickness of 3 nm or more and 5 nm or less; It is equipped with the following.
  • a photodetection device includes a plurality of pixels each provided with one or more photoelectric conversion elements, and uses the photoelectric conversion element according to the embodiment of the present disclosure as the photoelectric conversion element. It is prepared.
  • the first electrode and the second electrode are arranged between the first electrode and the second electrode arranged in parallel and the photoelectric conversion layer.
  • a semiconductor layer is provided, including a first layer and a second layer stacked in order from the electrode side, the thickness of the first layer being smaller than the thickness of the second layer, and in the range of 3 nm or more and 5 nm or less. I did it like that. This reduces the influence of fixed charges on the surface of the semiconductor layer while maintaining carrier conduction within the semiconductor layer.
  • FIG. 1 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to an embodiment of the present disclosure.
  • FIG. 2 is a schematic plan view showing an example of a pixel configuration of an imaging device having the image sensor shown in FIG. 1.
  • FIG. FIG. 2 is a schematic cross-sectional view showing an example of the configuration of the photoelectric conversion section shown in FIG. 1.
  • FIG. FIG. 4 is a characteristic diagram showing the relationship between the film thickness of the first layer of the semiconductor layer shown in FIG. 3 and the amount of delayed charge. 4 is a characteristic diagram showing the relationship between the thickness of the first layer of the semiconductor layer shown in FIG. 3 and the drain current.
  • FIG. FIG. 4 is a characteristic diagram showing the relationship between the thickness of the second layer of the semiconductor layer shown in FIG.
  • FIG. 7A is an enlarged view of a portion of FIG. 7A.
  • FIG. 2 is an equivalent circuit diagram of the image sensor shown in FIG. 1.
  • FIG. FIG. 2 is a schematic diagram showing the arrangement of transistors forming a lower electrode and a control section of the image sensor shown in FIG. 1;
  • FIG. 2 is a cross-sectional view for explaining a method of manufacturing the image sensor shown in FIG. 1.
  • FIG. FIG. 11 is a cross-sectional view showing a step following FIG. 10;
  • FIG. 12 is a cross-sectional view showing a step subsequent to FIG. 11;
  • FIG. 13 is a cross-sectional view showing a step following FIG. 12;
  • FIG. 14 is a cross-sectional view showing a step following FIG. 13;
  • FIG. 15 is a cross-sectional view showing a step following FIG. 14;
  • FIG. 2 is a timing chart showing an example of the operation of the image sensor shown in FIG. 1.
  • FIG. FIG. 3 is a schematic cross-sectional view showing the configuration of a photoelectric conversion section according to Modification Example 1 of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view showing the configuration of a photoelectric conversion unit according to Modification Example 2 of the present disclosure.
  • FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to Modification 3 of the present disclosure.
  • FIG. 19A is a schematic plan view showing an example of a pixel configuration of an imaging device having the image sensor shown in FIG. 19A.
  • FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to Modification Example 4 of the present disclosure.
  • 20A is a schematic plan view showing an example of a pixel configuration of an imaging device having the image sensor shown in FIG. 20A.
  • FIG. FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to Modification Example 5 of the present disclosure.
  • FIG. 2 is a block diagram showing the configuration of an imaging device using the imaging device shown in FIG. 1 etc. as a pixel.
  • 23 is a functional block diagram showing an example of an electronic device (camera) using the imaging device shown in FIG. 22.
  • FIG. 23 is a schematic diagram showing an example of the overall configuration of a photodetection system using the imaging device shown in FIG. 22.
  • FIG. 24A is a diagram showing an example of a circuit configuration of the photodetection system shown in FIG. 24A.
  • FIG. FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system.
  • FIG. 2 is a block diagram showing an example of the functional configuration of a camera head and a CCU.
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system.
  • FIG. 2 is an explanatory diagram showing an example of installation positions of an outside-vehicle information detection section and an imaging section.
  • Embodiment Example of an image sensor having a semiconductor layer consisting of two layers (first layer and second layer) having a predetermined film thickness ratio between a lower electrode and a photoelectric conversion layer) 1-1.
  • Modification 1 (other example of the configuration of the photoelectric conversion section) 2-2.
  • Modification 2 (other example of the configuration of the photoelectric conversion section) 2-3.
  • Modification example 3 (an example of an image sensor that performs spectroscopy using a color filter) 2-4.
  • Modification 4 (another example of an image sensor that performs spectroscopy using a color filter) 2-5.
  • Modification example 5 (an example of an image sensor in which a plurality of photoelectric conversion units are stacked) 3.
  • FIG. 1 shows a cross-sectional configuration of an image sensor (image sensor 10) according to an embodiment of the present disclosure.
  • FIG. 2 schematically shows an example of the planar configuration of the image sensor 10 shown in FIG. 1
  • FIG. 1 shows a cross section taken along the line II shown in FIG.
  • FIG. 3 schematically shows an enlarged example of the cross-sectional configuration of the main part (photoelectric conversion section 20) of the image sensor 10 shown in FIG.
  • the image sensor 10 is arranged in an array in a pixel section 1A of an image pickup device (for example, the image pickup device 1, see FIG. 22) such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor used in electronic devices such as digital still cameras and video cameras.
  • CMOS Complementary Metal Oxide Semiconductor
  • a pixel unit 1a consisting of four unit pixels P arranged in, for example, two rows and two columns serves as a repeating unit, and is repeated in an array in the row direction and column direction. It is located.
  • the photoelectric conversion section 20 in the photoelectric conversion section 20 provided on the semiconductor substrate 30, a plurality of layers are provided between the lower electrode 21 consisting of the readout electrode 21A and the storage electrode 21B and the photoelectric conversion layer 24.
  • a semiconductor layer 23 is provided.
  • the semiconductor layer 23 has a structure in which, for example, a first layer 23A and a second layer 23B are laminated in this order from the lower electrode 11 side, and the thickness of the first layer 23A is smaller than the thickness of the second layer 23B, and , 3 nm or more and 5 nm or less.
  • the readout electrode 21A corresponds to a specific example of the "second electrode” of the present disclosure
  • the storage electrode 21B corresponds to a specific example of the "first electrode” of the present disclosure.
  • the first layer 23A corresponds to a specific example of the "first layer” of the present disclosure
  • the second layer 23B corresponds to a specific example of the "second layer” of the present disclosure.
  • the image sensor 10 is, for example, a so-called vertical spectroscopy type device in which one photoelectric conversion section 20 and two photoelectric conversion regions 32B and 32R are stacked vertically.
  • the photoelectric conversion unit 20 is provided on the back surface (first surface 30A) side of the semiconductor substrate 30.
  • the photoelectric conversion regions 32B and 32R are embedded in the semiconductor substrate 30 and are stacked in the thickness direction of the semiconductor substrate 30.
  • the photoelectric conversion unit 20 and the photoelectric conversion regions 32B and 32R selectively detect light in different wavelength ranges and perform photoelectric conversion. For example, the photoelectric conversion unit 20 acquires a green (G) color signal.
  • the photoelectric conversion regions 32B and 32R obtain blue (B) and red (R) color signals, respectively, due to differences in absorption coefficients. This allows the image sensor 10 to acquire multiple types of color signals in one pixel without using a color filter.
  • a multilayer wiring layer 40 is further provided on the second surface 30B of the semiconductor substrate 30 with a gate insulating layer 33 interposed therebetween.
  • the multilayer wiring layer 40 has, for example, a structure in which wiring layers 41, 42, and 43 are laminated within an insulating layer 44.
  • a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, an output circuit 114, a control circuit 115, and input/output terminals are provided in the peripheral area of the semiconductor substrate 30, that is, in the peripheral area 1B around the pixel unit 1A. 116 etc. are provided.
  • the first surface 30A side of the semiconductor substrate 30 is represented as a light incident side S1
  • the second surface 30B side is represented as a wiring layer side S2.
  • a semiconductor layer 23 and a photoelectric conversion layer 24 formed using an organic material are stacked in this order from the lower electrode 21 side between a lower electrode 21 and an upper electrode 25 that are arranged opposite to each other.
  • the semiconductor layer 23 includes the first layer 23A and the second layer 23B stacked in this order from the lower electrode 21 side.
  • the photoelectric conversion layer 24 includes a p-type semiconductor and an n-type semiconductor, and has a bulk heterojunction structure within the layer.
  • a bulk heterojunction structure is a p/n junction formed by mixing a p-type semiconductor and an n-type semiconductor.
  • the photoelectric conversion section 20 further includes an insulating layer 22 between the lower electrode 21 and the semiconductor layer 23.
  • the insulating layer 22 is provided, for example, over the entire surface of the pixel portion 1A, and has an opening 22H above the readout electrode 21A that constitutes the lower electrode 21.
  • the readout electrode 21A is electrically connected to the semiconductor layer 23 via this opening 22H.
  • FIG. 1 shows an example in which the semiconductor layer 23, the photoelectric conversion layer 24, and the upper electrode 25 are formed separately for each image sensor 10, the semiconductor layer 23, the photoelectric conversion layer 24, and the upper electrode 25 are For example, it may be provided as a continuous layer common to a plurality of image sensors 10.
  • an insulating layer 26 and an interlayer insulating layer 27 are stacked between the first surface 30A of the semiconductor substrate 30 and the lower electrode 21.
  • the insulating layer 26 includes a layer having a fixed charge (fixed charge layer) 26A and a dielectric layer 26B having an insulating property, which are stacked in this order from the semiconductor substrate 30 side.
  • the photoelectric conversion regions 32B and 32R make it possible to vertically separate light by utilizing the fact that the wavelength of light absorbed in the semiconductor substrate 30 made of a silicon substrate differs depending on the depth of incidence of the light. , and each has a pn junction in a predetermined region of the semiconductor substrate 30.
  • a through electrode 34 is provided between the first surface 30A and the second surface 30B of the semiconductor substrate 30.
  • the through electrode 34 is electrically connected to the readout electrode 21A, and the photoelectric conversion unit 20 connects the gate Gamp of the amplifier transistor AMP and the reset transistor RST (reset transistor Tr1rst) which also serves as the floating diffusion FD1 via the through electrode 34. ) is connected to one source/drain region 36B.
  • carriers (electrons here) generated in the photoelectric conversion unit 20 provided on the first surface 30A side of the semiconductor substrate 30 are transferred to the second surface of the semiconductor substrate 30 via the through electrode 34. It is possible to transfer it well to the 30B side and improve the characteristics.
  • the lower end of the through electrode 34 is connected to the wiring (connection part 41A) in the wiring layer 41, and the connection part 41A and the gate Gamp of the amplifier transistor AMP are connected via the lower first contact 45. .
  • the connecting portion 41A and the floating diffusion FD1 (region 36B) are connected, for example, via the lower second contact 46.
  • the upper end of the through electrode 34 is connected to the read electrode 21A via, for example, a pad portion 39A and an upper first contact 39C.
  • a protective layer 51 is provided above the photoelectric conversion section 20.
  • a wiring 52 and a light shielding film 53 are provided that electrically connect the upper electrode 25 and the peripheral circuit section 130 around the pixel section 1A.
  • optical members such as a flattening layer (not shown) and an on-chip lens 54 are provided above the protective layer 51.
  • the photoelectric conversion layer 24 In the image sensor 10 of this embodiment, light that enters the photoelectric conversion unit 20 from the light incidence side S1 is absorbed by the photoelectric conversion layer 24.
  • the excitons generated thereby move to the interface between the electron donor and electron acceptor that constitute the photoelectric conversion layer 24, and are separated into excitons, that is, dissociated into electrons and holes.
  • the carriers (electrons and holes) generated here are diffused due to the difference in carrier concentration and due to the internal electric field due to the difference in work function between the anode (for example, the upper electrode 25) and the cathode (for example, the lower electrode 21). It is carried to different electrodes and detected as a photocurrent. Further, the transport direction of electrons and holes can also be controlled by applying a potential between the lower electrode 21 and the upper electrode 25.
  • the photoelectric conversion unit 20 is an organic photoelectric conversion element that absorbs, for example, green light corresponding to part or all of a selective wavelength range (for example, from 450 nm to 650 nm) and generates excitons. .
  • the lower electrode 21 is composed of, for example, a readout electrode 21A and a storage electrode 21B arranged in parallel on the interlayer insulating layer 28.
  • the readout electrode 21A is for transferring carriers generated in the photoelectric conversion layer 25 to the floating diffusion FD1, and for example, one for each pixel unit 1a consisting of four pixels arranged in 2 rows x 2 columns. It is provided.
  • the readout electrode 21A is connected to the floating diffusion FD1 via, for example, the upper first contact 39C, the pad portion 39A, the through electrode 34, the connecting portion 41A, and the lower second contact 46.
  • the storage electrodes 21B are for storing signal charges, such as electrons, in the oxide semiconductor layer 23 among the carriers generated in the photoelectric conversion layer 25, and are provided for each pixel.
  • the storage electrode 21B is provided for each unit pixel P in a region that directly faces the light receiving surfaces of the photoelectric conversion regions 32B and 32R formed in the semiconductor substrate 30 and covers these light receiving surfaces.
  • the storage electrode 21B is preferably larger than the readout electrode 21A, so that more carriers can be stored.
  • the lower electrode 21 is made of a light-transmitting conductive film, and is made of, for example, ITO (indium tin oxide).
  • the constituent material of the lower electrode 21 may be a tin oxide (SnO 2 )-based material to which a dopant is added, or a zinc oxide-based material made by adding a dopant to zinc oxide (ZnO). good.
  • zinc oxide-based materials include aluminum zinc oxide (AZO) to which aluminum (Al) is added as a dopant, gallium zinc oxide (GZO) to which gallium (Ga) is added, and indium zinc to which indium (In) is added.
  • examples include oxides (IZO).
  • IGZO, ITZO, CuI, InSbO 4 , ZnMgO, CuInO 2 , MgIN 2 O 4 , CdO, ZnSnO 3 and the like may be used.
  • the insulating layer 22 is for electrically separating the storage electrode 21B and the semiconductor layer 23.
  • the insulating layer 22 is provided, for example, on the interlayer insulating layer 27 so as to cover the lower electrode 21.
  • an opening 22H is provided above the readout electrode 21A of the lower electrode 21, and the readout electrode 21A and the semiconductor layer 23 are electrically connected through this opening 22H.
  • the insulating layer 22 is composed of, for example, a single layer film made of one kind of silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiON), etc., or a laminated film made of two or more kinds. There is.
  • the thickness of the insulating layer 22 is, for example, 20 nm to 500 nm.
  • the semiconductor layer 23 is for accumulating carriers (electrons) generated in the photoelectric conversion layer 24.
  • the semiconductor layer 23 is provided between the lower electrode 21 and the photoelectric conversion layer 24, and has a laminated structure in which the first layer 23A and the second layer 23B are laminated in this order from the lower electrode 21 side. have. Electrons generated in the photoelectric conversion layer 24 are accumulated in the entire semiconductor layer 23 from near the interface between the insulating layer 22 and the first layer 23A above the storage electrode 21B, and are transferred to the readout electrode 21A via the first layer 23A. be done.
  • the electrons accumulated above the storage electrode 21B are transferred to the readout electrode 21A by controlling the potential of the storage electrode 21B to generate a potential gradient, and the electrons are transferred from the readout electrode 21A to the floating diffusion. Transferred to FD1.
  • the first layer 23A and the second layer 23B each have a predetermined thickness. Specifically, the thickness (t1) of the first layer 23A is smaller than the thickness (t2) of the second layer 23B, for example, the thickness (t1) of the first layer 23A and the thickness of the second layer 23B. (t2) (t1/t2) is 0.16 or less.
  • the semiconductor layer 23 (first layer 23A and second layer 23B) can be formed using, for example, the following materials.
  • the semiconductor layer 23 can be formed using an n-type oxide semiconductor material.
  • IGZO In-Ga-Zn-O-based oxide semiconductor
  • ITZO In-Sn-Zn-O-based oxide semiconductor
  • ZTO Zn-Sn-O-based oxide semiconductor
  • IGZTO Examples include In-Ga-Zn-Sn-O-based oxide semiconductor), GTO (Ga-Sn-O-based oxide semiconductor), and IGO (In-Ga-O-based oxide semiconductor).
  • AlZnO, GaZnO, InZnO, etc. which are obtained by adding aluminum (Al), gallium (Ga), indium (In), etc. as a dopant to the above oxide semiconductor, CuI, InSbO 4 , ZnMgO, CuInO 2 , MgIN 2 A material containing O 4 , CdO, etc. can be used.
  • the first layer 23A and the second layer 23B are preferably made of at least one of the above-mentioned oxide semiconductor materials, and indium oxide such as IGZO is particularly preferably used.
  • the first layer 23A is formed using indium oxide having a higher indium content than the indium oxide forming the second layer 23B.
  • the first layer 23A and the second layer 23B both have crystallinity or amorphous property, for example.
  • one of the first layer 23A and the second layer 23B may have crystallinity and the other may have amorphous property.
  • the first layer 23A may have a laminated structure of an amorphous layer and a crystal layer.
  • a part of the first layer 23A (an initial layer with a thickness of several nm when forming the first layer 23A) may be an amorphous layer.
  • the first layer 23A serves as a seed crystal for the second layer 23B, so it is difficult to form the second layer 23B with good film quality. Therefore, the defect level at the interface between the first layer 23A and the second layer 23B can be reduced.
  • the impurities in the layer are reduced compared to when the first layer 23A is formed directly on the insulating layer 22, so the defect levels caused by impurities are reduced. can be reduced. Further, since inhibition of crystal growth caused by impurities is reduced, crystallinity can be improved.
  • the impurities in silicon are reduced, so the defect level can be reduced.
  • FIG. 4 shows the relationship between the thickness of the first layer 23A and the amount of delayed charge.
  • the amount of delayed charge is the amount of charge that is transferred after a certain period of time has passed from the start of transfer when the charges accumulated in the semiconductor layer 23 are transferred to the floating diffusion FD1, and the smaller the value of the amount of delayed charge, the better. It has carrier conductivity.
  • FIG. 5 shows the relationship between the film thickness of the first layer 23A and the drain current.
  • the drain current is a current that flows through the drain of a transistor when the semiconductor layer 23 is used as a channel layer, and the smaller the value of the drain current, the higher the reliability.
  • FIG. 6 shows the relationship between the thickness of the second layer 23B and the amount of variation in threshold voltage.
  • FIG. 7A shows a simulation result of the relationship between the distance from the interface between the insulating layer 22 and the semiconductor layer 23 and the carrier concentration.
  • FIG. 7B is an enlarged view of the range of film thickness from 0 nm to 10 nm shown in FIG. 7A, with the vertical axis normalized.
  • the electrons accumulated in the semiconductor layer 23 decrease as the distance from the interface between the insulating layer 22 and the first layer 23A decreases.
  • the carrier concentration decreases. It decreases by about one order of magnitude.
  • the thickness of the first layer 23A is set to be 3 nm or more and 5 nm or less.
  • the distance from the interface between the insulating layer 22 and the semiconductor layer 23 is It can be seen that the carrier (electron) concentration decreases as the distance increases. However, if there are many electrons on the surface of the semiconductor layer 23, more electrons will be captured by fixed charges, and fluctuations in the threshold voltage (Vth) will increase. Therefore, if the carrier concentration in a state where no voltage is applied is, for example, 1.0E+16cm -3 , and the number of electrons on the surface is less than that with electrons accumulated in the semiconductor layer 23, the amount of fluctuation in the threshold voltage (Vth) can be considered to be within the standard. That is, from FIG. 7A, the thickness of the semiconductor layer 23 is 35 nm or more. Here, the thickness of the first layer 23A was set to be 3 nm or more and 5 nm or less, as described above. Therefore, the thickness of the second layer 23B is 32 nm or more.
  • the photoelectric conversion layer 24 converts light energy into electrical energy.
  • the photoelectric conversion layer 24 is configured to include, for example, two or more types of organic materials (p-type semiconductor material or n-type semiconductor material) each functioning as a p-type semiconductor or an n-type semiconductor.
  • the photoelectric conversion layer 24 has a junction surface (p/n junction surface) between a p-type semiconductor material and an n-type semiconductor material within the layer.
  • a p-type semiconductor relatively functions as an electron donor, and an n-type semiconductor relatively functions as an electron acceptor.
  • the photoelectric conversion layer 24 provides a field where excitons generated when absorbing light are separated into electrons and holes. (n-junction), excitons separate into electrons and holes.
  • the photoelectric conversion layer 24 includes, in addition to a p-type semiconductor material and an n-type semiconductor material, an organic material that photoelectrically converts light in a predetermined wavelength range while transmitting light in other wavelength ranges, a so-called dye material. may have been done.
  • a so-called dye material may have been done.
  • the photoelectric conversion layer 24 is formed using three types of organic materials: a p-type semiconductor material, an n-type semiconductor material, and a dye material
  • the p-type semiconductor material and the n-type semiconductor material are used in the visible region (for example, from 450 nm to It is preferable that the material is transparent to light at a wavelength of 800 nm).
  • the thickness of the photoelectric conversion layer 24 is, for example, 50 nm to 500 nm.
  • Examples of the organic material constituting the photoelectric conversion layer 24 include quinacridone derivatives, naphthalene derivatives, anthracene derivatives, phenanthrene derivatives, tetracene derivatives, pyrene derivatives, perylene derivatives, and fluoranthene derivatives.
  • the photoelectric conversion layer 24 is composed of a combination of two or more of the above organic materials.
  • the above organic materials function as a p-type semiconductor or an n-type semiconductor depending on the combination.
  • the organic material constituting the photoelectric conversion layer 24 is not particularly limited.
  • polymers such as phenylene vinylene, fluorene, carbazole, indole, pyrene, pyrrole, picoline, thiophene, acetylene, and diacetylene, or derivatives thereof can be used.
  • metal complex dyes cyanine dyes, merocyanine dyes, phenylxanthene dyes, triphenylmethane dyes, rhodacyanine dyes, xanthene dyes, macrocyclic azaannulene dyes, azulene dyes, naphthoquinone dyes, anthraquinone dyes. , fused polycyclic aromatics such as pyrene, chain compounds in which aromatic rings or heterocyclic compounds are condensed, two nitrogen-containing heterocycles such as quinoline, benzothiazole, benzoxazole, etc., which have squarylium groups and croconic metine groups as bonding chains.
  • cyanine-based similar dyes bonded by squarylium groups and croconic metine groups can be used.
  • the metal complex dyes include dithiol metal complex dyes, metal phthalocyanine dyes, metal porphyrin dyes, and ruthenium complex dyes. Among these, ruthenium complex dyes are particularly preferred, but are not limited to the above.
  • the upper electrode 25, like the lower electrode 21, is made of a light-transmitting conductive film, and is made of, for example, ITO (indium tin oxide).
  • the upper electrode 25 may be made of a tin oxide (SnO 2 )-based material to which a dopant has been added, or a zinc oxide-based material made by adding a dopant to zinc oxide (ZnO).
  • zinc oxide-based materials include aluminum zinc oxide (AZO) to which aluminum (Al) is added as a dopant, gallium zinc oxide (GZO) to which gallium (Ga) is added, and indium zinc to which indium (In) is added.
  • examples include oxides (IZO).
  • the upper electrode 25 may be separated for each pixel, or may be formed as a common electrode for each pixel.
  • the thickness of the upper electrode 25 is, for example, 10 nm to 200 nm.
  • the photoelectric conversion unit 20 includes other layers between the lower electrode 21 and the photoelectric conversion layer 24 (for example, between the semiconductor layer 23 and the photoelectric conversion layer 24) and between the photoelectric conversion layer 24 and the upper electrode 25. may be provided.
  • the photoelectric conversion unit 20 includes, in order from the lower electrode 21 side, a semiconductor layer 23, a buffer layer that also serves as an electron blocking film, a photoelectric conversion layer 24, a buffer layer that also serves as a hole blocking film, a work function adjustment layer, and the like. It's okay.
  • the photoelectric conversion layer 24 may have, for example, a pin bulk heterostructure in which a p-type blocking layer, a layer (i-layer) containing a p-type semiconductor and an n-type semiconductor, and an n-type blocking layer are stacked.
  • the insulating layer 26 covers the first surface 30A of the semiconductor substrate 30 to reduce the interface level with the semiconductor substrate 30 and suppress the generation of dark current from the interface with the semiconductor substrate 30. Further, the insulating layer 26 extends from the first surface 30A of the semiconductor substrate 30 to the side surface of the opening 34H (see FIG. 11) in which the through electrode 34 penetrating the semiconductor substrate 30 is formed.
  • the insulating layer 26 has, for example, a stacked structure of a fixed charge layer 26A and a dielectric layer 26B.
  • the fixed charge layer 26A may be a film having a positive fixed charge or a film having a negative fixed charge.
  • a constituent material of the fixed charge layer 26A it is preferable to use a semiconductor material or a conductive material having a wider band gap than the semiconductor substrate 30. Thereby, generation of dark current at the interface of the semiconductor substrate 30 can be suppressed.
  • Examples of the constituent materials of the fixed charge layer 26A include hafnium oxide (HfO x ), aluminum oxide (AlO x ), zirconium oxide (ZrO x ), tantalum oxide (TaO x ), titanium oxide (TiO x ), and lanthanum oxide ( LaO x ), praseodymium oxide (PrO x ), cerium oxide (CeO x ), neodymium oxide (NdO x ), promethium oxide (PmO x ), samarium oxide (SmO x ), europium oxide (EuO x ) , gadolinium oxide (GdO x ), terbium oxide (TbO x ), dysprosium oxide (DyO x ), holmium oxide (HoO x ), thulium oxide (TmO x ) , ytterbium oxide (YbO x ), lutetium oxide (L
  • the dielectric layer 26B is for preventing the reflection of light caused by the difference in refractive index between the semiconductor substrate 30 and the interlayer insulating layer 27.
  • the constituent material of the dielectric layer 26B is preferably a material having a refractive index between the refractive index of the semiconductor substrate 30 and the refractive index of the interlayer insulating layer 27.
  • Examples of the constituent material of the dielectric layer 26B include silicon oxide, TEOS, silicon nitride, and silicon oxynitride (SiON).
  • the interlayer insulating layer 27 is composed of, for example, a single layer film made of one of silicon oxide, silicon nitride, silicon oxynitride, etc., or a laminated film made of two or more of these.
  • a shield electrode 28 is provided on the interlayer insulating layer 27 together with the lower electrode 21 .
  • the shield electrode 28 is for preventing capacitive coupling between adjacent pixel units 1a.
  • the shield electrode 28 is provided around the pixel unit 1a consisting of four pixels arranged in 2 rows x 2 columns, and has a fixed potential. is being applied.
  • the shield electrode 28 further extends between adjacent pixels in the row direction (Z-axis direction) and column direction (X-axis direction) within the pixel unit 1a.
  • the semiconductor substrate 30 is made of, for example, an n-type silicon (Si) substrate, and has a p-well 31 in a predetermined region.
  • the photoelectric conversion regions 32B and 32R are each composed of a photodiode (PD) having a pn junction in a predetermined region of the semiconductor substrate 30, and the wavelength of the light absorbed in the Si substrate differs depending on the depth of incidence of the light. This makes it possible to split light vertically.
  • the photoelectric conversion region 32B for example, selectively detects blue light and accumulates signal charges corresponding to blue, and is installed at a depth where blue light can be efficiently photoelectrically converted.
  • the photoelectric conversion region 32R for example, selectively detects red light and accumulates signal charges corresponding to red, and is installed at a depth that allows efficient photoelectric conversion of red light.
  • blue (B) is a color that corresponds to a wavelength range of, for example, 450 nm to 495 nm
  • red (R) is a color that corresponds to a wavelength range of, for example, 620 nm to 750 nm.
  • Each of the photoelectric conversion regions 32B and 32R only needs to be capable of detecting light in some or all of the wavelength ranges.
  • the photoelectric conversion region 32B is configured to include, for example, a p+ region that becomes a hole storage layer and an n region that becomes an electron storage layer.
  • the photoelectric conversion region 32R has, for example, a p+ region serving as a hole storage layer and an n region serving as an electron storage layer (has a pnp stacked structure).
  • the n region of the photoelectric conversion region 32B is connected to the vertical transfer transistor Tr2.
  • the p+ region of the photoelectric conversion region 32B is bent along the transfer transistor Tr2 and connected to the p+ region of the photoelectric conversion region 32R.
  • the gate insulating layer 33 is composed of, for example, a single layer film made of one of silicon oxide, silicon nitride, silicon oxynitride, etc., or a laminated film made of two or more of these.
  • the through electrode 34 is provided between the first surface 30A and the second surface 30B of the semiconductor substrate 30, and has a function as a connector between the photoelectric conversion section 20 and the gate Gamp of the amplifier transistor AMP and the floating diffusion FD1. This serves as a transmission path for carriers generated in the photoelectric conversion section 20.
  • a reset gate Grst of the reset transistor RST is arranged next to the floating diffusion FD1 (one source/drain region 36B of the reset transistor RST). This allows the carriers accumulated in the floating diffusion FD1 to be reset by the reset transistor RST.
  • the pad portions 39A, 39B, the upper first contact 39C, the upper second contact 39D, the lower first contact 45, the lower second contact 46, and the wiring 52 are made of, for example, a doped silicon material such as PDAS (Phosphorus Doped Amorphous Silicon). Alternatively, it can be formed using a metal material such as aluminum (Al), tungsten (W), titanium (Ti), cobalt (Co), hafnium (Hf), and tantalum (Ta).
  • a doped silicon material such as PDAS (Phosphorus Doped Amorphous Silicon).
  • a metal material such as aluminum (Al), tungsten (W), titanium (Ti), cobalt (Co), hafnium (Hf), and tantalum (Ta).
  • the protective layer 51 and the on-chip lens 54 are made of a light-transmitting material, for example, a single layer film made of silicon oxide, silicon nitride, silicon oxynitride, etc., or two types thereof. It is composed of a laminated film consisting of the above.
  • the thickness of this protective layer 51 is, for example, 100 nm to 30,000 nm.
  • the light shielding film 53 is provided in the protective layer 51 together with the wiring 52 so as to cover at least the region of the readout electrode 21A that is in direct contact with the semiconductor layer 23 without covering the storage electrode 21B.
  • the light shielding film 53 can be formed using, for example, tungsten (W), aluminum (Al), an alloy of Al and copper (Cu), or the like.
  • FIG. 8 is an equivalent circuit diagram of the image sensor 10 shown in FIG. 1.
  • FIG. 9 schematically shows the arrangement of the lower electrode 21 of the image sensor 10 shown in FIG. 1 and the transistors forming the control section.
  • the reset transistor RST (reset transistor TR1rst) is for resetting the carriers transferred from the photoelectric conversion unit 20 to the floating diffusion FD1, and is formed of, for example, a MOS transistor.
  • the reset transistor TR1rst includes a reset gate Grst, a channel formation region 36A, and source/drain regions 36B and 36C.
  • the reset gate Grst is connected to the reset line RST1, and one source/drain region 36B of the reset transistor TR1rst also serves as the floating diffusion FD1.
  • the other source/drain region 36C constituting the reset transistor TR1rst is connected to the power supply line VDD.
  • the amplifier transistor AMP (amplifier transistor TR1amp) is a modulation element that modulates the amount of charge generated in the photoelectric conversion section 20 into a voltage, and is composed of, for example, a MOS transistor.
  • the amplifier transistor AMP includes a gate Gamp, a channel formation region 35A, and source/drain regions 35B and 35C.
  • the gate Gamp is connected to the readout electrode 21A and one source/drain region 36B (floating diffusion FD1) of the reset transistor TR1rst via the lower first contact 45, the connecting portion 41A, the lower second contact 46, the through electrode 34, etc. has been done.
  • one source/drain region 35B shares a region with the other source/drain region 36C forming the reset transistor TR1rst, and is connected to the power supply line VDD.
  • the selection transistor SEL selection transistor TR1sel
  • the selection transistor SEL is composed of a gate Gsel, a channel formation region 34A, and source/drain regions 34B and 34C.
  • the gate Gsel is connected to the selection line SEL1.
  • One source/drain region 34B shares a region with the other source/drain region 35C forming the amplifier transistor AMP, and the other source/drain region 34C is connected to the signal line (data output line) VSL1. has been done.
  • the transfer transistor TR2 (transfer transistor TR2trs) is for transferring the signal charge corresponding to the blue color generated and accumulated in the photoelectric conversion region 32B to the floating diffusion FD2. Since the photoelectric conversion region 32B is formed at a deep position from the second surface 30B of the semiconductor substrate 30, it is preferable that the transfer transistor TR2trs of the photoelectric conversion region 32B is constituted by a vertical transistor. Transfer transistor TR2trs is connected to transfer gate line TG2. A floating diffusion FD2 is provided in a region 37C near the gate Gtrs2 of the transfer transistor TR2trs. The carriers accumulated in the photoelectric conversion region 32B are read out to the floating diffusion FD2 via a transfer channel formed along the gate Gtrs2.
  • the transfer transistor TR3 (transfer transistor TR3trs) is for transferring the signal charge corresponding to the red color generated and accumulated in the photoelectric conversion region 32R to the floating diffusion FD3, and is formed of, for example, a MOS transistor. Transfer transistor TR3trs is connected to transfer gate line TG3. A floating diffusion FD3 is provided in a region 38C near the gate Gtrs3 of the transfer transistor TR3trs. The carriers accumulated in the photoelectric conversion region 32R are read out to the floating diffusion FD3 via a transfer channel formed along the gate Gtrs3.
  • a reset transistor TR2rst Further provided on the second surface 30B side of the semiconductor substrate 30 are a reset transistor TR2rst, an amplifier transistor TR2amp, and a selection transistor TR2sel, which constitute a control section of the photoelectric conversion region 32B. Furthermore, a reset transistor TR3rst, an amplifier transistor TR3amp, and a selection transistor TR3sel, which constitute a control section of the photoelectric conversion region 32R, are provided.
  • the reset transistor TR2rst is composed of a gate, a channel formation region, and a source/drain region.
  • the gate of the reset transistor TR2rst is connected to the reset line RST2, and one source/drain region of the reset transistor TR2rst is connected to the power supply line VDD.
  • the other source/drain region of the reset transistor TR2rst also serves as the floating diffusion FD2.
  • Amplifier transistor TR2amp is composed of a gate, a channel formation region, and a source/drain region.
  • the gate is connected to the other source/drain region (floating diffusion FD2) of the reset transistor TR2rst.
  • One source/drain region forming the amplifier transistor TR2amp shares a region with one source/drain region forming the reset transistor TR2rst, and is connected to the power supply line VDD.
  • the selection transistor TR2sel is composed of a gate, a channel formation region, and a source/drain region.
  • the gate is connected to selection line SEL2.
  • One source/drain region forming the selection transistor TR2sel shares a region with the other source/drain region forming the amplifier transistor TR2amp.
  • the other source/drain region constituting the selection transistor TR2sel is connected to the signal line (data output line) VSL2.
  • the reset transistor TR3rst is composed of a gate, a channel formation region, and a source/drain region.
  • the gate of the reset transistor TR3rst is connected to the reset line RST3, and one source/drain region forming the reset transistor TR3rst is connected to the power supply line VDD.
  • the other source/drain region forming the reset transistor TR3rst also serves as the floating diffusion FD3.
  • the amplifier transistor TR3amp is composed of a gate, a channel formation region, and a source/drain region.
  • the gate is connected to the other source/drain region (floating diffusion FD3) forming the reset transistor TR3rst.
  • One source/drain region forming the amplifier transistor TR3amp shares a region with one source/drain region forming the reset transistor TR3rst, and is connected to the power supply line VDD.
  • the selection transistor TR3sel is composed of a gate, a channel formation region, and a source/drain region.
  • the gate is connected to selection line SEL3.
  • One source/drain region forming the selection transistor TR3sel shares a region with the other source/drain region forming the amplifier transistor TR3amp.
  • the other source/drain region constituting the selection transistor TR3sel is connected to a signal line (data output line) VSL3.
  • the reset lines RST1, RST2, RST3, selection lines SEL1, SEL2, SEL3, and transfer gate lines TG2, TG3 are each connected to a vertical drive circuit that constitutes a drive circuit.
  • Signal lines (data output lines) VSL1, VSL2, and VSL3 are connected to a column signal processing circuit 112 that constitutes a drive circuit.
  • the image sensor 10 of this embodiment can be manufactured, for example, as follows.
  • a p-well 31 is formed in a semiconductor substrate 30, and in this p-well 31, for example, n-type photoelectric conversion regions 32B and 32R are formed.
  • a p+ region is formed near the first surface 30A of the semiconductor substrate 30.
  • a transfer transistor Tr2 As also shown in FIG. 10, on the second surface 30B of the semiconductor substrate 30, after forming an n+ region that will become, for example, floating diffusions FD1 to FD3, a gate insulating layer 33, a transfer transistor Tr2, a transfer transistor Tr3, and a selection layer are formed.
  • a gate wiring layer 47 including the gates of the transistor SEL, the amplifier transistor AMP, and the reset transistor RST is formed.
  • the transfer transistor Tr2, the transfer transistor Tr3, the selection transistor SEL, the amplifier transistor AMP, and the reset transistor RST are formed.
  • a multilayer wiring layer 40 is formed, which is made up of wiring layers 41 to 43 including a lower first contact 45, a lower second contact 46, and a connecting portion 41A, and an insulating layer 44.
  • an SOI (Silicon on Insulator) substrate in which the semiconductor substrate 30, a buried oxide film (not shown), and a holding substrate (not shown) are stacked is used as the base of the semiconductor substrate 30, for example.
  • the buried oxide film and the holding substrate are bonded to the first surface 30A of the semiconductor substrate 30.
  • annealing treatment is performed.
  • a support substrate (not shown) or another semiconductor substrate or the like is bonded onto the multilayer wiring layer 40 provided on the second surface 30B side of the semiconductor substrate 30, and the semiconductor substrate 30 is turned upside down. Subsequently, the semiconductor substrate 30 is separated from the buried oxide film of the SOI substrate and the holding substrate, and the first surface 30A of the semiconductor substrate 30 is exposed.
  • the above steps can be performed using techniques used in normal CMOS processes, such as ion implantation and CVD (Chemical Vapor Deposition).
  • the semiconductor substrate 30 is processed from the first surface 30A side by, for example, dry etching to form, for example, an annular opening 34H.
  • the depth of the opening 34H penetrates from the first surface 30A to the second surface 30B of the semiconductor substrate 30, and reaches, for example, the connection portion 41A.
  • a fixed charge layer 26A and a dielectric layer 26B are sequentially formed on the first surface 30A of the semiconductor substrate 30 and the side surface of the opening 34H.
  • the fixed charge layer 26A can be formed, for example, by forming a hafnium oxide film or an aluminum oxide film using an atomic layer deposition method (ALD method).
  • the dielectric layer 26B can be formed, for example, by forming a silicon oxide film using a plasma CVD method.
  • pad portions 39A and 39B are formed at predetermined positions on the dielectric layer 26B, in which a barrier metal made of a laminated film of titanium and titanium nitride (Ti/TiN film) and a tungsten film are laminated. .
  • the pad portions 39A and 39B can be used as a light shielding film.
  • an interlayer insulating layer 27 is formed on the dielectric layer 26B and the pad portions 39A, 39B, and the surface of the interlayer insulating layer 27 is planarized using a CMP (Chemical Mechanical Polishing) method.
  • a conductive material such as Al is filled in the openings 27H1 and 27H2, and an upper first contact 39C is formed. and an upper second contact 39D.
  • a conductive film 21x is formed on the interlayer insulating layer 27 using, for example, a sputtering method, and then patterned using a photolithography technique. Specifically, after forming a photoresist PR at a predetermined position of the conductive film 21x, the conductive film 21x is processed using dry etching or wet etching. Thereafter, by removing the photoresist PR, the readout electrode 21A and the storage electrode 21B are formed as shown in FIG. 14.
  • a semiconductor layer 23 consisting of an insulating layer 22, a first layer 23A, and a second layer 23B, a photoelectric conversion layer 24, and an upper electrode 25 are formed.
  • the insulating layer 22 is formed by forming a silicon oxide film using, for example, an ALD method, and then planarizing the surface of the insulating layer 22 using a CMP method. After that, an opening 22H is formed on the readout electrode 21A using, for example, wet etching.
  • the semiconductor layer 23 (first layer 23A and second layer 23B) can be formed using, for example, a sputtering method.
  • the photoelectric conversion layer 24 is formed using, for example, a vacuum deposition method.
  • the upper electrode 25 is formed using, for example, a sputtering method. Finally, the protective layer 51 including the wiring 52 and the light shielding film 53 and the on-chip lens 54 are provided on the upper electrode 25. Through the above steps, the image sensor 10 shown in FIG. 1 is completed.
  • a buffer layer that also serves as an electron blocking film between the semiconductor layer 23 and the photoelectric conversion layer 24 and between the photoelectric conversion layer 24 and the upper electrode 25, a buffer layer that also serves as an electron blocking film, a buffer layer that also serves as a hole blocking film, or When forming other layers containing organic materials such as a work function adjustment layer, it is desirable to form each layer continuously in a vacuum process (in an integrated vacuum process).
  • the method for forming the photoelectric conversion layer 24 is not necessarily limited to a method using a vacuum evaporation method, and for example, a spin coating technique, a printing technique, or the like may be used.
  • vacuum evaporation method in addition to the sputtering method, vacuum evaporation method, reactive evaporation method, and electron beam evaporation method may be used, depending on the material constituting the transparent electrode.
  • physical vapor deposition methods PVD methods
  • ion plating methods ion plating methods
  • pyrosol methods methods for thermally decomposing organometallic compounds
  • spray methods dip methods
  • various CVD methods including MOCVD methods, electroless plating methods, and electrolytic methods.
  • green light is first selectively detected (absorbed) in the photoelectric conversion unit 20 and photoelectrically converted.
  • the photoelectric conversion section 20 is connected to the gate Gamp of the amplifier transistor TR1amp and the floating diffusion FD1 via the through electrode 34. Therefore, electrons among the excitons generated in the photoelectric conversion unit 20 are extracted from the lower electrode 21 side, transferred to the second surface 30S2 side of the semiconductor substrate 30 via the through electrode 34, and accumulated in the floating diffusion FD1. . At the same time, the amount of charge generated in the photoelectric conversion section 20 is modulated into voltage by the amplifier transistor TR1amp.
  • the reset gate Grst of the reset transistor TR1rst is arranged next to the floating diffusion FD1. Thereby, the carriers accumulated in the floating diffusion FD1 are reset by the reset transistor TR1rst.
  • the photoelectric conversion unit 20 is connected not only to the amplifier transistor TR1amp but also to the floating diffusion FD1 via the through electrode 34, carriers accumulated in the floating diffusion FD1 can be easily reset by the reset transistor TR1rst. becomes.
  • FIG. 16 shows an example of the operation of the image sensor 10.
  • A shows the potential at the storage electrode 21B
  • B shows the potential at the floating diffusion FD1 (readout electrode 21A)
  • C shows the potential at the gate (Gsel) of the reset transistor TR1rst. It is.
  • voltages are individually applied to the readout electrode 21A and the storage electrode 21B.
  • a potential V1 is applied from the drive circuit to the readout electrode 21A, and a potential V2 is applied to the storage electrode 21B.
  • the potentials V1 and V2 are assumed to be V2>V1.
  • carriers (signal charges; electrons) generated by photoelectric conversion are attracted to the storage electrode 21B and are stored in the region of the semiconductor layer 23 facing the storage electrode 21B (storage period).
  • the potential of the region of the semiconductor layer 23 facing the storage electrode 21B becomes a more negative value as time elapses during photoelectric conversion. Note that the holes are sent out from the upper electrode 25 to the drive circuit.
  • a reset operation is performed in the latter half of the accumulation period. Specifically, at timing t1, the scanning section changes the voltage of the reset signal RST from a low level to a high level. As a result, in the unit pixel P, the reset transistor TR1rst is turned on, and as a result, the voltage of the floating diffusion FD1 is set to the power supply voltage, and the voltage of the floating diffusion FD1 is reset (reset period).
  • carrier reading is performed. Specifically, at timing t2, a potential V3 is applied from the drive circuit to the readout electrode 21A, and a potential V4 is applied to the storage electrode 21B.
  • the potentials V3 and V4 are assumed to be V3 ⁇ V4.
  • the carriers accumulated in the region corresponding to the storage electrode 21B are read out from the readout electrode 21A to the floating diffusion FD1. That is, the carriers accumulated in the semiconductor layer 23 are read out by the control section (transfer period).
  • the drive circuit applies the potential V1 to the read electrode 21A again, and the potential V2 to the storage electrode 21B. Thereby, carriers generated by photoelectric conversion are attracted to the storage electrode 21B and accumulated in the region of the photoelectric conversion layer 24 facing the storage electrode 21B (accumulation period).
  • the first layer 23A and the second layer 23B are arranged between the lower electrode 21 consisting of the readout electrode 21A and the storage electrode 21B and the photoelectric conversion layer 24.
  • the semiconductor layers 23 were stacked in this order from the 11 side, and the thickness of the first layer 23A was smaller than the thickness of the second layer 23B, and was 3 nm or more and 5 nm or less. This reduces the influence of fixed charges on the surface of the semiconductor layer 23 while maintaining carrier conduction within the semiconductor layer 23. This will be explained below.
  • a stacked image sensor in which a plurality of photoelectric conversion units are vertically stacked has been progressing as an image sensor forming a CCD image sensor, a CMOS image sensor, etc.
  • a stacked image sensor for example, two photoelectric conversion regions each consisting of a photodiode (PD) are stacked in a silicon (Si) substrate, and a photoelectric conversion layer having a photoelectric conversion layer containing an organic material above the Si substrate is formed. It has a configuration in which a section is provided.
  • a stacked image sensor requires a structure that accumulates and transfers signal charges generated in each photoelectric conversion section.
  • the photoelectric conversion layer is formed by configuring the photoelectric conversion region side of a pair of electrodes facing each other between two electrodes, a first electrode and a charge storage electrode.
  • the signal charges generated in the conversion layer can be stored.
  • signal charges are once accumulated above the charge storage electrode and then transferred to the floating diffusion FD in the Si substrate. This makes it possible to completely deplete the charge storage section and erase carriers at the start of exposure. As a result, it is possible to suppress the occurrence of phenomena such as an increase in kTC noise, deterioration of random noise, and deterioration of captured image quality.
  • an indium-gallium-zinc composite is formed between the first electrode including the charge storage electrode and the photoelectric conversion layer.
  • the photoresponsiveness is improved by providing a composite oxide layer made of oxide (IGZO).
  • This composite oxide layer has a two-layer structure and is provided for the purpose of preventing stagnation of carriers from the photoelectric conversion layer to the composite oxide layer.
  • this composite oxide layer causes deterioration in reliability. This deterioration in reliability is due to the large number of oxygen defects in the oxide semiconductor layer (lower layer) provided on the first electrode side and the fixed charges on the surface of the oxide semiconductor layer (upper layer) provided on the photoelectric conversion layer side. It is assumed that this is due to this.
  • a first layer 23A and a second layer 23B are arranged in this order from the lower electrode 11 side between the lower electrode 21 consisting of the readout electrode 21A and the storage electrode 21B and the photoelectric conversion layer 24.
  • a stacked semiconductor layer 23 was provided, and the thickness of the first layer 23A was smaller than the thickness of the second layer 23B, and was 3 nm or more and 5 nm or less. As a result, oxygen defects in the first layer 23A are reduced, and carrier conduction within the semiconductor layer 23 can be maintained.
  • the second layer 23B is thicker than the first layer 23A (for example, the ratio (t1/t2) of the thickness (t1) of the first layer 23A to the thickness (t2) of the second layer 23B is 0.16 or less. ), the influence of fixed charges on the surface of the semiconductor layer 23 is reduced, and the amount of variation in the threshold voltage (Vth) is reduced.
  • FIG. 17 schematically represents a cross-sectional configuration of a main part (photoelectric conversion unit 20A) of an image sensor according to Modification 1 of the present disclosure.
  • the photoelectric conversion unit 20A of this modification differs from the above embodiment in that a protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24.
  • the protective layer 29 is for preventing oxygen from being desorbed from the oxide semiconductor material forming the semiconductor layer 23 .
  • the material constituting the protective layer 29 include TiO 2 , titanium oxide silicide (TiSiO), niobium oxide (Nb 2 O 5 ), and TaO x .
  • the thickness of the protective layer 29 is effective if it is, for example, one atomic layer, and is preferably, for example, 0.5 nm or more and 10 nm or less.
  • the protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24, so that it is possible to reduce the desorption of oxygen from the surface of the semiconductor layer 23. .
  • signal charges electrosprays
  • FIG. 18 schematically represents a cross-sectional configuration of a main part (photoelectric conversion unit 20B) of an image sensor according to Modification 2 of the present disclosure.
  • the photoelectric conversion section 20B of this modification has the structure of the photoelectric conversion section 20A in the above-mentioned modification 1, and further includes a third layer 23C on the second layer 23B.
  • a first layer 23A, a second layer 23B, and a third layer 23C are laminated in this order from the lower electrode 21 side.
  • 23B has the same configuration as the above embodiment.
  • the third layer 23C is for suppressing oxygen vacancies in the semiconductor layer 23 and has amorphous properties.
  • the third layer 23C can be formed using an indium oxide semiconductor similarly to the first layer 23A and the second layer 23B. Specifically, IGZO and IGO are mentioned. Since the bond between zinc (Zn) and oxygen (O) is weaker than that of indium (In), oxygen vacancies in the semiconductor layer 23 can be further suppressed by forming the third layer 23C using IGO that does not contain Zn. can do.
  • the thickness of the third layer 23C is, for example, 1 nm or more and 10 nm or less. This third layer 23C corresponds to a specific example of the "third layer" of the present disclosure.
  • the semiconductor layer 23 has a three-layer structure of the first layer 23A, the second layer 23B, and the third layer 23C, and furthermore, a protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24. .
  • a protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24.
  • the present technology can also be applied to an image sensor having the following configuration.
  • FIG. 19A schematically represents a cross-sectional configuration of an image sensor 10A according to Modification 3 of the present disclosure.
  • FIG. 19B schematically shows an example of the planar configuration of the image sensor 10A shown in FIG. 19A
  • FIG. 19A shows a cross section taken along the line III-III shown in FIG. 19B.
  • the image sensor 10A is, for example, a stacked type image sensor in which a photoelectric conversion region 32 and a photoelectric conversion section 60 are stacked, and the pixel section 1A of an imaging device (for example, the imaging device 1) equipped with this image sensor 10A.
  • the pixel unit 1a is a repeating unit of four pixels arranged in 2 rows x 2 columns, and an array is formed in the row direction and the column direction. They are arranged in a repeated pattern.
  • a color filter 55 that selectively transmits red light (R), green light (G), and blue light (B) is provided above the photoelectric conversion unit 60 (light incidence side S1). , are provided for each unit pixel P, respectively.
  • a pixel unit 1a consisting of four pixels arranged in 2 rows x 2 columns, two color filters that selectively transmit green light (G) are arranged diagonally, and red light (R ) and blue light (B) are arranged one by one on orthogonal diagonals.
  • each unit pixel (Pr, Pg, Pb) provided with each color filter corresponding colored light is detected in the photoelectric conversion unit 60, for example. That is, in the pixel section 1A, pixels (Pr, Pg, Pb) that respectively detect red light (R), green light (G), and blue light (B) are arranged in a Bayer pattern.
  • the photoelectric conversion unit 60 includes, for example, a lower electrode 61, an insulating layer 62, a semiconductor layer 63, a photoelectric conversion layer 64, and an upper electrode 65.
  • Each of numerals 65 and 65 has the same configuration as the photoelectric conversion section 20 in the above embodiment.
  • the photoelectric conversion region 32 detects light in a different wavelength range from that of the photoelectric conversion section 60.
  • the image sensor 10A out of the light transmitted through the color filter 55, light in the visible light region (red light (R), green light (G), and blue light (B)) is filtered through each color filter.
  • Other light such as light (infrared light (IR)) in the infrared light region (for example, 700 nm or more and 1000 nm or less), is absorbed by the photoelectric conversion section 60 of the unit pixel (Pr, Pg, Pb). It passes through the converter 60.
  • Infrared light (IR) transmitted through this photoelectric conversion section 60 is detected in the photoelectric conversion region 32 of each unit pixel Pr, Pg, Pb, and each unit pixel Pr, Pg, Pb corresponds to infrared light (IR).
  • a signal charge is generated. That is, the imaging device 1 equipped with the imaging device 10A can simultaneously generate both a visible light image and an infrared light image.
  • FIG. 20A schematically shows a cross-sectional configuration of an image sensor 10B according to Modification 4 of the present disclosure.
  • FIG. 20B schematically shows an example of the planar configuration of the image sensor 10B shown in FIG. 20A
  • FIG. 20A shows a cross section taken along the line IV-IV shown in FIG. 20B.
  • the color filter 55 that selectively transmits red light (R), green light (G), and blue light (B) is provided above the photoelectric conversion unit 60 (light incidence side S1).
  • the color filter 55 may be provided between the photoelectric conversion region 32 and the photoelectric conversion section 60, for example, as shown in FIG. 20A.
  • the color filter 55 is a color filter (color filter 55R) that selectively transmits at least red light (R) and a color filter that selectively transmits at least blue light (B) in the pixel unit 1a.
  • the color filters (color filters 55B) are arranged diagonally with each other.
  • the photoelectric conversion unit 60 photoelectric conversion layer 64) is configured to selectively absorb wavelengths corresponding to green light, for example, similarly to the above embodiments. This makes it possible to acquire signals corresponding to RGB in the photoelectric conversion regions (photoelectric conversion regions 32R, 32G) arranged below the photoelectric conversion unit 60 and the color filters 55R, 55B, respectively.
  • the area of each of the RGB photoelectric conversion parts can be increased compared to an image sensor having a general Bayer array, so it is possible to improve the S/N ratio.
  • FIG. 21 schematically shows a cross-sectional configuration of an image sensor 10C according to Modification 5 of the present disclosure.
  • the image sensor 10C of this modification has two photoelectric conversion sections 20 and 80 and one photoelectric conversion region 32 stacked in the vertical direction.
  • the photoelectric conversion units 20 and 80 and the photoelectric conversion region 32 selectively detect light in different wavelength ranges and perform photoelectric conversion.
  • the photoelectric conversion unit 20 acquires a green (G) color signal.
  • the photoelectric conversion unit 80 acquires a blue (B) color signal.
  • the photoelectric conversion region 32 acquires a red (R) color signal.
  • the photoelectric conversion unit 80 is stacked, for example, above the photoelectric conversion unit 20, and, like the photoelectric conversion unit 20, includes a lower electrode 81, a semiconductor layer 83 including, for example, a first semiconductor layer 83A and a second semiconductor layer 83B, and a photoelectric conversion layer. 84 and an upper electrode 85 are stacked in this order from the first surface 30A side of the semiconductor substrate 30.
  • the lower electrode 81 is composed of a readout electrode 81A and a storage electrode 81B, which are electrically separated by an insulating layer 82.
  • the insulating layer 82 is provided with an opening 82H above the readout electrode 81A.
  • An interlayer insulating layer 87 is provided between the photoelectric conversion section 80 and the photoelectric conversion section 20.
  • a through electrode 88 that penetrates the interlayer insulating layer 87 and the photoelectric conversion section 20 and is electrically connected to the readout electrode 21A of the photoelectric conversion section 20 is connected to the readout electrode 81A. Further, the readout electrode 81A is electrically connected to the floating diffusion FD provided on the semiconductor substrate 30 via the through electrodes 34 and 88, and temporarily accumulates carriers generated in the photoelectric conversion layer 84. be able to. Further, the readout electrode 81A is electrically connected to the amplifier transistor AMP provided on the semiconductor substrate 30 via the through electrodes 34 and 88.
  • FIG. 22 shows an example of the overall configuration of an imaging device (imaging device 1) including the imaging device (for example, the imaging device 10) shown in FIG. 1 and the like.
  • the imaging device 1 is, for example, a CMOS image sensor, which captures incident light (image light) from a subject through an optical lens system (not shown), and measures the amount of incident light formed on an imaging surface. It converts each pixel into an electrical signal and outputs it as a pixel signal.
  • the imaging device 1 has a pixel section 1A as an imaging area on a semiconductor substrate 30, and includes, for example, a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, and an output circuit in the peripheral area of the pixel section 1A. It has a circuit 114, a control circuit 115, and an input/output terminal 116.
  • the pixel portion 1A includes, for example, a plurality of unit pixels P arranged two-dimensionally in a matrix.
  • a pixel drive line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column.
  • the pixel drive line Lread transmits a drive signal for reading signals from pixels.
  • One end of the pixel drive line Lread is connected to an output end corresponding to each row of the vertical drive circuit 111.
  • the vertical drive circuit 111 is a pixel drive section that is composed of a shift register, an address decoder, etc., and drives each unit pixel P of the pixel section 1A, for example, row by row. Signals output from each unit pixel P in the pixel row selectively scanned by the vertical drive circuit 111 are supplied to the column signal processing circuit 112 through each vertical signal line Lsig.
  • the column signal processing circuit 112 includes an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
  • the horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and sequentially drives each horizontal selection switch of the column signal processing circuit 112 while scanning them. By this selective scanning by the horizontal drive circuit 113, the signals of each pixel transmitted through each of the vertical signal lines Lsig are sequentially outputted to the horizontal signal line 121, and transmitted to the outside of the semiconductor substrate 30 through the horizontal signal line 121. .
  • the output circuit 114 performs signal processing on the signals sequentially supplied from each of the column signal processing circuits 112 via the horizontal signal line 121 and outputs the processed signals.
  • the output circuit 114 may perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
  • the circuit portion consisting of the vertical drive circuit 111, column signal processing circuit 112, horizontal drive circuit 113, horizontal signal line 121, and output circuit 114 may be formed directly on the semiconductor substrate 30, or may be formed on an external control IC. It may be arranged. Moreover, those circuit parts may be formed on another board connected by a cable or the like.
  • the control circuit 115 receives a clock applied from outside the semiconductor substrate 30, data instructing an operation mode, etc., and also outputs data such as internal information of the imaging device 1.
  • the control circuit 115 further includes a timing generator that generates various timing signals, and controls the vertical drive circuit 111, column signal processing circuit 112, horizontal drive circuit 113, etc. based on the various timing signals generated by the timing generator. Performs drive control of peripheral circuits.
  • the input/output terminal 116 is for exchanging signals with the outside.
  • the imaging device 1 described above can be applied to various electronic devices, such as an imaging system such as a digital still camera or a digital video camera, a mobile phone with an imaging function, or other equipment with an imaging function. can do.
  • an imaging system such as a digital still camera or a digital video camera
  • a mobile phone with an imaging function or other equipment with an imaging function. can do.
  • FIG. 23 is a block diagram showing an example of the configuration of electronic device 1000.
  • the electronic device 1000 includes an optical system 1001, an imaging device 1, and a DSP (Digital Signal Processor) 1002. , an operation system 1006, and a power supply system 1007 are connected to each other, and can capture still images and moving images.
  • DSP Digital Signal Processor
  • the optical system 1001 is configured with one or more lenses, and captures incident light (image light) from a subject and forms an image on the imaging surface of the imaging device 1.
  • the imaging device 1 converts the amount of incident light focused on the imaging surface by the optical system 1001 into an electrical signal for each pixel, and supplies the electrical signal to the DSP 1002 as a pixel signal.
  • the DSP 1002 performs various signal processing on the signal from the imaging device 1 to obtain an image, and temporarily stores the data of the image in the memory 1003.
  • the image data stored in the memory 1003 is recorded on a recording device 1005 or supplied to a display device 1004 to display the image.
  • the operation system 1006 receives various operations by the user and supplies operation signals to each block of the electronic device 1000, and the power supply system 1007 supplies power necessary for driving each block of the electronic device 1000.
  • FIG. 24A schematically shows an example of the overall configuration of a photodetection system 2000 including the imaging device 1.
  • FIG. 24B shows an example of the circuit configuration of the photodetection system 2000.
  • the photodetection system 2000 includes a light emitting device 2001 as a light source section that emits infrared light L2, and a photodetection device 2002 as a light receiving section having a photoelectric conversion element.
  • the photodetection device 2002 the above-described imaging device 1 can be used.
  • the light detection system 2000 may further include a system control section 2003, a light source drive section 2004, a sensor control section 2005, a light source side optical system 2006, and a camera side optical system 2007.
  • the light detection device 2002 can detect light L1 and light L2.
  • the light L1 is the light that is the ambient light from the outside reflected on the subject (measurement object) 2100 (FIG. 24A).
  • Light L2 is light that is emitted by the light emitting device 2001 and then reflected by the subject 2100.
  • the light L1 is, for example, visible light
  • the light L2 is, for example, infrared light.
  • Light L1 can be detected in a photoelectric conversion section in photodetection device 2002, and light L2 can be detected in a photoelectric conversion region in photodetection device 2002.
  • Image information of the subject 2100 can be obtained from the light L1, and distance information between the subject 2100 and the light detection system 2000 can be obtained from the light L2.
  • the photodetection system 2000 can be installed in, for example, an electronic device such as a smartphone or a mobile object such as a car.
  • the light emitting device 2001 can be configured with, for example, a semiconductor laser, a surface emitting semiconductor laser, or a vertical cavity surface emitting laser (VCSEL).
  • VCSEL vertical cavity surface emitting laser
  • an iTOF method can be adopted, but the method is not limited thereto.
  • the photoelectric conversion unit can measure the distance to the subject 2100 using, for example, time-of-flight (TOF).
  • a structured light method or a stereo vision method can be adopted as a method for detecting the light L2 emitted from the light emitting device 2001 by the photodetecting device 2002.
  • the distance between the light detection system 2000 and the subject 2100 can be measured by projecting a predetermined pattern of light onto the subject 2100 and analyzing the degree of distortion of the pattern.
  • the stereo vision method the distance between the light detection system 2000 and the subject can be measured by, for example, using two or more cameras and acquiring two or more images of the subject 2100 viewed from two or more different viewpoints. can.
  • the light emitting device 2001 and the photodetecting device 2002 can be synchronously controlled by the system control unit 2003.
  • FIG. 25 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (present technology) can be applied.
  • FIG. 25 shows an operator (doctor) 11131 performing surgery on a patient 11132 on a patient bed 11133 using the endoscopic surgery system 11000.
  • the endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a pneumoperitoneum tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. , and a cart 11200 loaded with various devices for endoscopic surgery.
  • the endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into a body cavity of a patient 11132 over a predetermined length, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid scope having a rigid tube 11101 is shown, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible tube. good.
  • An opening into which an objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and the light is guided to the tip of the lens barrel. Irradiation is directed toward an observation target within the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system.
  • the observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted as RAW data to a camera control unit (CCU) 11201.
  • CCU camera control unit
  • the CCU 11201 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal, such as development processing (demosaic processing), for displaying an image based on the image signal.
  • CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under control from the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 11100 when photographing the surgical site or the like.
  • a light source such as an LED (light emitting diode)
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • the user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • a treatment tool control device 11205 controls driving of an energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, or the like.
  • the pneumoperitoneum device 11206 injects gas into the body cavity of the patient 11132 via the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of ensuring a field of view with the endoscope 11100 and a working space for the operator. send in.
  • the recorder 11207 is a device that can record various information regarding surgery.
  • the printer 11208 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies irradiation light to the endoscope 11100 when photographing the surgical site can be configured, for example, from a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source configured by a combination of RGB laser light sources
  • the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so the white balance of the captured image is adjusted in the light source device 11203. It can be carried out.
  • the laser light from each RGB laser light source is irradiated onto the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby supporting each of RGB. It is also possible to capture images in a time-division manner. According to this method, a color image can be obtained without providing a color filter in the image sensor.
  • the driving of the light source device 11203 may be controlled so that the intensity of the light it outputs is changed at predetermined time intervals.
  • the drive of the image sensor of the camera head 11102 in synchronization with the timing of changes in the light intensity to acquire images in a time-division manner and compositing the images, a high dynamic It is possible to generate an image of a range.
  • the light source device 11203 may be configured to be able to supply light in a predetermined wavelength range compatible with special light observation.
  • Special light observation uses, for example, the wavelength dependence of light absorption in body tissues to illuminate the mucosal surface layer by irradiating a narrower band of light than the light used for normal observation (i.e., white light).
  • Narrow Band Imaging is performed to photograph specific tissues such as blood vessels with high contrast.
  • fluorescence observation may be performed in which an image is obtained using fluorescence generated by irradiating excitation light.
  • Fluorescence observation involves irradiating body tissues with excitation light and observing the fluorescence from the body tissues (autofluorescence observation), or locally injecting reagents such as indocyanine green (ICG) into the body tissues and It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 may be configured to be able to supply narrowband light and/or excitation light compatible with such special light observation.
  • FIG. 26 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 25.
  • the camera head 11102 includes a lens unit 11401, an imaging section 11402, a driving section 11403, a communication section 11404, and a camera head control section 11405.
  • the CCU 11201 includes a communication section 11411, an image processing section 11412, and a control section 11413. Camera head 11102 and CCU 11201 are communicably connected to each other by transmission cable 11400.
  • the lens unit 11401 is an optical system provided at the connection part with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging element configuring the imaging unit 11402 may be one (so-called single-plate type) or multiple (so-called multi-plate type).
  • image signals corresponding to RGB are generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display. By performing 3D display, the operator 11131 can more accurately grasp the depth of the living tissue at the surgical site.
  • a plurality of lens units 11401 may be provided corresponding to each imaging element.
  • the imaging unit 11402 does not necessarily have to be provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the drive unit 11403 is constituted by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405. Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 to the CCU 11201 via the transmission cable 11400 as RAW data.
  • the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405.
  • the control signal may include, for example, information specifying the frame rate of the captured image, information specifying the exposure value at the time of capturing, and/or information specifying the magnification and focus of the captured image. Contains information about conditions.
  • the above imaging conditions such as the frame rate, exposure value, magnification, focus, etc. may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. good.
  • the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
  • the camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102.
  • the image signal and control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various image processing on the image signal, which is RAW data, transmitted from the camera head 11102.
  • the control unit 11413 performs various controls related to the imaging of the surgical site etc. by the endoscope 11100 and the display of the captured image obtained by imaging the surgical site etc. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a captured image showing the surgical site, etc., based on the image signal subjected to image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape and color of the edge of an object included in the captured image to detect surgical tools such as forceps, specific body parts, bleeding, mist when using the energy treatment tool 11112, etc. can be recognized.
  • the control unit 11413 may use the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgical support information in a superimposed manner and presenting it to the surgeon 11131, it becomes possible to reduce the burden on the surgeon 11131 and allow the surgeon 11131 to proceed with the surgery reliably.
  • the transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above. By applying the technology according to the present disclosure to the imaging unit 11402, detection accuracy is improved.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure can be applied to any type of transportation such as a car, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility vehicle, an airplane, a drone, a ship, a robot, a construction machine, an agricultural machine (tractor), etc. It may also be realized as a device mounted on the body.
  • FIG. 27 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio/image output section 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a drive force generation device such as an internal combustion engine or a drive motor that generates drive force for the vehicle, a drive force transmission mechanism that transmits the drive force to wheels, and a drive force transmission mechanism that controls the steering angle of the vehicle. It functions as a control device for a steering mechanism to adjust and a braking device to generate braking force for the vehicle.
  • the body system control unit 12020 controls the operations of various devices installed in the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, or a fog lamp.
  • radio waves transmitted from a portable device that replaces a key or signals from various switches may be input to the body control unit 12020.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, etc. of the vehicle.
  • the external information detection unit 12030 detects information external to the vehicle in which the vehicle control system 12000 is mounted.
  • an imaging section 12031 is connected to the outside-vehicle information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image.
  • the external information detection unit 12030 may perform object detection processing such as a person, car, obstacle, sign, or text on the road surface or distance detection processing based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light.
  • the imaging unit 12031 can output the electrical signal as an image or as distance measurement information.
  • the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
  • the in-vehicle information detection unit 12040 detects in-vehicle information.
  • a driver condition detection section 12041 that detects the condition of the driver is connected to the in-vehicle information detection unit 12040.
  • the driver condition detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver condition detection unit 12041. It may be calculated, or it may be determined whether the driver is falling asleep.
  • the microcomputer 12051 calculates control target values for the driving force generation device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, Control commands can be output to 12010.
  • the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of autonomous driving, etc., which does not rely on operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of preventing glare, such as switching from high beam to low beam. It can be carried out.
  • the audio and image output unit 12052 transmits an output signal of at least one of audio and images to an output device that can visually or audibly notify information to the occupants of the vehicle or to the outside of the vehicle.
  • an audio speaker 12061, a display section 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
  • FIG. 28 is a diagram showing an example of the installation position of the imaging section 12031.
  • the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle 12100.
  • An imaging unit 12101 provided in the front nose and an imaging unit 12105 provided above the windshield inside the vehicle mainly acquire images in front of the vehicle 12100.
  • Imaging units 12102 and 12103 provided in the side mirrors mainly capture images of the sides of the vehicle 12100.
  • An imaging unit 12104 provided in the rear bumper or back door mainly captures images of the rear of the vehicle 12100.
  • the imaging unit 12105 provided above the windshield inside the vehicle is mainly used to detect preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
  • FIG. 28 shows an example of the imaging range of the imaging units 12101 to 12104.
  • An imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
  • imaging ranges 12112 and 12113 indicate imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
  • an imaging range 12114 shows the imaging range of the imaging unit 12101 provided on the front nose.
  • the imaging range of the imaging unit 12104 provided in the rear bumper or back door is shown. For example, by overlapping the image data captured by the imaging units 12101 to 12104, an overhead image of the vehicle 12100 viewed from above can be obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of image sensors, or may be an image sensor having pixels for phase difference detection.
  • the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104. In particular, by determining the three-dimensional object that is closest to the vehicle 12100 on its path and that is traveling at a predetermined speed (for example, 0 km/h or more) in approximately the same direction as the vehicle 12100, it is possible to extract the three-dimensional object as the preceding vehicle. can.
  • a predetermined speed for example, 0 km/h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving, etc., in which the vehicle travels autonomously without depending on the driver's operation.
  • the microcomputer 12051 transfers three-dimensional object data to other three-dimensional objects such as two-wheeled vehicles, regular vehicles, large vehicles, pedestrians, and utility poles based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic obstacle avoidance. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceeds a set value and there is a possibility of a collision, the microcomputer 12051 transmits information via the audio speaker 12061 and the display unit 12062. By outputting a warning to the driver via the vehicle control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
  • the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceed
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether the pedestrian is present in the images captured by the imaging units 12101 to 12104.
  • pedestrian recognition involves, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and a pattern matching process is performed on a series of feature points indicating the outline of an object to determine whether it is a pedestrian or not.
  • the audio image output unit 12052 creates a rectangular outline for emphasis on the recognized pedestrian.
  • the display unit 12062 is controlled to display the .
  • the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
  • the image sensor has a structure in which the photoelectric conversion unit 20 that detects green light and the photoelectric conversion regions 32B and 32R that detect blue light and red light are stacked, but the present disclosure The content is not limited to this structure.
  • the photoelectric conversion section may detect red light or blue light, or the photoelectric conversion region may detect green light.
  • the number and ratio of these photoelectric conversion sections and photoelectric conversion regions are not limited, and two or more photoelectric conversion sections may be provided, or color signals of multiple colors can be obtained only with the photoelectric conversion section. You can do it like this.
  • the lower electrode 21 is composed of two electrodes, the readout electrode 21A and the storage electrode 21B. Three or more electrodes may be provided.
  • the present technology can also have the following configuration.
  • the present technology having the following configuration, between the first electrode and the second electrode arranged in parallel and the photoelectric conversion layer, the first electrode and the second electrode are stacked in order from the side of the first electrode and the second electrode.
  • the semiconductor layer includes a first layer and a second layer, the first layer having a thickness smaller than the second layer, and having a thickness of 3 nm or more and 5 nm or less. This reduces the influence of fixed charges on the surface of the semiconductor layer while maintaining carrier conduction within the semiconductor layer. Therefore, it is possible to improve reliability.
  • Both the first layer and the second layer are formed using indium oxide,
  • the content ratio of indium contained in the first indium oxide constituting the first layer is higher than the content ratio of indium contained in the second indium oxide constituting the second layer.
  • the semiconductor layer further includes a third layer having an amorphous property between the photoelectric conversion layer and the second layer, the photovoltaic layer according to any one of (1) to (6) above. conversion element.
  • (9) further comprising an insulating layer provided between the first electrode and the second electrode and the semiconductor layer and having an opening above the first electrode, The photoelectric conversion element according to any one of (1) to (8), wherein the second electrode and the semiconductor layer are electrically connected through the opening.
  • the first electrode and the second electrode are arranged on a side opposite to a light incident surface with respect to the photoelectric conversion layer. photoelectric conversion element. (12) The photoelectric conversion element according to any one of (1) to (11), wherein a voltage is applied to each of the first electrode and the second electrode individually.
  • the photoelectric conversion element is A first electrode and a second electrode arranged in parallel; a third electrode placed opposite the first electrode and the second electrode; a photoelectric conversion layer provided between the first electrode, the second electrode, and the third electrode; A first layer and a second layer provided between the first electrode and the second electrode and the photoelectric conversion layer and stacked in order from the first electrode and the second electrode side. and a semiconductor layer in which the thickness of the first layer is smaller than the thickness of the second layer and is 3 nm or more and 5 nm or less.
  • the one or more photoelectric conversion regions are embedded in a semiconductor substrate, The photodetecting device according to (14), wherein the one or more photoelectric conversion units are arranged on the light incident surface side of the semiconductor substrate.
  • a multilayer wiring layer is formed on a surface of the semiconductor substrate opposite to the light incident surface.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Light Receiving Elements (AREA)

Abstract

A photoelectric conversion element according to an embodiment of the present disclosure is provided with: a first electrode and a second electrode that are disposed in parallel; a third electrode disposed opposite the first electrode and the second electrode; a photoelectric conversion layer provided between the first and second electrodes and the third electrode; and a semiconductor layer which is provided between the first and second electrodes and the photoelectric conversion layer, and includes a first layer and a second layer that are stacked successively from the side of the first and second electrodes, the first layer having a thickness that is smaller than the thickness of the second layer and is 3 nm to 5 nm inclusive.

Description

光電変換素子および光検出装置Photoelectric conversion element and photodetection device
 本開示は、例えば有機材料を用いた光電変換素子およびこれを備えた光検出装置に関する。 The present disclosure relates to a photoelectric conversion element using, for example, an organic material and a photodetection device equipped with the same.
 例えば、特許文献1では、第1電極、光電変換層および第2電極が積層されてなる光電変換部において、第1電極と光電変換層との間に、インジウム-ガリウム-亜鉛複合酸化物(IGZO)からなる複合酸化物層を設けることで光応答性の改善を図った撮像素子が開示されている。 For example, in Patent Document 1, in a photoelectric conversion unit in which a first electrode, a photoelectric conversion layer, and a second electrode are stacked, an indium-gallium-zinc composite oxide (IGZO) is provided between the first electrode and the photoelectric conversion layer. ) has been disclosed, which has disclosed an image sensor in which the photoresponsiveness is improved by providing a composite oxide layer consisting of the following.
国際公開第2019/035252号International Publication No. 2019/035252
 ところで、光電変換素子および光検出装置では、信頼性の向上が求められている。 Incidentally, there is a demand for improved reliability in photoelectric conversion elements and photodetecting devices.
 信頼性を向上させることが可能な光電変換素子および光検出装置を提供することが望ましい。 It is desirable to provide a photoelectric conversion element and a photodetection device that can improve reliability.
 本開示の一実施形態の光電変換素子は、並列配置されてなる第1の電極および第2の電極と、第1の電極および第2の電極と対向配置された第3の電極と、第1の電極および第2の電極と第3の電極との間に設けられた光電変換層と、第1の電極および第2の電極と光電変換層との間に設けられ、第1の電極および第2の電極の側から順に積層された第1の層および第2の層を含み、第1の層の厚みは第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層とを備えたものである。 A photoelectric conversion element according to an embodiment of the present disclosure includes a first electrode and a second electrode arranged in parallel, a third electrode arranged opposite to the first electrode and the second electrode, and a first electrode and a second electrode arranged in parallel. a photoelectric conversion layer provided between the first electrode and the second electrode, and a photoelectric conversion layer provided between the first electrode and the second electrode, and a photoelectric conversion layer provided between the first electrode and the second electrode and the third electrode; a semiconductor layer comprising a first layer and a second layer stacked in order from the second electrode side, the first layer having a thickness smaller than the second layer and having a thickness of 3 nm or more and 5 nm or less; It is equipped with the following.
 本開示の一実施形態の光検出装置は、1または複数の光電変換素子がそれぞれ設けられた複数の画素を備えたものであり、光電変換素子として上記本開示の一実施形態の光電変換素子を備えたものである。 A photodetection device according to an embodiment of the present disclosure includes a plurality of pixels each provided with one or more photoelectric conversion elements, and uses the photoelectric conversion element according to the embodiment of the present disclosure as the photoelectric conversion element. It is prepared.
 本開示の一実施形態の光電変換素子および一実施形態の光検出装置では、並列配置されてなる第1の電極および第2の電極と光電変換層との間に、第1の電極および第2の電極の側から順に積層された第1の層および第2の層を含み、第1の層の厚みは第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層を設けるようにした。これにより、半導体層内におけるキャリア伝導を維持しつつ、半導体層表面の固定電荷の影響を軽減する。 In the photoelectric conversion element of one embodiment of the present disclosure and the photodetection device of one embodiment, the first electrode and the second electrode are arranged between the first electrode and the second electrode arranged in parallel and the photoelectric conversion layer. A semiconductor layer is provided, including a first layer and a second layer stacked in order from the electrode side, the thickness of the first layer being smaller than the thickness of the second layer, and in the range of 3 nm or more and 5 nm or less. I did it like that. This reduces the influence of fixed charges on the surface of the semiconductor layer while maintaining carrier conduction within the semiconductor layer.
本開示の一実施の形態に係る撮像素子の構成の一例を表す断面模式図である。FIG. 1 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to an embodiment of the present disclosure. 図1に示した撮像素子を有する撮像装置の画素構成の一例を表す平面模式図である。FIG. 2 is a schematic plan view showing an example of a pixel configuration of an imaging device having the image sensor shown in FIG. 1. FIG. 図1に示した光電変換部の構成の一例を表す断面模式図である。FIG. 2 is a schematic cross-sectional view showing an example of the configuration of the photoelectric conversion section shown in FIG. 1. FIG. 図3に示した半導体層の第1層の膜厚と遅延電荷量との関係を表す特性図である。FIG. 4 is a characteristic diagram showing the relationship between the film thickness of the first layer of the semiconductor layer shown in FIG. 3 and the amount of delayed charge. 図3に示した半導体層の第1層の膜厚とドレイン電流との関係を表す特性図である。4 is a characteristic diagram showing the relationship between the thickness of the first layer of the semiconductor layer shown in FIG. 3 and the drain current. FIG. 図3に示した半導体層の第2層の膜厚と閾値電圧の変動量との関係を表す特性図である。FIG. 4 is a characteristic diagram showing the relationship between the thickness of the second layer of the semiconductor layer shown in FIG. 3 and the amount of variation in threshold voltage. 図3に示した絶縁層と半導体層との界面からの距離とキャリア濃度との関係のシミュレーション結果を表す特性図である。4 is a characteristic diagram showing simulation results of the relationship between the distance from the interface between the insulating layer and the semiconductor layer shown in FIG. 3 and the carrier concentration. FIG. 図7Aの一部の拡大図である。7A is an enlarged view of a portion of FIG. 7A. FIG. 図1に示した撮像素子の等価回路図である。2 is an equivalent circuit diagram of the image sensor shown in FIG. 1. FIG. 図1に示した撮像素子の下部電極および制御部を構成するトランジスタの配置を表わす模式図である。FIG. 2 is a schematic diagram showing the arrangement of transistors forming a lower electrode and a control section of the image sensor shown in FIG. 1; 図1に示した撮像素子の製造方法を説明するための断面図である。FIG. 2 is a cross-sectional view for explaining a method of manufacturing the image sensor shown in FIG. 1. FIG. 図10に続く工程を表す断面図である。FIG. 11 is a cross-sectional view showing a step following FIG. 10; 図11に続く工程を表す断面図である。FIG. 12 is a cross-sectional view showing a step subsequent to FIG. 11; 図12に続く工程を表す断面図である。FIG. 13 is a cross-sectional view showing a step following FIG. 12; 図13に続く工程を表す断面図である。FIG. 14 is a cross-sectional view showing a step following FIG. 13; 図14に続く工程を表す断面図である。FIG. 15 is a cross-sectional view showing a step following FIG. 14; 図1に示した撮像素子の一動作例を表すタイミング図である。FIG. 2 is a timing chart showing an example of the operation of the image sensor shown in FIG. 1. FIG. 本開示の変形例1に係る光電変換部の構成を表す断面模式図である。FIG. 3 is a schematic cross-sectional view showing the configuration of a photoelectric conversion section according to Modification Example 1 of the present disclosure. 本開示の変形例2に係る光電変換部の構成を表す断面模式図である。FIG. 7 is a schematic cross-sectional view showing the configuration of a photoelectric conversion unit according to Modification Example 2 of the present disclosure. 本開示の変形例3に係る撮像素子の構成の一例を表す断面模式図である。FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to Modification 3 of the present disclosure. 図19Aに示した撮像素子を有する撮像装置の画素構成の一例を表す平面模式図である。FIG. 19A is a schematic plan view showing an example of a pixel configuration of an imaging device having the image sensor shown in FIG. 19A. 本開示の変形例4に係る撮像素子の構成の一例を表す断面模式図である。FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to Modification Example 4 of the present disclosure. 図20Aに示した撮像素子を有する撮像装置の画素構成の一例を表す平面模式図である。20A is a schematic plan view showing an example of a pixel configuration of an imaging device having the image sensor shown in FIG. 20A. FIG. 本開示の変形例5に係る撮像素子の構成の一例を表す断面模式図である。FIG. 7 is a schematic cross-sectional view showing an example of the configuration of an image sensor according to Modification Example 5 of the present disclosure. 図1等に示した撮像素子を画素として用いた撮像装置の構成を表すブロック図である。FIG. 2 is a block diagram showing the configuration of an imaging device using the imaging device shown in FIG. 1 etc. as a pixel. 図22に示した撮像装置を用いた電子機器(カメラ)の一例を表す機能ブロック図である。23 is a functional block diagram showing an example of an electronic device (camera) using the imaging device shown in FIG. 22. FIG. 図22に示した撮像装置を用いた光検出システムの全体構成の一例を表す模式図である。23 is a schematic diagram showing an example of the overall configuration of a photodetection system using the imaging device shown in FIG. 22. FIG. 図24Aに示した光検出システムの回路構成の一例を表す図である。24A is a diagram showing an example of a circuit configuration of the photodetection system shown in FIG. 24A. FIG. 内視鏡手術システムの概略的な構成の一例を示す図である。FIG. 1 is a diagram showing an example of a schematic configuration of an endoscopic surgery system. カメラヘッド及びCCUの機能構成の一例を示すブロック図である。FIG. 2 is a block diagram showing an example of the functional configuration of a camera head and a CCU. 車両制御システムの概略的な構成の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。FIG. 2 is an explanatory diagram showing an example of installation positions of an outside-vehicle information detection section and an imaging section.
 以下、本開示における一実施形態について、図面を参照して詳細に説明する。以下の説明は本開示の一具体例であって、本開示は以下の態様に限定されるものではない。また、本開示は、各図に示す各構成要素の配置や寸法、寸法比等についても、それらに限定されるものではない。なお、説明する順序は、下記の通りである。
 1.実施の形態(下部電極と光電変換層との間に、所定の膜厚比を有する2層(第1層および第2層)からなる半導体層を有する撮像素子の例)
   1-1.撮像素子の構成
   1-2.撮像素子の製造方法
   1-3.撮像素子の信号取得動作
   1-4.作用・効果
 2.変形例
   2-1.変形例1(光電変換部の構成の他の例)
   2-2.変形例2(光電変換部の構成の他の例)
   2-3.変形例3(カラーフィルタを用いて分光する撮像素子の一例)
   2-4.変形例4(カラーフィルタを用いて分光する撮像素子の他の例)
   2-5.変形例5(複数の光電変換部が積層された撮像素子の一例)
 3.適用例
 4.応用例
Hereinafter, one embodiment of the present disclosure will be described in detail with reference to the drawings. The following description is a specific example of the present disclosure, and the present disclosure is not limited to the following embodiments. Further, the present disclosure is not limited to the arrangement, dimensions, dimensional ratio, etc. of each component shown in each figure. The order of explanation is as follows.
1. Embodiment (Example of an image sensor having a semiconductor layer consisting of two layers (first layer and second layer) having a predetermined film thickness ratio between a lower electrode and a photoelectric conversion layer)
1-1. Configuration of image sensor 1-2. Manufacturing method of image sensor 1-3. Signal acquisition operation of image sensor 1-4. Action/Effect 2. Modification example 2-1. Modification 1 (other example of the configuration of the photoelectric conversion section)
2-2. Modification 2 (other example of the configuration of the photoelectric conversion section)
2-3. Modification example 3 (an example of an image sensor that performs spectroscopy using a color filter)
2-4. Modification 4 (another example of an image sensor that performs spectroscopy using a color filter)
2-5. Modification example 5 (an example of an image sensor in which a plurality of photoelectric conversion units are stacked)
3. Application example 4. Application example
<1.実施の形態>
 図1は、本開示の一実施の形態に係る撮像素子(撮像素子10)の断面構成を表したものである。図2は、図1に示した撮像素子10の平面構成の一例を模式的に表したものであり、図1は、図2に示したI-I線における断面を表している。図3は、図1に示した撮像素子10の要部(光電変換部20)の断面構成の一例を拡大して模式的に表したものである。撮像素子10は、例えば、デジタルスチルカメラ、ビデオカメラ等の電子機器に用いられるCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等の撮像装置(例えば、撮像装置1、図22参照)の画素部1Aにおいてアレイ状に繰り返し配置される1つの画素(単位画素P)を構成するものである。画素部1Aでは、図2に示したように、例えば2行×2列で配置された4つの単位画素Pからなる画素ユニット1aが繰り返し単位となり、行方向と列方向とからなるアレイ状に繰り返し配置されている。
<1. Embodiment>
FIG. 1 shows a cross-sectional configuration of an image sensor (image sensor 10) according to an embodiment of the present disclosure. FIG. 2 schematically shows an example of the planar configuration of the image sensor 10 shown in FIG. 1, and FIG. 1 shows a cross section taken along the line II shown in FIG. FIG. 3 schematically shows an enlarged example of the cross-sectional configuration of the main part (photoelectric conversion section 20) of the image sensor 10 shown in FIG. The image sensor 10 is arranged in an array in a pixel section 1A of an image pickup device (for example, the image pickup device 1, see FIG. 22) such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor used in electronic devices such as digital still cameras and video cameras. This constitutes one pixel (unit pixel P) that is repeatedly arranged in a shape. In the pixel section 1A, as shown in FIG. 2, a pixel unit 1a consisting of four unit pixels P arranged in, for example, two rows and two columns serves as a repeating unit, and is repeated in an array in the row direction and column direction. It is located.
 本実施の形態の撮像素子10は、半導体基板30上に設けられた光電変換部20において、読み出し電極21Aおよび蓄積電極21Bからなる下部電極21と光電変換層24との間に、複数の層からなる半導体層23が設けられたものである。半導体層23は、例えば第1層23Aおよび第2層23Bが下部電極11側からこの順に積層された構成を有し、第1層23Aの厚みは、第2層23Bの厚みよりも小さく、且つ、3nm以上5nm以下となっている。この読み出し電極21Aが、本開示の「第2の電極」の一具体例に相当し、蓄積電極21Bが、本開示の「第1の電極」の一具体例に相当する。また、第1層23Aが、本開示の「第1の層」の一具体例に相当し、第2層23Bが、本開示の「第2の層」の一具体例に相当する。 In the image sensor 10 of the present embodiment, in the photoelectric conversion section 20 provided on the semiconductor substrate 30, a plurality of layers are provided between the lower electrode 21 consisting of the readout electrode 21A and the storage electrode 21B and the photoelectric conversion layer 24. A semiconductor layer 23 is provided. The semiconductor layer 23 has a structure in which, for example, a first layer 23A and a second layer 23B are laminated in this order from the lower electrode 11 side, and the thickness of the first layer 23A is smaller than the thickness of the second layer 23B, and , 3 nm or more and 5 nm or less. The readout electrode 21A corresponds to a specific example of the "second electrode" of the present disclosure, and the storage electrode 21B corresponds to a specific example of the "first electrode" of the present disclosure. Further, the first layer 23A corresponds to a specific example of the "first layer" of the present disclosure, and the second layer 23B corresponds to a specific example of the "second layer" of the present disclosure.
(1-1.撮像素子の構成)
 撮像素子10は、例えば、1つの光電変換部20と、2つの光電変換領域32B,32Rとが縦方向に積層された、いわゆる縦方向分光型のものである。光電変換部20は、半導体基板30の裏面(第1面30A)側に設けられている。光電変換領域32B,32Rは、半導体基板30内に埋め込み形成されており、半導体基板30の厚み方向に積層されている。
(1-1. Configuration of image sensor)
The image sensor 10 is, for example, a so-called vertical spectroscopy type device in which one photoelectric conversion section 20 and two photoelectric conversion regions 32B and 32R are stacked vertically. The photoelectric conversion unit 20 is provided on the back surface (first surface 30A) side of the semiconductor substrate 30. The photoelectric conversion regions 32B and 32R are embedded in the semiconductor substrate 30 and are stacked in the thickness direction of the semiconductor substrate 30.
 光電変換部20と、光電変換領域32B,32Rとは、互いに異なる波長域の光を選択的に検出して光電変換を行うものである。例えば、光電変換部20では、緑(G)の色信号を取得する。光電変換領域32B,32Rでは、吸収係数の違いにより、それぞれ、青(B)および赤(R)の色信号を取得する。これにより、撮像素子10では、カラーフィルタを用いることなく一つの画素において複数種類の色信号を取得可能となっている。 The photoelectric conversion unit 20 and the photoelectric conversion regions 32B and 32R selectively detect light in different wavelength ranges and perform photoelectric conversion. For example, the photoelectric conversion unit 20 acquires a green (G) color signal. The photoelectric conversion regions 32B and 32R obtain blue (B) and red (R) color signals, respectively, due to differences in absorption coefficients. This allows the image sensor 10 to acquire multiple types of color signals in one pixel without using a color filter.
 なお、本実施の形態では、光電変換によって生じる電子と正孔との対(励起子)のうち、電子を信号電荷として読み出す場合(n型半導体領域を光電変換層とする場合)について説明する。また、図中において、「p」「n」に付した「+(プラス)」は、p型またはn型の不純物濃度が高いことを表している。 Note that in this embodiment, a case where electrons are read out as signal charges among pairs of electrons and holes (excitons) generated by photoelectric conversion (a case where an n-type semiconductor region is used as a photoelectric conversion layer) will be described. Further, in the figure, the "+" (plus) added to "p" and "n" indicates that the p-type or n-type impurity concentration is high.
 半導体基板30の表面(第2面30B)には、例えば、フローティングディフュージョン(浮遊拡散層)FD1(半導体基板30内の領域36B),FD2(半導体基板30内の領域37C),FD3(半導体基板30内の領域38C)と、転送トランジスタTr2,Tr3と、アンプトランジスタ(変調素子)AMPと、リセットトランジスタRSTと、選択トランジスタSELとが設けられている。半導体基板30の第2面30Bには、さらに、ゲート絶縁層33を介して多層配線層40が設けられている。多層配線層40は、例えば、配線層41,42,43が絶縁層44内に積層された構成を有している。半導体基板30の周辺部、即ち、画素部1Aの周囲の周辺領域1Bには、後述する垂直駆動回路111、カラム信号処理回路112、水平駆動回路113、出力回路114、制御回路115および入出力端子116等が設けられている。 On the surface (second surface 30B) of the semiconductor substrate 30, for example, floating diffusions (floating diffusion layers) FD1 (region 36B in the semiconductor substrate 30), FD2 (region 37C in the semiconductor substrate 30), and FD3 (the region 37C in the semiconductor substrate 30) are formed. A region 38C), transfer transistors Tr2 and Tr3, an amplifier transistor (modulation element) AMP, a reset transistor RST, and a selection transistor SEL are provided. A multilayer wiring layer 40 is further provided on the second surface 30B of the semiconductor substrate 30 with a gate insulating layer 33 interposed therebetween. The multilayer wiring layer 40 has, for example, a structure in which wiring layers 41, 42, and 43 are laminated within an insulating layer 44. A vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, an output circuit 114, a control circuit 115, and input/output terminals are provided in the peripheral area of the semiconductor substrate 30, that is, in the peripheral area 1B around the pixel unit 1A. 116 etc. are provided.
 なお、図面では、半導体基板30の第1面30A側を光入射側S1、第2面30B側を配線層側S2と表している。 In the drawings, the first surface 30A side of the semiconductor substrate 30 is represented as a light incident side S1, and the second surface 30B side is represented as a wiring layer side S2.
 光電変換部20は、対向配置された下部電極21と上部電極25との間に、半導体層23および有機材料を用いて形成された光電変換層24が、下部電極21側からこの順に積層されている。半導体層23は、上記のように、第1層23Aおよび第2層23Bが下部電極21側からこの順に積層されている。光電変換層24は、p型半導体およびn型半導体を含んで構成され、層内にバルクヘテロ接合構造を有する。バルクヘテロ接合構造は、p型半導体およびn型半導体が混ざり合うことで形成されたp/n接合面である。 In the photoelectric conversion unit 20, a semiconductor layer 23 and a photoelectric conversion layer 24 formed using an organic material are stacked in this order from the lower electrode 21 side between a lower electrode 21 and an upper electrode 25 that are arranged opposite to each other. There is. As described above, the semiconductor layer 23 includes the first layer 23A and the second layer 23B stacked in this order from the lower electrode 21 side. The photoelectric conversion layer 24 includes a p-type semiconductor and an n-type semiconductor, and has a bulk heterojunction structure within the layer. A bulk heterojunction structure is a p/n junction formed by mixing a p-type semiconductor and an n-type semiconductor.
 光電変換部20は、さらに、下部電極21と半導体層23との間に絶縁層22を有している。絶縁層22は、例えば、画素部1A全面に亘って設けられると共に、下部電極21を構成する読み出し電極21A上に開口22Hを有している。読み出し電極21Aは、この開口22Hを介して半導体層23と電気的に接続されている。 The photoelectric conversion section 20 further includes an insulating layer 22 between the lower electrode 21 and the semiconductor layer 23. The insulating layer 22 is provided, for example, over the entire surface of the pixel portion 1A, and has an opening 22H above the readout electrode 21A that constitutes the lower electrode 21. The readout electrode 21A is electrically connected to the semiconductor layer 23 via this opening 22H.
 なお、図1では、半導体層23、光電変換層24および上部電極25が、撮像素子10毎に分離形成されている例を示したが、半導体層23、光電変換層24および上部電極25は、例えば、複数の撮像素子10に共通した連続層として設けられていてもよい。 Although FIG. 1 shows an example in which the semiconductor layer 23, the photoelectric conversion layer 24, and the upper electrode 25 are formed separately for each image sensor 10, the semiconductor layer 23, the photoelectric conversion layer 24, and the upper electrode 25 are For example, it may be provided as a continuous layer common to a plurality of image sensors 10.
 半導体基板30の第1面30Aと下部電極21との間には、例えば、絶縁層26と、層間絶縁層27とが積層されている。絶縁層26は、固定電荷を有する層(固定電荷層)26Aと、絶縁性を有する誘電体層26Bとが、半導体基板30側からこの順に積層されている。 For example, an insulating layer 26 and an interlayer insulating layer 27 are stacked between the first surface 30A of the semiconductor substrate 30 and the lower electrode 21. The insulating layer 26 includes a layer having a fixed charge (fixed charge layer) 26A and a dielectric layer 26B having an insulating property, which are stacked in this order from the semiconductor substrate 30 side.
 光電変換領域32B,32Rは、シリコン基板からなる半導体基板30において光の入射深さに応じて吸収される光の波長が異なることを利用して縦方向に光を分光することを可能としたものであり、それぞれ、半導体基板30の所定領域にpn接合を有している。 The photoelectric conversion regions 32B and 32R make it possible to vertically separate light by utilizing the fact that the wavelength of light absorbed in the semiconductor substrate 30 made of a silicon substrate differs depending on the depth of incidence of the light. , and each has a pn junction in a predetermined region of the semiconductor substrate 30.
 半導体基板30の第1面30Aと第2面30Bとの間には、貫通電極34が設けられている。貫通電極34は、読み出し電極21Aと電気的に接続されており、光電変換部20は、貫通電極34を介して、アンプトランジスタAMPのゲートGampと、フローティングディフュージョンFD1を兼ねるリセットトランジスタRST(リセットトランジスタTr1rst)の一方のソース/ドレイン領域36Bに接続されている。これにより、撮像素子10では、半導体基板30の第1面30A側に設けられた光電変換部20で生じたキャリア(ここでは、電子)を、貫通電極34を介して半導体基板30の第2面30B側に良好に転送し、特性を高めることが可能となっている。 A through electrode 34 is provided between the first surface 30A and the second surface 30B of the semiconductor substrate 30. The through electrode 34 is electrically connected to the readout electrode 21A, and the photoelectric conversion unit 20 connects the gate Gamp of the amplifier transistor AMP and the reset transistor RST (reset transistor Tr1rst) which also serves as the floating diffusion FD1 via the through electrode 34. ) is connected to one source/drain region 36B. As a result, in the image sensor 10, carriers (electrons here) generated in the photoelectric conversion unit 20 provided on the first surface 30A side of the semiconductor substrate 30 are transferred to the second surface of the semiconductor substrate 30 via the through electrode 34. It is possible to transfer it well to the 30B side and improve the characteristics.
 貫通電極34の下端は、配線層41内の配線(接続部41A)に接続されており、接続部41Aと、アンプトランジスタAMPのゲートGampとは、下部第1コンタクト45を介して接続されている。接続部41Aと、フローティングディフュージョンFD1(領域36B)とは、例えば、下部第2コンタクト46を介して接続されている。貫通電極34の上端は、例えば、パッド部39Aおよび上部第1コンタクト39Cを介して読み出し電極21Aに接続されている。 The lower end of the through electrode 34 is connected to the wiring (connection part 41A) in the wiring layer 41, and the connection part 41A and the gate Gamp of the amplifier transistor AMP are connected via the lower first contact 45. . The connecting portion 41A and the floating diffusion FD1 (region 36B) are connected, for example, via the lower second contact 46. The upper end of the through electrode 34 is connected to the read electrode 21A via, for example, a pad portion 39A and an upper first contact 39C.
 光電変換部20の上方には、保護層51が設けられている。保護層51内には、例えば、画素部1Aの周囲において上部電極25と周辺回路部130とを電気的に接続する配線52や遮光膜53が設けられている。保護層51の上方には、さらに、平坦化層(図示せず)やオンチップレンズ54等の光学部材が配設されている。 A protective layer 51 is provided above the photoelectric conversion section 20. In the protective layer 51, for example, a wiring 52 and a light shielding film 53 are provided that electrically connect the upper electrode 25 and the peripheral circuit section 130 around the pixel section 1A. Further, optical members such as a flattening layer (not shown) and an on-chip lens 54 are provided above the protective layer 51.
 本実施の形態の撮像素子10では、光入射側S1から光電変換部20に入射した光は、光電変換層24で吸収される。これによって生じた励起子は、光電変換層24を構成する電子供与体と電子受容体との界面に移動し、励起子分離、即ち、電子と正孔とに解離する。ここで発生したキャリア(電子および正孔)は、キャリアの濃度差による拡散や、陽極(例えば、上部電極25)と陰極(例えば、下部電極21)との仕事関数の差による内部電界によって、それぞれ異なる電極へ運ばれ、光電流として検出される。また、電子および正孔の輸送方向は、下部電極21と上部電極25との間に電位を印加することによっても制御することができる。 In the image sensor 10 of this embodiment, light that enters the photoelectric conversion unit 20 from the light incidence side S1 is absorbed by the photoelectric conversion layer 24. The excitons generated thereby move to the interface between the electron donor and electron acceptor that constitute the photoelectric conversion layer 24, and are separated into excitons, that is, dissociated into electrons and holes. The carriers (electrons and holes) generated here are diffused due to the difference in carrier concentration and due to the internal electric field due to the difference in work function between the anode (for example, the upper electrode 25) and the cathode (for example, the lower electrode 21). It is carried to different electrodes and detected as a photocurrent. Further, the transport direction of electrons and holes can also be controlled by applying a potential between the lower electrode 21 and the upper electrode 25.
 以下、各部の構成や材料等について詳細に説明する。 Hereinafter, the configuration and materials of each part will be explained in detail.
 光電変換部20は、選択的な波長域(例えば、450nm以上650nm以下)の一部または全部の波長域に対応する、例えば緑色光を吸収して、励起子を発生させる有機光電変換素子である。 The photoelectric conversion unit 20 is an organic photoelectric conversion element that absorbs, for example, green light corresponding to part or all of a selective wavelength range (for example, from 450 nm to 650 nm) and generates excitons. .
 下部電極21は、例えば、層間絶縁層28上に並列配置された読み出し電極21Aと蓄積電極21Bとから構成されている。読み出し電極21Aは、光電変換層25内で発生したキャリアをフローティングディフュージョンFD1に転送するためのものであり、例えば2行×2列で配置された4つの画素からなる画素ユニット1a毎に1つずつ設けられている。 The lower electrode 21 is composed of, for example, a readout electrode 21A and a storage electrode 21B arranged in parallel on the interlayer insulating layer 28. The readout electrode 21A is for transferring carriers generated in the photoelectric conversion layer 25 to the floating diffusion FD1, and for example, one for each pixel unit 1a consisting of four pixels arranged in 2 rows x 2 columns. It is provided.
 読み出し電極21Aは、例えば、上部第1コンタクト39C、パッド部39A、貫通電極34、接続部41Aおよび下部第2コンタクト46を介してフローティングディフュージョンFD1に接続されている。 The readout electrode 21A is connected to the floating diffusion FD1 via, for example, the upper first contact 39C, the pad portion 39A, the through electrode 34, the connecting portion 41A, and the lower second contact 46.
 蓄積電極21Bは、光電変換層25内で発生したキャリアのうち、信号電荷として、例えば電子を酸化物半導体層23内に蓄積するためのものであり、画素毎にそれぞれ設けられている。蓄積電極21Bは、単位画素P毎に、半導体基板30内に形成された光電変換領域32B,32Rの受光面と正対して、これらの受光面を覆う領域に設けられている。蓄積電極21Bは、読み出し電極21Aよりも大きいことが好ましく、これにより、多くのキャリアを蓄積することができる。 The storage electrodes 21B are for storing signal charges, such as electrons, in the oxide semiconductor layer 23 among the carriers generated in the photoelectric conversion layer 25, and are provided for each pixel. The storage electrode 21B is provided for each unit pixel P in a region that directly faces the light receiving surfaces of the photoelectric conversion regions 32B and 32R formed in the semiconductor substrate 30 and covers these light receiving surfaces. The storage electrode 21B is preferably larger than the readout electrode 21A, so that more carriers can be stored.
 下部電極21は、光透過性を有する導電膜により構成され、例えば、ITO(インジウム錫酸化物)により構成されている。下部電極21の構成材料としては、ITOの他にも、ドーパントを添加した酸化スズ(SnO)系材料、あるいは亜鉛酸化物(ZnO)にドーパントを添加してなる酸化亜鉛系材料を用いてもよい。酸化亜鉛系材料としては、例えば、ドーパントとしてアルミニウム(Al)を添加したアルミニウム亜鉛酸化物(AZO)、ガリウム(Ga)を添加したガリウム亜鉛酸化物(GZO)、インジウム(In)を添加したインジウム亜鉛酸化物(IZO)が挙げられる。また、この他にも、IGZO、ITZO、CuI、InSbO、ZnMgO、CuInO、MgIN、CdO、ZnSnO等を用いてもよい。 The lower electrode 21 is made of a light-transmitting conductive film, and is made of, for example, ITO (indium tin oxide). In addition to ITO, the constituent material of the lower electrode 21 may be a tin oxide (SnO 2 )-based material to which a dopant is added, or a zinc oxide-based material made by adding a dopant to zinc oxide (ZnO). good. Examples of zinc oxide-based materials include aluminum zinc oxide (AZO) to which aluminum (Al) is added as a dopant, gallium zinc oxide (GZO) to which gallium (Ga) is added, and indium zinc to which indium (In) is added. Examples include oxides (IZO). In addition to these, IGZO, ITZO, CuI, InSbO 4 , ZnMgO, CuInO 2 , MgIN 2 O 4 , CdO, ZnSnO 3 and the like may be used.
 絶縁層22は、蓄積電極21Bと半導体層23とを電気的に分離するためのものである。絶縁層22は、下部電極21を覆うように、例えば層間絶縁層27上に設けられている。絶縁層22には、下部電極21のうち、読み出し電極21A上に開口22Hが設けられており、この開口22Hを介して、読み出し電極21Aと半導体層23とが電気的に接続されている。絶縁層22は、例えば、酸化シリコン(SiO)、窒化シリコン(SiN)および酸窒化シリコン(SiON)等のうちの1種よりなる単層膜あるいは2種以上よりなる積層膜により構成されている。絶縁層22の厚みは、例えば、20nm~500nmである。 The insulating layer 22 is for electrically separating the storage electrode 21B and the semiconductor layer 23. The insulating layer 22 is provided, for example, on the interlayer insulating layer 27 so as to cover the lower electrode 21. In the insulating layer 22, an opening 22H is provided above the readout electrode 21A of the lower electrode 21, and the readout electrode 21A and the semiconductor layer 23 are electrically connected through this opening 22H. The insulating layer 22 is composed of, for example, a single layer film made of one kind of silicon oxide (SiO x ), silicon nitride (SiN x ), silicon oxynitride (SiON), etc., or a laminated film made of two or more kinds. There is. The thickness of the insulating layer 22 is, for example, 20 nm to 500 nm.
 半導体層23は、光電変換層24で発生したキャリア(電子)を蓄積するためのものである。半導体層23は、上記のように、下部電極21と光電変換層24との間に設けられており、第1層23Aと第2層23Bとが下部電極21側からこの順に積層された積層構造を有している。光電変換層24で発生した電子は、蓄積電極21Bの上方の、絶縁層22と第1層23Aとの界面近傍から半導体層23全体に蓄積され、第1層23Aを介して読み出し電極21Aへ転送される。蓄積電極21Bの上方に蓄積された電子は、詳細は後述するが、蓄積電極21Bの電位を制御してポテンシャル勾配を生成することにより、読み出し電極21Aへ転送され、読み出し電極21Aから電子がフローティングディフュージョンFD1に転送される。 The semiconductor layer 23 is for accumulating carriers (electrons) generated in the photoelectric conversion layer 24. As described above, the semiconductor layer 23 is provided between the lower electrode 21 and the photoelectric conversion layer 24, and has a laminated structure in which the first layer 23A and the second layer 23B are laminated in this order from the lower electrode 21 side. have. Electrons generated in the photoelectric conversion layer 24 are accumulated in the entire semiconductor layer 23 from near the interface between the insulating layer 22 and the first layer 23A above the storage electrode 21B, and are transferred to the readout electrode 21A via the first layer 23A. be done. Although the details will be described later, the electrons accumulated above the storage electrode 21B are transferred to the readout electrode 21A by controlling the potential of the storage electrode 21B to generate a potential gradient, and the electrons are transferred from the readout electrode 21A to the floating diffusion. Transferred to FD1.
 第1層23Aおよび第2層23Bは、それぞれ、所定の厚みを有する。具体的には、第1層23Aの厚み(t1)は、第2層23Bの厚み(t2)よりも小さくなっており、例えば、第1層23Aの厚み(t1)と第2層23Bの厚み(t2)との比(t1/t2)が0.16以下となっている。 The first layer 23A and the second layer 23B each have a predetermined thickness. Specifically, the thickness (t1) of the first layer 23A is smaller than the thickness (t2) of the second layer 23B, for example, the thickness (t1) of the first layer 23A and the thickness of the second layer 23B. (t2) (t1/t2) is 0.16 or less.
 半導体層23(第1層23Aおよび第2層23B)は、例えば、以下の材料を用いて形成することができる。本実施の形態の撮像素子10では、光電変換層24で発生したキャリアのうち電子を信号電荷として読み出す。このため、半導体層23は、n型の酸化物半導体材料を用いて形成することができる。具体的には、IGZO(In-Ga-Zn-O系酸化物半導体)、ITZO(In-Sn-Zn-O系酸化物半導体)、ZTO(Zn-Sn-O系酸化物半導体)、IGZTO(In-Ga-Zn-Sn-O系酸化物半導体)、GTO(Ga-Sn-O系酸化物半導体)あるいはIGO(In-Ga-O系酸化物半導体)等が挙げられる。この他、上記酸化物半導体に、ドーパントとしてアルミニウム(Al)や、ガリウム(Ga)あるいはインジウム(In)等を添加したAlZnO、GaZnO、InZnO等や、CuI、InSbO、ZnMgO、CuInO、MgIN、CdO等を含む材料を用いることができる。 The semiconductor layer 23 (first layer 23A and second layer 23B) can be formed using, for example, the following materials. In the image sensor 10 of this embodiment, electrons among carriers generated in the photoelectric conversion layer 24 are read out as signal charges. Therefore, the semiconductor layer 23 can be formed using an n-type oxide semiconductor material. Specifically, IGZO (In-Ga-Zn-O-based oxide semiconductor), ITZO (In-Sn-Zn-O-based oxide semiconductor), ZTO (Zn-Sn-O-based oxide semiconductor), IGZTO ( Examples include In-Ga-Zn-Sn-O-based oxide semiconductor), GTO (Ga-Sn-O-based oxide semiconductor), and IGO (In-Ga-O-based oxide semiconductor). In addition, AlZnO, GaZnO, InZnO, etc., which are obtained by adding aluminum (Al), gallium (Ga), indium (In), etc. as a dopant to the above oxide semiconductor, CuI, InSbO 4 , ZnMgO, CuInO 2 , MgIN 2 A material containing O 4 , CdO, etc. can be used.
 第1層23Aおよび第2層23Bは、上記酸化物半導体材料のうちの少なくとも1種用いることが好ましく、なかでも、IGZO等のインジウム酸化物が好適に用いられる。例えば、第1層23Aは、第2層23Bを構成するインジウム酸化物よりもインジウムの含有比率が高いインジウム酸化物を用いて形成される。一例として、第1層23AはIn:Ga:Zn=4:2:3のIGZOを用い、第2層23BはIn:Ga:Zn=1:1:1のIGZOを用いて形成することができる。 The first layer 23A and the second layer 23B are preferably made of at least one of the above-mentioned oxide semiconductor materials, and indium oxide such as IGZO is particularly preferably used. For example, the first layer 23A is formed using indium oxide having a higher indium content than the indium oxide forming the second layer 23B. As an example, the first layer 23A can be formed using IGZO with In:Ga:Zn=4:2:3, and the second layer 23B can be formed using IGZO with In:Ga:Zn=1:1:1. .
 第1層23Aおよび第2層23Bは、例えば、共に結晶性またはアモルファス性を有している。あるいは、第1層23Aおよび第2層23Bは、一方が結晶性を有し、他方がアモルファス性を有していてもよい。また、第1層23Aおよび第2層23Bが共に結晶性を有する場合には、第1層23Aはアモルファス層と結晶層との積層構造となっていてもよい。具体的には、第1層23Aの一部(第1層23Aを成膜する際の膜厚数nmの初期層)がアモルファス層となっていてもよい。第1層23Aおよび第2層23Bを共に結晶層として形成する場合、第1層23Aが第2層23Bの種結晶としての役割を果たすため、良好な膜質の第2層23Bを形成することができ、第1層23Aと第2層23Bとの界面の欠陥準位を低減することができる。第1層23Aを結晶層、第2層23Bをアモルファス層とする場合には、直接絶縁層22上に形成する場合と比較して層中の不純物が低減するため、不純物起因の欠陥準位を低減することができる。また、不純物に起因した結晶成長の阻害も低減されるため、結晶性を向上させることができる。第1層23Aをアモルファス層、第2層23Bを結晶層とする場合、および第1層23Aおよび第2層23Bを共にアモルファス層とする場合にも、シリコンの不純物が低減するため、欠陥準位を低減することができる。 The first layer 23A and the second layer 23B both have crystallinity or amorphous property, for example. Alternatively, one of the first layer 23A and the second layer 23B may have crystallinity and the other may have amorphous property. Moreover, when the first layer 23A and the second layer 23B both have crystallinity, the first layer 23A may have a laminated structure of an amorphous layer and a crystal layer. Specifically, a part of the first layer 23A (an initial layer with a thickness of several nm when forming the first layer 23A) may be an amorphous layer. When both the first layer 23A and the second layer 23B are formed as crystal layers, the first layer 23A serves as a seed crystal for the second layer 23B, so it is difficult to form the second layer 23B with good film quality. Therefore, the defect level at the interface between the first layer 23A and the second layer 23B can be reduced. When the first layer 23A is a crystal layer and the second layer 23B is an amorphous layer, the impurities in the layer are reduced compared to when the first layer 23A is formed directly on the insulating layer 22, so the defect levels caused by impurities are reduced. can be reduced. Further, since inhibition of crystal growth caused by impurities is reduced, crystallinity can be improved. Also when the first layer 23A is an amorphous layer and the second layer 23B is a crystalline layer, or when both the first layer 23A and the second layer 23B are amorphous layers, the impurities in silicon are reduced, so the defect level can be reduced.
 図4は、第1層23Aの膜厚と遅延電荷量との関係を表したものである。遅延電荷量とは、半導体層23に蓄積された電荷をフローティングディフュージョンFD1に転送する際に、転送開始からある時間経過後に転送されている電荷の量であり、遅延電荷量の値が小さいほど良好なキャリア伝導性を有する。図5は、第1層23Aの膜厚とドレイン電流との関係を表したものである。ドレイン電流は、トランジスタにおいて半導体層23をチャネル層としたときにドレインに流れる電流であり、ドレイン電流の値が小さいほど高い信頼性を有する。図6は、第2層23Bの膜厚と閾値電圧の変動量との関係を表したものである。図7Aは、絶縁層22と半導体層23との界面からの距離とキャリア濃度との関係のシミュレーション結果を表したものである。図7Bは、図7Aに示した膜厚0nm~10nmの範囲を、縦軸を規格化して拡大したものである。 FIG. 4 shows the relationship between the thickness of the first layer 23A and the amount of delayed charge. The amount of delayed charge is the amount of charge that is transferred after a certain period of time has passed from the start of transfer when the charges accumulated in the semiconductor layer 23 are transferred to the floating diffusion FD1, and the smaller the value of the amount of delayed charge, the better. It has carrier conductivity. FIG. 5 shows the relationship between the film thickness of the first layer 23A and the drain current. The drain current is a current that flows through the drain of a transistor when the semiconductor layer 23 is used as a channel layer, and the smaller the value of the drain current, the higher the reliability. FIG. 6 shows the relationship between the thickness of the second layer 23B and the amount of variation in threshold voltage. FIG. 7A shows a simulation result of the relationship between the distance from the interface between the insulating layer 22 and the semiconductor layer 23 and the carrier concentration. FIG. 7B is an enlarged view of the range of film thickness from 0 nm to 10 nm shown in FIG. 7A, with the vertical axis normalized.
 図4からは、第1層23Aが所定の膜厚を有していれば良好なキャリア伝導性を維持できることがわかる。図5からは、第1層23Aの厚みが数nm以下であれば信頼性に影響がないことがわかる。この図4および図5の結果から、第1層23Aの厚みを例えば5nm以下とすることにより、キャリア伝導性を維持しつつ、信頼性を向上させることができるといえる。一方、半導体層23に蓄積される電子は絶縁層22と第1層23Aとの界面近傍に蓄積される。半導体層23に蓄積される電子は、絶縁層22と第1層23Aとの界面から遠ざかるほど減少し、例えば、絶縁層22と第1層23Aとの界面からの距離が3nm程度でキャリア濃度は約1桁減少する。以上から、第1層23Aの厚みは、3nm以上5nm以下とする。 From FIG. 4, it can be seen that good carrier conductivity can be maintained if the first layer 23A has a predetermined thickness. From FIG. 5, it can be seen that reliability is not affected if the thickness of the first layer 23A is several nm or less. From the results shown in FIGS. 4 and 5, it can be said that by setting the thickness of the first layer 23A to, for example, 5 nm or less, reliability can be improved while maintaining carrier conductivity. On the other hand, electrons accumulated in the semiconductor layer 23 are accumulated near the interface between the insulating layer 22 and the first layer 23A. The electrons accumulated in the semiconductor layer 23 decrease as the distance from the interface between the insulating layer 22 and the first layer 23A decreases.For example, when the distance from the interface between the insulating layer 22 and the first layer 23A is about 3 nm, the carrier concentration decreases. It decreases by about one order of magnitude. From the above, the thickness of the first layer 23A is set to be 3 nm or more and 5 nm or less.
 図6からは、第2層23Bを所定の厚みより厚くすることにより、閾値電圧(Vth)の変動が低減されることがわかる。例えば、第2層23Bの厚みが20nmの薄膜の場合には、第2層23B表面の固定電荷によりバックチャネルが発生し、閾値電圧(Vth)の変動が大きくなるが、第2層23Bの厚みが60nmの厚膜の場合には、閾値電圧(Vth)の変動が低減される。一方、実際の駆動電圧で第2層23Bに電子を蓄積させた際のキャリア濃度の膜厚分布をシミュレーションすると、図7Aに示したように、絶縁層22と半導体層23との界面からの距離が遠くなるほどキャリア(電子)濃度が減少することがわかる。但し、半導体層23の表面に電子が多いと、固定電荷に捕らわれる電子が多くなり、閾値電圧(Vth)の変動が増加する。そのため、電圧を印加していない状態でのキャリア濃度を例えば1.0E+16cm-3とし、半導体層23に電子が蓄積した状態で表面の電子がそれよりも少なければ、閾値電圧(Vth)の変動量は基準内とみなせる。つまり、図7Aから半導体層23の厚みは35nm以上となる。ここで、第1層23Aの厚みは、上記のように、3nm以上5nm以下とした。よって、第2層23Bの厚みは32nm以上となる。 From FIG. 6, it can be seen that by making the second layer 23B thicker than a predetermined thickness, fluctuations in the threshold voltage (Vth) are reduced. For example, when the second layer 23B is a thin film with a thickness of 20 nm, a back channel is generated due to fixed charges on the surface of the second layer 23B, and the fluctuation of the threshold voltage (Vth) becomes large. In the case of a thick film of 60 nm, fluctuations in threshold voltage (Vth) are reduced. On the other hand, when simulating the film thickness distribution of carrier concentration when electrons are accumulated in the second layer 23B using an actual driving voltage, as shown in FIG. 7A, the distance from the interface between the insulating layer 22 and the semiconductor layer 23 is It can be seen that the carrier (electron) concentration decreases as the distance increases. However, if there are many electrons on the surface of the semiconductor layer 23, more electrons will be captured by fixed charges, and fluctuations in the threshold voltage (Vth) will increase. Therefore, if the carrier concentration in a state where no voltage is applied is, for example, 1.0E+16cm -3 , and the number of electrons on the surface is less than that with electrons accumulated in the semiconductor layer 23, the amount of fluctuation in the threshold voltage (Vth) can be considered to be within the standard. That is, from FIG. 7A, the thickness of the semiconductor layer 23 is 35 nm or more. Here, the thickness of the first layer 23A was set to be 3 nm or more and 5 nm or less, as described above. Therefore, the thickness of the second layer 23B is 32 nm or more.
 光電変換層24は、光エネルギーを電気エネルギーに変換するものである。光電変換層24は、例えば、それぞれp型半導体またはn型半導体として機能する有機材料(p型半導体材料またはn型半導体材料)を2種以上含んで構成されている。光電変換層24は、層内に、p型半導体材料とn型半導体材料との接合面(p/n接合面)を有する。p型半導体は、相対的に電子供与体(ドナー)として機能するものであり、n型半導体は、相対的に電子受容体(アクセプタ)として機能するものである。光電変換層24は、光を吸収した際に生じる励起子が電子と正孔とに分離する場を提供するものであり、具体的には、電子供与体と電子受容体との界面(p/n接合面)において、励起子が電子と正孔とに分離する。 The photoelectric conversion layer 24 converts light energy into electrical energy. The photoelectric conversion layer 24 is configured to include, for example, two or more types of organic materials (p-type semiconductor material or n-type semiconductor material) each functioning as a p-type semiconductor or an n-type semiconductor. The photoelectric conversion layer 24 has a junction surface (p/n junction surface) between a p-type semiconductor material and an n-type semiconductor material within the layer. A p-type semiconductor relatively functions as an electron donor, and an n-type semiconductor relatively functions as an electron acceptor. The photoelectric conversion layer 24 provides a field where excitons generated when absorbing light are separated into electrons and holes. (n-junction), excitons separate into electrons and holes.
 光電変換層24は、p型半導体材料およびn型半導体材料の他に、所定の波長域の光を光電変換する一方、他の波長域の光を透過させる有機材料、いわゆる色素材料を含んで構成されていてもよい。光電変換層24をp型半導体材料、n型半導体材料および色素材料の3種類の有機材料を用いて形成する場合には、p型半導体材料およびn型半導体材料は、可視領域(例えば、450nm~800nm)において光透過性を有する材料であることが好ましい。光電変換層24の厚みは、例えば、50nm~500nmである。 The photoelectric conversion layer 24 includes, in addition to a p-type semiconductor material and an n-type semiconductor material, an organic material that photoelectrically converts light in a predetermined wavelength range while transmitting light in other wavelength ranges, a so-called dye material. may have been done. When the photoelectric conversion layer 24 is formed using three types of organic materials: a p-type semiconductor material, an n-type semiconductor material, and a dye material, the p-type semiconductor material and the n-type semiconductor material are used in the visible region (for example, from 450 nm to It is preferable that the material is transparent to light at a wavelength of 800 nm). The thickness of the photoelectric conversion layer 24 is, for example, 50 nm to 500 nm.
 光電変換層24を構成する有機材料としては、例えば、キナクリドン誘導体、ナフタレン誘導体、アントラセン誘導体、フェナントレン誘導体、テトラセン誘導体、ピレン誘導体、ペリレン誘導体およびフルオランテン誘導体が挙げられる。光電変換層24は、上記有機材料を2種以上組み合わせて構成されている。上記有機材料は、その組み合わせによってp型半導体またはn型半導体として機能する。 Examples of the organic material constituting the photoelectric conversion layer 24 include quinacridone derivatives, naphthalene derivatives, anthracene derivatives, phenanthrene derivatives, tetracene derivatives, pyrene derivatives, perylene derivatives, and fluoranthene derivatives. The photoelectric conversion layer 24 is composed of a combination of two or more of the above organic materials. The above organic materials function as a p-type semiconductor or an n-type semiconductor depending on the combination.
 なお、光電変換層24を構成する有機材料は特に限定されない。上記した有機材料以外には、例えば、フェニレンビニレン、フルオレン、カルバゾール、インドール、ピレン、ピロール、ピコリン、チオフェン、アセチレンおよびジアセチレン等の重合体あるいはその誘導体を用いることができる。または、金属錯体色素、シアニン系色素、メロシアニン系色素、フェニルキサンテン系色素、トリフェニルメタン系色素、ロダシアニン系色素、キサンテン系色素、大環状アザアヌレン系色素、アズレン系色素、ナフトキノン系色素、アントラキノン系色素、ピレン等の縮合多環芳香族、芳香環または複素環化合物が縮合した鎖状化合物、スクアリリウム基およびクロコニツクメチン基を結合鎖として有するキノリン、ベンゾチアゾール、ベンゾオキサゾール等の2つの含窒素複素環あるいはスクアリリウム基およびクロコニツクメチン基により結合したシアニン系類似の色素等を用いることができる。なお、金属錯体色素としては、ジチオール金属錯体系色素、金属フタロシアニン色素、金属ポルフィリン色素あるいはルテニウム錯体色素が挙げられる。これらのうち、ルテニウム錯体色素が特に好ましいが、上記に限定するものではない。 Note that the organic material constituting the photoelectric conversion layer 24 is not particularly limited. In addition to the above organic materials, for example, polymers such as phenylene vinylene, fluorene, carbazole, indole, pyrene, pyrrole, picoline, thiophene, acetylene, and diacetylene, or derivatives thereof can be used. Or metal complex dyes, cyanine dyes, merocyanine dyes, phenylxanthene dyes, triphenylmethane dyes, rhodacyanine dyes, xanthene dyes, macrocyclic azaannulene dyes, azulene dyes, naphthoquinone dyes, anthraquinone dyes. , fused polycyclic aromatics such as pyrene, chain compounds in which aromatic rings or heterocyclic compounds are condensed, two nitrogen-containing heterocycles such as quinoline, benzothiazole, benzoxazole, etc., which have squarylium groups and croconic metine groups as bonding chains. Alternatively, cyanine-based similar dyes bonded by squarylium groups and croconic metine groups can be used. The metal complex dyes include dithiol metal complex dyes, metal phthalocyanine dyes, metal porphyrin dyes, and ruthenium complex dyes. Among these, ruthenium complex dyes are particularly preferred, but are not limited to the above.
 上部電極25は、下部電極21と同様に光透過性を有する導電膜により構成され、例えば、ITO(インジウム錫酸化物)により構成されている。上部電極25の構成材料としては、このITOの他にも、ドーパントを添加した酸化スズ(SnO)系材料、あるいは亜鉛酸化物(ZnO)にドーパントを添加してなる酸化亜鉛系材料を用いてもよい。酸化亜鉛系材料としては、例えば、ドーパントとしてアルミニウム(Al)を添加したアルミニウム亜鉛酸化物(AZO)、ガリウム(Ga)を添加したガリウム亜鉛酸化物(GZO)、インジウム(In)を添加したインジウム亜鉛酸化物(IZO)が挙げられる。また、この他にも、IGZO、ITZO、CuI、InSbO、ZnMgO、CuInO、MgIN、CdO、ZnSnO等を用いてもよい。上部電極25は、画素毎に分離されていてもよいし、各画素に共通の電極として形成されていてもよい。上部電極25の厚みは、例えば、10nm~200nmである。 The upper electrode 25, like the lower electrode 21, is made of a light-transmitting conductive film, and is made of, for example, ITO (indium tin oxide). In addition to ITO, the upper electrode 25 may be made of a tin oxide (SnO 2 )-based material to which a dopant has been added, or a zinc oxide-based material made by adding a dopant to zinc oxide (ZnO). Good too. Examples of zinc oxide-based materials include aluminum zinc oxide (AZO) to which aluminum (Al) is added as a dopant, gallium zinc oxide (GZO) to which gallium (Ga) is added, and indium zinc to which indium (In) is added. Examples include oxides (IZO). In addition to these, IGZO, ITZO, CuI, InSbO 4 , ZnMgO, CuInO 2 , MgIN 2 O 4 , CdO, ZnSnO 3 and the like may be used. The upper electrode 25 may be separated for each pixel, or may be formed as a common electrode for each pixel. The thickness of the upper electrode 25 is, for example, 10 nm to 200 nm.
 なお、光電変換部20は、下部電極21と光電変換層24との間(例えば、半導体層23と光電変換層24との間)および光電変換層24と上部電極25との間に他の層が設けられていてもよい。例えば、光電変換部20は、下部電極21側から順に、半導体層23、電子ブロッキング膜を兼ねるバッファ層、光電変換層24、正孔ブロッキング膜を兼ねるバッファ層および仕事関数調整層等が積層されていてもよい。また、光電変換層24は、例えば、p型ブロッキング層、p型半導体およびn型半導体を含む層(i層)およびn型ブロッキング層が積層されたpinバルクヘテロ構造としてもよい。 Note that the photoelectric conversion unit 20 includes other layers between the lower electrode 21 and the photoelectric conversion layer 24 (for example, between the semiconductor layer 23 and the photoelectric conversion layer 24) and between the photoelectric conversion layer 24 and the upper electrode 25. may be provided. For example, the photoelectric conversion unit 20 includes, in order from the lower electrode 21 side, a semiconductor layer 23, a buffer layer that also serves as an electron blocking film, a photoelectric conversion layer 24, a buffer layer that also serves as a hole blocking film, a work function adjustment layer, and the like. It's okay. Further, the photoelectric conversion layer 24 may have, for example, a pin bulk heterostructure in which a p-type blocking layer, a layer (i-layer) containing a p-type semiconductor and an n-type semiconductor, and an n-type blocking layer are stacked.
 絶縁層26は、半導体基板30の第1面30Aを覆い、半導体基板30との界面準位を低減させると共に、半導体基板30との界面からの暗電流の発生を抑制するためのものである。また、絶縁層26は、半導体基板30の第1面30Aから半導体基板30を貫通する貫通電極34が形成される開口34H(図11参照)の側面に亘って延在している。絶縁層26は、例えば、固定電荷層26Aと誘電体層26Bとの積層構造を有している。 The insulating layer 26 covers the first surface 30A of the semiconductor substrate 30 to reduce the interface level with the semiconductor substrate 30 and suppress the generation of dark current from the interface with the semiconductor substrate 30. Further, the insulating layer 26 extends from the first surface 30A of the semiconductor substrate 30 to the side surface of the opening 34H (see FIG. 11) in which the through electrode 34 penetrating the semiconductor substrate 30 is formed. The insulating layer 26 has, for example, a stacked structure of a fixed charge layer 26A and a dielectric layer 26B.
 固定電荷層26Aは、正の固定電荷を有する膜でもよいし、負の固定電荷を有する膜でもよい。固定電荷層26Aの構成材料としては、半導体基板30よりもバンドギャップの広い半導体材料または導電材料を用いて形成することが好ましい。これにより、半導体基板30の界面における暗電流の発生を抑えることができる。固定電荷層26Aの構成材料としては、例えば、酸化ハフニウム(HfO)、酸化アルミニウム(AlO)、酸化ジルコニウム(ZrO)、酸化タンタル(TaO)、酸化チタン(TiO)、酸化ランタン(LaO)、酸化プラセオジム(PrO)、酸化セリウム(CeO)、酸化ネオジム(NdO)、酸化プロメチウム(PmO)、酸化サマリウム(SmO)、酸化ユウロピウム(EuO)、酸化ガドリニウム(GdO)、酸化テルビウム(TbO)、酸化ジスプロシウム(DyO)、酸化ホルミウム(HoO)、酸化ツリウム(TmO)、酸化イッテルビウム(YbO)、酸化ルテチウム(LuO)、酸化イットリウム(YO)、窒化ハフニウム(HfN)、窒化アルミニウム(AlN)、酸窒化ハフニウム(HfO)および酸窒化アルミニウム(AlO)等が挙げられる。 The fixed charge layer 26A may be a film having a positive fixed charge or a film having a negative fixed charge. As a constituent material of the fixed charge layer 26A, it is preferable to use a semiconductor material or a conductive material having a wider band gap than the semiconductor substrate 30. Thereby, generation of dark current at the interface of the semiconductor substrate 30 can be suppressed. Examples of the constituent materials of the fixed charge layer 26A include hafnium oxide (HfO x ), aluminum oxide (AlO x ), zirconium oxide (ZrO x ), tantalum oxide (TaO x ), titanium oxide (TiO x ), and lanthanum oxide ( LaO x ), praseodymium oxide (PrO x ), cerium oxide (CeO x ), neodymium oxide (NdO x ), promethium oxide (PmO x ), samarium oxide (SmO x ), europium oxide (EuO x ) , gadolinium oxide (GdO x ), terbium oxide (TbO x ), dysprosium oxide (DyO x ), holmium oxide (HoO x ), thulium oxide (TmO x ) , ytterbium oxide (YbO x ), lutetium oxide (LuO x ), yttrium oxide (YO x ) ), hafnium nitride (HfN x ), aluminum nitride (AlN x ), hafnium oxynitride (HfO x N y ), aluminum oxynitride (AlO x N y ), and the like.
 誘電体層26Bは、半導体基板30と層間絶縁層27との間の屈折率差によって生じる光の反射を防止するためのものである。誘電体層26Bの構成材料としては、半導体基板30の屈折率と層間絶縁層27の屈折率との間の屈折率を有する材料であることが好ましい。誘電体層26Bの構成材料としては、例えば、酸化シリコン、TEOS、窒化シリコンおよび酸窒化シリコン(SiON)等が挙げられる。 The dielectric layer 26B is for preventing the reflection of light caused by the difference in refractive index between the semiconductor substrate 30 and the interlayer insulating layer 27. The constituent material of the dielectric layer 26B is preferably a material having a refractive index between the refractive index of the semiconductor substrate 30 and the refractive index of the interlayer insulating layer 27. Examples of the constituent material of the dielectric layer 26B include silicon oxide, TEOS, silicon nitride, and silicon oxynitride (SiON).
 層間絶縁層27は、例えば、酸化シリコン、窒化シリコンおよび酸窒化シリコン等のうちの1種よりなる単層膜か、あるいはこれらのうちの2種以上よりなる積層膜により構成されている。 The interlayer insulating layer 27 is composed of, for example, a single layer film made of one of silicon oxide, silicon nitride, silicon oxynitride, etc., or a laminated film made of two or more of these.
 層間絶縁層27上には、下部電極21と共に、シールド電極28が設けられている。シールド電極28は、隣り合う画素ユニット1a間における容量結合を防ぐためのものであり、例えば、2行×2列で配置された4つの画素からなる画素ユニット1aの周囲に設けられ、固定電位が印加されている。シールド電極28は、さらに、画素ユニット1a内において、行方向(Z軸方向)および列方向(X軸方向)に隣り合う画素間に延在している。 A shield electrode 28 is provided on the interlayer insulating layer 27 together with the lower electrode 21 . The shield electrode 28 is for preventing capacitive coupling between adjacent pixel units 1a. For example, the shield electrode 28 is provided around the pixel unit 1a consisting of four pixels arranged in 2 rows x 2 columns, and has a fixed potential. is being applied. The shield electrode 28 further extends between adjacent pixels in the row direction (Z-axis direction) and column direction (X-axis direction) within the pixel unit 1a.
 半導体基板30は、例えば、n型のシリコン(Si)基板により構成され、所定の領域にpウェル31を有している。 The semiconductor substrate 30 is made of, for example, an n-type silicon (Si) substrate, and has a p-well 31 in a predetermined region.
 光電変換領域32B,32Rは、それぞれ、半導体基板30の所定の領域にpn接合を有するフォトダイオード(PD)によって構成され、Si基板において光の入射深さに応じて吸収される光の波長が異なることを利用して縦方向に光を分光することを可能としたものである。光電変換領域32Bは、例えば青色光を選択的に検出して青色に対応する信号電荷を蓄積させるものであり、青色光を効率的に光電変換可能な深さに設置されている。光電変換領域32Rは、例えば赤色光を選択的に検出して赤色に対応する信号電荷を蓄積させるものであり、赤色光を効率的に光電変換可能な深さに設置されている。なお、青(B)は、例えば450nm~495nmの波長域、赤(R)は、例えば620nm~750nmの波長域にそれぞれ対応する色である。光電変換領域32B,32Rはそれぞれ、各波長域のうちの一部または全部の波長域の光を検出可能となっていればよい。 The photoelectric conversion regions 32B and 32R are each composed of a photodiode (PD) having a pn junction in a predetermined region of the semiconductor substrate 30, and the wavelength of the light absorbed in the Si substrate differs depending on the depth of incidence of the light. This makes it possible to split light vertically. The photoelectric conversion region 32B, for example, selectively detects blue light and accumulates signal charges corresponding to blue, and is installed at a depth where blue light can be efficiently photoelectrically converted. The photoelectric conversion region 32R, for example, selectively detects red light and accumulates signal charges corresponding to red, and is installed at a depth that allows efficient photoelectric conversion of red light. Note that blue (B) is a color that corresponds to a wavelength range of, for example, 450 nm to 495 nm, and red (R) is a color that corresponds to a wavelength range of, for example, 620 nm to 750 nm. Each of the photoelectric conversion regions 32B and 32R only needs to be capable of detecting light in some or all of the wavelength ranges.
 光電変換領域32Bは、例えば正孔蓄積層となるp+領域と、電子蓄積層となるn領域とを含んで構成されている。光電変換領域32Rは、例えば正孔蓄積層となるp+領域と、電子蓄積層となるn領域とを有する(p-n-pの積層構造を有する)。光電変換領域32Bのn領域は、縦型の転送トランジスタTr2に接続されている。光電変換領域32Bのp+領域は、転送トランジスタTr2に沿って屈曲し、光電変換領域32Rのp+領域につながっている。 The photoelectric conversion region 32B is configured to include, for example, a p+ region that becomes a hole storage layer and an n region that becomes an electron storage layer. The photoelectric conversion region 32R has, for example, a p+ region serving as a hole storage layer and an n region serving as an electron storage layer (has a pnp stacked structure). The n region of the photoelectric conversion region 32B is connected to the vertical transfer transistor Tr2. The p+ region of the photoelectric conversion region 32B is bent along the transfer transistor Tr2 and connected to the p+ region of the photoelectric conversion region 32R.
 ゲート絶縁層33は、例えば、酸化シリコン、窒化シリコンおよび酸窒化シリコン等のうちの1種よりなる単層膜か、あるいはこれらのうちの2種以上よりなる積層膜により構成されている。 The gate insulating layer 33 is composed of, for example, a single layer film made of one of silicon oxide, silicon nitride, silicon oxynitride, etc., or a laminated film made of two or more of these.
 貫通電極34は、半導体基板30の第1面30Aと第2面30Bとの間に設けられ、光電変換部20とアンプトランジスタAMPのゲートGampおよびフローティングディフュージョンFD1とのコネクタとしての機能を有すると共に、光電変換部20において生じたキャリアの伝送経路となるものである。フローティングディフュージョンFD1(リセットトランジスタRSTの一方のソース/ドレイン領域36B)の隣にはリセットトランジスタRSTのリセットゲートGrstが配置されている。これにより、フローティングディフュージョンFD1に蓄積されたキャリアを、リセットトランジスタRSTによりリセットすることが可能となる。 The through electrode 34 is provided between the first surface 30A and the second surface 30B of the semiconductor substrate 30, and has a function as a connector between the photoelectric conversion section 20 and the gate Gamp of the amplifier transistor AMP and the floating diffusion FD1. This serves as a transmission path for carriers generated in the photoelectric conversion section 20. A reset gate Grst of the reset transistor RST is arranged next to the floating diffusion FD1 (one source/drain region 36B of the reset transistor RST). This allows the carriers accumulated in the floating diffusion FD1 to be reset by the reset transistor RST.
 パッド部39A,39B、上部第1コンタクト39C、上部第2コンタクト39D、下部第1コンタクト45、下部第2コンタクト46および配線52は、例えば、PDAS(Phosphorus Doped Amorphous Silicon)等のドープされたシリコン材料、または、アルミニウム(Al)、タングステン(W)、チタン(Ti)、コバルト(Co)、ハフニウム(Hf)およびタンタル(Ta)等の金属材料を用いて形成することができる。 The pad portions 39A, 39B, the upper first contact 39C, the upper second contact 39D, the lower first contact 45, the lower second contact 46, and the wiring 52 are made of, for example, a doped silicon material such as PDAS (Phosphorus Doped Amorphous Silicon). Alternatively, it can be formed using a metal material such as aluminum (Al), tungsten (W), titanium (Ti), cobalt (Co), hafnium (Hf), and tantalum (Ta).
 保護層51およびオンチップレンズ54は、光透過性を有する材料により構成され、例えば、酸化シリコン、窒化シリコンおよび酸窒化シリコン等のうちのいずれかよりなる単層膜、あるいはそれらのうちの2種以上よりなる積層膜により構成されている。この保護層51の厚みは、例えば、100nm~30000nmである。 The protective layer 51 and the on-chip lens 54 are made of a light-transmitting material, for example, a single layer film made of silicon oxide, silicon nitride, silicon oxynitride, etc., or two types thereof. It is composed of a laminated film consisting of the above. The thickness of this protective layer 51 is, for example, 100 nm to 30,000 nm.
 遮光膜53は、例えば、配線52と共に保護層51内に、少なくとも蓄積電極21Bにはかからず、半導体層23と直接接している読み出し電極21Aの領域を覆うように設けられている。遮光膜53は、例えば、タングステン(W)、アルミニウム(Al)およびAlと銅(Cu)との合金等を用いて形成することができる。 For example, the light shielding film 53 is provided in the protective layer 51 together with the wiring 52 so as to cover at least the region of the readout electrode 21A that is in direct contact with the semiconductor layer 23 without covering the storage electrode 21B. The light shielding film 53 can be formed using, for example, tungsten (W), aluminum (Al), an alloy of Al and copper (Cu), or the like.
 図8は、図1に示した撮像素子10の等価回路図である。図9は、図1に示した撮像素子10の下部電極21および制御部を構成するトランジスタの配置を模式的に表したものである。 FIG. 8 is an equivalent circuit diagram of the image sensor 10 shown in FIG. 1. FIG. 9 schematically shows the arrangement of the lower electrode 21 of the image sensor 10 shown in FIG. 1 and the transistors forming the control section.
 リセットトランジスタRST(リセットトランジスタTR1rst)は、光電変換部20からフローティングディフュージョンFD1に転送されたキャリアをリセットするためのものであり、例えばMOSトランジスタにより構成されている。具体的には、リセットトランジスタTR1rstは、リセットゲートGrstと、チャネル形成領域36Aと、ソース/ドレイン領域36B,36Cとから構成されている。リセットゲートGrstは、リセット線RST1に接続され、リセットトランジスタTR1rstの一方のソース/ドレイン領域36Bは、フローティングディフュージョンFD1を兼ねている。リセットトランジスタTR1rstを構成する他方のソース/ドレイン領域36Cは、電源線VDDに接続されている。 The reset transistor RST (reset transistor TR1rst) is for resetting the carriers transferred from the photoelectric conversion unit 20 to the floating diffusion FD1, and is formed of, for example, a MOS transistor. Specifically, the reset transistor TR1rst includes a reset gate Grst, a channel formation region 36A, and source/ drain regions 36B and 36C. The reset gate Grst is connected to the reset line RST1, and one source/drain region 36B of the reset transistor TR1rst also serves as the floating diffusion FD1. The other source/drain region 36C constituting the reset transistor TR1rst is connected to the power supply line VDD.
 アンプトランジスタAMP(アンプトランジスタTR1amp)は、光電変換部20で生じた電荷量を電圧に変調する変調素子であり、例えばMOSトランジスタにより構成されている。具体的には、アンプトランジスタAMPは、ゲートGampと、チャネル形成領域35Aと、ソース/ドレイン領域35B,35Cとから構成されている。ゲートGampは、下部第1コンタクト45、接続部41A、下部第2コンタクト46および貫通電極34等を介して、読み出し電極21AおよびリセットトランジスタTR1rstの一方のソース/ドレイン領域36B(フローティングディフュージョンFD1)に接続されている。また、一方のソース/ドレイン領域35Bは、リセットトランジスタTR1rstを構成する他方のソース/ドレイン領域36Cと、領域を共有しており、電源線VDDに接続されている。 The amplifier transistor AMP (amplifier transistor TR1amp) is a modulation element that modulates the amount of charge generated in the photoelectric conversion section 20 into a voltage, and is composed of, for example, a MOS transistor. Specifically, the amplifier transistor AMP includes a gate Gamp, a channel formation region 35A, and source/ drain regions 35B and 35C. The gate Gamp is connected to the readout electrode 21A and one source/drain region 36B (floating diffusion FD1) of the reset transistor TR1rst via the lower first contact 45, the connecting portion 41A, the lower second contact 46, the through electrode 34, etc. has been done. Further, one source/drain region 35B shares a region with the other source/drain region 36C forming the reset transistor TR1rst, and is connected to the power supply line VDD.
 選択トランジスタSEL(選択トランジスタTR1sel)は、ゲートGselと、チャネル形成領域34Aと、ソース/ドレイン領域34B,34Cとから構成されている。ゲートGselは、選択線SEL1に接続されている。一方のソース/ドレイン領域34Bは、アンプトランジスタAMPを構成する他方のソース/ドレイン領域35Cと、領域を共有しており、他方のソース/ドレイン領域34Cは、信号線(データ出力線)VSL1に接続されている。 The selection transistor SEL (selection transistor TR1sel) is composed of a gate Gsel, a channel formation region 34A, and source/ drain regions 34B and 34C. The gate Gsel is connected to the selection line SEL1. One source/drain region 34B shares a region with the other source/drain region 35C forming the amplifier transistor AMP, and the other source/drain region 34C is connected to the signal line (data output line) VSL1. has been done.
 転送トランジスタTR2(転送トランジスタTR2trs)は、光電変換領域32Bにおいて発生し、蓄積された青色に対応する信号電荷をフローティングディフュージョンFD2に転送するためのものである。光電変換領域32Bは半導体基板30の第2面30Bから深い位置に形成されているので、光電変換領域32Bの転送トランジスタTR2trsは縦型のトランジスタにより構成されていることが好ましい。転送トランジスタTR2trsは、転送ゲート線TG2に接続されている。転送トランジスタTR2trsのゲートGtrs2の近傍の領域37Cには、フローティングディフュージョンFD2が設けられている。光電変換領域32Bに蓄積されたキャリアは、ゲートGtrs2に沿って形成される転送チャネルを介してフローティングディフュージョンFD2に読み出される。 The transfer transistor TR2 (transfer transistor TR2trs) is for transferring the signal charge corresponding to the blue color generated and accumulated in the photoelectric conversion region 32B to the floating diffusion FD2. Since the photoelectric conversion region 32B is formed at a deep position from the second surface 30B of the semiconductor substrate 30, it is preferable that the transfer transistor TR2trs of the photoelectric conversion region 32B is constituted by a vertical transistor. Transfer transistor TR2trs is connected to transfer gate line TG2. A floating diffusion FD2 is provided in a region 37C near the gate Gtrs2 of the transfer transistor TR2trs. The carriers accumulated in the photoelectric conversion region 32B are read out to the floating diffusion FD2 via a transfer channel formed along the gate Gtrs2.
 転送トランジスタTR3(転送トランジスタTR3trs)は、光電変換領域32Rにおいて発生し、蓄積された赤色に対応する信号電荷を、フローティングディフュージョンFD3に転送するためのものであり、例えばMOSトランジスタにより構成されている。転送トランジスタTR3trsは、転送ゲート線TG3に接続されている。転送トランジスタTR3trsのゲートGtrs3の近傍の領域38Cには、フローティングディフュージョンFD3が設けられている。光電変換領域32Rに蓄積されたキャリアは、ゲートGtrs3に沿って形成される転送チャネルを介してフローティングディフュージョンFD3に読み出される。 The transfer transistor TR3 (transfer transistor TR3trs) is for transferring the signal charge corresponding to the red color generated and accumulated in the photoelectric conversion region 32R to the floating diffusion FD3, and is formed of, for example, a MOS transistor. Transfer transistor TR3trs is connected to transfer gate line TG3. A floating diffusion FD3 is provided in a region 38C near the gate Gtrs3 of the transfer transistor TR3trs. The carriers accumulated in the photoelectric conversion region 32R are read out to the floating diffusion FD3 via a transfer channel formed along the gate Gtrs3.
 半導体基板30の第2面30B側には、さらに、光電変換領域32Bの制御部を構成するリセットトランジスタTR2rstと、アンプトランジスタTR2ampと、選択トランジスタTR2selが設けられている。更に、光電変換領域32Rの制御部を構成するリセットトランジスタTR3rstと、アンプトランジスタTR3ampおよび選択トランジスタTR3selが設けられている。 Further provided on the second surface 30B side of the semiconductor substrate 30 are a reset transistor TR2rst, an amplifier transistor TR2amp, and a selection transistor TR2sel, which constitute a control section of the photoelectric conversion region 32B. Furthermore, a reset transistor TR3rst, an amplifier transistor TR3amp, and a selection transistor TR3sel, which constitute a control section of the photoelectric conversion region 32R, are provided.
 リセットトランジスタTR2rstは、ゲート、チャネル形成領域およびソース/ドレイン領域から構成されている。リセットトランジスタTR2rstのゲートはリセット線RST2に接続され、リセットトランジスタTR2rstの一方のソース/ドレイン領域は電源線VDDに接続されている。リセットトランジスタTR2rstの他方のソース/ドレイン領域は、フローティングディフュージョンFD2を兼ねている。 The reset transistor TR2rst is composed of a gate, a channel formation region, and a source/drain region. The gate of the reset transistor TR2rst is connected to the reset line RST2, and one source/drain region of the reset transistor TR2rst is connected to the power supply line VDD. The other source/drain region of the reset transistor TR2rst also serves as the floating diffusion FD2.
 アンプトランジスタTR2ampは、ゲート、チャネル形成領域およびソース/ドレイン領域から構成されている。ゲートは、リセットトランジスタTR2rstの他方のソース/ドレイン領域(フローティングディフュージョンFD2)に接続されている。アンプトランジスタTR2ampを構成する一方のソース/ドレイン領域は、リセットトランジスタTR2rstを構成する一方のソース/ドレイン領域と領域を共有しており、電源線VDDに接続されている。 Amplifier transistor TR2amp is composed of a gate, a channel formation region, and a source/drain region. The gate is connected to the other source/drain region (floating diffusion FD2) of the reset transistor TR2rst. One source/drain region forming the amplifier transistor TR2amp shares a region with one source/drain region forming the reset transistor TR2rst, and is connected to the power supply line VDD.
 選択トランジスタTR2selは、ゲート、チャネル形成領域およびソース/ドレイン領域から構成されている。ゲートは、選択線SEL2に接続されている。選択トランジスタTR2selを構成する一方のソース/ドレイン領域は、アンプトランジスタTR2ampを構成する他方のソース/ドレイン領域と領域を共有している。選択トランジスタTR2selを構成する他方のソース/ドレイン領域は、信号線(データ出力線)VSL2に接続されている。 The selection transistor TR2sel is composed of a gate, a channel formation region, and a source/drain region. The gate is connected to selection line SEL2. One source/drain region forming the selection transistor TR2sel shares a region with the other source/drain region forming the amplifier transistor TR2amp. The other source/drain region constituting the selection transistor TR2sel is connected to the signal line (data output line) VSL2.
 リセットトランジスタTR3rstは、ゲート、チャネル形成領域およびソース/ドレイン領域から構成されている。リセットトランジスタTR3rstのゲートはリセット線RST3に接続され、リセットトランジスタTR3rstを構成する一方のソース/ドレイン領域は電源線VDDに接続されている。リセットトランジスタTR3rstを構成する他方のソース/ドレイン領域は、フローティングディフュージョンFD3を兼ねている。 The reset transistor TR3rst is composed of a gate, a channel formation region, and a source/drain region. The gate of the reset transistor TR3rst is connected to the reset line RST3, and one source/drain region forming the reset transistor TR3rst is connected to the power supply line VDD. The other source/drain region forming the reset transistor TR3rst also serves as the floating diffusion FD3.
 アンプトランジスタTR3ampは、ゲート、チャネル形成領域およびソース/ドレイン領域から構成されている。ゲートは、リセットトランジスタTR3rstを構成する他方のソース/ドレイン領域(フローティングディフュージョンFD3)に接続されている。アンプトランジスタTR3ampを構成する一方のソース/ドレイン領域は、リセットトランジスタTR3rstを構成する一方のソース/ドレイン領域と、領域を共有しており、電源線VDDに接続されている。 The amplifier transistor TR3amp is composed of a gate, a channel formation region, and a source/drain region. The gate is connected to the other source/drain region (floating diffusion FD3) forming the reset transistor TR3rst. One source/drain region forming the amplifier transistor TR3amp shares a region with one source/drain region forming the reset transistor TR3rst, and is connected to the power supply line VDD.
 選択トランジスタTR3selは、ゲート、チャネル形成領域およびソース/ドレイン領域から構成されている。ゲートは、選択線SEL3に接続されている。選択トランジスタTR3selを構成する一方のソース/ドレイン領域は、アンプトランジスタTR3ampを構成する他方のソース/ドレイン領域と、領域を共有している。選択トランジスタTR3selを構成する他方のソース/ドレイン領域は、信号線(データ出力線)VSL3に接続されている。 The selection transistor TR3sel is composed of a gate, a channel formation region, and a source/drain region. The gate is connected to selection line SEL3. One source/drain region forming the selection transistor TR3sel shares a region with the other source/drain region forming the amplifier transistor TR3amp. The other source/drain region constituting the selection transistor TR3sel is connected to a signal line (data output line) VSL3.
 リセット線RST1,RST2,RST3、選択線SEL1,SEL2,SEL3、転送ゲート線TG2,TG3は、それぞれ、駆動回路を構成する垂直駆動回路に接続されている。信号線(データ出力線)VSL1,VSL2,VSL3は、駆動回路を構成するカラム信号処理回路112に接続されている。 The reset lines RST1, RST2, RST3, selection lines SEL1, SEL2, SEL3, and transfer gate lines TG2, TG3 are each connected to a vertical drive circuit that constitutes a drive circuit. Signal lines (data output lines) VSL1, VSL2, and VSL3 are connected to a column signal processing circuit 112 that constitutes a drive circuit.
(1-2.撮像素子の製造方法)
 本実施の形態の撮像素子10は、例えば、次のようにして製造することができる。
(1-2. Manufacturing method of image sensor)
The image sensor 10 of this embodiment can be manufactured, for example, as follows.
 図10~図15は、撮像素子10の製造方法を工程順に表したものである。まず、図10に示したように、半導体基板30内に例えばpウェル31を形成し、このpウェル31内に例えばn型の光電変換領域32B,32Rを形成する。半導体基板30の第1面30A近傍にはp+領域を形成する。 10 to 15 show a method for manufacturing the image sensor 10 in order of steps. First, as shown in FIG. 10, for example, a p-well 31 is formed in a semiconductor substrate 30, and in this p-well 31, for example, n-type photoelectric conversion regions 32B and 32R are formed. A p+ region is formed near the first surface 30A of the semiconductor substrate 30.
 半導体基板30の第2面30Bには、同じく図10に示したように、例えばフローティングディフュージョンFD1~FD3となるn+領域を形成したのち、ゲート絶縁層33と、転送トランジスタTr2、転送トランジスタTr3、選択トランジスタSEL、アンプトランジスタAMPおよびリセットトランジスタRSTの各ゲートを含むゲート配線層47とを形成する。これにより、転送トランジスタTr2、転送トランジスタTr3、選択トランジスタSEL、アンプトランジスタAMPおよびリセットトランジスタRSTが形成される。更に、半導体基板30の第2面30B上に、下部第1コンタクト45、下部第2コンタクト46および接続部41Aを含む配線層41~43および絶縁層44からなる多層配線層40を形成する。 As also shown in FIG. 10, on the second surface 30B of the semiconductor substrate 30, after forming an n+ region that will become, for example, floating diffusions FD1 to FD3, a gate insulating layer 33, a transfer transistor Tr2, a transfer transistor Tr3, and a selection layer are formed. A gate wiring layer 47 including the gates of the transistor SEL, the amplifier transistor AMP, and the reset transistor RST is formed. As a result, the transfer transistor Tr2, the transfer transistor Tr3, the selection transistor SEL, the amplifier transistor AMP, and the reset transistor RST are formed. Furthermore, on the second surface 30B of the semiconductor substrate 30, a multilayer wiring layer 40 is formed, which is made up of wiring layers 41 to 43 including a lower first contact 45, a lower second contact 46, and a connecting portion 41A, and an insulating layer 44.
 半導体基板30の基体としては、例えば、半導体基板30と、埋込み酸化膜(図示せず)と、保持基板(図示せず)とを積層したSOI(Silicon on Insulator)基板を用いる。埋込み酸化膜および保持基板は、図10には図示しないが、半導体基板30の第1面30Aに接合されている。イオン注入後、アニール処理を行う。 As the base of the semiconductor substrate 30, for example, an SOI (Silicon on Insulator) substrate in which the semiconductor substrate 30, a buried oxide film (not shown), and a holding substrate (not shown) are stacked is used. Although not shown in FIG. 10, the buried oxide film and the holding substrate are bonded to the first surface 30A of the semiconductor substrate 30. After ion implantation, annealing treatment is performed.
 次いで、半導体基板30の第2面30B側に設けられた多層配線層40上に支持基板(図示せず)または他の半導体基体等を接合して、上下反転する。続いて、半導体基板30をSOI基板の埋込み酸化膜および保持基板から分離し、半導体基板30の第1面30Aを露出させる。以上の工程は、イオン注入およびCVD(Chemical Vapor Deposition)法等、通常のCMOSプロセスで使用されている技術にて行うことが可能である。 Next, a support substrate (not shown) or another semiconductor substrate or the like is bonded onto the multilayer wiring layer 40 provided on the second surface 30B side of the semiconductor substrate 30, and the semiconductor substrate 30 is turned upside down. Subsequently, the semiconductor substrate 30 is separated from the buried oxide film of the SOI substrate and the holding substrate, and the first surface 30A of the semiconductor substrate 30 is exposed. The above steps can be performed using techniques used in normal CMOS processes, such as ion implantation and CVD (Chemical Vapor Deposition).
 次いで、図11に示したように、例えばドライエッチングにより半導体基板30を第1面30A側から加工し、例えば環状の開口34Hを形成する。開口34Hの深さは、図11に示したように、半導体基板30の第1面30Aから第2面30Bまで貫通すると共に、例えば、接続部41Aまで達する。 Next, as shown in FIG. 11, the semiconductor substrate 30 is processed from the first surface 30A side by, for example, dry etching to form, for example, an annular opening 34H. As shown in FIG. 11, the depth of the opening 34H penetrates from the first surface 30A to the second surface 30B of the semiconductor substrate 30, and reaches, for example, the connection portion 41A.
 続いて、半導体基板30の第1面30Aおよび開口34Hの側面に、例えば固定電荷層26Aおよび誘電体層26Bを順に形成する。固定電荷層26Aは、例えば、原子層堆積法(ALD法)を用いて酸化ハフニウム膜や酸化アルミニウム膜を成膜することで形成することができる。誘電体層26Bは、例えば、プラズマCVD法を用いて酸化シリコン膜を製膜することで形成することができる。次に、誘電体層26B上の所定の位置に、例えば、チタンと窒化チタンとの積層膜(Ti/TiN膜)からなるバリアメタルとタングステン膜とが積層されたパッド部39A,39Bを形成する。これにより、パッド部39A,39Bを遮光膜として用いることができる。その後、誘電体層26Bおよびパッド部39A,39B上に、層間絶縁層27を形成し、CMP(Chemical Mechanical Polishing)法を用いて層間絶縁層27の表面を平坦化する。 Subsequently, for example, a fixed charge layer 26A and a dielectric layer 26B are sequentially formed on the first surface 30A of the semiconductor substrate 30 and the side surface of the opening 34H. The fixed charge layer 26A can be formed, for example, by forming a hafnium oxide film or an aluminum oxide film using an atomic layer deposition method (ALD method). The dielectric layer 26B can be formed, for example, by forming a silicon oxide film using a plasma CVD method. Next, pad portions 39A and 39B are formed at predetermined positions on the dielectric layer 26B, in which a barrier metal made of a laminated film of titanium and titanium nitride (Ti/TiN film) and a tungsten film are laminated. . Thereby, the pad portions 39A and 39B can be used as a light shielding film. Thereafter, an interlayer insulating layer 27 is formed on the dielectric layer 26B and the pad portions 39A, 39B, and the surface of the interlayer insulating layer 27 is planarized using a CMP (Chemical Mechanical Polishing) method.
 続いて、図12に示したように、パッド部39A,39B上に、それぞれ開口27H1,27H2を形成した後、この開口27H1,27H2に、例えばAl等の導電材料を埋め込み、上部第1コンタクト39Cおよび上部第2コンタクト39Dを形成する。 Subsequently, as shown in FIG. 12, after openings 27H1 and 27H2 are formed on the pad portions 39A and 39B, respectively, a conductive material such as Al is filled in the openings 27H1 and 27H2, and an upper first contact 39C is formed. and an upper second contact 39D.
 次に、図13に示したように、層間絶縁層27上に、例えば、スパッタリング法を用いて導電膜21xを成膜した後、フォトリソグラフィー技術を用いてパターニングを行う。具体的には、導電膜21xの所定の位置にフォトレジストPRを形成した後、ドライエッチングまたはウェットエッチングを用いて導電膜21xを加工する。その後、フォトレジストPRを除去することで、図14に示したように、読み出し電極21Aおよび蓄積電極21Bが形成される。 Next, as shown in FIG. 13, a conductive film 21x is formed on the interlayer insulating layer 27 using, for example, a sputtering method, and then patterned using a photolithography technique. Specifically, after forming a photoresist PR at a predetermined position of the conductive film 21x, the conductive film 21x is processed using dry etching or wet etching. Thereafter, by removing the photoresist PR, the readout electrode 21A and the storage electrode 21B are formed as shown in FIG. 14.
 続いて、図15に示したように、絶縁層22、第1層23Aおよび第2層23Bからなる半導体層23、光電変換層24および上部電極25を形成する。絶縁層22は、例えば、ALD法を用いて酸化シリコン膜を製膜した後、CMP法を用いて絶縁層22の表面を平坦化する。その後、読み出し電極21A上に、例えば、ウェットエッチングを用いて開口22Hを形成する。半導体層23(第1層23Aおよび第2層23B)は、例えば、スパッタリング法を用いて形成することができる。光電変換層24は、例えば、真空蒸着法を用いて形成する。上部電極25は、下部電極21と同様に、例えば、スパッタリング法を用いて形成する。最後に、上部電極25上に、配線52および遮光膜53を含む保護層51と、オンチップレンズ54とを配設する。以上により、図1に示した撮像素子10が完成する。 Subsequently, as shown in FIG. 15, a semiconductor layer 23 consisting of an insulating layer 22, a first layer 23A, and a second layer 23B, a photoelectric conversion layer 24, and an upper electrode 25 are formed. The insulating layer 22 is formed by forming a silicon oxide film using, for example, an ALD method, and then planarizing the surface of the insulating layer 22 using a CMP method. After that, an opening 22H is formed on the readout electrode 21A using, for example, wet etching. The semiconductor layer 23 (first layer 23A and second layer 23B) can be formed using, for example, a sputtering method. The photoelectric conversion layer 24 is formed using, for example, a vacuum deposition method. Like the lower electrode 21, the upper electrode 25 is formed using, for example, a sputtering method. Finally, the protective layer 51 including the wiring 52 and the light shielding film 53 and the on-chip lens 54 are provided on the upper electrode 25. Through the above steps, the image sensor 10 shown in FIG. 1 is completed.
 なお、上記のように、半導体層23と光電変換層24との間および光電変換層24と上部電極25との間に、電子ブロッキング膜を兼ねるバッファ層や、正孔ブロッキング膜を兼ねるバッファ層あるいは仕事関数調整層等の有機材料を含む他の層を形成する場合には、各層を真空工程において連続的に(真空一貫プロセスで)形成することが望ましい。また、光電変換層24の成膜方法としては、必ずしも真空蒸着法を用いた手法に限らず、例えば、スピンコート技術やプリント技術等を用いてもよい。更に、透明電極(下部電極21および上部電極25)を形成する方法としては、スパッタリング法の他に、透明電極を構成する材料にもよるが、真空蒸着法や反応性蒸着法、電子ビーム蒸着法、イオンプレーティング法といった物理的気相成長法(PVD法)、パイロゾル法、有機金属化合物を熱分解する方法、スプレー法、ディップ法、MOCVD法を含む各種のCVD法、無電解メッキ法および電解メッキ法を挙げることができる。 Note that, as described above, between the semiconductor layer 23 and the photoelectric conversion layer 24 and between the photoelectric conversion layer 24 and the upper electrode 25, a buffer layer that also serves as an electron blocking film, a buffer layer that also serves as a hole blocking film, or When forming other layers containing organic materials such as a work function adjustment layer, it is desirable to form each layer continuously in a vacuum process (in an integrated vacuum process). Furthermore, the method for forming the photoelectric conversion layer 24 is not necessarily limited to a method using a vacuum evaporation method, and for example, a spin coating technique, a printing technique, or the like may be used. Furthermore, as a method for forming the transparent electrodes (lower electrode 21 and upper electrode 25), in addition to the sputtering method, vacuum evaporation method, reactive evaporation method, and electron beam evaporation method may be used, depending on the material constituting the transparent electrode. , physical vapor deposition methods (PVD methods) such as ion plating methods, pyrosol methods, methods for thermally decomposing organometallic compounds, spray methods, dip methods, various CVD methods including MOCVD methods, electroless plating methods, and electrolytic methods. One example is the plating method.
(1-3.撮像素子の信号取得動作)
 撮像素子10では、光電変換部20に、オンチップレンズ54を介して光が入射すると、その光は、光電変換部20、光電変換領域32B,32Rの順に通過し、その通過過程において緑(G)、青(B)、赤(R)の色光毎に光電変換される。以下、各色の信号取得動作について説明する。
(1-3. Image sensor signal acquisition operation)
In the image sensor 10, when light enters the photoelectric conversion unit 20 via the on-chip lens 54, the light passes through the photoelectric conversion unit 20 and the photoelectric conversion regions 32B and 32R in this order, and in the process of passing through it, it changes to green (G). ), blue (B), and red (R) color light is photoelectrically converted. The signal acquisition operation for each color will be described below.
(光電変換部20による緑色信号の取得)
 撮像素子10へ入射した光のうち、まず、緑色光が、光電変換部20において選択的に検出(吸収)され、光電変換される。
(Acquisition of green signal by photoelectric conversion unit 20)
Of the light incident on the image sensor 10, green light is first selectively detected (absorbed) in the photoelectric conversion unit 20 and photoelectrically converted.
 光電変換部20は、貫通電極34を介して、アンプトランジスタTR1ampのゲートGampとフローティングディフュージョンFD1とに接続されている。よって、光電変換部20で発生した励起子のうちの電子が下部電極21側から取り出され、貫通電極34を介して半導体基板30の第2面30S2側へ転送され、フローティングディフュージョンFD1に蓄積される。これと同時に、アンプトランジスタTR1ampにより、光電変換部20で生じた電荷量が電圧に変調される。 The photoelectric conversion section 20 is connected to the gate Gamp of the amplifier transistor TR1amp and the floating diffusion FD1 via the through electrode 34. Therefore, electrons among the excitons generated in the photoelectric conversion unit 20 are extracted from the lower electrode 21 side, transferred to the second surface 30S2 side of the semiconductor substrate 30 via the through electrode 34, and accumulated in the floating diffusion FD1. . At the same time, the amount of charge generated in the photoelectric conversion section 20 is modulated into voltage by the amplifier transistor TR1amp.
 また、フローティングディフュージョンFD1の隣には、リセットトランジスタTR1rstのリセットゲートGrstが配置されている。これにより、フローティングディフュージョンFD1に蓄積されたキャリアは、リセットトランジスタTR1rstによりリセットされる。 Further, the reset gate Grst of the reset transistor TR1rst is arranged next to the floating diffusion FD1. Thereby, the carriers accumulated in the floating diffusion FD1 are reset by the reset transistor TR1rst.
 光電変換部20は、貫通電極34を介して、アンプトランジスタTR1ampだけでなくフローティングディフュージョンFD1にも接続されているので、フローティングディフュージョンFD1に蓄積されたキャリアをリセットトランジスタTR1rstにより容易にリセットすることが可能となる。 Since the photoelectric conversion unit 20 is connected not only to the amplifier transistor TR1amp but also to the floating diffusion FD1 via the through electrode 34, carriers accumulated in the floating diffusion FD1 can be easily reset by the reset transistor TR1rst. becomes.
 これに対して、貫通電極34とフローティングディフュージョンFD1とが接続されていない場合には、フローティングディフュージョンFD1に蓄積されたキャリアをリセットすることが困難となり、大きな電圧をかけて上部電極25側へ引き抜くことになる。そのため、光電変換層24がダメージを受ける虞がある。また、短時間でのリセットを可能とする構造は暗時ノイズの増大を招き、トレードオフとなるため、この構造は困難である。 On the other hand, if the through electrode 34 and the floating diffusion FD1 are not connected, it becomes difficult to reset the carriers accumulated in the floating diffusion FD1, and it becomes difficult to apply a large voltage to pull them out to the upper electrode 25 side. become. Therefore, there is a possibility that the photoelectric conversion layer 24 may be damaged. Further, a structure that allows resetting in a short period of time causes an increase in dark noise, which is a trade-off, so this structure is difficult.
 図16は、撮像素子10の一動作例を表したものである。(A)は、蓄積電極21Bにおける電位を示し、(B)は、フローティングディフュージョンFD1(読み出し電極21A)における電位を示し、(C)は、リセットトランジスタTR1rstのゲート(Gsel)における電位を示したものである。撮像素子10では、読み出し電極21Aおよび蓄積電極21Bは、それぞれ個別に電圧が印加されるようになっている。 FIG. 16 shows an example of the operation of the image sensor 10. (A) shows the potential at the storage electrode 21B, (B) shows the potential at the floating diffusion FD1 (readout electrode 21A), and (C) shows the potential at the gate (Gsel) of the reset transistor TR1rst. It is. In the image sensor 10, voltages are individually applied to the readout electrode 21A and the storage electrode 21B.
 撮像素子10では、蓄積期間において、駆動回路から読み出し電極21Aに電位V1が印加され、蓄積電極21Bに電位V2が印加される。ここで、電位V1,V2は、V2>V1とする。これにより、光電変換によって生じたキャリア(信号電荷;電子)は、蓄積電極21Bに引きつけられ、蓄積電極21Bと対向する半導体層23の領域に蓄積される(蓄積期間)。因みに、蓄積電極21Bと対向する半導体層23の領域の電位は、光電変換の時間経過に伴い、より負側の値となる。なお、正孔は、上部電極25から駆動回路へと送出される。 In the image sensor 10, during the storage period, a potential V1 is applied from the drive circuit to the readout electrode 21A, and a potential V2 is applied to the storage electrode 21B. Here, the potentials V1 and V2 are assumed to be V2>V1. As a result, carriers (signal charges; electrons) generated by photoelectric conversion are attracted to the storage electrode 21B and are stored in the region of the semiconductor layer 23 facing the storage electrode 21B (storage period). Incidentally, the potential of the region of the semiconductor layer 23 facing the storage electrode 21B becomes a more negative value as time elapses during photoelectric conversion. Note that the holes are sent out from the upper electrode 25 to the drive circuit.
 撮像素子10では、蓄積期間の後期にリセット動作がなされる。具体的には、タイミングt1において、走査部は、リセット信号RSTの電圧を低レベルから高レベルに変化させる。これにより、単位画素Pでは、リセットトランジスタTR1rstがオン状態になり、その結果、フローティングディフュージョンFD1の電圧が電源電圧に設定され、フローティングディフュージョンFD1の電圧がリセットされる(リセット期間)。 In the image sensor 10, a reset operation is performed in the latter half of the accumulation period. Specifically, at timing t1, the scanning section changes the voltage of the reset signal RST from a low level to a high level. As a result, in the unit pixel P, the reset transistor TR1rst is turned on, and as a result, the voltage of the floating diffusion FD1 is set to the power supply voltage, and the voltage of the floating diffusion FD1 is reset (reset period).
 リセット動作の完了後、キャリアの読み出しが行われる。具体的には、タイミングt2において、駆動回路から読み出し電極21Aには電位V3が印加され、蓄積電極21Bには電位V4が印加される。ここで、電位V3,V4は、V3<V4とする。これにより、蓄積電極21Bに対応する領域に蓄積されていたキャリアは、読み出し電極21AからフローティングディフュージョンFD1へと読み出される。即ち、半導体層23に蓄積されたキャリアが制御部に読み出される(転送期間)。 After the reset operation is completed, carrier reading is performed. Specifically, at timing t2, a potential V3 is applied from the drive circuit to the readout electrode 21A, and a potential V4 is applied to the storage electrode 21B. Here, the potentials V3 and V4 are assumed to be V3<V4. Thereby, the carriers accumulated in the region corresponding to the storage electrode 21B are read out from the readout electrode 21A to the floating diffusion FD1. That is, the carriers accumulated in the semiconductor layer 23 are read out by the control section (transfer period).
 読み出し動作完了後、再び、駆動回路から読み出し電極21Aに電位V1が印加され、蓄積電極21Bに電位V2が印加される。これにより、光電変換によって生じたキャリアは、蓄積電極21Bに引きつけられ、蓄積電極21Bと対向する光電変換層24の領域に蓄積される(蓄積期間)。 After the read operation is completed, the drive circuit applies the potential V1 to the read electrode 21A again, and the potential V2 to the storage electrode 21B. Thereby, carriers generated by photoelectric conversion are attracted to the storage electrode 21B and accumulated in the region of the photoelectric conversion layer 24 facing the storage electrode 21B (accumulation period).
(光電変換領域32B,32Rによる青色信号,赤色信号の取得)
 続いて、光電変換部20を透過した光のうち、青色光は光電変換領域32B、赤色光は光電変換領域32Rにおいて、それぞれ順に吸収され、光電変換される。光電変換領域32Bでは、入射した青色光に対応した電子が光電変換領域32Bのn領域に蓄積され、蓄積された電子は、転送トランジスタTr2によりフローティングディフュージョンFD2へと転送される。同様に、光電変換領域32Rでは、入射した赤色光に対応した電子が光電変換領域32Rのn領域に蓄積され、蓄積された電子は、転送トランジスタTr3によりフローティングディフュージョンFD3へと転送される。
(Obtaining blue and red signals using photoelectric conversion regions 32B and 32R)
Subsequently, among the light transmitted through the photoelectric conversion section 20, blue light is absorbed and photoelectrically converted in the photoelectric conversion region 32B and red light in the photoelectric conversion region 32R, respectively. In the photoelectric conversion region 32B, electrons corresponding to the incident blue light are accumulated in the n region of the photoelectric conversion region 32B, and the accumulated electrons are transferred to the floating diffusion FD2 by the transfer transistor Tr2. Similarly, in the photoelectric conversion region 32R, electrons corresponding to the incident red light are accumulated in the n region of the photoelectric conversion region 32R, and the accumulated electrons are transferred to the floating diffusion FD3 by the transfer transistor Tr3.
(1-4.作用・効果)
 本実施の形態の撮像素子10は、光電変換部20において、読み出し電極21Aおよび蓄積電極21Bからなる下部電極21と光電変換層24との間に、第1層23Aおよび第2層23Bが下部電極11側からこの順に積層された半導体層23を設け、第1層23Aの厚みを、第2層23Bの厚みよりも小さく、且つ、3nm以上5nm以下とした。これにより、半導体層23内におけるキャリア伝導を維持しつつ、半導体層23表面の固定電荷の影響を軽減する。以下、これについて説明する。
(1-4. Action/effect)
In the image sensor 10 of the present embodiment, in the photoelectric conversion unit 20, the first layer 23A and the second layer 23B are arranged between the lower electrode 21 consisting of the readout electrode 21A and the storage electrode 21B and the photoelectric conversion layer 24. The semiconductor layers 23 were stacked in this order from the 11 side, and the thickness of the first layer 23A was smaller than the thickness of the second layer 23B, and was 3 nm or more and 5 nm or less. This reduces the influence of fixed charges on the surface of the semiconductor layer 23 while maintaining carrier conduction within the semiconductor layer 23. This will be explained below.
 近年、CCDイメージセンサやCMOSイメージセンサ等を構成する撮像素子として、複数の光電変換部が縦方向に積層された積層型撮像素子の開発が進められている。積層型撮像素子では、例えば、シリコン(Si)基板内に、それぞれフォトダイオード(PD)からなる2つ光電変換領域が積層形成され、Si基板の上方に有機材料を含む光電変換層を有する光電変換部が設けられた構成を有している。 In recent years, the development of a stacked image sensor in which a plurality of photoelectric conversion units are vertically stacked has been progressing as an image sensor forming a CCD image sensor, a CMOS image sensor, etc. In a stacked image sensor, for example, two photoelectric conversion regions each consisting of a photodiode (PD) are stacked in a silicon (Si) substrate, and a photoelectric conversion layer having a photoelectric conversion layer containing an organic material above the Si substrate is formed. It has a configuration in which a section is provided.
 積層型撮像素子では、それぞれの光電変換部において発生した信号電荷を蓄積し、転送する構造が必要とされている。光電変換部では、例えば、光電変換層を間に対向配置された一対の電極のうちの光電変換領域側を、第1電極と、電荷蓄積用電極との2つの電極から構成することにより、光電変換層で発生した信号電荷を蓄積できるようになっている。このような撮像素子では、信号電荷は、電荷蓄積用電極の上方に一旦蓄積した後、Si基板内のフローティングディフュージョンFDへ転送される。これにより、露光開始時に電荷蓄積部を完全空乏化し、キャリアを消去することが可能となる。その結果、kTCノイズの増大や、ランダムノイズの悪化、撮像画質の低下といった現象の発生を抑制することができる。 A stacked image sensor requires a structure that accumulates and transfers signal charges generated in each photoelectric conversion section. In the photoelectric conversion section, for example, the photoelectric conversion layer is formed by configuring the photoelectric conversion region side of a pair of electrodes facing each other between two electrodes, a first electrode and a charge storage electrode. The signal charges generated in the conversion layer can be stored. In such an image sensor, signal charges are once accumulated above the charge storage electrode and then transferred to the floating diffusion FD in the Si substrate. This makes it possible to completely deplete the charge storage section and erase carriers at the start of exposure. As a result, it is possible to suppress the occurrence of phenomena such as an increase in kTC noise, deterioration of random noise, and deterioration of captured image quality.
 ところで、上記のように、光電変換領域側に複数の電極を有する撮像素子では、前述したように、電荷蓄積用電極を含む第1電極と光電変換層との間に、インジウム-ガリウム-亜鉛複合酸化物(IGZO)からなる複合酸化物層を設けることで光応答性の改善が図られている。この複合酸化物層は2層構造からなり、光電変換層から複合酸化物層へのキャリアの停滞を防ぐ目的で設けられている。しかしながら、この複合酸化物層は信頼性の悪化の原因となっている。この信頼性の悪化は、第1電極側に設けられた酸化物半導体層(下層)の酸素欠陥の多さと、光電変換層側に設けられた酸化物半導体層(上層)の表面の固定電荷に起因すると推察される。 By the way, as described above, in an image sensor having a plurality of electrodes on the photoelectric conversion region side, an indium-gallium-zinc composite is formed between the first electrode including the charge storage electrode and the photoelectric conversion layer. The photoresponsiveness is improved by providing a composite oxide layer made of oxide (IGZO). This composite oxide layer has a two-layer structure and is provided for the purpose of preventing stagnation of carriers from the photoelectric conversion layer to the composite oxide layer. However, this composite oxide layer causes deterioration in reliability. This deterioration in reliability is due to the large number of oxygen defects in the oxide semiconductor layer (lower layer) provided on the first electrode side and the fixed charges on the surface of the oxide semiconductor layer (upper layer) provided on the photoelectric conversion layer side. It is assumed that this is due to this.
 これに対して、本実施の形態では、読み出し電極21Aおよび蓄積電極21Bからなる下部電極21と光電変換層24との間に、第1層23Aおよび第2層23Bが下部電極11側からこの順に積層された半導体層23を設け、第1層23Aの厚みを、第2層23Bの厚みよりも小さく、且つ、3nm以上5nm以下とした。これにより、第1層23Aの酸素欠陥が低減され、半導体層23内におけるキャリア伝導を維持できるようになる。また、第2層23Bを第1層23Aよりも厚膜(例えば、第1層23Aの厚み(t1)と第2層23Bの厚み(t2)との比(t1/t2)が0.16以下)とすることにより、半導体層23の表面の固定電荷の影響が軽減され、閾値電圧(Vth)の変動量が低減される。 On the other hand, in the present embodiment, a first layer 23A and a second layer 23B are arranged in this order from the lower electrode 11 side between the lower electrode 21 consisting of the readout electrode 21A and the storage electrode 21B and the photoelectric conversion layer 24. A stacked semiconductor layer 23 was provided, and the thickness of the first layer 23A was smaller than the thickness of the second layer 23B, and was 3 nm or more and 5 nm or less. As a result, oxygen defects in the first layer 23A are reduced, and carrier conduction within the semiconductor layer 23 can be maintained. In addition, the second layer 23B is thicker than the first layer 23A (for example, the ratio (t1/t2) of the thickness (t1) of the first layer 23A to the thickness (t2) of the second layer 23B is 0.16 or less. ), the influence of fixed charges on the surface of the semiconductor layer 23 is reduced, and the amount of variation in the threshold voltage (Vth) is reduced.
 以上により、本実施の形態の撮像素子10では、信頼性を向上させることが可能となる。 As described above, in the image sensor 10 of this embodiment, it is possible to improve reliability.
 次に、本開示の変形例(変形例1~5)について説明する。以下では、上記実施の形態と同様の構成要素については同一の符号を付し、適宜その説明を省略する。 Next, modified examples (modified examples 1 to 5) of the present disclosure will be described. In the following, the same reference numerals are given to the same components as in the above embodiment, and the description thereof will be omitted as appropriate.
<2.変形例>
(2-1.変形例1)
 図17は、本開示の変形例1の撮像素子の要部(光電変換部20A)の断面構成を模式的に表したものである。本変形例の光電変換部20Aは、半導体層23と光電変換層24との間に保護層29を設けた点が上記実施の形態とは異なる。
<2. Modified example>
(2-1. Modification example 1)
FIG. 17 schematically represents a cross-sectional configuration of a main part (photoelectric conversion unit 20A) of an image sensor according to Modification 1 of the present disclosure. The photoelectric conversion unit 20A of this modification differs from the above embodiment in that a protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24.
 保護層29は、半導体層23を構成する酸化物半導体材料からの酸素の脱離を防ぐためのものである。保護層29を構成する材料としては、例えば、TiO、ケイ化酸化チタン(TiSiO)、酸化ニオブ(Nb)およびTaO等が挙げられる。保護層29の厚みは、例えば1原子層あれば効果があり、例えば0.5nm以上10nm以下であることが好ましい。 The protective layer 29 is for preventing oxygen from being desorbed from the oxide semiconductor material forming the semiconductor layer 23 . Examples of the material constituting the protective layer 29 include TiO 2 , titanium oxide silicide (TiSiO), niobium oxide (Nb 2 O 5 ), and TaO x . The thickness of the protective layer 29 is effective if it is, for example, one atomic layer, and is preferably, for example, 0.5 nm or more and 10 nm or less.
 このように、本変形例では、半導体層23と光電変換層24との間に保護層29を設けるようにしたので、半導体層23の表面からの酸素の脱離を低減することが可能となる。これにより、半導体層23(具体的には、第2層23B)と光電変換層24との間の界面におけるトラップの発生が低減される。また、半導体層23側から光電変換層24への信号電荷(電子)の逆流を防ぐことが可能となる。よって、上記実施の形態の効果に加えて、酸素の脱離による信頼性の低下を低減することが可能となるという効果を奏する。 In this modification, the protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24, so that it is possible to reduce the desorption of oxygen from the surface of the semiconductor layer 23. . This reduces the occurrence of traps at the interface between the semiconductor layer 23 (specifically, the second layer 23B) and the photoelectric conversion layer 24. Further, it is possible to prevent signal charges (electrons) from flowing backward from the semiconductor layer 23 side to the photoelectric conversion layer 24. Therefore, in addition to the effects of the embodiments described above, it is possible to reduce the decrease in reliability due to desorption of oxygen.
(2-2.変形例2)
 図18は、本開示の変形例2の撮像素子の要部(光電変換部20B)の断面構成を模式的に表したものである。本変形例の光電変換部20Bは、上記変形例1における光電変換部20Aの構成に加えて、第2層23B上に、さらに第3層23Cを設けたものである。
(2-2. Modification 2)
FIG. 18 schematically represents a cross-sectional configuration of a main part (photoelectric conversion unit 20B) of an image sensor according to Modification 2 of the present disclosure. The photoelectric conversion section 20B of this modification has the structure of the photoelectric conversion section 20A in the above-mentioned modification 1, and further includes a third layer 23C on the second layer 23B.
 本変形例の半導体層23は、第1層23Aと、第2層23Bと、第3層23Cとが、下部電極21側からこの順に積層されたものであり、第1層23Aおよび第2層23Bは、上記実施の形態と同様の構成を有している。 In the semiconductor layer 23 of this modification, a first layer 23A, a second layer 23B, and a third layer 23C are laminated in this order from the lower electrode 21 side. 23B has the same configuration as the above embodiment.
 第3層23Cは、半導体層23の酸素欠損を抑制するためのものであり、アモルファス性を有している。第3層23Cは、第1層23Aおよび第2層23Bと同様に、インジウム酸化物半導体を用いて形成することができる。具体的には、IGZOやIGOが挙げられる。亜鉛(Zn)と酸素(O)との結合は、インジウム(In)よりも弱いため、Znを含まないIGOを用いて第3層23Cを形成することにより、半導体層23の酸素欠損をより抑制することができる。第3層23Cの厚みは、例えば、1nm以上10nm以下である。この第3層23Cが、本開示の「第3の層」の一具体例に相当する。 The third layer 23C is for suppressing oxygen vacancies in the semiconductor layer 23 and has amorphous properties. The third layer 23C can be formed using an indium oxide semiconductor similarly to the first layer 23A and the second layer 23B. Specifically, IGZO and IGO are mentioned. Since the bond between zinc (Zn) and oxygen (O) is weaker than that of indium (In), oxygen vacancies in the semiconductor layer 23 can be further suppressed by forming the third layer 23C using IGO that does not contain Zn. can do. The thickness of the third layer 23C is, for example, 1 nm or more and 10 nm or less. This third layer 23C corresponds to a specific example of the "third layer" of the present disclosure.
 このように、本変形例では、半導体層23を第1層23A、第2層23Bおよび第3層23Cの3層構造とし、さらに、半導体層23と光電変換層24との間に保護層29を設けるようにした。これにより、半導体層23(具体的には、第3層23C)の表面からの酸素の脱離をさらに防ぐことができ、信頼性をより向上させることが可能となる。 As described above, in this modification, the semiconductor layer 23 has a three-layer structure of the first layer 23A, the second layer 23B, and the third layer 23C, and furthermore, a protective layer 29 is provided between the semiconductor layer 23 and the photoelectric conversion layer 24. . Thereby, desorption of oxygen from the surface of the semiconductor layer 23 (specifically, the third layer 23C) can be further prevented, and reliability can be further improved.
 更に、本技術は、以下のような構成を有する撮像素子にも適用することができる。 Furthermore, the present technology can also be applied to an image sensor having the following configuration.
(2-3.変形例3)
 図19Aは、本開示の変形例3の撮像素子10Aの断面構成を模式的に表したものである。図19Bは、図19Aに示した撮像素子10Aの平面構成の一例を模式的に表したものであり、図19Aは、図19Bに示したIII-III線における断面を表している。撮像素子10Aは、例えば、光電変換領域32と、光電変換部60とが積層された積層型の撮像素子であり、この撮像素子10Aを備えた撮像装置(例えば、撮像装置1)の画素部1Aでは、上記実施の形態と同様に、例えば図19Bに示したように、例えば2行×2列で配置された4つの画素を画素ユニット1aが繰り返し単位となり、行方向と列方向とからなるアレイ状に繰り返し配置されている。
(2-3. Modification 3)
FIG. 19A schematically represents a cross-sectional configuration of an image sensor 10A according to Modification 3 of the present disclosure. FIG. 19B schematically shows an example of the planar configuration of the image sensor 10A shown in FIG. 19A, and FIG. 19A shows a cross section taken along the line III-III shown in FIG. 19B. The image sensor 10A is, for example, a stacked type image sensor in which a photoelectric conversion region 32 and a photoelectric conversion section 60 are stacked, and the pixel section 1A of an imaging device (for example, the imaging device 1) equipped with this image sensor 10A. Now, similarly to the above embodiment, as shown in FIG. 19B, for example, the pixel unit 1a is a repeating unit of four pixels arranged in 2 rows x 2 columns, and an array is formed in the row direction and the column direction. They are arranged in a repeated pattern.
 本変形の撮像素子10Aでは、光電変換部60の上方(光入射側S1)には、赤色光(R)、緑色光(G)および青色光(B)を選択的に透過させるカラーフィルタ55が、それぞれ、単位画素P毎に設けられている。具体的には、2行×2列で配置された4つの画素からなる画素ユニット1aにおいて、緑色光(G)を選択的に透過させるカラーフィルタが対角線上に2つ配置され、赤色光(R)および青色光(B)を選択的に透過させるカラーフィルタが、直交する対角線上に1つずつ配置されている。各カラーフィルタが設けられた単位画素(Pr,Pg,Pb)では、例えば、光電変換部60において、それぞれ、対応する色光が検出されるようになっている。即ち、画素部1Aでは、それぞれ、赤色光(R)、緑色光(G)および青色光(B)を検出する画素(Pr,Pg,Pb)が、ベイヤー状に配列されている。 In the image sensor 10A of this modification, a color filter 55 that selectively transmits red light (R), green light (G), and blue light (B) is provided above the photoelectric conversion unit 60 (light incidence side S1). , are provided for each unit pixel P, respectively. Specifically, in a pixel unit 1a consisting of four pixels arranged in 2 rows x 2 columns, two color filters that selectively transmit green light (G) are arranged diagonally, and red light (R ) and blue light (B) are arranged one by one on orthogonal diagonals. In each unit pixel (Pr, Pg, Pb) provided with each color filter, corresponding colored light is detected in the photoelectric conversion unit 60, for example. That is, in the pixel section 1A, pixels (Pr, Pg, Pb) that respectively detect red light (R), green light (G), and blue light (B) are arranged in a Bayer pattern.
 光電変換部60は、例えば、下部電極61、絶縁層62、半導体層63、光電変換層64および上部電極65からなり、下部電極61、絶縁層62、半導体層63、光電変換層64および上部電極65は、それぞれ、上記実施の形態における光電変換部20と同様の構成を有している。光電変換領域32は、光電変換部60とは異なる波長域の光を検出する。 The photoelectric conversion unit 60 includes, for example, a lower electrode 61, an insulating layer 62, a semiconductor layer 63, a photoelectric conversion layer 64, and an upper electrode 65. Each of numerals 65 and 65 has the same configuration as the photoelectric conversion section 20 in the above embodiment. The photoelectric conversion region 32 detects light in a different wavelength range from that of the photoelectric conversion section 60.
 撮像素子10Aでは、カラーフィルタ55を透過した光のうち、可視光領域の光(赤色光(R)、緑色光(G)および青色光(B))は、それぞれ、各カラーフィルタが設けられた単位画素(Pr,Pg,Pb)の光電変換部60で吸収され、それ以外の光、例えば、赤外光領域(例えば、700nm以上1000nm以下)の光(赤外光(IR))は、光電変換部60を透過する。この光電変換部60を透過した赤外光(IR)は、各単位画素Pr,Pg,Pbの光電変換領域32において検出され、各単位画素Pr,Pg,Pbでは赤外光(IR)に対応する信号電荷が生成される。即ち、撮像素子10Aを備えた撮像装置1では、可視光画像および赤外光画像の両方を同時に生成可能となっている。 In the image sensor 10A, out of the light transmitted through the color filter 55, light in the visible light region (red light (R), green light (G), and blue light (B)) is filtered through each color filter. Other light, such as light (infrared light (IR)) in the infrared light region (for example, 700 nm or more and 1000 nm or less), is absorbed by the photoelectric conversion section 60 of the unit pixel (Pr, Pg, Pb). It passes through the converter 60. Infrared light (IR) transmitted through this photoelectric conversion section 60 is detected in the photoelectric conversion region 32 of each unit pixel Pr, Pg, Pb, and each unit pixel Pr, Pg, Pb corresponds to infrared light (IR). A signal charge is generated. That is, the imaging device 1 equipped with the imaging device 10A can simultaneously generate both a visible light image and an infrared light image.
(2-4.変形例4)
 図20Aは、本開示の変形例4の撮像素子10Bの断面構成を模式的に表したものである。図20Bは、図20Aに示した撮像素子10Bの平面構成の一例を模式的に表したものであり、図20Aは、図20Bに示したIV-IV線における断面を表している。上記変形例7では、赤色光(R)、緑色光(G)および青色光(B)を選択的に透過させるカラーフィルタ55が光電変換部60の上方(光入射側S1)に設けられた例を示したが、カラーフィルタ55は、例えば、図20Aに示したように、光電変換領域32と光電変換部60との間に設けるようにしてもよい。
(2-4. Modification example 4)
FIG. 20A schematically shows a cross-sectional configuration of an image sensor 10B according to Modification 4 of the present disclosure. FIG. 20B schematically shows an example of the planar configuration of the image sensor 10B shown in FIG. 20A, and FIG. 20A shows a cross section taken along the line IV-IV shown in FIG. 20B. In the above modification 7, the color filter 55 that selectively transmits red light (R), green light (G), and blue light (B) is provided above the photoelectric conversion unit 60 (light incidence side S1). However, the color filter 55 may be provided between the photoelectric conversion region 32 and the photoelectric conversion section 60, for example, as shown in FIG. 20A.
 撮像素子10Bでは、例えば、カラーフィルタ55は、画素ユニット1a内において、少なくとも赤色光(R)を選択的に透過させるカラーフィルタ(カラーフィルタ55R)および少なくとも青色光(B)を選択的に透過させるカラーフィルタ(カラーフィルタ55B)が互いに対角線上に配置された構成を有している。光電変換部60(光電変換層64)は、例えば上記実施の形態と同様に緑色光に対応する波長を選択的に吸収するように構成されている。これにより、光電変換部60およびカラーフィルタ55R,55Bの下方にそれぞれ配置された光電変換領域(光電変換領域32R,32G)においてRGBに対応する信号を取得することが可能となる。本変形例の撮像素子10Bでは、一般的なベイヤー配列を有する撮像素子よりもRGBそれぞれの光電変換部の面積を拡大することができるため、S/N比を向上させることが可能となる。 In the image sensor 10B, for example, the color filter 55 is a color filter (color filter 55R) that selectively transmits at least red light (R) and a color filter that selectively transmits at least blue light (B) in the pixel unit 1a. The color filters (color filters 55B) are arranged diagonally with each other. The photoelectric conversion unit 60 (photoelectric conversion layer 64) is configured to selectively absorb wavelengths corresponding to green light, for example, similarly to the above embodiments. This makes it possible to acquire signals corresponding to RGB in the photoelectric conversion regions (photoelectric conversion regions 32R, 32G) arranged below the photoelectric conversion unit 60 and the color filters 55R, 55B, respectively. In the image sensor 10B of this modification, the area of each of the RGB photoelectric conversion parts can be increased compared to an image sensor having a general Bayer array, so it is possible to improve the S/N ratio.
(2-5.変形例5)
 図21は、本開示の変形例5に係る撮像素子10Cの断面構成を模式的に表したものである。本変形例の撮像素子10Cは、2つの光電変換部20,80と、1つの光電変換領域32とが縦方向に積層されたものである。
(2-5. Modification 5)
FIG. 21 schematically shows a cross-sectional configuration of an image sensor 10C according to Modification 5 of the present disclosure. The image sensor 10C of this modification has two photoelectric conversion sections 20 and 80 and one photoelectric conversion region 32 stacked in the vertical direction.
 光電変換部20,80と、光電変換領域32とは、互いに異なる波長域の光を選択的に検出して光電変換を行うものである。例えば、光電変換部20では緑(G)の色信号を取得する。例えば、光電変換部80は青(B)の色信号を取得する。例えば、光電変換領域32では赤(R)の色信号を取得する。これにより、撮像素子10Cでは、カラーフィルタを用いることなく一つの画素において複数種類の色信号を取得可能となっている。 The photoelectric conversion units 20 and 80 and the photoelectric conversion region 32 selectively detect light in different wavelength ranges and perform photoelectric conversion. For example, the photoelectric conversion unit 20 acquires a green (G) color signal. For example, the photoelectric conversion unit 80 acquires a blue (B) color signal. For example, the photoelectric conversion region 32 acquires a red (R) color signal. Thereby, in the image sensor 10C, it is possible to acquire a plurality of types of color signals in one pixel without using a color filter.
 光電変換部80は、例えば光電変換部20の上方に積層され、光電変換部20と同様に、下部電極81、例えば第1半導体層83Aおよび第2半導体層83Bを含む半導体層83、光電変換層84および上部電極85が、半導体基板30の第1面30Aの側からこの順に積層された構成を有している。下部電極81は、光電変換部20と同様に、読み出し電極81Aと蓄積電極81Bとから構成されており、絶縁層82によって電気的に分離されている。絶縁層82は、読み出し電極81A上に開口82Hが設けられている。光電変換部80と光電変換部20との間には、層間絶縁層87が設けられている。 The photoelectric conversion unit 80 is stacked, for example, above the photoelectric conversion unit 20, and, like the photoelectric conversion unit 20, includes a lower electrode 81, a semiconductor layer 83 including, for example, a first semiconductor layer 83A and a second semiconductor layer 83B, and a photoelectric conversion layer. 84 and an upper electrode 85 are stacked in this order from the first surface 30A side of the semiconductor substrate 30. Like the photoelectric conversion section 20, the lower electrode 81 is composed of a readout electrode 81A and a storage electrode 81B, which are electrically separated by an insulating layer 82. The insulating layer 82 is provided with an opening 82H above the readout electrode 81A. An interlayer insulating layer 87 is provided between the photoelectric conversion section 80 and the photoelectric conversion section 20.
 読み出し電極81Aには、層間絶縁層87および光電変換部20を貫通し、光電変換部20の読み出し電極21Aと電気的に接続された貫通電極88が接続されている。更に、読み出し電極81Aは、貫通電極34,88を介して、半導体基板30に設けられたフローティングディフュージョンFDと電気的に接続されており、光電変換層84において生成されたキャリアを一時的に蓄積することができる。更に、読み出し電極81Aは、貫通電極34,88を介して、半導体基板30に設けられたアンプトランジスタAMP等と電気的に接続されている。 A through electrode 88 that penetrates the interlayer insulating layer 87 and the photoelectric conversion section 20 and is electrically connected to the readout electrode 21A of the photoelectric conversion section 20 is connected to the readout electrode 81A. Further, the readout electrode 81A is electrically connected to the floating diffusion FD provided on the semiconductor substrate 30 via the through electrodes 34 and 88, and temporarily accumulates carriers generated in the photoelectric conversion layer 84. be able to. Further, the readout electrode 81A is electrically connected to the amplifier transistor AMP provided on the semiconductor substrate 30 via the through electrodes 34 and 88.
<3.適用例>
(適用例1)
 図22は、図1等に示した撮像素子(例えば、撮像素子10)を備えた撮像装置(撮像装置1)の全体構成の一例を表したものである。
<3. Application example>
(Application example 1)
FIG. 22 shows an example of the overall configuration of an imaging device (imaging device 1) including the imaging device (for example, the imaging device 10) shown in FIG. 1 and the like.
 撮像装置1は、例えば、CMOSイメージセンサであり、光学レンズ系(図示せず)を介して被写体からの入射光(像光)を取り込んで、撮像面上に結像された入射光の光量を画素単位で電気信号に変換して画素信号として出力するものである。撮像装置1は、半導体基板30上に、撮像エリアとしての画素部1Aを有すると共に、この画素部1Aの周辺領域に、例えば、垂直駆動回路111、カラム信号処理回路112、水平駆動回路113、出力回路114、制御回路115および入出力端子116を有している。 The imaging device 1 is, for example, a CMOS image sensor, which captures incident light (image light) from a subject through an optical lens system (not shown), and measures the amount of incident light formed on an imaging surface. It converts each pixel into an electrical signal and outputs it as a pixel signal. The imaging device 1 has a pixel section 1A as an imaging area on a semiconductor substrate 30, and includes, for example, a vertical drive circuit 111, a column signal processing circuit 112, a horizontal drive circuit 113, and an output circuit in the peripheral area of the pixel section 1A. It has a circuit 114, a control circuit 115, and an input/output terminal 116.
 画素部1Aには、例えば、行列状に2次元配置された複数の単位画素Pを有している。この単位画素Pには、例えば、画素行ごとに画素駆動線Lread(具体的には行選択線およびリセット制御線)が配線され、画素列ごとに垂直信号線Lsigが配線されている。画素駆動線Lreadは、画素からの信号読み出しのための駆動信号を伝送するものである。画素駆動線Lreadの一端は、垂直駆動回路111の各行に対応した出力端に接続されている。 The pixel portion 1A includes, for example, a plurality of unit pixels P arranged two-dimensionally in a matrix. In this unit pixel P, for example, a pixel drive line Lread (specifically, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column. The pixel drive line Lread transmits a drive signal for reading signals from pixels. One end of the pixel drive line Lread is connected to an output end corresponding to each row of the vertical drive circuit 111.
 垂直駆動回路111は、シフトレジスタやアドレスデコーダ等によって構成され、画素部1Aの各単位画素Pを、例えば、行単位で駆動する画素駆動部である。垂直駆動回路111によって選択走査された画素行の各単位画素Pから出力される信号は、垂直信号線Lsigの各々を通してカラム信号処理回路112に供給される。カラム信号処理回路112は、垂直信号線Lsigごとに設けられたアンプや水平選択スイッチ等によって構成されている。 The vertical drive circuit 111 is a pixel drive section that is composed of a shift register, an address decoder, etc., and drives each unit pixel P of the pixel section 1A, for example, row by row. Signals output from each unit pixel P in the pixel row selectively scanned by the vertical drive circuit 111 are supplied to the column signal processing circuit 112 through each vertical signal line Lsig. The column signal processing circuit 112 includes an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
 水平駆動回路113は、シフトレジスタやアドレスデコーダ等によって構成され、カラム信号処理回路112の各水平選択スイッチを走査しつつ順番に駆動するものである。この水平駆動回路113による選択走査により、垂直信号線Lsigの各々を通して伝送される各画素の信号が順番に水平信号線121に出力され、当該水平信号線121を通して半導体基板30の外部へ伝送される。 The horizontal drive circuit 113 is composed of a shift register, an address decoder, etc., and sequentially drives each horizontal selection switch of the column signal processing circuit 112 while scanning them. By this selective scanning by the horizontal drive circuit 113, the signals of each pixel transmitted through each of the vertical signal lines Lsig are sequentially outputted to the horizontal signal line 121, and transmitted to the outside of the semiconductor substrate 30 through the horizontal signal line 121. .
 出力回路114は、カラム信号処理回路112の各々から水平信号線121を介して順次供給される信号に対して信号処理を行って出力するものである。出力回路114は、例えば、バッファリングのみを行う場合もあるし、黒レベル調整、列ばらつき補正および各種デジタル信号処理等が行われる場合もある。 The output circuit 114 performs signal processing on the signals sequentially supplied from each of the column signal processing circuits 112 via the horizontal signal line 121 and outputs the processed signals. For example, the output circuit 114 may perform only buffering, or may perform black level adjustment, column variation correction, various digital signal processing, and the like.
 垂直駆動回路111、カラム信号処理回路112、水平駆動回路113、水平信号線121および出力回路114からなる回路部分は、半導体基板30上に直に形成されていてもよいし、あるいは外部制御ICに配設されたものであってもよい。また、それらの回路部分は、ケーブル等により接続された他の基板に形成されていてもよい。 The circuit portion consisting of the vertical drive circuit 111, column signal processing circuit 112, horizontal drive circuit 113, horizontal signal line 121, and output circuit 114 may be formed directly on the semiconductor substrate 30, or may be formed on an external control IC. It may be arranged. Moreover, those circuit parts may be formed on another board connected by a cable or the like.
 制御回路115は、半導体基板30の外部から与えられるクロックや、動作モードを指令するデータ等を受け取り、また、撮像装置1の内部情報等のデータを出力するものである。制御回路115はさらに、各種のタイミング信号を生成するタイミングジェネレータを有し、当該タイミングジェネレータで生成された各種のタイミング信号を基に垂直駆動回路111、カラム信号処理回路112および水平駆動回路113等の周辺回路の駆動制御を行う。 The control circuit 115 receives a clock applied from outside the semiconductor substrate 30, data instructing an operation mode, etc., and also outputs data such as internal information of the imaging device 1. The control circuit 115 further includes a timing generator that generates various timing signals, and controls the vertical drive circuit 111, column signal processing circuit 112, horizontal drive circuit 113, etc. based on the various timing signals generated by the timing generator. Performs drive control of peripheral circuits.
 入出力端子116は、外部との信号のやり取りを行うものである。 The input/output terminal 116 is for exchanging signals with the outside.
(適用例2)
 また、上述したような撮像装置1は、例えば、デジタルスチルカメラやデジタルビデオカメラなどの撮像システム、撮像機能を備えた携帯電話機、または、撮像機能を備えた他の機器といった各種の電子機器に適用することができる。
(Application example 2)
Furthermore, the imaging device 1 described above can be applied to various electronic devices, such as an imaging system such as a digital still camera or a digital video camera, a mobile phone with an imaging function, or other equipment with an imaging function. can do.
 図23は、電子機器1000の構成の一例を表したブロック図である。 FIG. 23 is a block diagram showing an example of the configuration of electronic device 1000.
 図23に示すように、電子機器1000は、光学系1001、撮像装置1、DSP(Digital Signal Processor)1002を備えており、バス1008を介して、DSP1002、メモリ1003、表示装置1004、記録装置1005、操作系1006および電源系1007が接続されて構成され、静止画像および動画像を撮像可能である。 As shown in FIG. 23, the electronic device 1000 includes an optical system 1001, an imaging device 1, and a DSP (Digital Signal Processor) 1002. , an operation system 1006, and a power supply system 1007 are connected to each other, and can capture still images and moving images.
 光学系1001は、1枚または複数枚のレンズを有して構成され、被写体からの入射光(像光)を取り込んで撮像装置1の撮像面上に結像するものである。 The optical system 1001 is configured with one or more lenses, and captures incident light (image light) from a subject and forms an image on the imaging surface of the imaging device 1.
 撮像装置1としては、上述した撮像装置1が適用される。撮像装置1は、光学系1001によって撮像面上に結像された入射光の光量を画素単位で電気信号に変換して画素信号としてDSP1002に供給する。 As the imaging device 1, the imaging device 1 described above is applied. The imaging device 1 converts the amount of incident light focused on the imaging surface by the optical system 1001 into an electrical signal for each pixel, and supplies the electrical signal to the DSP 1002 as a pixel signal.
 DSP1002は、撮像装置1からの信号に対して各種の信号処理を施して画像を取得し、その画像のデータを、メモリ1003に一時的に記憶させる。メモリ1003に記憶された画像のデータは、記録装置1005に記録されたり、表示装置1004に供給されて画像が表示されたりする。また、操作系1006は、ユーザによる各種の操作を受け付けて電子機器1000の各ブロックに操作信号を供給し、電源系1007は、電子機器1000の各ブロックの駆動に必要な電力を供給する。 The DSP 1002 performs various signal processing on the signal from the imaging device 1 to obtain an image, and temporarily stores the data of the image in the memory 1003. The image data stored in the memory 1003 is recorded on a recording device 1005 or supplied to a display device 1004 to display the image. Further, the operation system 1006 receives various operations by the user and supplies operation signals to each block of the electronic device 1000, and the power supply system 1007 supplies power necessary for driving each block of the electronic device 1000.
(適用例3)
 図24Aは、撮像装置1を備えた光検出システム2000の全体構成の一例を模式的に表したものである。図24Bは、光検出システム2000の回路構成の一例を表したものである。光検出システム2000は、赤外光L2を発する光源部としての発光装置2001と、光電変換素子を有する受光部としての光検出装置2002とを備えている。光検出装置2002としては、上述した撮像装置1を用いることができる。光検出システム2000は、さらに、システム制御部2003、光源駆動部2004、センサ制御部2005、光源側光学系2006およびカメラ側光学系2007を備えていてもよい。
(Application example 3)
FIG. 24A schematically shows an example of the overall configuration of a photodetection system 2000 including the imaging device 1. FIG. 24B shows an example of the circuit configuration of the photodetection system 2000. The photodetection system 2000 includes a light emitting device 2001 as a light source section that emits infrared light L2, and a photodetection device 2002 as a light receiving section having a photoelectric conversion element. As the photodetection device 2002, the above-described imaging device 1 can be used. The light detection system 2000 may further include a system control section 2003, a light source drive section 2004, a sensor control section 2005, a light source side optical system 2006, and a camera side optical system 2007.
 光検出装置2002は光L1と光L2とを検出することができる。光L1は、外部からの環境光が被写体(測定対象物)2100(図24A)において反射された光である。光L2は発光装置2001において発光されたのち、被写体2100に反射された光である。光L1は例えば可視光であり、光L2は例えば赤外光である。光L1は、光検出装置2002における光電変換部において検出可能であり、光L2は、光検出装置2002における光電変換領域において検出可能である。光L1から被写体2100の画像情報を獲得し、光L2から被写体2100と光検出システム2000との間の距離情報を獲得することができる。光検出システム2000は、例えば、スマートフォン等の電子機器や車等の移動体に搭載することができる。発光装置2001は例えば、半導体レーザ、面発光半導体レーザ、垂直共振器型面発光レーザ(VCSEL)で構成することができる。発光装置2001から発光された光L2の光検出装置2002による検出方法としては、例えばiTOF方式を採用することができるが、これに限定されることはない。iTOF方式では、光電変換部は、例えば光飛行時間(Time-of-Flight;TOF)により被写体2100との距離を測定することができる。発光装置2001から発光された光L2の光検出装置2002による検出方法としては、例えば、ストラクチャード・ライト方式やステレオビジョン方式を採用することもできる。例えばストラクチャード・ライト方式では、あらかじめ定められたパターンの光を被写体2100に投影し、そのパターンのひずみ具合を解析することによって光検出システム2000と被写体2100との距離を測定することができる。また、ステレオビジョン方式においては、例えば2以上のカメラを用い、被写体2100を2以上の異なる視点から見た2以上の画像を取得することで光検出システム2000と被写体との距離を測定することができる。なお、発光装置2001と光検出装置2002とは、システム制御部2003によって同期制御することができる。 The light detection device 2002 can detect light L1 and light L2. The light L1 is the light that is the ambient light from the outside reflected on the subject (measurement object) 2100 (FIG. 24A). Light L2 is light that is emitted by the light emitting device 2001 and then reflected by the subject 2100. The light L1 is, for example, visible light, and the light L2 is, for example, infrared light. Light L1 can be detected in a photoelectric conversion section in photodetection device 2002, and light L2 can be detected in a photoelectric conversion region in photodetection device 2002. Image information of the subject 2100 can be obtained from the light L1, and distance information between the subject 2100 and the light detection system 2000 can be obtained from the light L2. The photodetection system 2000 can be installed in, for example, an electronic device such as a smartphone or a mobile object such as a car. The light emitting device 2001 can be configured with, for example, a semiconductor laser, a surface emitting semiconductor laser, or a vertical cavity surface emitting laser (VCSEL). As a method of detecting the light L2 emitted from the light emitting device 2001 by the photodetecting device 2002, for example, an iTOF method can be adopted, but the method is not limited thereto. In the iTOF method, the photoelectric conversion unit can measure the distance to the subject 2100 using, for example, time-of-flight (TOF). As a method for detecting the light L2 emitted from the light emitting device 2001 by the photodetecting device 2002, for example, a structured light method or a stereo vision method can be adopted. For example, in the structured light method, the distance between the light detection system 2000 and the subject 2100 can be measured by projecting a predetermined pattern of light onto the subject 2100 and analyzing the degree of distortion of the pattern. Furthermore, in the stereo vision method, the distance between the light detection system 2000 and the subject can be measured by, for example, using two or more cameras and acquiring two or more images of the subject 2100 viewed from two or more different viewpoints. can. Note that the light emitting device 2001 and the photodetecting device 2002 can be synchronously controlled by the system control unit 2003.
<4.応用例>
(内視鏡手術システムへの応用例)
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
<4. Application example>
(Example of application to endoscopic surgery system)
The technology according to the present disclosure (this technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
 図25は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。 FIG. 25 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure (present technology) can be applied.
 図25では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11133上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。 FIG. 25 shows an operator (doctor) 11131 performing surgery on a patient 11132 on a patient bed 11133 using the endoscopic surgery system 11000. As illustrated, the endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as a pneumoperitoneum tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. , and a cart 11200 loaded with various devices for endoscopic surgery.
 内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。 The endoscope 11100 is composed of a lens barrel 11101 whose distal end is inserted into a body cavity of a patient 11132 over a predetermined length, and a camera head 11102 connected to the proximal end of the lens barrel 11101. In the illustrated example, an endoscope 11100 configured as a so-called rigid scope having a rigid tube 11101 is shown, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible tube. good.
 鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 An opening into which an objective lens is fitted is provided at the tip of the lens barrel 11101. A light source device 11203 is connected to the endoscope 11100, and the light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101, and the light is guided to the tip of the lens barrel. Irradiation is directed toward an observation target within the body cavity of the patient 11132 through the lens. Note that the endoscope 11100 may be a direct-viewing mirror, a diagonal-viewing mirror, or a side-viewing mirror.
 カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。 An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from an observation target is focused on the image sensor by the optical system. The observation light is photoelectrically converted by the image sensor, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted as RAW data to a camera control unit (CCU) 11201.
 CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。 The CCU 11201 includes a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and centrally controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal, such as development processing (demosaic processing), for displaying an image based on the image signal.
 表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。 The display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under control from the CCU 11201.
 光源装置11203は、例えばLED(light emitting diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。 The light source device 11203 is composed of a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 11100 when photographing the surgical site or the like.
 入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。 The input device 11204 is an input interface for the endoscopic surgery system 11000. The user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
 処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 A treatment tool control device 11205 controls driving of an energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, or the like. The pneumoperitoneum device 11206 injects gas into the body cavity of the patient 11132 via the pneumoperitoneum tube 11111 in order to inflate the body cavity of the patient 11132 for the purpose of ensuring a field of view with the endoscope 11100 and a working space for the operator. send in. The recorder 11207 is a device that can record various information regarding surgery. The printer 11208 is a device that can print various types of information regarding surgery in various formats such as text, images, or graphs.
 なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。 Note that the light source device 11203 that supplies irradiation light to the endoscope 11100 when photographing the surgical site can be configured, for example, from a white light source configured by an LED, a laser light source, or a combination thereof. When a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so the white balance of the captured image is adjusted in the light source device 11203. It can be carried out. In this case, the laser light from each RGB laser light source is irradiated onto the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby supporting each of RGB. It is also possible to capture images in a time-division manner. According to this method, a color image can be obtained without providing a color filter in the image sensor.
 また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Furthermore, the driving of the light source device 11203 may be controlled so that the intensity of the light it outputs is changed at predetermined time intervals. By controlling the drive of the image sensor of the camera head 11102 in synchronization with the timing of changes in the light intensity to acquire images in a time-division manner and compositing the images, a high dynamic It is possible to generate an image of a range.
 また、光源装置11203は、特殊光観察に対応した所定の波長域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Additionally, the light source device 11203 may be configured to be able to supply light in a predetermined wavelength range compatible with special light observation. Special light observation uses, for example, the wavelength dependence of light absorption in body tissues to illuminate the mucosal surface layer by irradiating a narrower band of light than the light used for normal observation (i.e., white light). Narrow Band Imaging is performed to photograph specific tissues such as blood vessels with high contrast. Alternatively, in the special light observation, fluorescence observation may be performed in which an image is obtained using fluorescence generated by irradiating excitation light. Fluorescence observation involves irradiating body tissues with excitation light and observing the fluorescence from the body tissues (autofluorescence observation), or locally injecting reagents such as indocyanine green (ICG) into the body tissues and It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 may be configured to be able to supply narrowband light and/or excitation light compatible with such special light observation.
 図26は、図25に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。 FIG. 26 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 25.
 カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。 The camera head 11102 includes a lens unit 11401, an imaging section 11402, a driving section 11403, a communication section 11404, and a camera head control section 11405. The CCU 11201 includes a communication section 11411, an image processing section 11412, and a control section 11413. Camera head 11102 and CCU 11201 are communicably connected to each other by transmission cable 11400.
 レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。 The lens unit 11401 is an optical system provided at the connection part with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401. The lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
 撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。 The imaging element configuring the imaging unit 11402 may be one (so-called single-plate type) or multiple (so-called multi-plate type). When the imaging unit 11402 is configured with a multi-plate type, for example, image signals corresponding to RGB are generated by each imaging element, and a color image may be obtained by combining them. Alternatively, the imaging unit 11402 may be configured to include a pair of imaging elements for respectively acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display. By performing 3D display, the operator 11131 can more accurately grasp the depth of the living tissue at the surgical site. Note that when the imaging section 11402 is configured with a multi-plate type, a plurality of lens units 11401 may be provided corresponding to each imaging element.
 また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。 Furthermore, the imaging unit 11402 does not necessarily have to be provided in the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
 駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。 The drive unit 11403 is constituted by an actuator, and moves the zoom lens and focus lens of the lens unit 11401 by a predetermined distance along the optical axis under control from the camera head control unit 11405. Thereby, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
 通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。 The communication unit 11404 is configured by a communication device for transmitting and receiving various information to and from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the imaging unit 11402 to the CCU 11201 via the transmission cable 11400 as RAW data.
 また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。 Furthermore, the communication unit 11404 receives a control signal for controlling the drive of the camera head 11102 from the CCU 11201 and supplies it to the camera head control unit 11405. The control signal may include, for example, information specifying the frame rate of the captured image, information specifying the exposure value at the time of capturing, and/or information specifying the magnification and focus of the captured image. Contains information about conditions.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。 Note that the above imaging conditions such as the frame rate, exposure value, magnification, focus, etc. may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. good. In the latter case, the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
 カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。 The camera head control unit 11405 controls the drive of the camera head 11102 based on the control signal from the CCU 11201 received via the communication unit 11404.
 通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。 The communication unit 11411 is configured by a communication device for transmitting and receiving various information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
 また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。 Furthermore, the communication unit 11411 transmits a control signal for controlling the drive of the camera head 11102 to the camera head 11102. The image signal and control signal can be transmitted by electrical communication, optical communication, or the like.
 画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。 The image processing unit 11412 performs various image processing on the image signal, which is RAW data, transmitted from the camera head 11102.
 制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。 The control unit 11413 performs various controls related to the imaging of the surgical site etc. by the endoscope 11100 and the display of the captured image obtained by imaging the surgical site etc. For example, the control unit 11413 generates a control signal for controlling the drive of the camera head 11102.
 また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。 Furthermore, the control unit 11413 causes the display device 11202 to display a captured image showing the surgical site, etc., based on the image signal subjected to image processing by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects the shape and color of the edge of an object included in the captured image to detect surgical tools such as forceps, specific body parts, bleeding, mist when using the energy treatment tool 11112, etc. can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may use the recognition result to superimpose and display various types of surgical support information on the image of the surgical site. By displaying the surgical support information in a superimposed manner and presenting it to the surgeon 11131, it becomes possible to reduce the burden on the surgeon 11131 and allow the surgeon 11131 to proceed with the surgery reliably.
 カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 The transmission cable 11400 connecting the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with electrical signal communication, an optical fiber compatible with optical communication, or a composite cable thereof.
 ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。 Here, in the illustrated example, communication is performed by wire using the transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
 以上、本開示に係る技術が適用され得る内視鏡手術システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部11402に適用され得る。撮像部11402に本開示に係る技術を適用することにより、検出精度が向上する。 An example of an endoscopic surgery system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above. By applying the technology according to the present disclosure to the imaging unit 11402, detection accuracy is improved.
 なお、ここでは、一例として内視鏡手術システムについて説明したが、本開示に係る技術は、その他、例えば、顕微鏡手術システム等に適用されてもよい。 Although an endoscopic surgery system has been described here as an example, the technology according to the present disclosure may be applied to other systems, such as a microsurgical system.
(移動体への応用例)
 本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット、建設機械、農業機械(トラクター)などのいずれかの種類の移動体に搭載される装置として実現されてもよい。
(Example of application to mobile objects)
The technology according to the present disclosure can be applied to various products. For example, the technology according to the present disclosure can be applied to any type of transportation such as a car, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility vehicle, an airplane, a drone, a ship, a robot, a construction machine, an agricultural machine (tractor), etc. It may also be realized as a device mounted on the body.
 図27は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 27 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図27に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example shown in FIG. 27, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050. Further, as the functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio/image output section 12052, and an in-vehicle network I/F (interface) 12053 are illustrated.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 includes a drive force generation device such as an internal combustion engine or a drive motor that generates drive force for the vehicle, a drive force transmission mechanism that transmits the drive force to wheels, and a drive force transmission mechanism that controls the steering angle of the vehicle. It functions as a control device for a steering mechanism to adjust and a braking device to generate braking force for the vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operations of various devices installed in the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, or a fog lamp. In this case, radio waves transmitted from a portable device that replaces a key or signals from various switches may be input to the body control unit 12020. The body system control unit 12020 receives input of these radio waves or signals, and controls the door lock device, power window device, lamp, etc. of the vehicle.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The external information detection unit 12030 detects information external to the vehicle in which the vehicle control system 12000 is mounted. For example, an imaging section 12031 is connected to the outside-vehicle information detection unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image of the exterior of the vehicle, and receives the captured image. The external information detection unit 12030 may perform object detection processing such as a person, car, obstacle, sign, or text on the road surface or distance detection processing based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of received light. The imaging unit 12031 can output the electrical signal as an image or as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or non-visible light such as infrared rays.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The in-vehicle information detection unit 12040 detects in-vehicle information. For example, a driver condition detection section 12041 that detects the condition of the driver is connected to the in-vehicle information detection unit 12040. The driver condition detection unit 12041 includes, for example, a camera that images the driver, and the in-vehicle information detection unit 12040 detects the degree of fatigue or concentration of the driver based on the detection information input from the driver condition detection unit 12041. It may be calculated, or it may be determined whether the driver is falling asleep.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates control target values for the driving force generation device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040, Control commands can be output to 12010. For example, the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions, including vehicle collision avoidance or impact mitigation, following distance based on vehicle distance, vehicle speed maintenance, vehicle collision warning, vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose of
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 In addition, the microcomputer 12051 controls the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of autonomous driving, etc., which does not rely on operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Furthermore, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information outside the vehicle acquired by the outside information detection unit 12030. For example, the microcomputer 12051 controls the headlamps according to the position of the preceding vehicle or oncoming vehicle detected by the vehicle exterior information detection unit 12030, and performs cooperative control for the purpose of preventing glare, such as switching from high beam to low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図27の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The audio and image output unit 12052 transmits an output signal of at least one of audio and images to an output device that can visually or audibly notify information to the occupants of the vehicle or to the outside of the vehicle. In the example of FIG. 27, an audio speaker 12061, a display section 12062, and an instrument panel 12063 are illustrated as output devices. The display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
 図28は、撮像部12031の設置位置の例を示す図である。 FIG. 28 is a diagram showing an example of the installation position of the imaging section 12031.
 図28では、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 28, the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。車室内のフロントガラスの上部に備えられる撮像部12105は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle 12100. An imaging unit 12101 provided in the front nose and an imaging unit 12105 provided above the windshield inside the vehicle mainly acquire images in front of the vehicle 12100. Imaging units 12102 and 12103 provided in the side mirrors mainly capture images of the sides of the vehicle 12100. An imaging unit 12104 provided in the rear bumper or back door mainly captures images of the rear of the vehicle 12100. The imaging unit 12105 provided above the windshield inside the vehicle is mainly used to detect preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
 なお、図28には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 Note that FIG. 28 shows an example of the imaging range of the imaging units 12101 to 12104. An imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose, imaging ranges 12112 and 12113 indicate imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and an imaging range 12114 shows the imaging range of the imaging unit 12101 provided on the front nose. The imaging range of the imaging unit 12104 provided in the rear bumper or back door is shown. For example, by overlapping the image data captured by the imaging units 12101 to 12104, an overhead image of the vehicle 12100 viewed from above can be obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of image sensors, or may be an image sensor having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051 determines the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104. In particular, by determining the three-dimensional object that is closest to the vehicle 12100 on its path and that is traveling at a predetermined speed (for example, 0 km/h or more) in approximately the same direction as the vehicle 12100, it is possible to extract the three-dimensional object as the preceding vehicle. can. Furthermore, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance in front of the preceding vehicle, and perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. In this way, it is possible to perform cooperative control for the purpose of autonomous driving, etc., in which the vehicle travels autonomously without depending on the driver's operation.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 transfers three-dimensional object data to other three-dimensional objects such as two-wheeled vehicles, regular vehicles, large vehicles, pedestrians, and utility poles based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic obstacle avoidance. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk exceeds a set value and there is a possibility of a collision, the microcomputer 12051 transmits information via the audio speaker 12061 and the display unit 12062. By outputting a warning to the driver via the vehicle control unit 12010 and performing forced deceleration and avoidance steering via the drive system control unit 12010, driving support for collision avoidance can be provided.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether the pedestrian is present in the images captured by the imaging units 12101 to 12104. Such pedestrian recognition involves, for example, a procedure for extracting feature points in images captured by the imaging units 12101 to 12104 as infrared cameras, and a pattern matching process is performed on a series of feature points indicating the outline of an object to determine whether it is a pedestrian or not. This is done through a procedure that determines the When the microcomputer 12051 determines that a pedestrian is present in the images captured by the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 creates a rectangular outline for emphasis on the recognized pedestrian. The display unit 12062 is controlled to display the . Furthermore, the audio image output unit 12052 may control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
 以上、実施の形態および変形例1~5ならびに適用例および応用例を挙げて説明したが、本開示内容は上記実施の形態等に限定されるものではなく、種々変形が可能である。例えば、上記実施の形態では、撮像素子として、緑色光を検出する光電変換部20と、青色光,赤色光をそれぞれ検出する光電変換領域32B,32Rとを積層させた構成としたが、本開示内容はこのような構造に限定されるものではない。例えば、光電変換部において赤色光あるいは青色光を検出するようにしてもよいし、光電変換領域において緑色光を検出するようにしてもよい。 Although the embodiments, modifications 1 to 5, application examples, and applied examples have been described above, the content of the present disclosure is not limited to the above embodiments, etc., and various modifications are possible. For example, in the above embodiment, the image sensor has a structure in which the photoelectric conversion unit 20 that detects green light and the photoelectric conversion regions 32B and 32R that detect blue light and red light are stacked, but the present disclosure The content is not limited to this structure. For example, the photoelectric conversion section may detect red light or blue light, or the photoelectric conversion region may detect green light.
 また、これらの光電変換部および光電変換領域の数やその比率も限定されるものではなく、2以上の光電変換部を設けてもよいし、光電変換部だけで複数色の色信号が得られるようにしてもよい。 Further, the number and ratio of these photoelectric conversion sections and photoelectric conversion regions are not limited, and two or more photoelectric conversion sections may be provided, or color signals of multiple colors can be obtained only with the photoelectric conversion section. You can do it like this.
 更に、上記実施の形態等では、下部電極21を構成する複数の電極として、読み出し電極21Aおよび蓄積電極21Bの2つの電極から構成した例を示したが、この他に、転送電極や排出電極等の3つ以上の電極を設けるようにしてもよい。 Further, in the above embodiments, the lower electrode 21 is composed of two electrodes, the readout electrode 21A and the storage electrode 21B. Three or more electrodes may be provided.
 なお、本明細書中に記載された効果はあくまで例示であって限定されるものではなく、また、他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.
 なお、本技術は以下のような構成を取ることも可能である。以下の構成の本技術によれば、並列配置されてなる第1の電極および第2の電極と光電変換層との間に、第1の電極および第2の電極の側から順に積層された第1の層および第2の層を含み、第1の層の厚みは第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層を設けるようにした。これにより、半導体層内におけるキャリア伝導を維持しつつ、半導体層表面の固定電荷の影響を軽減する。よって、信頼性を向上させることが可能となる。
(1)
 並列配置されてなる第1の電極および第2の電極と、
 前記第1の電極および前記第2の電極と対向配置された第3の電極と、
 前記第1の電極および前記第2の電極と前記第3の電極との間に設けられた光電変換層と、
 前記第1の電極および前記第2の電極と前記光電変換層との間に設けられ、前記第1の電極および前記第2の電極の側から順に積層された第1の層および第2の層を含み、前記第1の層の厚みは前記第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層と
 を備えた光電変換素子。
(2)
 前記第1の層の膜厚(t1)と前記第2の層の膜厚(t2)との比(t1/t2)は0.16以下である、前記(1)に記載の光電変換素子。
(3)
 前記第1の層および前記第2の層は、ともにインジウム酸化物を用いて形成されており、
 前記第1の層を構成する第1のインジウム酸化物に含まれるインジウムの含有比率は、前記第2の層を構成する第2のインジウム酸化物に含まれるインジウムの含有比率よりも高い、前記(1)または(2)に記載の光電変換素子。
(4)
 前記第2の層の膜厚は32nm以上である、前記(1)乃至(3)のうちのいずれか1つに記載の光電変換素子。
(5)
 前記第1の層および前記第2の層は共に結晶性を有している、前記(1)乃至(4)のうちのいずれか1つに記載の光電変換素子。
(6)
 前記第1の層および前記第2の層は共にアモルファス性を有している、前記(1)乃至(4)のうちのいずれか1つに記載の光電変換素子。
(7)
 前記半導体層は、前記光電変換層と前記第2の層との間にアモルファス性を有する第3の層をさらに有する、前記(1)乃至(6)のうちのいずれか1つに記載の光電変換素子。
(8)
 前記第3の層の膜厚は、1nm以上10nm以下である、前記(7)に記載の光電変換素子。
(9)
 前記第1の電極および前記第2の電極と、前記半導体層との間に設けられると共に、前記第1の電極の上方に開口を有する絶縁層をさらに有し、
 前記第2の電極と前記半導体層とは、前記開口を介して電気的に接続されている、前記(1)乃至(8)のうちのいずれか1つに記載の光電変換素子。
(10)
 前記光電変換層と前記半導体層との間に無機材料を含む保護層をさらに有する、前記(1)乃至(9)のうちのいずれか1つに記載の光電変換素子。
(11)
 前記第1の電極および前記第2の電極は、前記光電変換層に対して光入射面とは反対側に配置されている、前記(1)乃至(10)のうちのいずれか1つに記載の光電変換素子。
(12)
 前記第1の電極および前記第2の電極は、それぞれ個別に電圧が印加される、前記(1)乃至(11)のうちのいずれか1つに記載の光電変換素子。
(13)
 1または複数の光電変換素子がそれぞれ設けられている複数の画素を備え、
 前記光電変換素子は、
 並列配置されてなる第1の電極および第2の電極と、
 前記第1の電極および前記第2の電極と対向配置された第3の電極と、
 前記第1の電極および前記第2の電極と前記第3の電極との間に設けられた光電変換層と、
 前記第1の電極および前記第2の電極と前記光電変換層との間に設けられ、前記第1の電極および前記第2の電極の側から順に積層された第1の層および第2の層を含み、前記第1の層の厚みは前記第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層と
 を有する光検出装置。
(14)
 前記光電変換素子は、前記1または複数の光電変換素子とは異なる波長域の光電変換を行う1または複数の光電変換領域をさらに有する、前記(13)に記載の光検出装置。
(15)
 前記1または複数の光電変換領域は半導体基板に埋め込み形成され、
 前記1または複数の光電変換部は前記半導体基板の光入射面側に配置されている、前記(14)に記載の光検出装置。
(16)
 前記半導体基板の前記光入射面とは反対側の面に多層配線層が形成されている、前記(15)に記載の光検出装置。
(17)
 並列配置されてなる第1の電極および第2の電極と、
 前記第1の電極および前記第2の電極と対向配置された第3の電極と、
 前記第1の電極および前記第2の電極と前記第3の電極との間に設けられた光電変換層と、
 前記第1の電極および前記第2の電極と前記光電変換層との間に設けられ、前記第1の電極および前記第2の電極の側から順に積層された第1の層および第2の層を含み、前記第1の層の膜厚(t1)と前記第2の層の膜厚(t2)との比(t1/t2)が0.16以下となる半導体層と
 を備えた光電変換素子。
Note that the present technology can also have the following configuration. According to the present technology having the following configuration, between the first electrode and the second electrode arranged in parallel and the photoelectric conversion layer, the first electrode and the second electrode are stacked in order from the side of the first electrode and the second electrode. The semiconductor layer includes a first layer and a second layer, the first layer having a thickness smaller than the second layer, and having a thickness of 3 nm or more and 5 nm or less. This reduces the influence of fixed charges on the surface of the semiconductor layer while maintaining carrier conduction within the semiconductor layer. Therefore, it is possible to improve reliability.
(1)
A first electrode and a second electrode arranged in parallel;
a third electrode placed opposite the first electrode and the second electrode;
a photoelectric conversion layer provided between the first electrode, the second electrode, and the third electrode;
A first layer and a second layer provided between the first electrode and the second electrode and the photoelectric conversion layer and stacked in order from the first electrode and the second electrode side. and a semiconductor layer in which the thickness of the first layer is smaller than the thickness of the second layer and is 3 nm or more and 5 nm or less.
(2)
The photoelectric conversion element according to (1) above, wherein the ratio (t1/t2) between the thickness (t1) of the first layer and the thickness (t2) of the second layer is 0.16 or less.
(3)
Both the first layer and the second layer are formed using indium oxide,
The content ratio of indium contained in the first indium oxide constituting the first layer is higher than the content ratio of indium contained in the second indium oxide constituting the second layer. The photoelectric conversion element according to 1) or (2).
(4)
The photoelectric conversion element according to any one of (1) to (3), wherein the second layer has a film thickness of 32 nm or more.
(5)
The photoelectric conversion element according to any one of (1) to (4), wherein the first layer and the second layer both have crystallinity.
(6)
The photoelectric conversion element according to any one of (1) to (4), wherein both the first layer and the second layer have amorphous properties.
(7)
The semiconductor layer further includes a third layer having an amorphous property between the photoelectric conversion layer and the second layer, the photovoltaic layer according to any one of (1) to (6) above. conversion element.
(8)
The photoelectric conversion element according to (7) above, wherein the third layer has a thickness of 1 nm or more and 10 nm or less.
(9)
further comprising an insulating layer provided between the first electrode and the second electrode and the semiconductor layer and having an opening above the first electrode,
The photoelectric conversion element according to any one of (1) to (8), wherein the second electrode and the semiconductor layer are electrically connected through the opening.
(10)
The photoelectric conversion element according to any one of (1) to (9), further comprising a protective layer containing an inorganic material between the photoelectric conversion layer and the semiconductor layer.
(11)
As described in any one of (1) to (10) above, the first electrode and the second electrode are arranged on a side opposite to a light incident surface with respect to the photoelectric conversion layer. photoelectric conversion element.
(12)
The photoelectric conversion element according to any one of (1) to (11), wherein a voltage is applied to each of the first electrode and the second electrode individually.
(13)
comprising a plurality of pixels each provided with one or more photoelectric conversion elements,
The photoelectric conversion element is
A first electrode and a second electrode arranged in parallel;
a third electrode placed opposite the first electrode and the second electrode;
a photoelectric conversion layer provided between the first electrode, the second electrode, and the third electrode;
A first layer and a second layer provided between the first electrode and the second electrode and the photoelectric conversion layer and stacked in order from the first electrode and the second electrode side. and a semiconductor layer in which the thickness of the first layer is smaller than the thickness of the second layer and is 3 nm or more and 5 nm or less.
(14)
The photodetector according to (13), wherein the photoelectric conversion element further includes one or more photoelectric conversion regions that perform photoelectric conversion in a wavelength range different from that of the one or more photoelectric conversion elements.
(15)
The one or more photoelectric conversion regions are embedded in a semiconductor substrate,
The photodetecting device according to (14), wherein the one or more photoelectric conversion units are arranged on the light incident surface side of the semiconductor substrate.
(16)
The photodetecting device according to (15), wherein a multilayer wiring layer is formed on a surface of the semiconductor substrate opposite to the light incident surface.
(17)
A first electrode and a second electrode arranged in parallel;
a third electrode placed opposite the first electrode and the second electrode;
a photoelectric conversion layer provided between the first electrode, the second electrode, and the third electrode;
A first layer and a second layer provided between the first electrode and the second electrode and the photoelectric conversion layer and stacked in order from the first electrode and the second electrode side. and a semiconductor layer in which the ratio (t1/t2) of the thickness (t1) of the first layer to the thickness (t2) of the second layer is 0.16 or less. .
 本出願は、日本国特許庁において2022年3月15日に出願された日本特許出願番号2022-039791号を基礎として優先権を主張するものであり、この出願の全ての内容を参照によって本出願に援用する。 This application claims priority based on Japanese Patent Application No. 2022-039791 filed on March 15, 2022 at the Japan Patent Office, and all contents of this application are incorporated herein by reference. be used for.
 当業者であれば、設計上の要件や他の要因に応じて、種々の修正、コンビネーション、サブコンビネーション、および変更を想到し得るが、それらは添付の請求の範囲やその均等物の範囲に含まれるものであることが理解される。 Various modifications, combinations, subcombinations, and changes may occur to those skilled in the art, depending on design requirements and other factors, which may come within the scope of the appended claims and their equivalents. It is understood that the

Claims (16)

  1.  並列配置されてなる第1の電極および第2の電極と、
     前記第1の電極および前記第2の電極と対向配置された第3の電極と、
     前記第1の電極および前記第2の電極と前記第3の電極との間に設けられた光電変換層と、
     前記第1の電極および前記第2の電極と前記光電変換層との間に設けられ、前記第1の電極および前記第2の電極の側から順に積層された第1の層および第2の層を含み、前記第1の層の厚みは前記第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層と
     を備えた光電変換素子。
    A first electrode and a second electrode arranged in parallel;
    a third electrode placed opposite the first electrode and the second electrode;
    a photoelectric conversion layer provided between the first electrode, the second electrode, and the third electrode;
    A first layer and a second layer provided between the first electrode and the second electrode and the photoelectric conversion layer and stacked in order from the first electrode and the second electrode side. and a semiconductor layer in which the thickness of the first layer is smaller than the thickness of the second layer and is 3 nm or more and 5 nm or less.
  2.  前記第1の層の膜厚(t1)と前記第2の層の膜厚(t2)との比(t1/t2)は0.16以下である、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein the ratio (t1/t2) between the thickness (t1) of the first layer and the thickness (t2) of the second layer is 0.16 or less.
  3.  前記第1の層および前記第2の層は、ともにインジウム酸化物を用いて形成されており、
     前記第1の層を構成する第1のインジウム酸化物に含まれるインジウムの含有比率は、前記第2の層を構成する第2のインジウム酸化物に含まれるインジウムの含有比率よりも高い、請求項1に記載の光電変換素子。
    Both the first layer and the second layer are formed using indium oxide,
    A content ratio of indium contained in the first indium oxide constituting the first layer is higher than a content ratio of indium contained in the second indium oxide constituting the second layer. 1. The photoelectric conversion element according to 1.
  4.  前記第2の層の膜厚は32nm以上である、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein the second layer has a film thickness of 32 nm or more.
  5.  前記第1の層および前記第2の層は共に結晶性を有している、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein both the first layer and the second layer have crystallinity.
  6.  前記第1の層および前記第2の層は共にアモルファス性を有している、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein both the first layer and the second layer have amorphous properties.
  7.  前記半導体層は、前記光電変換層と前記第2の層との間にアモルファス性を有する第3の層をさらに有する、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein the semiconductor layer further includes a third layer having an amorphous property between the photoelectric conversion layer and the second layer.
  8.  前記第3の層の膜厚は、1nm以上10nm以下である、請求項7に記載の光電変換素子。 The photoelectric conversion element according to claim 7, wherein the third layer has a thickness of 1 nm or more and 10 nm or less.
  9.  前記第1の電極および前記第2の電極と、前記半導体層との間に設けられると共に、前記第1の電極の上方に開口を有する絶縁層をさらに有し、
     前記第2の電極と前記半導体層とは、前記開口を介して電気的に接続されている、請求項1に記載の光電変換素子。
    further comprising an insulating layer provided between the first electrode and the second electrode and the semiconductor layer and having an opening above the first electrode,
    The photoelectric conversion element according to claim 1, wherein the second electrode and the semiconductor layer are electrically connected through the opening.
  10.  前記光電変換層と前記半導体層との間に無機材料を含む保護層をさらに有する、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, further comprising a protective layer containing an inorganic material between the photoelectric conversion layer and the semiconductor layer.
  11.  前記第1の電極および前記第2の電極は、前記光電変換層に対して光入射面とは反対側に配置されている、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein the first electrode and the second electrode are arranged on a side opposite to a light incident surface with respect to the photoelectric conversion layer.
  12.  前記第1の電極および前記第2の電極は、それぞれ個別に電圧が印加される、請求項1に記載の光電変換素子。 The photoelectric conversion element according to claim 1, wherein a voltage is applied to each of the first electrode and the second electrode individually.
  13.  1または複数の光電変換素子がそれぞれ設けられている複数の画素を備え、
     前記光電変換素子は、
     並列配置されてなる第1の電極および第2の電極と、
     前記第1の電極および前記第2の電極と対向配置された第3の電極と、
     前記第1の電極および前記第2の電極と前記第3の電極との間に設けられた光電変換層と、
     前記第1の電極および前記第2の電極と前記光電変換層との間に設けられ、前記第1の電極および前記第2の電極の側から順に積層された第1の層および第2の層を含み、前記第1の層の厚みは前記第2の層の厚みよりも小さく、且つ、3nm以上5nm以下である半導体層と
     を有する光検出装置。
    comprising a plurality of pixels each provided with one or more photoelectric conversion elements,
    The photoelectric conversion element is
    A first electrode and a second electrode arranged in parallel;
    a third electrode placed opposite the first electrode and the second electrode;
    a photoelectric conversion layer provided between the first electrode, the second electrode, and the third electrode;
    A first layer and a second layer provided between the first electrode and the second electrode and the photoelectric conversion layer and stacked in order from the first electrode and the second electrode side. and a semiconductor layer in which the thickness of the first layer is smaller than the thickness of the second layer and is 3 nm or more and 5 nm or less.
  14.  前記光電変換素子は、前記1または複数の光電変換素子とは異なる波長域の光電変換を行う1または複数の光電変換領域をさらに有する、請求項13に記載の光検出装置。 The photodetection device according to claim 13, wherein the photoelectric conversion element further includes one or more photoelectric conversion regions that perform photoelectric conversion in a wavelength range different from that of the one or more photoelectric conversion elements.
  15.  前記1または複数の光電変換領域は半導体基板に埋め込み形成され、
     前記1または複数の光電変換部は前記半導体基板の光入射面側に配置されている、請求項14に記載の光検出装置。
    The one or more photoelectric conversion regions are embedded in a semiconductor substrate,
    15. The photodetection device according to claim 14, wherein the one or more photoelectric conversion sections are arranged on a light incident surface side of the semiconductor substrate.
  16.  前記半導体基板の前記光入射面とは反対側の面に多層配線層が形成されている、請求項15に記載の光検出装置。 16. The photodetection device according to claim 15, wherein a multilayer wiring layer is formed on a surface of the semiconductor substrate opposite to the light incident surface.
PCT/JP2023/008332 2022-03-15 2023-03-06 Photoelectric conversion element and optical detection device WO2023176551A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022039791 2022-03-15
JP2022-039791 2022-03-15

Publications (1)

Publication Number Publication Date
WO2023176551A1 true WO2023176551A1 (en) 2023-09-21

Family

ID=88023025

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/008332 WO2023176551A1 (en) 2022-03-15 2023-03-06 Photoelectric conversion element and optical detection device

Country Status (1)

Country Link
WO (1) WO2023176551A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009088430A (en) * 2007-10-03 2009-04-23 Sony Corp Solid-state imaging apparatus, manufacturing method thereof, and imaging apparatus
JP2013123065A (en) * 2010-03-26 2013-06-20 Semiconductor Energy Lab Co Ltd Semiconductor device
JP2020191446A (en) * 2019-05-17 2020-11-26 三星電子株式会社Samsung Electronics Co.,Ltd. Photoelectric conversion element, organic sensor including the same, and electronic device
WO2021161699A1 (en) * 2020-02-12 2021-08-19 ソニーグループ株式会社 Imaging element, laminated imaging element, solid-state imaging device, and inorganic oxide semiconductor material

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009088430A (en) * 2007-10-03 2009-04-23 Sony Corp Solid-state imaging apparatus, manufacturing method thereof, and imaging apparatus
JP2013123065A (en) * 2010-03-26 2013-06-20 Semiconductor Energy Lab Co Ltd Semiconductor device
JP2020191446A (en) * 2019-05-17 2020-11-26 三星電子株式会社Samsung Electronics Co.,Ltd. Photoelectric conversion element, organic sensor including the same, and electronic device
WO2021161699A1 (en) * 2020-02-12 2021-08-19 ソニーグループ株式会社 Imaging element, laminated imaging element, solid-state imaging device, and inorganic oxide semiconductor material

Similar Documents

Publication Publication Date Title
JP7372243B2 (en) Image sensor and imaging device
US11469262B2 (en) Photoelectric converter and solid-state imaging device
US20220139978A1 (en) Imaging element and imaging device
JP7242655B2 (en) Image sensor driving method
WO2019044464A1 (en) Solid-state imaging device and method for controlling solid-state imaging device
JP7433231B2 (en) Image sensor and imaging device
US20210233948A1 (en) Solid-state imaging element and manufacturing method thereof
WO2021200509A1 (en) Imaging element and imaging device
US20230207598A1 (en) Photoelectric converter and imaging device
US20230124165A1 (en) Imaging element and imaging device
WO2023176551A1 (en) Photoelectric conversion element and optical detection device
WO2023181919A1 (en) Imaging element, method for manufacturing imaging element, and optical detection device
WO2023153308A1 (en) Photoelectric conversion element and optical detection device
WO2024070293A1 (en) Photoelectric conversion element, and photodetector
WO2024106235A1 (en) Photo detection device, photo detection device manufacturing method, and electronic equipment
WO2023007822A1 (en) Imaging element and imaging device
WO2023176852A1 (en) Photoelectric conversion element, photodetection apparatus, and photodetection system
WO2023162982A1 (en) Photoelectric conversion element, photodetector, and electronic device
WO2023127603A1 (en) Photoelectric conversion element, imaging device, and electronic apparatus
WO2023037622A1 (en) Imaging element and imaging device
WO2022249595A1 (en) Photoelectric conversion element and imaging device
US20240030251A1 (en) Solid-state imaging element and electronic device
WO2022059415A1 (en) Photoelectric conversion element and imaging device
WO2023037621A1 (en) Imaging element and imaging device
WO2023112595A1 (en) Photoelectric conversion element and imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23770507

Country of ref document: EP

Kind code of ref document: A1