WO2018139110A1 - 受光素子、受光素子の製造方法、撮像素子および電子機器 - Google Patents
受光素子、受光素子の製造方法、撮像素子および電子機器 Download PDFInfo
- Publication number
- WO2018139110A1 WO2018139110A1 PCT/JP2017/045422 JP2017045422W WO2018139110A1 WO 2018139110 A1 WO2018139110 A1 WO 2018139110A1 JP 2017045422 W JP2017045422 W JP 2017045422W WO 2018139110 A1 WO2018139110 A1 WO 2018139110A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- photoelectric conversion
- conversion layer
- light receiving
- receiving element
- semiconductor material
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims description 74
- 238000004519 manufacturing process Methods 0.000 title claims description 21
- 238000006243 chemical reaction Methods 0.000 claims abstract description 259
- 239000004065 semiconductor Substances 0.000 claims abstract description 71
- 239000000463 material Substances 0.000 claims abstract description 68
- 238000000034 method Methods 0.000 claims description 42
- 239000000758 substrate Substances 0.000 claims description 38
- 229910000530 Gallium indium arsenide Inorganic materials 0.000 claims description 19
- WPYVAWXEWQSOGY-UHFFFAOYSA-N indium antimonide Chemical compound [Sb]#[In] WPYVAWXEWQSOGY-UHFFFAOYSA-N 0.000 claims description 8
- 229910000673 Indium arsenide Inorganic materials 0.000 claims description 7
- RPQDHPTXJYYUPQ-UHFFFAOYSA-N indium arsenide Chemical compound [In]#[As] RPQDHPTXJYYUPQ-UHFFFAOYSA-N 0.000 claims description 7
- 229910000661 Mercury cadmium telluride Inorganic materials 0.000 claims description 4
- 230000008569 process Effects 0.000 description 26
- 238000001514 detection method Methods 0.000 description 23
- 238000004891 communication Methods 0.000 description 18
- 238000012986 modification Methods 0.000 description 17
- 230000004048 modification Effects 0.000 description 17
- 238000012545 processing Methods 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 15
- 230000001681 protective effect Effects 0.000 description 13
- 230000003287 optical effect Effects 0.000 description 12
- 238000002161 passivation Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- 238000002674 endoscopic surgery Methods 0.000 description 11
- 230000000694 effects Effects 0.000 description 10
- 238000005530 etching Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 229910052581 Si3N4 Inorganic materials 0.000 description 8
- HQVNEWCFYHHQES-UHFFFAOYSA-N silicon nitride Chemical compound N12[Si]34N5[Si]62N3[Si]51N64 HQVNEWCFYHHQES-UHFFFAOYSA-N 0.000 description 8
- 210000001519 tissue Anatomy 0.000 description 8
- GPXJNWSHGFTCBW-UHFFFAOYSA-N Indium phosphide Chemical compound [In]#P GPXJNWSHGFTCBW-UHFFFAOYSA-N 0.000 description 7
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 7
- KXNLCSXBJCPWGL-UHFFFAOYSA-N [Ga].[As].[In] Chemical compound [Ga].[As].[In] KXNLCSXBJCPWGL-UHFFFAOYSA-N 0.000 description 7
- 229910052710 silicon Inorganic materials 0.000 description 7
- 239000010703 silicon Substances 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 239000010949 copper Substances 0.000 description 6
- 239000012535 impurity Substances 0.000 description 6
- 238000001356 surgical procedure Methods 0.000 description 6
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 5
- 229910052814 silicon oxide Inorganic materials 0.000 description 5
- 239000000470 constituent Substances 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 229910018072 Al 2 O 3 Inorganic materials 0.000 description 3
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 description 3
- PXHVJJICTQNCMI-UHFFFAOYSA-N Nickel Chemical compound [Ni] PXHVJJICTQNCMI-UHFFFAOYSA-N 0.000 description 3
- KDLHZDBZIXYQEI-UHFFFAOYSA-N Palladium Chemical compound [Pd] KDLHZDBZIXYQEI-UHFFFAOYSA-N 0.000 description 3
- 229910004298 SiO 2 Inorganic materials 0.000 description 3
- 229910052787 antimony Inorganic materials 0.000 description 3
- WATWJIUSRGPENY-UHFFFAOYSA-N antimony atom Chemical compound [Sb] WATWJIUSRGPENY-UHFFFAOYSA-N 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 229910052802 copper Inorganic materials 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 239000011810 insulating material Substances 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 3
- TWNQGVIAIRXVLR-UHFFFAOYSA-N oxo(oxoalumanyloxy)alumane Chemical compound O=[Al]O[Al]=O TWNQGVIAIRXVLR-UHFFFAOYSA-N 0.000 description 3
- 238000000206 photolithography Methods 0.000 description 3
- BASFCYQUMIYNBI-UHFFFAOYSA-N platinum Chemical compound [Pt] BASFCYQUMIYNBI-UHFFFAOYSA-N 0.000 description 3
- 238000004611 spectroscopical analysis Methods 0.000 description 3
- 238000001931 thermography Methods 0.000 description 3
- 208000005646 Pneumoperitoneum Diseases 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 230000000052 comparative effect Effects 0.000 description 2
- 238000001312 dry etching Methods 0.000 description 2
- 229910052732 germanium Inorganic materials 0.000 description 2
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 2
- 239000010931 gold Substances 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 238000001451 molecular beam epitaxy Methods 0.000 description 2
- 230000004297 night vision Effects 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 238000005498 polishing Methods 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 239000011701 zinc Substances 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 229910010413 TiO 2 Inorganic materials 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- NRTOMJZYCJJWKI-UHFFFAOYSA-N Titanium nitride Chemical compound [Ti]#N NRTOMJZYCJJWKI-UHFFFAOYSA-N 0.000 description 1
- HCHKCACWOHOZIP-UHFFFAOYSA-N Zinc Chemical compound [Zn] HCHKCACWOHOZIP-UHFFFAOYSA-N 0.000 description 1
- 238000002679 ablation Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000003513 alkali Substances 0.000 description 1
- 229910045601 alloy Inorganic materials 0.000 description 1
- 239000000956 alloy Substances 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 238000009529 body temperature measurement Methods 0.000 description 1
- MCMSPRNYOJJPIZ-UHFFFAOYSA-N cadmium;mercury;tellurium Chemical compound [Cd]=[Te]=[Hg] MCMSPRNYOJJPIZ-UHFFFAOYSA-N 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 238000005229 chemical vapour deposition Methods 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000002073 fluorescence micrograph Methods 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- CJNBYAVZURUTKZ-UHFFFAOYSA-N hafnium(iv) oxide Chemical compound O=[Hf]=O CJNBYAVZURUTKZ-UHFFFAOYSA-N 0.000 description 1
- 230000020169 heat generation Effects 0.000 description 1
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 229910052759 nickel Inorganic materials 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- BPUBBGLMJRNUCC-UHFFFAOYSA-N oxygen(2-);tantalum(5+) Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ta+5].[Ta+5] BPUBBGLMJRNUCC-UHFFFAOYSA-N 0.000 description 1
- 229910052763 palladium Inorganic materials 0.000 description 1
- 229910052697 platinum Inorganic materials 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- HBMJWWWQQXIZIP-UHFFFAOYSA-N silicon carbide Chemical compound [Si+]#[C-] HBMJWWWQQXIZIP-UHFFFAOYSA-N 0.000 description 1
- LIVNPJMFVYWSIS-UHFFFAOYSA-N silicon monoxide Chemical compound [Si-]#[O+] LIVNPJMFVYWSIS-UHFFFAOYSA-N 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 229910001936 tantalum oxide Inorganic materials 0.000 description 1
- JBQYATWDVHIOAR-UHFFFAOYSA-N tellanylidenegermanium Chemical compound [Te]=[Ge] JBQYATWDVHIOAR-UHFFFAOYSA-N 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- WFKWXMTUELFFGS-UHFFFAOYSA-N tungsten Chemical compound [W] WFKWXMTUELFFGS-UHFFFAOYSA-N 0.000 description 1
- 229910052721 tungsten Inorganic materials 0.000 description 1
- 239000010937 tungsten Substances 0.000 description 1
- 229910052725 zinc Inorganic materials 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14665—Imagers using a photoconductor layer
- H01L27/14669—Infrared imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14683—Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14665—Imagers using a photoconductor layer
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14609—Pixel-elements with integrated switching, control, storage or amplification elements
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14636—Interconnect structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L31/00—Semiconductor devices sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation; Processes or apparatus specially adapted for the manufacture or treatment thereof or of parts thereof; Details thereof
- H01L31/02—Details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/702—SSIS architectures characterised by non-identical, non-equidistant or non-planar pixel layout
Definitions
- the present disclosure relates to a light receiving element used for, for example, an infrared sensor, a manufacturing method thereof, an imaging element, and an electronic device.
- a light receiving element used in this infrared sensor uses a photoelectric conversion layer containing a III-V group semiconductor such as InGaAs (indium gallium arsenide), for example.
- a III-V group semiconductor such as InGaAs (indium gallium arsenide), for example.
- FIG. 2 electric charges are generated by absorbing infrared rays (photoelectric conversion is performed).
- a light receiving element includes a plurality of photoelectric conversion layers including a first photoelectric conversion layer and a second photoelectric conversion layer, which are arranged in different regions in plan view, and a plurality of photoelectric conversion layers.
- a method for manufacturing a light receiving element includes: a first photoelectric conversion layer that is disposed in different regions in plan view and separated from each other by an insulating film;
- the second photoelectric conversion layer is formed by containing a second inorganic semiconductor material different from the first inorganic semiconductor material.
- the first photoelectric conversion layer and the second photoelectric conversion layer are different from each other in inorganic semiconductor materials (first inorganic semiconductor material and second inorganic semiconductor material). Therefore, a wavelength band capable of photoelectric conversion is set in each of the first photoelectric conversion layer and the second photoelectric conversion layer.
- An image sensor according to an embodiment of the present disclosure includes the light receiving element according to the embodiment of the present disclosure.
- An electronic apparatus includes the imaging element according to the embodiment of the present disclosure.
- the first photoelectric conversion layer and the second photoelectric conversion layer include different inorganic semiconductor materials. Therefore, the wavelength band in which photoelectric conversion can be performed can be shifted between the first photoelectric conversion layer and the second photoelectric conversion layer. Therefore, photoelectric conversion can be performed over a wide wavelength band.
- FIG. 3A is a cross-sectional diagram illustrating a process following the process in FIG. 3B. It is sectional drawing showing the structure of the light receiving element which concerns on a comparative example.
- FIG. 10 is a cross-sectional view illustrating a configuration of a light receiving element according to Modification 1.
- FIG. 10 is a cross-sectional view illustrating a configuration of a light receiving element according to Modification 2.
- FIG. 8 is a cross-sectional view illustrating another example of the light receiving element illustrated in FIG. 7.
- It is sectional drawing for demonstrating 1 process of the manufacturing method of the light receiving element shown in FIG. It is sectional drawing showing the process of following FIG. 9A. It is sectional drawing showing the process of following FIG. 9B.
- FIG. 10B is a cross-sectional diagram illustrating a process following the process in FIG. 10B.
- 10 is a cross-sectional view illustrating a configuration of a light receiving element according to Modification 3.
- FIG. It is sectional drawing for demonstrating 1 process of the manufacturing method of the light receiving element shown in FIG.
- FIG. 1 It is a functional block diagram showing an example of the electronic device (camera) using the image pick-up element shown in FIG. It is a figure which shows an example of a schematic structure of an endoscopic surgery system. It is a block diagram which shows an example of a function structure of a camera head and CCU. It is a block diagram which shows an example of a schematic structure of a vehicle control system. It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part.
- Embodiment an example of a light receiving element having a photoelectric conversion layer composed of different inorganic semiconductor materials
- Modification 1 example having photoelectric conversion layers of different sizes in plan view
- Modification 2 example in which the light incident surface side is flat
- Modification 3 Example of longitudinal spectroscopy
- Application example 1 example of image sensor
- Application Example 2 Example of electronic equipment
- Application example 1 application example to endoscopic surgery system
- Application example 2 application example to a moving body
- FIG. 1 illustrates a cross-sectional configuration of a light receiving element (light receiving element 1) according to an embodiment of the present disclosure.
- the light receiving element 1 is applied to an infrared sensor or the like using an inorganic semiconductor material such as a III-V group semiconductor, for example.
- an inorganic semiconductor material such as a III-V group semiconductor, for example.
- a plurality of light receiving unit regions P pixels P1, P2, P3, two-dimensionally arranged).
- P4, P5... Pn FIG. 1 shows a cross-sectional configuration of a portion corresponding to five pixels P (pixels P1 to P5).
- the light receiving element 1 has a ROIC (readout integrated circuit) substrate 11.
- the first electrode 21, the first contact layer 22, the photoelectric conversion layer 23, the second contact layer 24, and the second electrode 25 are provided on the ROIC substrate 11 in this order.
- the first electrode 21, the first contact layer 22, the photoelectric conversion layer 23, and the second contact layer 24 are provided separately for each pixel P, and the second electrode 25 is provided in common for the plurality of pixels P. .
- Light (for example, light having wavelengths in the visible region and infrared region) is incident on the light receiving element 1 from the second electrode 25 side to the photoelectric conversion layer 23. For example, light having a wavelength in the visible region is photoelectrically converted by the pixels P1 to P3, and light having a wavelength in the infrared region is photoelectrically converted by the pixels P4 and P5.
- the light receiving element 1 includes a protective film 12 between the first electrode 21 and the ROIC substrate 11, and the protective film 12 is provided with a through electrode 12 ⁇ / b> E connected to the first electrode 21.
- the light receiving element 1 has an insulating film 13 between adjacent pixels P.
- the light receiving element 1 has a passivation film 14 and a color filter layer 15 in this order on the second electrode 25.
- light passing through the color filter layer 15 and the passivation film 14 is a photoelectric conversion layer.
- the light is incident on 23.
- the configuration of each unit will be described. Since the pixels P1 to P5 have the same configuration except for the photoelectric conversion layer 23, the description of each part other than the photoelectric conversion layer 23 is common to each pixel P.
- the ROIC substrate 11 is composed of, for example, a silicon (Si) substrate and a multilayer wiring layer on the silicon substrate, and the ROIC is provided on the multilayer wiring layer.
- An electrode containing, for example, copper (Cu) is provided for each pixel P in a position near the protective film 12 in the multilayer wiring layer, and this electrode is in contact with the through electrode 12E.
- the first electrode 21 is an electrode (anode) to which a voltage for reading signal charges generated in the photoelectric conversion layer 23 (holes or electrons; for the sake of convenience, signal charges are described below as holes) is supplied. And provided for each pixel P.
- the first electrode 21 is smaller than the first contact layer 22 in plan view, and is in contact with a substantially central portion of the first contact layer 22.
- One first electrode 21 is disposed for one pixel P, and in the adjacent pixel P, the first electrode 21 is electrically separated by the protective film 12.
- the first electrode 21 is, for example, titanium (Ti), tungsten (W), titanium nitride (TiN), platinum (Pt), gold (Au), germanium (Ge), palladium (Pd), zinc (Zn), nickel. It is made of any one of (Ni) and aluminum (Al), or an alloy containing at least one of them.
- the first electrode 21 may be a single film of such a constituent material, or may be a laminated film combining two or more kinds.
- the first contact layer 22 is provided between the first electrode 21 and the photoelectric conversion layer 23 and is in contact therewith.
- One first contact layer 22 is disposed for one pixel P, and in the adjacent pixel P, the first contact layer 22 is electrically isolated by the insulating film 13.
- the first contact layer 22 is a region where signal charges generated in the photoelectric conversion layer 23 move, and is made of, for example, an inorganic semiconductor material containing a p-type impurity.
- InP indium phosphide
- Zn zinc
- the contact surface with the first electrode 21 is arranged on the same plane between the pixels P. That is, the contact surfaces of the plurality of first contact layers 22 with the first electrode 21 constitute the same plane.
- the photoelectric conversion layer 23 between the first electrode 21 and the second electrode 25 absorbs light of a predetermined wavelength and generates a signal charge, and includes an inorganic semiconductor material such as a III-V semiconductor. It is out.
- the inorganic semiconductor material constituting the photoelectric conversion layer 23 include Ge (germanium), InGaAs (indium gallium arsenide), Ex. InGaAs, InAsSb (indium arsenide antimony), InAs (indium arsenide), and InSb (indium antimony).
- HgCdTe (mercury cadmium telluride).
- One photoelectric conversion layer 23 is arranged for one pixel P.
- the photoelectric conversion layer 23 is electrically separated by the insulating film 13.
- the pixel P1 has a photoelectric conversion layer 23A
- the pixel P2 has a photoelectric conversion layer 23B
- the pixel P3 has a photoelectric conversion layer 23C
- the pixel P4 has a photoelectric conversion layer 23D
- the pixel P5 has a photoelectric conversion layer 23E.
- the photoelectric conversion layers 23A to 23E are arranged at different positions in plan view.
- the inorganic semiconductor material included in the photoelectric conversion layer 23A (or the photoelectric conversion layers 23B to 23D) is different from the inorganic semiconductor material included in the photoelectric conversion layer 23E.
- the photoelectric conversion layer 23E corresponds to a specific example of the first photoelectric conversion layer of the present technology
- the photoelectric conversion layer 23A (or photoelectric conversion layers 23B to 23D) corresponds to a specific example of the second photoelectric conversion layer of the present technology.
- the photoelectric conversion layers 23A, 23B, and 23C mainly convert light having a wavelength in the visible region.
- the photoelectric conversion layer 23A has light in the blue wavelength range (for example, a wavelength of 500 nm or less), the photoelectric conversion layer 23B has light in the green wavelength range (for example, wavelength 500 nm to 600 nm), and the photoelectric conversion layer 23C has light in the red wavelength range (for example, wavelength 600 nm to 600 nm). 800 nm) is absorbed and signal charges are generated.
- the photoelectric conversion layers 23A to 23C are made of, for example, an i-type III-V group semiconductor. Examples of III-V semiconductors used in the photoelectric conversion layers 23A to 23C include InGaAs (indium gallium arsenide).
- the photoelectric conversion layers 23A, 23B, and 23C have different thicknesses.
- the photoelectric conversion layer 23A has the smallest thickness, and the photoelectric conversion layer 23B and the photoelectric conversion layer 23C are thicker in this order.
- the thickness of the photoelectric conversion layer 23A is 500 nm or less
- the thickness of the photoelectric conversion layer 23B is 700 nm or less
- the thickness of the photoelectric conversion layer 23C is 800 nm or less.
- the photoelectric conversion layer 23D mainly converts light having a wavelength in the short infrared region (for example, a wavelength of 1 ⁇ m to 10 ⁇ m).
- the photoelectric conversion layer 23D is made of, for example, an i-type III-V group semiconductor, and made of, for example, InGaAs (indium gallium arsenide).
- the photoelectric conversion layer 23D is, for example, thicker than the photoelectric conversion layers 23A to 23C, and the thickness of the photoelectric conversion layer 23D is, for example, 1 ⁇ m to 10 ⁇ m.
- the photoelectric conversion layer 23E mainly converts light having a wavelength in the mid-infrared region (for example, a wavelength of 3 ⁇ m to 10 ⁇ m).
- the photoelectric conversion layer 23E is made of, for example, an i-type III-V group semiconductor different from the photoelectric conversion layers 23A to 23D. Specifically, InAsSb (indium arsenide antimony), InSb (indium antimony), or the like can be used for the photoelectric conversion layer 23E.
- InAsSb indium arsenide antimony
- InSb indium antimony
- the thickness of the photoelectric conversion layer 23E is different from the thickness of the photoelectric conversion layers 23A to 23C, for example, and is 3 ⁇ m to 10 ⁇ m, for example.
- the second contact layer 24 is provided between the photoelectric conversion layer 23 and the second electrode 25 and is in contact therewith.
- One second contact layer 24 is arranged for one pixel P, and in the adjacent pixel P, the second contact layer 24 is electrically isolated by the insulating film 13.
- the second contact layer 24 is a region where charges discharged from the second electrode 25 move, and is made of, for example, a compound semiconductor containing n-type impurities.
- InP indium phosphide
- Si silicon
- the second electrode 25 is provided on the second contact layer 24 (light incident side) so as to be in contact with the second contact layer 24 as an electrode common to each pixel P.
- the second electrode 25 is for discharging charges that are not used as signal charges among the charges generated in the photoelectric conversion layer 23 (cathode).
- the second electrode 25 is made of a conductive film that can transmit incident light such as infrared rays.
- ITO Indium Tin Oxide
- ITiO In 2 O 3 —TiO 2
- the protective film 12 is provided so as to cover one surface (surface on the light incident side) of the ROIC substrate 11.
- the protective film 12 is made of, for example, an inorganic insulating material. Examples of the inorganic insulating material include silicon nitride (SiN), aluminum oxide (Al 2 O 3 ), silicon oxide (SiO 2 ), and hafnium oxide (HfO 2 ).
- the protective film 12 may have a laminated structure including a plurality of films.
- the through electrode 12E provided in the protective film 12 is for connecting the wiring of the ROIC substrate 11 and the first electrode 21, and is provided for each pixel P.
- the through electrode 12E is made of copper, for example.
- the insulating film 13 covers the side surface of the first contact layer 22, the side surface of the photoelectric conversion layer 23, and the side surface of the second contact layer 24.
- the insulating film 13 is for separating adjacent photoelectric conversion layers 23 for each pixel P, and a region between the adjacent photoelectric conversion layers 23 is filled with the insulating film 13.
- the insulating film 13 includes, for example, an oxide such as silicon oxide (SiO x ) or aluminum oxide (Al 2 O 3 ).
- the insulating film 13 may be configured by a laminated structure including a plurality of films.
- the insulating film 13 may be made of a silicon (Si) -based insulating material such as silicon oxynitride (SiON), carbon-containing silicon oxide (SiOC), silicon nitride (SiN), or silicon carbide (SiC).
- Si silicon oxynitride
- SiOC carbon-containing silicon oxide
- SiN silicon nitride
- SiC silicon carbide
- the passivation film 14 covers the second electrode 25 and is provided between the second electrode 25 and the color filter layer 15.
- the passivation film 14 may have an antireflection function.
- silicon nitride (SiN), aluminum oxide (Al 2 O 3 ), silicon oxide (SiO 2 ), tantalum oxide (Ta 2 O 3 ), or the like can be used.
- the color filter layer 15 is provided on the passivation film 14 (on the light incident surface side of the passivation film 14).
- the color filter layer 15 includes, for example, a blue filter for the pixel P1, a green filter for the pixel P2, and a red filter for the pixel P3.
- the color filter layer 15 may include a visible light cut filter in the pixels P4 and P5.
- the light receiving element 1 may have an on-chip lens (for example, an on-chip lens 17 in FIG. 8 described later) for collecting incident light toward the photoelectric conversion layer 23 on the color filter layer 15.
- an on-chip lens for example, an on-chip lens 17 in FIG. 8 described later
- the light receiving element 1 can be manufactured as follows, for example. 2A to 3C show the manufacturing process of the light receiving element 1 in the order of steps. 2A to 3C show regions corresponding to the pixels P3 to P5.
- a substrate 31 made of, for example, silicon (Si) is prepared, and an insulating film 13 made of, for example, silicon oxide (SiO 2 ) or silicon nitride (SiN) is formed on the substrate 31.
- openings openings 13C to 13E corresponding to the pixels P3 to P5 are formed in regions corresponding to the respective pixels P of the formed insulating film 13, and second contacts are formed in the openings.
- Layer 24 is formed. Specifically, this is performed as follows. First, the insulating film 13 is patterned using, for example, photolithography and dry etching to form openings 13C to 13E. The openings 13C to 13E are formed for each pixel P and include portions a1 and a2 having different opening widths.
- the part a ⁇ b> 2 is an opening part in which the photoelectric conversion layer 23 is formed in a later step, and the depth is adjusted for each pixel P according to the thickness of the formed photoelectric conversion layer 23.
- the light receiving element 1 can be easily manufactured by adjusting the thickness of the photoelectric conversion layer 23 according to the depth of the portion a2.
- the portion a1 has a higher aspect ratio than the portion a2, and is formed as a trench or a hole in the portion a2.
- the aspect ratio of the portion a1 is, for example, 1.5 or more.
- the part a1 penetrates the insulating film 13 from the part a2, and is also provided on a part of the substrate 31 (part on the insulating film 13 side).
- alkali anisotropic etching is performed on the exposed surface of the substrate 51 in the portion a1.
- the crystal plane orientation dependence of the silicon substrate (substrate 31) is strong, and the etching rate in the (111) plane direction is extremely low.
- the etching process surface stops etching at the (111) plane, and a plurality of (111) planes are formed.
- the buffer layer 32 made of InP is applied from a plurality of (111) surfaces of the substrate 31 to the portion a1 of the insulating film 13 by MOCVD (Metal-Organic-Chemical-Vapor-Deposition) method or MBE (Molecular-Beam Epitaxy) method. It forms using.
- MOCVD Metal-Organic-Chemical-Vapor-Deposition
- MBE Molecular-Beam Epitaxy
- a photoelectric conversion layer 23 is formed in each opening (openings 13C to 13E) (FIGS. 2B and 2C).
- the photoelectric conversion layer 23 is formed using, for example, a hard mask 33.
- the photoelectric conversion layers 23C to 23E are formed in the openings 13C to 13E as follows. First, photoelectric conversion layers 23C and 23D made of, for example, InGaAs (indium gallium arsenide) are formed in the openings 13C and 13D by epitaxial growth in a state where the opening 13E is covered with the hard mask 33.
- a photoelectric conversion layer 23E made of, for example, InAsSb (indium arsenide antimony) or InSb (indium antimony) is formed in the opening 13E by epitaxial growth.
- InP is epitaxially grown on the photoelectric conversion layer 23 to form the first contact layer 22.
- the surface of the first contact layer 22 is planarized by CMP (Chemical Mechanical Polishing), for example.
- the constituent material of the first electrode 21 is formed on the flattened surface of the first contact layer 22, it is patterned using photolithography and etching. Thereby, the first electrode 21 is formed (FIG. 2E).
- the protective film 12 and the through electrode 12E are formed. Specifically, after the protective film 12 is formed on the first electrode 21 and the insulating film 13, an area corresponding to the central portion of the first electrode 21 in the protective film 12 is formed, for example, by photolithography and dry etching. Is used to form a through hole. Thereafter, a through electrode 12E made of copper, for example, is formed in the through hole.
- the through electrode 12E is joined to the electrode of the ROIC substrate 11. This bonding is performed by, for example, Cu—Cu bonding.
- the substrate 31 is thinned by, for example, a polishing machine, and the thinned substrate 31 and the buffer layer 32 are removed by, for example, etching to expose the surface of the second contact layer 24 (FIG. 3B).
- the second electrode 25, the passivation film 14, and the color filter layer 15 are formed in this order to complete the light receiving element 1 shown in FIG.
- the photoelectric conversion layers 23A to 23D of the pixels P1 to P4 and the photoelectric conversion layer 23E of the pixel P5 are made of different inorganic semiconductor materials.
- the thicknesses of the photoelectric conversion layers 23A to 23D can be adjusted to different thicknesses. This makes it easy to set a wavelength band in which photoelectric conversion is possible in each of the photoelectric conversion layers 23A to 23E (pixels P1 to P5).
- the photoelectric conversion layer 23A (pixel P1) has a blue wavelength range light
- the photoelectric conversion layer 23B (pixel P2) has a green wavelength range light
- the photoelectric conversion layer 23C (pixel P3) has a red wavelength range light
- a photoelectric conversion layer The light having a wavelength in the short infrared region can be photoelectrically converted by 23D (pixel P4)
- the light having a wavelength in the middle infrared region can be photoelectrically converted by the photoelectric conversion layer 23E (pixel P5). This will be described below.
- FIG. 4 shows a cross-sectional configuration of a light receiving element (light receiving element 100) according to a comparative example.
- a light receiving element light receiving element 100
- adjacent pixels P are not separated by an insulating film, and are common to all the pixels P, and the first contact layer 122, the photoelectric conversion layer 123, the second contact layer 124, and the second contact layer P are shared.
- An electrode 125 is provided. The first electrode 121 is separated for each pixel P.
- FIG. 5A to 5C show the manufacturing process of the light receiving element 100.
- FIG. 5A In the light receiving element 100, first, the photoelectric conversion layer 123 and the first contact layer 122 are formed on the substrate 124A by, for example, epitaxial growth (FIG. 5A), and then the protective film 12 and the through electrode (not shown) are formed.
- the through electrode and the electrode of the ROIC substrate 11 are bonded by, for example, Cu—Cu bonding (FIG. 5B).
- the substrate 124A is thinned to form the second contact layer 124 (FIG. 5C).
- the light receiving element 100 is formed by, for example, forming the second electrode 125, a passivation film, and a color filter layer.
- the light receiving element 1 is provided with photoelectric conversion layers 23A to 23E having different constituent materials or different thicknesses, so that light in different wavelength ranges among the pixels P can be selectively photoelectrically converted. it can. For example, light having a wavelength in the visible region is selectively converted in pixels P1 to P3, light having a wavelength in the short infrared region is selectively converted in pixel P4, and light having a wavelength in the middle infrared region is selectively converted in pixel P5.
- Such a light receiving element 1 can be easily formed by forming the photoelectric conversion layer 23 in the opening of the insulating film 13 provided for each pixel P (for example, the openings 13C to 13E in FIG. 2A).
- the photoelectric conversion layers 23A to 23D and the photoelectric conversion layer 23E include different inorganic semiconductor materials, and thus the photoelectric conversion layers 23A to 23D and the photoelectric conversion layers The wavelength band capable of photoelectric conversion can be shifted between the layer 23E and the layer 23E. Further, since the thicknesses of the photoelectric conversion layers 23A to 23D are made different from each other, the wavelength band capable of photoelectric conversion can be shifted. Therefore, photoelectric conversion can be performed over a wide wavelength band.
- FIG. 6 illustrates a cross-sectional configuration of a light receiving element (light receiving element 1A) according to Modification 1 of the above embodiment.
- light receiving element 1A Like the light receiving element 1A, photoelectric conversion layers 23 having different widths (widths W3 and W4) may be provided. Except for this point, the light receiving element 1 ⁇ / b> A has the same configuration and effect as the light receiving element 1.
- the width W4 of the photoelectric conversion layer 23D is larger than the width W3 of the photoelectric conversion layer 23C.
- the widths of the photoelectric conversion layers 23A and 23B are substantially the same as the width W3, and the width of the photoelectric conversion layer 23E is larger than the width W4.
- the photoelectric conversion layer 23C and the photoelectric conversion layer 23D have, for example, different sizes in plan view and different lengths (sizes in the direction orthogonal to the widths W3 and W4).
- the photoelectric conversion layer 23C and the photoelectric conversion layer 23D may be different in only one of the widths W3, W4 and the length.
- FIG. 7 illustrates a cross-sectional configuration of a light receiving element (light receiving element 1B) according to Modification 2.
- the surface on the ROIC substrate 11 side (specifically, the contact surface with the first electrode 21 of the first contact layer 22) is exemplified as a flat surface, but the surface on the light incident side is flat. It may be.
- the contact surface of the second contact layer 24 with the second electrode 25 may be provided between the pixels P on the same plane. That is, in the light receiving element 1B, the contact surfaces of the plurality of second contact layers 24 with the second electrodes 25 constitute the same plane. Except for this point, the light receiving element 1B has the same configuration and effect as the light receiving element 1.
- the light receiving element 1B may have an on-chip lens (on-chip lens 17).
- the on-chip lens 17 is provided on the color filter layer 15 via a passivation film 16.
- the focus design of the on-chip lens 17 is easy, and the on-chip lens 17 can be easily formed.
- the light receiving element 1B can be manufactured, for example, as follows.
- 9A to 10C show manufacturing steps of the light receiving element 1B in the order of steps.
- 9A to 10C show regions corresponding to the pixels P1 to P3.
- openings (openings 13A to 13C corresponding to the pixels P1 to P3) are formed in regions corresponding to the pixels P of the insulating film 13, and second openings are formed in the openings.
- a contact layer 24 is formed (FIG. 9A). At this time, by making the depth of the portion a2 the same between the pixels P, the contact surface of the second contact layer 24 with the second electrode 25 is arranged on the same plane between the pixels P. .
- a photoelectric conversion layer 23 is formed in each opening (openings 13A to 13C) (FIG. 9B).
- the photoelectric conversion layers 23A to 23C are formed, for example, by epitaxially growing InGaAs (indium gallium arsenide) and then adjusting the thickness between the pixels P by etching.
- the first contact layer 22 and the first electrode 21 are formed on the photoelectric conversion layer 23 in this order. Subsequently, after forming the protective film 12 and the through electrode 12E, the through electrode 12E is bonded to the electrode of the ROIC substrate 11 as shown in FIG. 10A.
- the substrate 31 is thinned, and the thinned substrate 31 and the buffer layer 32 are removed by, for example, etching to expose the surface of the second contact layer 24 (FIG. 10B).
- the second electrode 25, the passivation film 14 and the color filter layer 15 are formed in this order to complete the light receiving element 1B shown in FIG. 10C.
- the surface on the light incident surface side may be flat between the pixels P, and in this case as well, the same effect as in the above embodiment can be obtained.
- the focus design of the on-chip lens 17 is facilitated.
- FIG. 11 illustrates a cross-sectional configuration of the pixel P5 with respect to the light receiving element (light receiving element 1C) according to Modification 3.
- another photoelectric conversion layer photoelectric conversion layer 23EA
- photoelectric conversion layer 23EA may be stacked in the thickness direction of the photoelectric conversion layer 23E.
- longitudinal spectroscopy can be performed.
- the light receiving element 1 ⁇ / b> C has the same configuration and effects as the light receiving element 1.
- the photoelectric conversion layer 23EA (third photoelectric conversion layer) is stacked in the thickness direction of the photoelectric conversion layer 23E, and is provided at a position where the photoelectric conversion layer 23E partially overlaps the photoelectric conversion layer 23E in plan view.
- the photoelectric conversion layer 23EA is made of an inorganic semiconductor material different from the photoelectric conversion layer 23E.
- the photoelectric conversion layer 23EA mainly converts light having a wavelength in the short infrared region and is made of InGaAs (indium gallium arsenide).
- InGaAs indium gallium arsenide
- two photoelectric conversion layers 23EA are provided in the pixel P5, and the positions in the thickness direction are arranged at the same position.
- One photoelectric conversion layer 23EA may be provided in the pixel P5, or three or more photoelectric conversion layers 23EA may be provided.
- the first electrode 21A is provided on the surface of the photoelectric conversion layer 23EA facing the ROIC substrate 11, and the first electrode 21A is connected to the ROIC substrate 11 through the through electrode 12EA in the insulating film 13.
- a first contact layer 22A is provided between the photoelectric conversion layer 23EA and the first electrode 21A.
- a second contact layer 24A and a second electrode 25 are stacked in this order on the light incident surface of the photoelectric conversion layer 23EA.
- FIG. 12 shows one process when manufacturing the light receiving element 1C.
- the light receiving element 1C can be formed in the same manner as described in the above embodiment.
- the light receiving element 1C in one pixel P5, for example, light L1 having a wavelength in the middle infrared region is photoelectrically converted by the photoelectric conversion layer 23E, and light L2 having a wavelength in the short infrared region is photoelectrically converted. Photoelectric conversion is performed by the layer 23EA.
- a plurality of photoelectric conversion layers may be provided in the stacking direction in one pixel P. Even in such a case, an effect equivalent to that of the first embodiment can be obtained. In addition, since vertical spectroscopy can be performed within one pixel P, the pixel P can be easily miniaturized.
- FIG. 11 shows the case where the photoelectric conversion layer 23EA is provided in the pixel P5
- the photoelectric conversion layer 23EA may be provided in the other pixels P (for example, the pixels P1 to P4) together with the pixel P5.
- the photoelectric conversion layer 23EA may be provided in another pixel P without providing the photoelectric conversion layer 23EA in the pixel P5.
- FIG. 14 shows a functional configuration of the imaging element 2 using the element structure of the light receiving element 1 (or the light receiving elements 1A to 1C, hereinafter collectively referred to as the light receiving element 1) described in the above-described embodiment and the like.
- the imaging element 2 is, for example, an infrared image sensor, and includes, for example, a pixel unit 10P including the light receiving element 1 and a circuit unit 20 that drives the pixel unit 10P.
- the circuit unit 20 includes, for example, a row scanning unit 131, a horizontal selection unit 133, a column scanning unit 134, and a system control unit 132.
- the pixel unit 10P has, for example, a plurality of pixels P (light receiving elements 1) arranged two-dimensionally in a matrix.
- a pixel drive line Lread (for example, a row selection line and a reset control line) is wired for each pixel row, and a vertical signal line Lsig is wired for each pixel column.
- the pixel drive line Lread transmits a drive signal for reading a signal from the pixel P.
- One end of the pixel drive line Lread is connected to an output end corresponding to each row of the row scanning unit 131.
- the row scanning unit 131 includes a shift register, an address decoder, and the like, and is a pixel driving unit that drives each pixel P of the pixel unit 10 in units of rows, for example.
- a signal output from each pixel P in the pixel row selected and scanned by the row scanning unit 131 is supplied to the horizontal selection unit 133 through each of the vertical signal lines Lsig.
- the horizontal selection unit 133 is configured by an amplifier, a horizontal selection switch, and the like provided for each vertical signal line Lsig.
- the column scanning unit 134 includes a shift register, an address decoder, and the like, and drives the horizontal selection switches in the horizontal selection unit 133 in order while scanning. By the selective scanning by the column scanning unit 134, the signal of each pixel transmitted through each of the vertical signal lines Lsig is sequentially output to the horizontal signal line 135 and is input to the signal processing unit (not shown) through the horizontal signal line 135.
- the signal processing unit not shown
- a substrate 2A having a pixel unit 10P and a substrate 2B having a circuit unit 20 are stacked.
- the configuration is not limited to this, and the circuit unit 20 may be formed on the same substrate as the pixel unit 10P, or may be provided in an external control IC.
- the circuit unit 20 may be formed on another substrate connected by a cable or the like.
- the system control unit 132 receives a clock given from the outside, data for instructing an operation mode, and the like, and outputs data such as internal information of the image sensor 2.
- the system control unit 132 further includes a timing generator that generates various timing signals.
- the row scanning unit 131, the horizontal selection unit 133, the column scanning unit 134, and the like are based on the various timing signals generated by the timing generator. Drive control is performed.
- FIG. 16 shows a schematic configuration of an electronic apparatus 3 (camera) as an example.
- the electronic device 3 is a camera capable of taking a still image or a moving image, for example.
- the image pickup device 2 an optical system (optical lens) 310, a shutter device 311, and a drive unit that drives the image pickup device 2 and the shutter device 311. 313 and a signal processing unit 312.
- the optical system 310 guides image light (incident light) from the subject to the image sensor 2.
- the optical system 310 may be composed of a plurality of optical lenses.
- the shutter device 311 controls the light irradiation period and the light shielding period to the image sensor 2.
- the drive unit 313 controls the transfer operation of the image sensor 2 and the shutter operation of the shutter device 311.
- the signal processing unit 312 performs various signal processing on the signal output from the image sensor 2.
- the video signal Dout after the signal processing is stored in a storage medium such as a memory, or is output to a monitor or the like.
- the light receiving element 1 described in the present embodiment and the like can be applied to the following electronic devices (capsule endoscopes and moving bodies such as vehicles).
- ⁇ Application example 1 (endoscopic surgery system)>
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 17 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
- FIG. 17 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
- an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
- an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
- An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
- the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
- An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
- the image signal is transmitted to a camera control unit (CCU: “Camera Control Unit”) 11201 as RAW data.
- the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls the operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
- image processing for example, development processing (demosaic processing
- the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
- the light source device 11203 includes a light source such as an LED (light emitting diode), and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
- a light source such as an LED (light emitting diode)
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
- the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
- the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111.
- the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
- the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
- a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
- the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
- the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
- a so-called narrow-band light observation (Narrow Band Imaging) is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
- the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
- FIG. 18 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
- the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
- the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
- the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
- the imaging device constituting the imaging unit 11402 may be one (so-called single plate type) or plural (so-called multi-plate type).
- image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
- the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
- a plurality of lens units 11401 can be provided corresponding to each imaging element.
- the imaging unit 11402 is not necessarily provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
- the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
- the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
- the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
- the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good.
- a so-called AE (Auto-Exposure) function, AF (Auto-Focus) function, and AWB (Auto-White Balance) function are mounted on the endoscope 11100.
- the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
- the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102.
- the image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
- the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
- the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
- control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
- the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
- the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
- the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
- the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
- communication is performed by wire using the transmission cable 11400.
- communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
- FIG. 19 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
- a microcomputer 12051, a sound image output unit 12052, and an in-vehicle network I / F (Interface) 12053 are illustrated as a functional configuration of the integrated control unit 12050.
- the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
- the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
- the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
- the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
- the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
- the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
- the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
- the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
- the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
- the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
- the vehicle interior information detection unit 12040 detects vehicle interior information.
- a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
- the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
- the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
- the microcomputer 12051 realizes ADAS (Advanced Driver Assistance System) functions including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintenance traveling, vehicle collision warning, or vehicle lane departure warning. It is possible to perform cooperative control for the purpose.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
- the microcomputer 12051 can output a control command to the body system control unit 12030 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
- the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
- the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
- the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
- FIG. 20 is a diagram illustrating an example of an installation position of the imaging unit 12031.
- the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
- the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
- the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
- the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
- the imaging unit 12105 provided on the upper part of the windshield in the passenger compartment is mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
- FIG. 20 shows an example of the shooting range of the imaging units 12101 to 12104.
- the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
- the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
- the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
- the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
- a solid object that travels at a predetermined speed (for example, 0 km / h or more) in the same direction as the vehicle 12100, particularly the closest three-dimensional object on the traveling path of the vehicle 12100. it can.
- the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
- automatic brake control including follow-up stop control
- automatic acceleration control including follow-up start control
- cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
- the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
- the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
- the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
- the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
- the display unit 12062 is controlled so as to be superimposed and displayed.
- voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
- the technology according to the present disclosure can be applied to the imaging unit 12031 among the configurations described above.
- the technology according to the present disclosure can be applied to the imaging unit 12031, it is possible to obtain a captured image that is easier to see, and thus it is possible to reduce driver fatigue.
- the light receiving element 1 described in this embodiment and the like can also be applied to electronic devices such as surveillance cameras, biometric authentication systems, and thermography.
- the surveillance camera is for example a night vision system (night vision).
- night vision night vision
- the light receiving element 1 is applied as an in-vehicle camera, it is less susceptible to headlights and weather. For example, a captured image can be obtained without being affected by smoke, fog, and the like.
- the shape of the object can be recognized.
- non-contact temperature measurement is possible with thermography. Thermography can also detect temperature distribution and heat generation.
- the light receiving element 1 can be applied to an electronic device that detects flame, moisture, gas, or the like.
- the layer configuration of the light receiving element described in the above embodiment is an example, and other layers may be provided.
- the material and thickness of each layer are examples, and are not limited to those described above.
- first electrode 21 and the first contact layer 22 are in contact with each other and the second contact layer 24 and the second electrode 25 are in contact with each other has been described.
- Another layer may be provided between the layer 22 or between the second contact layer 24 and the second electrode 25.
- the first contact layer 22 may include an n-type impurity
- the second contact layer 24 may include a p-type impurity.
- the present disclosure may be configured as follows.
- a plurality of photoelectric conversion layers including a first photoelectric conversion layer and a second photoelectric conversion layer, which are arranged in different regions in plan view, An insulating film for separating the plurality of photoelectric conversion layers from each other;
- a light receiving element comprising: a second inorganic semiconductor material that is included in the second photoelectric conversion layer and is different from the first inorganic semiconductor material.
- the third photoelectric conversion layer that is provided in the thickness direction of the first photoelectric conversion layer and overlaps a part of the first photoelectric conversion layer in plan view,
- At least one of the first photoelectric conversion layer and the second photoelectric conversion layer is configured to generate light by absorbing light having a wavelength in the infrared region, and any one of (1) to (3) The light receiving element as described in any one.
- At least one of the first photoelectric conversion layer and the second photoelectric conversion layer is configured to absorb a light having a wavelength in the visible region and generate a charge.
- At least one of the first inorganic semiconductor material and the second inorganic semiconductor material is any one of Ge, InGaAs, Ex. InGaAs, InAsSb, InAs, InSb, and HgCdTe (1) to (5) The light receiving element as described in any one of these.
- a first electrode electrically connected to each of the first photoelectric conversion layer and the second photoelectric conversion layer;
- ROIC readout integrated circuit
- the light receiving element according to (7) further including a first contact layer provided between the first electrode and each of the first photoelectric conversion layer and the second photoelectric conversion layer.
- the light receiving element according to (8) wherein contact surfaces of the plurality of first contact layers with the first electrode are provided on the same plane.
- the second opening is used, and when the second inorganic semiconductor material is epitaxially grown in the second opening, the first opening is used as a hard mask.
- An imaging device comprising: a second inorganic semiconductor material that is included in the second photoelectric conversion layer and is different from the first inorganic semiconductor material.
- a plurality of photoelectric conversion layers including a first photoelectric conversion layer and a second photoelectric conversion layer, which are arranged in different regions in plan view, An insulating film for separating the plurality of photoelectric conversion layers from each other; A first inorganic semiconductor material included in the first photoelectric conversion layer; An electronic device having an imaging element, comprising: a second inorganic semiconductor material that is included in the second photoelectric conversion layer and different from the first inorganic semiconductor material.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Solid State Image Pick-Up Elements (AREA)
- Light Receiving Elements (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
1.実施形態(互いに異なる無機半導体材料で構成された光電変換層を有する受光素子の例)
2.変形例1(平面視で異なる互いに異なる大きさの光電変換層を有する例)
3.変形例2(光入射面側が平坦である例)
4.変形例3(縦方向分光の例)
5.適用例1(撮像素子の例)
6.適用例2(電子機器の例)
7.応用例1(内視鏡手術システムへの応用例)
8.応用例2(移動体への応用例)
[構成]
図1は、本開示の一実施の形態の受光素子(受光素子1)の断面構成を表したものである。受光素子1は、例えばIII-V族半導体などの無機半導体材料を用いた赤外線センサ等に適用されるものであり、例えば2次元配置された複数の受光単位領域P(画素P1,P2,P3,P4,P5…Pn)を含んでいる。尚、図1では、5つの画素P(画素P1~P5)に相当する部分の断面構成について示している。
受光素子1は、例えば次のようにして製造することができる。図2A~図3Cは、受光素子1の製造工程を工程順に表したものである。図2A~図3Cでは、画素P3~P5に対応する領域を示している。
受光素子1では、カラーフィルタ層15、パッシベーション膜14、第2電極25および第2コンタクト層24を介して、光電変換層23へ光(例えば可視領域および赤外領域の波長の光)が入射すると、この光が光電変換層23において吸収される。これにより、光電変換層23では正孔(ホール)および電子の対が発生する(光電変換される)。このとき、例えば第1電極21に所定の電圧が印加されると、光電変換層23に電位勾配が生じ、発生した電荷のうち一方の電荷(例えば正孔)が、信号電荷として第1コンタクト層22に移動し、第1コンタクト層22から第1電極21へ収集される。この信号電荷が、ROIC基板11によって読み出される。
本実施の形態の受光素子1では、画素P1~P4の光電変換層23A~23Dと、画素P5の光電変換層23Eとが互いに異なる無機半導体材料により構成されている。また、光電変換層23A~23Dの間でも、互いに異なる厚みに調整可能である。これにより、光電変換層23A~23E(画素P1~P5)各々で、光電変換可能な波長帯域を設定し易くなる。例えば、光電変換層23A(画素P1)で青色波長域の光、光電変換層23B(画素P2)で緑色波長域の光、光電変換層23C(画素P3)で赤色波長域の光、光電変換層23D(画素P4)で短赤外領域の波長の光、光電変換層23E(画素P5)で中赤外領域の波長の光がそれぞれ光電変換されるように構成することができる。以下、これについて説明する。
図6は、上記実施の形態の変形例1に係る受光素子(受光素子1A)の断面構成を表したものである。受光素子1Aのように、互いに異なる幅(幅W3,W4)の光電変換層23が設けられていてもよい。この点を除き、受光素子1Aは受光素子1と同様の構成および効果を有している。
図7は、変形例2に係る受光素子(受光素子1B)の断面構成を表したものである。上記実施の形態では、ROIC基板11側の面(具体的には、第1コンタクト層22の第1電極21との接触面)が平坦である場合を例示したが、光入射側の面が平坦であってもよい。具体的には、受光素子1Bのように、第2コンタクト層24の第2電極25との接触面が画素P間で、同一平面上に設けられていてもよい。即ち、受光素子1Bでは、複数の第2コンタクト層24の第2電極25との接触面が、同一平面を構成している。この点を除き、受光素子1Bは受光素子1と同様の構成および効果を有している。
図11は、変形例3に係る受光素子(受光素子1C)について、画素P5の断面構成を表したものである。本変形例のように、光電変換層23Eの厚み方向に、別の光電変換層(光電変換層23EA)を積層させるようにしてもよい。このような受光素子1Cでは、縦方向分光が可能となる。この点を除き、受光素子1Cは受光素子1と同様の構成および効果を有している。
図14は、上記実施の形態等において説明した受光素子1(または、受光素子1A~1C、以下、まとめて受光素子1という)の素子構造を用いた撮像素子2の機能構成を表したものである。撮像素子2は、例えば赤外線イメージセンサであり、例えば受光素子1を含む画素部10Pと、この画素部10Pを駆動する回路部20とを有している。回路部20は、例えば行走査部131、水平選択部133、列走査部134およびシステム制御部132を有している。
上述の撮像素子2は、例えば赤外領域を撮像可能なカメラなど、様々なタイプの電子機器に適用することができる。図16に、その一例として、電子機器3(カメラ)の概略構成を示す。この電子機器3は、例えば静止画または動画を撮影可能なカメラであり、撮像素子2と、光学系(光学レンズ)310と、シャッタ装置311と、撮像素子2およびシャッタ装置311を駆動する駆動部313と、信号処理部312とを有する。
本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
本開示に係る技術は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
(1)
平面視でそれぞれ異なる領域に配置された、第1光電変換層および第2光電変換層を含む複数の光電変換層と、
前記複数の光電変換層を互いに分離する絶縁膜と、
前記第1光電変換層に含まれる第1無機半導体材料と、
前記第2光電変換層に含まれ、前記第1無機半導体材料とは異なる第2無機半導体材料と
を備えた受光素子。
(2)
前記第1光電変換層の厚みと前記第2光電変換層の厚みとは異なっている
前記(1)に記載の受光素子。
(3)
更に、前記第1光電変換層の厚み方向に設けられるとともに、平面視で前記第1光電変換層の一部に重なる第3光電変換層を有し、
前記第3光電変換層は、前記第1無機半導体材料とは異なる第3無機半導体材料を含む
前記(1)または(2)に記載の受光素子。
(4)
前記第1光電変換層および前記第2光電変換層の少なくとも一方は、赤外領域の波長の光を吸収して電荷を発生するように構成されている
前記(1)乃至(3)のうちいずれか1つに記載の受光素子。
(5)
前記第1光電変換層および前記第2光電変換層の少なくとも一方は、可視領域の波長の光を吸収して電荷を発生するように構成されている
前記(1)乃至(4)のうちいずれか1つに記載の受光素子。
(6)
前記第1無機半導体材料および前記第2無機半導体材料の少なくとも一方は、Ge,InGaAs,Ex.InGaAs,InAsSb,InAs,InSbおよびHgCdTeのうちのいずれか1つである
前記(1)乃至(5)のうちいずれか1つに記載の受光素子。
(7)
更に、前記第1光電変換層、前記第2光電変換層それぞれに電気的に接続された第1電極と、
各々の前記第1電極に電気的に接続されたROIC(readout integrated circuit)基板とを有する
前記(1)乃至(6)のうちいずれか1つに記載の受光素子。
(8)
更に、前記第1電極と前記第1光電変換層、前記第2光電変換層それぞれとの間に設けられた第1コンタクト層を有する
前記(7)に記載の受光素子。
(9)
複数の前記第1コンタクト層の前記第1電極との接触面が、同一平面上に設けられている
前記(8)に記載の受光素子。
(10)
更に、前記第1光電変換層、前記第2光電変換層それぞれを間にして前記第1電極と対向する第2電極を有する
前記(7)乃至(9)のうちいずれか1つに記載の受光素子。
(11)
更に、前記第2電極と前記第1光電変換層、前記第2光電変換層それぞれとの間に設けられた第2コンタクト層を有する
前記(10)に記載の受光素子。
(12)
複数の前記第2コンタクト層の前記第2電極との接触面が、同一平面上に設けられている
前記(11)に記載の受光素子。
(13)
前記第2電極は、前記第1光電変換層および前記第2光電変換層に共通して設けられている
前記(10)乃至(12)のうちいずれか1つに記載の受光素子。
(14)
平面視で、前記第1光電変換層の大きさと前記第2光電変換層の大きさとが異なる
前記(1)乃至(13)のうちいずれか1つに記載の受光素子。
(15)
平面視で異なる領域に配置され、絶縁膜により互いに分離された複数の光電変換層のうち、
第1光電変換層を、第1無機半導体材料を含有させて形成し、
第2光電変換層を、前記第1無機半導体材料とは異なる第2無機半導体材料を含有させて形成する
受光素子の製造方法。
(16)
前記第1光電変換層および前記第2光電変換層は、
基板上に、第1開口および第2開口を有する前記絶縁膜を形成し、
前記第1開口に前記第1無機半導体材料、前記第2開口に前記第2無機半導体材料をそれぞれエピタキシャル成長させて形成する
前記(15)に記載の受光素子の製造方法。
(17)
前記第1開口に前記第1無機半導体材料をエピタキシャル成長させる際には前記第2開口を、前記第2開口に前記第2無機半導体材料をエピタキシャル成長させる際には前記第1開口を、それぞれハードマスクを用いて覆う
前記(16)に記載の受光素子の製造方法。
(18)
平面視でそれぞれ異なる領域に配置された、第1光電変換層および第2光電変換層を含む複数の光電変換層と、
前記複数の光電変換層を互いに分離する絶縁膜と、
前記第1光電変換層に含まれる第1無機半導体材料と、
前記第2光電変換層に含まれ、前記第1無機半導体材料とは異なる第2無機半導体材料と
を備えた撮像素子。
(19)
平面視でそれぞれ異なる領域に配置された、第1光電変換層および第2光電変換層を含む複数の光電変換層と、
前記複数の光電変換層を互いに分離する絶縁膜と、
前記第1光電変換層に含まれる第1無機半導体材料と、
前記第2光電変換層に含まれ、前記第1無機半導体材料とは異なる第2無機半導体材料とを備えた
撮像素子を有する電子機器。
Claims (19)
- 平面視でそれぞれ異なる領域に配置された、第1光電変換層および第2光電変換層を含む複数の光電変換層と、
前記複数の光電変換層を互いに分離する絶縁膜と、
前記第1光電変換層に含まれる第1無機半導体材料と、
前記第2光電変換層に含まれ、前記第1無機半導体材料とは異なる第2無機半導体材料と
を備えた受光素子。 - 前記第1光電変換層の厚みと前記第2光電変換層の厚みとは異なっている
請求項1に記載の受光素子。 - 更に、前記第1光電変換層の厚み方向に設けられるとともに、平面視で前記第1光電変換層の一部に重なる第3光電変換層を有し、
前記第3光電変換層は、前記第1無機半導体材料とは異なる第3無機半導体材料を含む
請求項1に記載の受光素子。 - 前記第1光電変換層および前記第2光電変換層の少なくとも一方は、赤外領域の波長の光を吸収して電荷を発生するように構成されている
請求項1に記載の受光素子。 - 前記第1光電変換層および前記第2光電変換層の少なくとも一方は、可視領域の波長の光を吸収して電荷を発生するように構成されている
請求項1に記載の受光素子。 - 前記第1無機半導体材料および前記第2無機半導体材料の少なくとも一方は、Ge,InGaAs,Ex.InGaAs,InAsSb,InAs,InSbおよびHgCdTeのうちのいずれか1つである
請求項1に記載の受光素子。 - 更に、前記第1光電変換層、前記第2光電変換層それぞれに電気的に接続された第1電極と、
各々の前記第1電極に電気的に接続されたROIC(readout integrated circuit)基板とを有する
請求項1に記載の受光素子。 - 更に、前記第1電極と前記第1光電変換層、前記第2光電変換層それぞれとの間に設けられた第1コンタクト層を有する
請求項7に記載の受光素子。 - 複数の前記第1コンタクト層の前記第1電極との接触面が、同一平面上に設けられている
請求項8に記載の受光素子。 - 更に、前記第1光電変換層、前記第2光電変換層それぞれを間にして前記第1電極と対向する第2電極を有する
請求項7に記載の受光素子。 - 更に、前記第2電極と前記第1光電変換層、前記第2光電変換層それぞれとの間に設けられた第2コンタクト層を有する
請求項10に記載の受光素子。 - 複数の前記第2コンタクト層の前記第2電極との接触面が、同一平面上に設けられている
請求項11に記載の受光素子。 - 前記第2電極は、前記第1光電変換層および前記第2光電変換層に共通して設けられている
請求項10に記載の受光素子。 - 平面視で、前記第1光電変換層の大きさと前記第2光電変換層の大きさとが異なる
請求項1に記載の受光素子。 - 平面視で異なる領域に配置され、絶縁膜により互いに分離された複数の光電変換層のうち、
第1光電変換層を、第1無機半導体材料を含有させて形成し、
第2光電変換層を、前記第1無機半導体材料とは異なる第2無機半導体材料を含有させて形成する
受光素子の製造方法。 - 前記第1光電変換層および前記第2光電変換層は、
基板上に、第1開口および第2開口を有する前記絶縁膜を形成し、
前記第1開口に前記第1無機半導体材料、前記第2開口に前記第2無機半導体材料をそれぞれエピタキシャル成長させて形成する
請求項15に記載の受光素子の製造方法。 - 前記第1開口に前記第1無機半導体材料をエピタキシャル成長させる際には前記第2開口を、前記第2開口に前記第2無機半導体材料をエピタキシャル成長させる際には前記第1開口を、それぞれハードマスクを用いて覆う
請求項16に記載の受光素子の製造方法。 - 平面視でそれぞれ異なる領域に配置された、第1光電変換層および第2光電変換層を含む複数の光電変換層と、
前記複数の光電変換層を互いに分離する絶縁膜と、
前記第1光電変換層に含まれる第1無機半導体材料と、
前記第2光電変換層に含まれ、前記第1無機半導体材料とは異なる第2無機半導体材料と
を備えた撮像素子。 - 平面視でそれぞれ異なる領域に配置された、第1光電変換層および第2光電変換層を含む複数の光電変換層と、
前記複数の光電変換層を互いに分離する絶縁膜と、
前記第1光電変換層に含まれる第1無機半導体材料と、
前記第2光電変換層に含まれ、前記第1無機半導体材料とは異なる第2無機半導体材料とを備えた
撮像素子を有する電子機器。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020197018328A KR102498922B1 (ko) | 2017-01-24 | 2017-12-19 | 수광 소자, 수광 소자의 제조 방법, 촬상 소자 및 전자 기기 |
KR1020237001688A KR102609022B1 (ko) | 2017-01-24 | 2017-12-19 | 수광 소자, 수광 소자의 제조 방법, 촬상 소자 및 전자 기기 |
CN202410126139.4A CN118173566A (zh) | 2017-01-24 | 2017-12-19 | 光接收器件 |
US16/477,969 US20200127039A1 (en) | 2017-01-24 | 2017-12-19 | Light-receiving device, method of manufacturing light receiving device, imaging device, and electronic apparatus |
DE112017006908.4T DE112017006908T5 (de) | 2017-01-24 | 2017-12-19 | Lichtempfangselement, verfahren zum produzieren eines lichtempfangselements, bildgebungselement und elektronische vorrichtung |
JP2018564163A JP6979974B2 (ja) | 2017-01-24 | 2017-12-19 | 受光素子の製造方法 |
CN201780083783.6A CN110226228B (zh) | 2017-01-24 | 2017-12-19 | 光接收器件、光接收器件制造方法、摄像器件和电子设备 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-010187 | 2017-01-24 | ||
JP2017010187 | 2017-01-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018139110A1 true WO2018139110A1 (ja) | 2018-08-02 |
Family
ID=62979468
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/045422 WO2018139110A1 (ja) | 2017-01-24 | 2017-12-19 | 受光素子、受光素子の製造方法、撮像素子および電子機器 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20200127039A1 (ja) |
JP (1) | JP6979974B2 (ja) |
KR (2) | KR102609022B1 (ja) |
CN (2) | CN110226228B (ja) |
DE (1) | DE112017006908T5 (ja) |
TW (1) | TWI781976B (ja) |
WO (1) | WO2018139110A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020080124A1 (ja) * | 2018-10-16 | 2020-04-23 | ソニーセミコンダクタソリューションズ株式会社 | 半導体素子およびその製造方法 |
WO2021187055A1 (ja) * | 2020-03-17 | 2021-09-23 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置及び電子機器 |
WO2021240998A1 (ja) * | 2020-05-26 | 2021-12-02 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子 |
WO2024111195A1 (ja) * | 2022-11-21 | 2024-05-30 | パナソニックIpマネジメント株式会社 | 撮像装置及び撮像装置の製造方法 |
US12027557B2 (en) | 2018-10-16 | 2024-07-02 | Sony Semiconductor Solutions Corporation | Semiconductor element and method of manufacturing the same |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI727507B (zh) * | 2019-11-19 | 2021-05-11 | 瑞昱半導體股份有限公司 | 信號處理裝置與信號處理方法 |
CN112953562B (zh) * | 2019-11-26 | 2024-02-09 | 瑞昱半导体股份有限公司 | 信号处理装置与信号处理方法 |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6093893A (ja) * | 1983-10-28 | 1985-05-25 | Toshiba Corp | カラ−用固体撮像装置 |
JPH02275670A (ja) * | 1989-01-18 | 1990-11-09 | Canon Inc | 光電変換装置および画像読取装置 |
JPH09213923A (ja) * | 1996-01-30 | 1997-08-15 | Nec Corp | 固体撮像装置 |
JP2003332551A (ja) * | 2002-05-08 | 2003-11-21 | Canon Inc | カラー撮像素子及びカラー受光素子 |
JP2007287977A (ja) * | 2006-04-18 | 2007-11-01 | Fujifilm Corp | 固体撮像素子の製造方法、固体撮像素子 |
JP2011171646A (ja) * | 2010-02-22 | 2011-09-01 | Rohm Co Ltd | カラー用固体撮像装置 |
JP2011258729A (ja) * | 2010-06-08 | 2011-12-22 | Panasonic Corp | 固体撮像装置及びその製造方法 |
JP2012114160A (ja) * | 2010-11-22 | 2012-06-14 | Panasonic Corp | 固体撮像装置及びその製造方法 |
JP2015149422A (ja) * | 2014-02-07 | 2015-08-20 | ソニー株式会社 | 受光素子、撮像素子及び撮像装置 |
JP2016033978A (ja) * | 2014-07-31 | 2016-03-10 | キヤノン株式会社 | 光電変換装置、及び撮像システム |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4866656B2 (ja) * | 2006-05-18 | 2012-02-01 | 富士フイルム株式会社 | 光電変換膜積層型カラー固体撮像装置 |
JP2010010478A (ja) * | 2008-06-27 | 2010-01-14 | Fujifilm Corp | 光電変換装置、光電変換装置の製造方法及び撮像装置 |
JP5353200B2 (ja) * | 2008-11-20 | 2013-11-27 | ソニー株式会社 | 固体撮像装置および撮像装置 |
TW201119019A (en) * | 2009-04-30 | 2011-06-01 | Corning Inc | CMOS image sensor on stacked semiconductor-on-insulator substrate and process for making same |
JP5564847B2 (ja) * | 2009-07-23 | 2014-08-06 | ソニー株式会社 | 固体撮像装置とその製造方法、及び電子機器 |
JP2011129873A (ja) * | 2009-11-17 | 2011-06-30 | Sony Corp | 固体撮像装置およびその製造方法、電子機器 |
JP5509962B2 (ja) * | 2010-03-19 | 2014-06-04 | ソニー株式会社 | 固体撮像装置、および、その製造方法、電子機器 |
JP2011243704A (ja) * | 2010-05-17 | 2011-12-01 | Panasonic Corp | 固体撮像装置 |
JP5534981B2 (ja) * | 2010-06-30 | 2014-07-02 | 株式会社東芝 | 固体撮像装置 |
JP2013012556A (ja) * | 2011-06-28 | 2013-01-17 | Sony Corp | 固体撮像装置とその製造方法、および電子機器 |
JP2013131553A (ja) * | 2011-12-20 | 2013-07-04 | Toshiba Corp | 固体撮像装置 |
JP2014127499A (ja) | 2012-12-25 | 2014-07-07 | Sumitomo Electric Ind Ltd | 受光デバイス、その製造法、およびセンシング装置 |
JP2014127545A (ja) * | 2012-12-26 | 2014-07-07 | Sony Corp | 固体撮像素子およびこれを備えた固体撮像装置 |
JP5873847B2 (ja) * | 2013-03-29 | 2016-03-01 | 富士フイルム株式会社 | 固体撮像素子および撮像装置 |
CN104064575B (zh) * | 2014-07-04 | 2017-08-25 | 豪威科技(上海)有限公司 | 背照式cmos影像传感器及其制造方法 |
WO2016121597A1 (ja) * | 2015-01-29 | 2016-08-04 | 東レ株式会社 | フェナントロリン誘導体、それを含有する電子デバイス、発光素子および光電変換素子 |
JP2016152265A (ja) * | 2015-02-16 | 2016-08-22 | 株式会社東芝 | 固体撮像素子 |
JP2017010187A (ja) | 2015-06-19 | 2017-01-12 | 富士ゼロックス株式会社 | 画像処理装置及び画像処理プログラム |
-
2017
- 2017-12-19 WO PCT/JP2017/045422 patent/WO2018139110A1/ja active Application Filing
- 2017-12-19 DE DE112017006908.4T patent/DE112017006908T5/de active Pending
- 2017-12-19 JP JP2018564163A patent/JP6979974B2/ja active Active
- 2017-12-19 CN CN201780083783.6A patent/CN110226228B/zh active Active
- 2017-12-19 CN CN202410126139.4A patent/CN118173566A/zh active Pending
- 2017-12-19 KR KR1020237001688A patent/KR102609022B1/ko active IP Right Grant
- 2017-12-19 KR KR1020197018328A patent/KR102498922B1/ko active IP Right Grant
- 2017-12-19 US US16/477,969 patent/US20200127039A1/en not_active Abandoned
-
2018
- 2018-01-03 TW TW107100127A patent/TWI781976B/zh active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6093893A (ja) * | 1983-10-28 | 1985-05-25 | Toshiba Corp | カラ−用固体撮像装置 |
JPH02275670A (ja) * | 1989-01-18 | 1990-11-09 | Canon Inc | 光電変換装置および画像読取装置 |
JPH09213923A (ja) * | 1996-01-30 | 1997-08-15 | Nec Corp | 固体撮像装置 |
JP2003332551A (ja) * | 2002-05-08 | 2003-11-21 | Canon Inc | カラー撮像素子及びカラー受光素子 |
JP2007287977A (ja) * | 2006-04-18 | 2007-11-01 | Fujifilm Corp | 固体撮像素子の製造方法、固体撮像素子 |
JP2011171646A (ja) * | 2010-02-22 | 2011-09-01 | Rohm Co Ltd | カラー用固体撮像装置 |
JP2011258729A (ja) * | 2010-06-08 | 2011-12-22 | Panasonic Corp | 固体撮像装置及びその製造方法 |
JP2012114160A (ja) * | 2010-11-22 | 2012-06-14 | Panasonic Corp | 固体撮像装置及びその製造方法 |
JP2015149422A (ja) * | 2014-02-07 | 2015-08-20 | ソニー株式会社 | 受光素子、撮像素子及び撮像装置 |
JP2016033978A (ja) * | 2014-07-31 | 2016-03-10 | キヤノン株式会社 | 光電変換装置、及び撮像システム |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020080124A1 (ja) * | 2018-10-16 | 2020-04-23 | ソニーセミコンダクタソリューションズ株式会社 | 半導体素子およびその製造方法 |
JPWO2020080124A1 (ja) * | 2018-10-16 | 2021-09-30 | ソニーセミコンダクタソリューションズ株式会社 | 半導体素子およびその製造方法 |
JP7372258B2 (ja) | 2018-10-16 | 2023-10-31 | ソニーセミコンダクタソリューションズ株式会社 | 半導体素子およびその製造方法 |
TWI821431B (zh) * | 2018-10-16 | 2023-11-11 | 日商索尼半導體解決方案公司 | 半導體元件及其製造方法 |
US12027557B2 (en) | 2018-10-16 | 2024-07-02 | Sony Semiconductor Solutions Corporation | Semiconductor element and method of manufacturing the same |
WO2021187055A1 (ja) * | 2020-03-17 | 2021-09-23 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像装置及び電子機器 |
WO2021240998A1 (ja) * | 2020-05-26 | 2021-12-02 | ソニーセミコンダクタソリューションズ株式会社 | 固体撮像素子 |
WO2024111195A1 (ja) * | 2022-11-21 | 2024-05-30 | パナソニックIpマネジメント株式会社 | 撮像装置及び撮像装置の製造方法 |
Also Published As
Publication number | Publication date |
---|---|
TW201841354A (zh) | 2018-11-16 |
TWI781976B (zh) | 2022-11-01 |
US20200127039A1 (en) | 2020-04-23 |
KR102498922B1 (ko) | 2023-02-13 |
KR102609022B1 (ko) | 2023-12-04 |
DE112017006908T5 (de) | 2019-10-02 |
CN118173566A (zh) | 2024-06-11 |
CN110226228B (zh) | 2024-02-20 |
CN110226228A (zh) | 2019-09-10 |
KR20190107663A (ko) | 2019-09-20 |
JP6979974B2 (ja) | 2021-12-15 |
KR20230012667A (ko) | 2023-01-26 |
JPWO2018139110A1 (ja) | 2019-11-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7014785B2 (ja) | 光電変換素子および撮像素子 | |
US11476285B2 (en) | Light-receiving device, imaging device, and electronic apparatus | |
WO2018194030A1 (ja) | 半導体素子およびその製造方法、並びに電子機器 | |
JP6979974B2 (ja) | 受光素子の製造方法 | |
JP6903896B2 (ja) | 受光素子の製造方法 | |
JP2024012345A (ja) | 半導体素子およびその製造方法 | |
EP3404715B1 (en) | Light receiving element, method for manufacturing light receiving element, image capturing element and electronic device | |
WO2020137282A1 (ja) | 撮像素子および電子機器 | |
CN113366656A (zh) | 光接收元件、光接收元件的制造方法以及摄像装置 | |
US11961863B2 (en) | Imaging element, semiconductor element, and electronic apparatus | |
TWI844563B (zh) | 攝像元件、半導體元件及電子機器 | |
JP7344114B2 (ja) | 撮像素子および電子機器 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17893707 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197018328 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2018564163 Country of ref document: JP Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17893707 Country of ref document: EP Kind code of ref document: A1 |