WO2019207978A1 - Élément de capture d'image et procédé de fabrication d'élément de capture d'image - Google Patents

Élément de capture d'image et procédé de fabrication d'élément de capture d'image Download PDF

Info

Publication number
WO2019207978A1
WO2019207978A1 PCT/JP2019/009673 JP2019009673W WO2019207978A1 WO 2019207978 A1 WO2019207978 A1 WO 2019207978A1 JP 2019009673 W JP2019009673 W JP 2019009673W WO 2019207978 A1 WO2019207978 A1 WO 2019207978A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixel
photoelectric conversion
color filter
incident
Prior art date
Application number
PCT/JP2019/009673
Other languages
English (en)
Japanese (ja)
Inventor
博信 深川
健児 池田
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2019207978A1 publication Critical patent/WO2019207978A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • H04N25/61Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof

Definitions

  • the present disclosure relates to an image sensor and a method for manufacturing the image sensor. Specifically, the present invention relates to an image sensor having a light shielding region outside a region where pixels are arranged, and a method for manufacturing the image sensor.
  • an imaging apparatus such as a camera
  • an imaging apparatus that has been downsized by arranging an imaging element close to a photographing lens is used.
  • a phenomenon is known in which the amount of light incident on the peripheral pixel is reduced with respect to the amount of light incident on the central pixel of the image sensor, and the sensitivity of the peripheral pixel of the image sensor is decreased. ing. This phenomenon is called shading.
  • ⁇ Pupil correction is performed to correct this decrease in sensitivity.
  • An on-chip lens is disposed in the above-described pixel, and light from the subject is condensed on the pixel by the on-chip lens. Since the light from the subject is vertically incident on the pixel arranged at the center of the pixel region, the on-chip lens is arranged at the center of the pixel. On the other hand, in the pixels arranged at the peripheral edge of the image sensor, the light from the subject is incident obliquely, so that the on-chip lens is shifted from the center of the pixel toward the center of the image sensor. Thereby, oblique incident light can be condensed on the pixel, and shading can be corrected.
  • an imaging device in which an upper layer film and an on-chip lens are sequentially stacked on a plurality of pixels formed on a semiconductor substrate
  • an imaging device that changes the correction amount of the on-chip lens position related to pupil correction has been proposed (for example, , See Patent Document 1).
  • the correction amount of the on-chip lens position is determined according to the distance from the center of the region where the pixels of the image sensor are arranged to the on-chip lens and the film thickness of the upper layer film at the position of the on-chip lens To change.
  • An insulating film, a color filter, and a planarizing film are disposed between the on-chip lens and the semiconductor substrate.
  • These insulating film, color filter, and planarizing film constitute the above-mentioned upper layer film.
  • the flattening film is a film for flattening the surface of the color filter. Therefore, the film thickness becomes relatively thick and the film thickness greatly changes in the pixel region. Since the above-described conventional technique corrects the decrease in sensitivity by pupil correction, there is a problem that the decrease in sensitivity cannot be sufficiently corrected when the film thickness of the planarization film or the like changes greatly.
  • the present disclosure has been made in view of the above-described problems, and an object thereof is to reduce a change in sensitivity in pixels arranged at a peripheral portion of a pixel region.
  • the present disclosure has been made in order to solve the above-described problems.
  • the first aspect of the present disclosure is that a photoelectric conversion unit that performs photoelectric conversion based on incident light and an opening are arranged to block the incident light.
  • a light shielding film that transmits incident light through the opening, a color filter that transmits incident light of a predetermined wavelength out of the incident light, a planarization film that planarizes the surface of the color filter, and the planarization
  • a pixel region disposed adjacent to a film and provided with a pixel including the color filter and an on-chip lens for condensing the incident light on the photoelectric conversion unit through the opening of the light shielding film; and the opening
  • the photoelectric conversion in the pixel in the vicinity of the light-shielding region is provided with a light-shielded pixel that is the pixel including the light-shielding film in which no part is disposed and a light-shielding region adjacent to the pixel region.
  • An image pickup element for
  • the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be further adjusted according to the color filter of the pixel.
  • the amount of light incident on the photoelectric conversion unit in the pixel near the light shielding region may be adjusted by changing the shape of the light shielding film.
  • the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be adjusted by changing the shape of the on-chip lens.
  • the amount of light incident on the photoelectric conversion unit in the pixel in the vicinity of the light shielding region may be adjusted by changing the refractive index of the on-chip lens.
  • the amount of incident light on the photoelectric conversion unit in the pixel near the light shielding area may be adjusted by changing the shape of the color filter.
  • the second aspect of the present disclosure includes a photoelectric conversion unit forming step that performs photoelectric conversion based on incident light, and a light shielding film that transmits the incident light through the opening while the opening is disposed to block the incident light.
  • An on-chip lens forming step for forming a plurality of pixels, the photoelectric conversion portion forming step, a second light shielding film forming step for forming a light shielding film in which the opening is not disposed, and the cap A light-shielding pixel forming step comprising a filter forming step, the planarizing film forming step, and the on-chip lens forming step, wherein the plurality of pixels to be formed are arranged in the light-shielding pixel forming step.
  • Forming the light-shielding pixels around the pixel area, and the manufacturing process of the pixels includes calculating the amount of incident light to the photoelectric conversion unit in the pixels in the vicinity of the light-shielding area where the light-shielding pixels are formed. It is a manufacturing method of an image sensor further provided with the process to adjust.
  • the amount of incident light to the photoelectric conversion unit of the pixel arranged in the vicinity of the boundary between the light shielding region and the pixel region is adjusted. It is assumed that the amount of incident light is adjusted according to the change in sensitivity in the pixels in the vicinity of the light shielding region based on the change in the film thickness of the flattening film or the like.
  • FIG. 3 is a top view illustrating a configuration example of a pixel according to an embodiment of the present disclosure.
  • FIG. It is a figure which shows an example of the image signal which concerns on embodiment of this indication.
  • 3 is a top view illustrating a configuration example of a pixel according to the first embodiment of the present disclosure.
  • FIG. 6 is a top view illustrating a configuration example of a pixel according to a second embodiment of the present disclosure.
  • FIG. 14 is a top view illustrating a configuration example of a pixel according to a third embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating a configuration example of an imaging element according to an embodiment of the present disclosure.
  • the image pickup device 1 of FIG. 1 includes a pixel array unit 10, a vertical drive unit 20, a column signal processing unit 30, and a control unit 40.
  • the pixel array unit 10 is configured by arranging the pixels 100 in a two-dimensional grid.
  • the pixel 100 generates an image signal corresponding to the irradiated light.
  • the pixel 100 includes a photoelectric conversion unit that generates charges according to the irradiated light.
  • the pixel 100 further includes a pixel circuit. This pixel circuit generates an image signal based on the charges generated by the photoelectric conversion unit. The generation of the image signal is controlled by a control signal generated by the vertical drive unit 20 described later.
  • signal lines 21 and 31 are arranged in an XY matrix.
  • the signal line 21 is a signal line that transmits a control signal for the pixel circuit in the pixel 100, and is arranged for each row of the pixel array unit 10 and wired in common to the pixels 100 arranged in each row.
  • the signal line 31 is a signal line that transmits an image signal generated by the pixel circuit of the pixel 100, and is arranged for each column of the pixel array unit 10, and is wired in common to the pixels 100 arranged in each column.
  • the vertical drive unit 20 generates a control signal for the pixel circuit of the pixel 100.
  • the vertical drive unit 20 transmits the generated control signal to the pixel 100 via the signal line 21 shown in FIG.
  • the column signal processing unit 30 processes the image signal generated by the pixel 100.
  • the column signal processing unit 30 processes the image signal transmitted from the pixel 100 via the signal line 31 shown in FIG.
  • the processing in the column signal processing unit 30 corresponds to, for example, analog-digital conversion that converts an analog image signal generated in the pixel 100 into a digital image signal.
  • the image signal processed by the column signal processing unit 30 is output as an image signal of the image sensor 1.
  • the control unit 40 controls the entire image sensor 1.
  • the control unit 40 controls the image sensor 1 by generating and outputting a control signal for controlling the vertical driving unit 20 and the column signal processing unit 30.
  • the control signal generated by the control unit 40 is transmitted to the vertical drive unit 20 and the column signal processing unit 30 through signal lines 41 and 42, respectively.
  • FIG. 2 is a diagram illustrating a configuration example of the pixel array unit according to the embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating a configuration example of the pixel array unit 10.
  • the pixel array unit 10 shown in FIG. 1 includes a pixel area 110 and a light shielding area 120.
  • the pixel area 110 is an area where the pixel 100 described in FIG. 1 is arranged.
  • Pixels 100a and 100b in the figure represent the pixels 100 arranged at the center and the periphery of the pixel region 110, respectively.
  • the light shielding area 120 is an area where the light shielding pixels 200 are arranged.
  • the light shielding pixel 200 is a pixel in which light from a subject is shielded, and is a pixel used for detecting a black level of an image signal.
  • a plurality of light-shielding pixels 200 are arranged around the pixel region 110 to form the light-shielding region 120.
  • FIG. 3 is a cross-sectional view illustrating a configuration example of a pixel according to an embodiment of the present disclosure.
  • This figure is a cross-sectional view illustrating a configuration example of the pixels 100a and 100b and the light-shielding pixel 200, and is a cross-sectional view along the line AA ′ in FIG.
  • this figure is a diagram illustrating a basic configuration of a pixel of the present disclosure.
  • the pixel 100 and the light-shielding pixel 200 in the figure are composed of a semiconductor substrate 121, a wiring region composed of a wiring layer 124 and an insulating layer 123, a support substrate 125, an insulating film 126, a light-shielding film 131, a color filter 141, and a flat surface.
  • a chemical film 161 and an on-chip lens 151 are provided.
  • the semiconductor substrate 121 is a semiconductor substrate on which the photoelectric conversion part of the pixel 100 and the semiconductor part of the pixel circuit are formed.
  • the photoelectric conversion unit 101 is described as an example.
  • the photoelectric conversion unit 101 is formed in a p-type well region formed in the semiconductor substrate 121.
  • the semiconductor substrate 121 constitutes a well region.
  • An n-type semiconductor region 122 is formed in the p-type well region.
  • a photodiode serving as the photoelectric conversion unit 101 is configured by a pn junction formed between the p-type well region and the n-type semiconductor region 122.
  • the wiring layer 124 is a wiring that connects elements formed on the semiconductor substrate 121.
  • the wiring layer 124 is also used for transmitting control signals and image signals for the pixels 100.
  • the wiring layer 124 can be composed of, for example, copper (Cu) or tungsten (W).
  • the insulating layer 123 insulates the wiring layer 124.
  • the insulating layer 123 can be made of, for example, silicon oxide (SiO 2 ).
  • the wiring layer 124 and the insulating layer 124 constitute a wiring region. Note that the wiring region in the figure is arranged on the surface of the semiconductor substrate 121.
  • the support substrate 125 is a substrate that is disposed adjacent to the wiring region and supports the semiconductor substrate 121.
  • the support substrate 125 is a substrate that improves the strength when the image pickup device 1 is manufactured.
  • the insulating film 126 is a substrate that is disposed on the back surface of the semiconductor substrate 121 and insulates the semiconductor substrate 121. This insulating film 126 can be made of, for example, SiO 2 .
  • the color filter 141 is an optical filter that transmits light of a predetermined wavelength among incident light.
  • a color filter 141 that transmits red light, green light, and blue light can be used.
  • any one of three types of color filters 141 that transmit red light, green light, and blue light is disposed.
  • the color filter 141 is configured to have a different thickness depending on the wavelength of light to be transmitted.
  • the color filter 141 corresponding to green light is configured to be thicker than the color filter 141 corresponding to red light and blue light. This is due to the characteristics of the color filter 141 and restrictions on the manufacturing process.
  • the on-chip lens 151 is a lens that condenses incident light on the photoelectric conversion unit 101.
  • the on-chip lens 151 has a hemispherical shape and collects incident light through the color filter 141.
  • the on-chip lens 151 can be made of, for example, an acrylic resin.
  • the image pickup device 1 shown in FIG. 1 is a back-illuminated image pickup device in which a color filter 141 and an on-chip lens 151 are disposed on the back surface of a semiconductor substrate and images incident light emitted from the back surface of the semiconductor substrate 121. .
  • the on-chip lens 151 is shifted in the direction of the center of the pixel region 110 with respect to the center of the photoelectric conversion unit 101 by pupil correction.
  • the planarizing film 161 is a film that planarizes the surface of the color filter 141. As described above, the surface of the color filter 141 has a different film thickness for each corresponding color. For this reason, the planarization film 161 is disposed, and the surface on which the on-chip lens 151 is formed is planarized.
  • the planarization film 161 can be made of the same material as the on-chip lens 151, for example. Specifically, when the on-chip lens 151 is formed, the surface of the color filter 141 can be flattened by thickly coating the surface of the color filter 141 with the material of the on-chip lens 151.
  • the light shielding film 131 is a film that shields incident light.
  • the light shielding film 131 is configured in different shapes in the pixel 100 and the light shielding pixel 200.
  • a light shielding film 131 having an opening 132 is disposed in the pixel 100.
  • Incident light transmitted through the on-chip lens 151 and the color filter 141 is irradiated to the photoelectric conversion unit 101 through the opening 132.
  • the light shielding film 131 disposed in the pixel 100 shields light incident obliquely from the adjacent pixel 100. Specifically, the light transmitted through the color filter 141 of the adjacent pixel 100 is prevented from entering the photoelectric conversion unit 101 of the pixel 100 itself. Thereby, the occurrence of crosstalk can be prevented.
  • the light shielding film 131 can be made of metal, for example.
  • the light-shielding pixel 200 a light-shielding film 131 having no opening is disposed. For this reason, in the light-shielding pixel 200, all light from the subject is shielded.
  • the image signal generated by such a light shielding pixel 200 is a signal corresponding to the black level of the image signal generated by the pixel 100.
  • the light shielding film 131 and the color filter 141 can be formed simultaneously in the pixel region and the light shielding region.
  • the color filter 141 is disposed adjacent to the insulating film 126 in the opening 132. That is, the light shielding film 131 is embedded in the opening 132.
  • the color filter 141 is laminated on the surface of the light shielding film 131. For this reason, the height from the semiconductor substrate 121 to the surface of the color filter 141 is higher in the light-shielded pixel 200 than in the pixel 100. Thus, a step is generated in the surface height of the color filter 141 between the pixel region 110 and the light shielding region 120.
  • the planarization film 161 When the planarization film 161 is formed on such a pixel array unit 10, the unevenness of the surface of the color filter 141 for each pixel 100 is planarized. On the other hand, the step between the pixel region 110 and the light shielding region 120 is not planarized, and the planarization film 161 is formed at different heights in the pixel region 110 and the light shielding region 120. Further, as shown in the figure, in the pixel 100b arranged in the vicinity of the light shielding region 120, the height of the planarization film 161 gradually decreases from the end of the pixel region 110 toward the center.
  • the distance between the on-chip lens 151 and the photoelectric conversion unit 101 is different from that of the pixel 100a, and the condensing position of incident light by the on-chip lens 151 is different from that of the pixel 100a.
  • FIG. 4 is a top view illustrating a configuration example of the pixel according to the embodiment of the present disclosure.
  • This figure is a top view showing a configuration example of the on-chip lens 151 and the light shielding film 131 described in FIG.
  • a dotted rectangle represents the pixel 100
  • a solid line rectangle represents the opening 132 of the light shielding film 131
  • a broken circle represents the on-chip lens 151.
  • “R”, “G”, and “B” in the figure represent the types of color filters 141 arranged in the pixels 100.
  • the pixel 100 in which “R”, “G”, and “B” are described represents the pixel 100 in which the color filters 141 corresponding to red light, green light, and blue light are arranged, respectively.
  • the pixels 100 on which the color filters 141 corresponding to red light, green light, and blue light are arranged are referred to as a red pixel 100, a green pixel 100, and a blue pixel 100, respectively.
  • the red pixel 100, the green pixel 100, and the blue pixel 100 are configured in a Bayer array is shown.
  • the Bayer arrangement is a pixel arrangement method in which the green pixels 100 are arranged in a checkered pattern, and the green pixels 100 and the blue pixels 100 are arranged between the green pixels 100.
  • a represents the configuration of the pixel 100 (pixel 100a) in the center of the pixel region 110
  • b in the drawing represents the configuration of the pixel 100 (pixel 100b) in the peripheral portion of the pixel region 110.
  • the on-chip lens 151 is disposed at the center of the pixel 100a.
  • pupil correction is performed, and the on-chip lens 151 is shifted to the left side in FIG.
  • FIG. 5 is a diagram illustrating an example of an image signal according to the embodiment of the present disclosure. This figure shows the relationship between the position of the pixel 100 in the pixel array unit 10 and the image signal.
  • a, b, and c represent image signals of the green pixel 100, the red pixel 100, and the blue pixel 100, respectively.
  • the vertical axis represents the level of the image signal
  • the horizontal axis represents the position of the pixel 100 along the line AA ′.
  • a solid line represents a graph of the image signal.
  • the image signal of the pixel 100 arranged in the center of the pixel array unit 10 has a relatively high level, and the pixel signal arranged in the peripheral portion of the pixel array unit 10 has a low level of pixel signal.
  • the pixel 100 arranged in the central portion of the pixel array unit 10 light from the subject enters perpendicularly.
  • the light from the subject is incident obliquely on the pixels 100 arranged in the peripheral portion. Since the amount of light per area of the opening 132 described with reference to FIG. 3 is reduced and the incident light of the pixel 100 is reduced, the level of the pixel signal is lowered in the pixel 100 arranged in the peripheral portion.
  • the dotted line in the figure is a graph showing an image signal based on a decrease in the amount of incident light. A phenomenon in which the level of the image signal of the pixels 100 arranged at the peripheral edge of the pixel array unit 10 in this way is called shading. *
  • the level of the image signal changes in the pixel 100 in the vicinity of the light shielding area. This is because, as described above, in the pixel 100 in the vicinity of the light-shielding region, the optical path length changes as the film thickness of the planarization film 161 increases. Further, the characteristics of the image signal are different for each of the green pixel 100, the red pixel 100, and the blue pixel 100. Specifically, a relatively high level image signal is generated in the green pixel 100 a in FIG. In the red pixel 100 of b in the figure, a higher level image signal is generated (region 302). On the other hand, a relatively low level image signal is generated in the blue pixel 100 of FIG.
  • the refractive index when passing through the on-chip lens 151 differs depending on the wavelength of light.
  • blue light having a short wavelength has a high refractive index, and is thus focused on the photoelectric conversion unit 101 in the vicinity of the surface of the semiconductor substrate 121. For this reason, the blue pixel 100 is easily affected by the film thickness of the planarization film 161.
  • FIG. 6 is a cross-sectional view illustrating a configuration example of the imaging element according to the first embodiment of the present disclosure.
  • FIG. 3 is a cross-sectional view illustrating a configuration example of the pixel 100a and the pixel 100b.
  • the pixel 100a is described as a comparative example and is the same as the pixel 100a described in FIG.
  • the pixel 100b in the figure is different from the light shielding film 131 of the pixel 100a in the shape of the light shielding film 131.
  • the size of the opening 132 of the light shielding film 131 is different from that of the opening 132 of the pixel 100a.
  • the openings 132b and 132c in the same figure correspond to the openings of the light shielding film 131 disposed in the green pixel 100 and the blue pixel 100, respectively.
  • FIG. 7 is a top view illustrating a configuration example of the pixel according to the first embodiment of the present disclosure.
  • This figure is a diagram showing the configuration of the on-chip lens 151 and the light shielding film 131 of the pixel 100b, similarly to b in FIG.
  • Openings 132a, 132b, and 132c in the figure represent the openings of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively.
  • These openings 132a, 132b, and 132c are configured to be larger than the opening 132 described in FIG. Increasing the opening can increase the amount of incident light.
  • large openings 132 are arranged in the order of the blue pixel 100, the green pixel 100, and the blue pixel 100.
  • the incident light amount of the green pixel 100 and the red pixel 100 with relatively high sensitivity is decreased, and the incident light amount of the blue pixel 100 with relatively low sensitivity is increased.
  • the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
  • FIGS. 8 and 9 are diagrams illustrating an example of a method for manufacturing the imaging element according to the first embodiment of the present disclosure.
  • 8 and 9 are diagrams showing the manufacturing process of the imaging device 1, and are diagrams showing the manufacturing processes of the pixels 100a and 100b and the light-shielding pixel 200 described in FIG.
  • the n-type semiconductor region 122 is formed in the p-type well region formed in the semiconductor substrate 121, and the photoelectric conversion part 101 is formed (a in FIG. 8).
  • This step is an example of the photoelectric conversion part forming step described in the claims.
  • a wiring region (not shown) is formed on the semiconductor substrate 121, and a support substrate 125 (not shown) is bonded.
  • an insulating film 126 is formed on the back surface of the semiconductor substrate 121. This can be performed, for example, by depositing the material of the insulating film 126 such as SiO 2 using CVD (Chemical Vapor Deposition) or the like (b in FIG. 8).
  • a metal film 401 and a resist 402 as materials for the light shielding film 131 are sequentially stacked on the surface of the insulating film 126.
  • openings 403, 403 a (not shown), 403 b and 403 c are formed in the resist 402.
  • the openings 403 and the like are formed in a size and a position corresponding to the openings 132, 132a, 132b, and 132c described in FIGS. 6 and 7 (c in FIG. 8).
  • the metal film 401 is etched using the resist 402 as a mask. This can be performed, for example, by dry etching.
  • the light shielding film 131 including the openings 132, 132a (not shown), 132b, and 132c can be formed (d in FIG. 8). Thereby, the pixels 100a and 100b and the light shielding film 131 of the light shielding pixel 200 can be formed simultaneously. Further, by changing the opening of the light shielding film 131 of the pixel 100b to 132a, 132b, and 132c, the amount of light incident on the photoelectric conversion unit 101 of the pixel 100b can be adjusted.
  • This step is an example of the first light shielding film forming step and the second light shielding film forming step described in the claims. Moreover, the said process is an example of the process of adjusting the incident light quantity as described in a claim.
  • the color filter 141 is formed on the surfaces of the insulating film 126 and the light shielding film 131. This can be done for each type of color filter 141. For example, a resin that is a material of the color filter 141 corresponding to green is applied, and openings are formed in regions where the color filters 141 of the red pixel 100 and the blue pixel 100 are disposed, and are cured. Next, this opening can be carried out by placing a resin as a material for the color filter 141 corresponding to red and blue (e in FIG. 9). This step is an example of the color filter forming step described in the claims.
  • a resin 404 that is a material of an on-chip lens is applied to the surface of the color filter 141.
  • the surface of the color filter 141 is flattened.
  • the vicinity of the surface of the resin 404 is constituted by an on-chip lens 151.
  • the resin 404 in the region adjacent to the color filter 141 is formed on the planarizing film 161 (f in FIG. 9). This step is an example of a planarization film forming step described in the claims.
  • the surface of the resin 404 is processed into a hemispherical shape to form an on-chip lens 151.
  • This can be performed, for example, by placing a resist having the same shape as the on-chip lens 151 on the surface of the resin 404 and performing dry etching to transfer the shape of the resist to the resin 404 (in FIG. 9).
  • This step is an example of an on-chip lens forming step described in the claims.
  • the pixels 100a and 100b in the pixel region 110 can be formed, and the light-shielding pixel 200 in the light-shielding region can be formed.
  • the imaging device 1 changes the shape of the light shielding film 131 of the pixel 100 arranged in the vicinity of the light shielding region for each pixel 100 and the color filter of the pixel 100. It adjusts according to the kind of 141. Thereby, the incident light quantity of the pixel 100 arranged in the vicinity of the light shielding region can be adjusted, and the change in sensitivity can be corrected.
  • Second Embodiment> In the image pickup device 1 of the first embodiment described above, the shape of the opening 132 of the light shielding film 131 of the pixel 100b is changed with respect to the pixel 100a. On the other hand, the imaging device 1 according to the second embodiment of the present disclosure is different from the above-described first embodiment in that the shape of the on-chip lens 151 is changed.
  • FIG. 10 is a cross-sectional view illustrating a configuration example of an imaging element according to the second embodiment of the present disclosure.
  • the pixel 100b in the figure is different from the pixel 100b described in FIG. 6 in that on-chip lenses 152b and 152c are provided instead of the on-chip lens 151.
  • the on-chip lenses 152b and 152c are on-chip lenses configured in a shape different from the on-chip lens 151. As will be described later, the on-chip lenses 152b and 152c are configured to have a smaller curvature than the on-chip lens 151, and correspond to the on-chip lenses disposed in the green pixel 100 and the blue pixel 100, respectively.
  • FIG. 11 is a top view illustrating a configuration example of a pixel according to the second embodiment of the present disclosure.
  • the on-chip lenses 152a, 152b, and 152c in the figure represent the on-chip lenses of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively.
  • the on-chip lenses 152a, 152b, and 152c are configured to have a smaller curvature than the on-chip lens 151. By reducing the curvature, the condensing distance can be increased. The condensing distance can be adjusted according to the increase in the thickness of the planarizing film 161. Further, as shown in the figure, the red pixel 100, the green pixel 100, and the blue pixel 100 are configured in a large shape in this order, and are configured in a small curvature in this order.
  • the blue light condensing position in the blue pixel 100 can be the photoelectric conversion unit 101 near the surface of the semiconductor substrate 121. it can.
  • the condensing position in the red pixel 100 and the green pixel 100 can be set near the end of the photoelectric conversion unit 101 by being configured with a relatively large curvature. Thereby, the sensitivity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of the image signal can be corrected.
  • the imaging device 1 adjusts the amount of light incident on the pixels 100 arranged in the vicinity of the light-shielding region by changing the shape of the on-chip lens. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
  • the shape of the on-chip lens of the pixel 100b is changed with respect to the pixel 100a.
  • the imaging device 1 according to the third embodiment of the present disclosure is different from the above-described second embodiment in that the refractive index of the on-chip lens is changed.
  • FIG. 12 is a cross-sectional view illustrating a configuration example of an imaging element according to the third embodiment of the present disclosure.
  • the pixel 100b in the figure is different from the pixel 100b described in FIG. 10 in that on-chip lenses 153b and 153c are provided instead of the on-chip lenses 152c and 152d.
  • the on-chip lenses 153b and 153c are on-chip lenses configured to have a refractive index different from that of the on-chip lens 151. As will be described later, the on-chip lenses 153b and 151c are configured to have a refractive index smaller than that of the on-chip lens 151, and correspond to the on-chip lenses disposed in the green pixel 100 and the blue pixel 100, respectively.
  • FIG. 13 is a top view illustrating a configuration example of a pixel according to the third embodiment of the present disclosure.
  • On-chip lenses 153a, 153b, and 153c in the figure represent on-chip lenses of the red pixel 100, the green pixel 100, and the blue pixel 100, respectively.
  • the on-chip lenses 153a, 153b, and 153c are configured to have a smaller refractive index than the on-chip lens 151.
  • the on-chip lenses 153a, 153b, and 153c are configured to have a small refractive index in this order.
  • the condensing distance can be increased, and the thickness of the planarization film 161 is increased. It is possible to reduce the influence of the change in the condensing position due to.
  • the refractive index of the on-chip lens 153 c of the blue pixel 100 is made smaller than that of the red pixel 100 and the green pixel 100, the blue light condensing position in the blue pixel 100 is shifted to the photoelectric conversion unit 101 near the surface of the semiconductor substrate 121. can do. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
  • the on-chip lenses 153a, 153b, and 153c can be made of a material different from that of the on-chip lens 151.
  • the on-chip lenses 153a, 153b, and 153c can also be configured as on-chip lenses having different refractive indexes by changing materials.
  • the on-chip lenses 153a, 153b, and 153c can be formed, for example, by performing the on-chip lens forming process described in FIG. 9g after forming the planarizing film 161, respectively.
  • the imaging device 1 adjusts the amount of incident light on the pixels 100 arranged in the vicinity of the light-shielding region by changing the refractive index of the on-chip lens 151. To do. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
  • the shape of the opening 132 of the light shielding film 131 of the pixel 100b is changed with respect to the pixel 100a.
  • the imaging device 1 according to the fourth embodiment of the present disclosure is different from the above-described first embodiment in that the shape of the color filter 141 is changed.
  • FIG. 14 is a cross-sectional view illustrating a configuration example of an imaging element according to the fourth embodiment of the present disclosure.
  • the pixel 100b in the figure is different from the pixel 100b described in FIG. 6 in that color filters 142b and 142c are provided instead of the color filter 141.
  • the color filters 142b and 142c are color filters configured in a shape different from that of the color filter 141. Specifically, the color filters 142b and 142c are formed with a film thickness thinner than that of the color filter 141, and correspond to the color filters arranged in the green pixel 100 and the blue pixel 100, respectively. Although not shown, the color filter 142 a having a thickness smaller than that of the color filter 141 is also disposed in the red pixel 100. In addition, the color filters 142a, 142b, and 142c are configured to have a thin film thickness in this order.
  • the incident light quantity of the pixel 100b can be increased, and the influence of the change in the incident light quantity accompanying the increase in the film thickness of the planarizing film 161 is reduced. can do.
  • the film thickness of the color filter 142c of the blue pixel 100 thinner than that of the red pixel 100 and the green pixel 100, the incident light quantity of the blue pixel 100 can be increased. Thereby, the incident light quantity of the red pixel 100, the green pixel 100, and the blue pixel 100 can be adjusted, and the level of an image signal can be corrected.
  • the color filters 142a, 142b, and 142c have rounded corners. This is because it is possible to prevent so-called “vignetting” that prevents light from entering the adjacent pixel 100b.
  • the imaging device 1 adjusts the amount of incident light on the pixels 100 arranged in the vicinity of the light shielding region by changing the shape of the color filter 142. Thereby, a change in sensitivity of the pixel 100 arranged in the vicinity of the light shielding region can be corrected.
  • the technology according to the present disclosure can be applied to various products.
  • the present technology may be realized as an imaging element mounted on an imaging device such as a camera.
  • FIG. 15 is a block diagram illustrating a schematic configuration example of a camera that is an example of an imaging apparatus to which the present technology can be applied.
  • the camera 1000 shown in FIG. 1 includes a lens 1001, an image sensor 1002, an imaging control unit 1003, a lens driving unit 1004, an image processing unit 1005, an operation input unit 1006, a frame memory 1007, a display unit 1008, And a recording unit 1009.
  • the lens 1001 is a photographing lens of the camera 1000.
  • the lens 1001 collects light from the subject and makes it incident on an image sensor 1002 described later to form an image of the subject.
  • the imaging element 1002 is a semiconductor element that images light from the subject condensed by the lens 1001.
  • the image sensor 1002 generates an analog image signal corresponding to the irradiated light, converts it into a digital image signal, and outputs it.
  • the imaging control unit 1003 controls imaging in the imaging element 1002.
  • the imaging control unit 1003 controls the imaging element 1002 by generating a control signal and outputting the control signal to the imaging element 1002.
  • the imaging control unit 1003 can perform autofocus in the camera 1000 based on the image signal output from the imaging element 1002.
  • the autofocus is a system that detects the focal position of the lens 1001 and automatically adjusts it.
  • a method image plane phase difference autofocus
  • an image plane phase difference is detected by a phase difference pixel arranged in the image sensor 1002 to detect a focal position
  • a method (contrast autofocus) in which a position where the contrast of an image is the highest is detected as a focal position can be applied.
  • the imaging control unit 1003 adjusts the position of the lens 1001 via the lens driving unit 1004 based on the detected focal position, and performs autofocus.
  • the imaging control unit 1003 can be configured by, for example, a DSP (Digital Signal Processor) equipped with firmware.
  • DSP Digital Signal Processor
  • the lens driving unit 1004 drives the lens 1001 based on the control of the imaging control unit 1003.
  • the lens driving unit 1004 can drive the lens 1001 by changing the position of the lens 1001 using a built-in motor.
  • the image processing unit 1005 processes the image signal generated by the image sensor 1002. This processing includes, for example, demosaic that generates an image signal of insufficient color among image signals corresponding to red, green, and blue for each pixel, noise reduction that removes noise of the image signal, and encoding of the image signal. Applicable.
  • the image processing unit 1005 can be configured by, for example, a microcomputer equipped with firmware.
  • the operation input unit 1006 receives an operation input from the user of the camera 1000.
  • the operation input unit 1006 for example, a push button or a touch panel can be used.
  • the operation input received by the operation input unit 1006 is transmitted to the imaging control unit 1003 and the image processing unit 1005. Thereafter, processing according to the operation input, for example, processing such as imaging of a subject is started.
  • the frame memory 1007 is a memory for storing frames that are image signals for one screen.
  • the frame memory 1007 is controlled by the image processing unit 1005 and holds a frame in the course of image processing.
  • the display unit 1008 displays the image processed by the image processing unit 1005.
  • a liquid crystal panel can be used for the display unit 1008.
  • the recording unit 1009 records the image processed by the image processing unit 1005.
  • a memory card or a hard disk can be used.
  • the camera to which the present invention can be applied has been described above.
  • the present technology can be applied to the image sensor 1002 among the configurations described above.
  • the image sensor 1 described in FIG. 1 can be applied to the image sensor 1002.
  • a change in sensitivity of the pixel 100 in the vicinity of the light-shielding region can be corrected, and deterioration in image quality of an image generated by the camera 1000 can be prevented.
  • FIG. 16 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology according to the present disclosure can be applied.
  • FIG. 16 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
  • an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
  • the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
  • An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
  • the pneumoperitoneum device 11206 passes gas into the body cavity via the insufflation tube 11111.
  • the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
  • the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
  • the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
  • FIG. 17 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 includes an imaging element.
  • One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used.
  • image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
  • 3D 3D
  • the imaging unit 11402 is not necessarily provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Auto White Balance
  • the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102.
  • the image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
  • the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
  • the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
  • the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400.
  • communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to the imaging unit 11402 among the configurations described above.
  • the imaging element 1 in FIG. 1 can be applied to the imaging unit 10402.
  • a high-quality surgical part image can be obtained, so that the surgeon can surely check the surgical part.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 18 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generation device, the steering mechanism, the braking device, and the like based on information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 19 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031. *
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper portion of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 19 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104.
  • pedestrian recognition is, for example, whether or not a person is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 displays a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed. Further, the audio image output unit 12052 may control the display unit 12062 so as to display an icon or the like indicating a pedestrian at a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
  • the imaging device 1 in FIG. 1 can be applied to the imaging unit 12031.
  • this technique can also take the following structures.
  • a photoelectric conversion unit that performs photoelectric conversion based on incident light, a light-shielding film in which an opening is disposed and blocks the incident light while transmitting the incident light, and a predetermined wavelength of the incident light
  • a color filter that transmits the incident light, a planarizing film that planarizes the surface of the color filter, and the entrance through the openings of the color filter and the light shielding film that are disposed adjacent to the planarizing film.
  • the formation step of the light shielding pixel is a step of forming the light shielding pixel around a pixel region where the plurality of pixels to be formed are arranged.
  • the manufacturing method of the image pickup device further includes a step of adjusting an amount of incident light to the photoelectric conversion unit in the pixel in the vicinity of the light shielding region, which is a region where the light shielding pixel is formed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Electromagnetism (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

Le but de la présente invention est de réduire les changements de sensibilité de pixels agencés dans la périphérie d'une région de pixels. Une région de blocage de lumière comprend une région de pixels et une région de blocage de lumière. Dans la région de pixels, des pixels sont agencés qui comprennent une partie de conversion photoélectrique pour réaliser une conversion photoélectrique sur la base d'une lumière incidente, un film de blocage de lumière ayant une partie d'ouverture et étant transmissif à la lumière incidente dans la partie d'ouverture tout en bloquant la lumière incidente, un filtre de couleur transmissif à une longueur d'onde prédéterminée de la lumière incidente, un film de planarisation pour planariser une surface du filtre de couleur, et une lentille sur puce qui est disposée adjacente au film de planarisation et concentre la lumière incidente au niveau de la partie de conversion photoélectrique par l'intermédiaire du filtre de couleur et de la partie d'ouverture du film de blocage de lumière. Dans la région de blocage de lumière, des pixels de blocage de lumière qui sont des pixels pourvus d'un film de blocage de lumière n'ayant aucune partie d'ouverture sont agencés. La région de blocage de lumière est adjacente à la région de pixels. La quantité de lumière qui entre dans la partie de conversion photoélectrique dans les pixels adjacents à la région de blocage de lumière est ajustée.
PCT/JP2019/009673 2018-04-26 2019-03-11 Élément de capture d'image et procédé de fabrication d'élément de capture d'image WO2019207978A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-084688 2018-04-26
JP2018084688A JP2019192802A (ja) 2018-04-26 2018-04-26 撮像素子および撮像素子の製造方法

Publications (1)

Publication Number Publication Date
WO2019207978A1 true WO2019207978A1 (fr) 2019-10-31

Family

ID=68293847

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/009673 WO2019207978A1 (fr) 2018-04-26 2019-03-11 Élément de capture d'image et procédé de fabrication d'élément de capture d'image

Country Status (2)

Country Link
JP (1) JP2019192802A (fr)
WO (1) WO2019207978A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022118657A1 (fr) * 2020-12-04 2022-06-09 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie à semi-conducteur et appareil électronique
WO2023068172A1 (fr) * 2021-10-20 2023-04-27 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021175048A (ja) * 2020-04-22 2021-11-01 ソニーセミコンダクタソリューションズ株式会社 電子機器
WO2024048488A1 (fr) * 2022-08-31 2024-03-07 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie à semi-conducteurs

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07176708A (ja) * 1993-12-21 1995-07-14 Matsushita Electron Corp 固体撮像装置
JP2009164247A (ja) * 2007-12-28 2009-07-23 Sony Corp 固体撮像装置とその製造方法、カメラ及び電子機器
JP2010199668A (ja) * 2009-02-23 2010-09-09 Sony Corp 固体撮像装置および電子機器
JP2011176325A (ja) * 2011-03-22 2011-09-08 Sony Corp 固体撮像装置及び電子機器
JP2012064924A (ja) * 2010-08-17 2012-03-29 Canon Inc マイクロレンズアレイの製造方法、固体撮像装置の製造方法および固体撮像装置
JP2012124377A (ja) * 2010-12-09 2012-06-28 Sony Corp 固体撮像装置とその製造方法、及び電子機器
JP2017059589A (ja) * 2015-09-14 2017-03-23 キヤノン株式会社 固体撮像素子及び撮像装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07176708A (ja) * 1993-12-21 1995-07-14 Matsushita Electron Corp 固体撮像装置
JP2009164247A (ja) * 2007-12-28 2009-07-23 Sony Corp 固体撮像装置とその製造方法、カメラ及び電子機器
JP2010199668A (ja) * 2009-02-23 2010-09-09 Sony Corp 固体撮像装置および電子機器
JP2012064924A (ja) * 2010-08-17 2012-03-29 Canon Inc マイクロレンズアレイの製造方法、固体撮像装置の製造方法および固体撮像装置
JP2012124377A (ja) * 2010-12-09 2012-06-28 Sony Corp 固体撮像装置とその製造方法、及び電子機器
JP2011176325A (ja) * 2011-03-22 2011-09-08 Sony Corp 固体撮像装置及び電子機器
JP2017059589A (ja) * 2015-09-14 2017-03-23 キヤノン株式会社 固体撮像素子及び撮像装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022118657A1 (fr) * 2020-12-04 2022-06-09 ソニーセミコンダクタソリューションズ株式会社 Élément d'imagerie à semi-conducteur et appareil électronique
WO2023068172A1 (fr) * 2021-10-20 2023-04-27 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie

Also Published As

Publication number Publication date
JP2019192802A (ja) 2019-10-31

Similar Documents

Publication Publication Date Title
WO2018043654A1 (fr) Dispositif d'imagerie à semi-conducteurs et son procédé de fabrication, et appareil électronique
JP6951866B2 (ja) 撮像素子
WO2018139278A1 (fr) Élément de capture d'image, procédé de fabrication, et dispositif électronique
WO2019207978A1 (fr) Élément de capture d'image et procédé de fabrication d'élément de capture d'image
US20230008784A1 (en) Solid-state imaging device and electronic device
JP2018200980A (ja) 撮像装置および固体撮像素子、並びに電子機器
US11284046B2 (en) Imaging element and imaging device for controlling polarization of incident light
US11889206B2 (en) Solid-state imaging device and electronic equipment
WO2020137203A1 (fr) Élément d'imagerie et dispositif d'imagerie
US20240030250A1 (en) Solid-state imaging device and electronic apparatus
US20230103730A1 (en) Solid-state imaging device
US20240006443A1 (en) Solid-state imaging device, imaging device, and electronic apparatus
WO2022091576A1 (fr) Dispositif d'imagerie à semi-conducteurs et appareil électronique
WO2022009674A1 (fr) Boîtier de semi-conducteur et procédé de production de boîtier de semi-conducteur
WO2020195180A1 (fr) Élément d'imagerie et dispositif d'imagerie
WO2019176302A1 (fr) Élément d'imagerie et procédé de fabrication d'élément d'imagerie
WO2024095832A1 (fr) Photodétecteur, appareil électronique et élément optique
WO2023195316A1 (fr) Dispositif de détection de lumière
WO2023162496A1 (fr) Dispositif d'imagerie
WO2023013156A1 (fr) Élément d'imagerie et dispositif électronique
WO2023195315A1 (fr) Dispositif de détection de lumière
WO2023021787A1 (fr) Dispositif de détection optique et son procédé de fabrication
WO2024029408A1 (fr) Dispositif d'imagerie
WO2023127498A1 (fr) Dispositif de détection de lumière et instrument électronique
WO2023058326A1 (fr) Dispositif d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19793021

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19793021

Country of ref document: EP

Kind code of ref document: A1