US7531781B2 - Imager with image-taking portions optimized to detect separated wavelength components - Google Patents

Imager with image-taking portions optimized to detect separated wavelength components Download PDF

Info

Publication number
US7531781B2
US7531781B2 US11/358,885 US35888506A US7531781B2 US 7531781 B2 US7531781 B2 US 7531781B2 US 35888506 A US35888506 A US 35888506A US 7531781 B2 US7531781 B2 US 7531781B2
Authority
US
United States
Prior art keywords
wavelength
image
light
wavelength component
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US11/358,885
Other languages
English (en)
Other versions
US20070051876A1 (en
Inventor
Hirofumi Sumi
Atsushi Toda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TODA, ATSUSHI, SUMI, HIROFUMI
Publication of US20070051876A1 publication Critical patent/US20070051876A1/en
Application granted granted Critical
Publication of US7531781B2 publication Critical patent/US7531781B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/28Interference filters
    • G02B5/281Interference filters designed for the infrared light
    • G02B5/282Interference filters designed for the infrared light reflecting for infrared and transparent for visible light, e.g. heat reflectors, laser protection
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/28Interference filters
    • G02B5/285Interference filters comprising deposited thin solid films
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/1462Coatings
    • H01L27/14621Colour filter arrangements
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14601Structural or functional details thereof
    • H01L27/14625Optical elements or arrangements associated with the device
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14645Colour imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14643Photodiode arrays; MOS imagers
    • H01L27/14649Infrared imagers
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • H01L27/14683Processes or apparatus peculiar to the manufacture or treatment of these devices or parts thereof
    • H01L27/14685Process for coatings or optical elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/17Colour separation based on photon absorption depth, e.g. full colour resolution obtained simultaneously at each pixel location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • H04N23/21Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • the present invention contains subject matter related to Japanese Patent Application JP 2005-050212 filed in the Japanese Patent Office on Feb. 25, 2005, the entire contents of which are incorporated herein by reference.
  • the present invention relates to an imager.
  • the imager includes a chip type, a package type, or a module type imaging device and a camera.
  • the present invention relates to a signal acquisition technique suitably applied, for example, to a solid-state imaging device using a semiconductor device capable of detecting and reading a physical value distribution in the form of electrical signals.
  • a semiconductor device capable of detecting and reading a physical value distribution in the form of electrical signals.
  • unit constituent elements which have sensitivity to an electromagnetic wave, such as light or radioactive rays, input from the outside are arranged, and the physical value distribution is converted into the electrical signals by the unit constituent elements.
  • the present invention relates to the structure in which images of respective wavelength components are independently obtained, that is, for example, an image of a visible light component and an image of a wavelength component (such as infrared light) other than visible light are independently obtained.
  • a semiconductor device for detecting a physical value distribution has been used in various fields, the semiconductor device having a plurality of unit constituent elements (such as pixels) arranged in lines or in a matrix, each of which has sensitivity to the change in physical value including electromagnetic waves, such as light or radioactive rays, input from the outside.
  • unit constituent elements such as pixels
  • electromagnetic waves such as light or radioactive rays
  • CMOS complementary metal-oxide semiconductor
  • CCD charge coupled device
  • MOS metal oxide semiconductor
  • CMOS complementary metal-oxide semiconductor
  • the devices read a physical value distribution in the form of electrical signals which are obtained through conversion of the above distribution by unit constituent elements (pixels in a solid-state imaging device).
  • photodiodes used as a photoelectric transducer (light-receiving element; photosensor) provided in an image-taking portion (pixel portion) of a device detect an electromagnetic wave, such as light or radioactive rays, input from the outside so as to generate and store signal charges, and the signal charges (photoelectrons) thus stored are read as image information.
  • an electromagnetic wave such as light or radioactive rays
  • Japanese Unexamined Patent Application Publication No. 2004-103964 is a single-plate system in which a visible image and an infrared image are separately obtained by exploiting the difference in position of light absorption in the depth direction of a semiconductor between wavelengths.
  • the structure described in Japanese Unexamined Patent Application Publication Nos. 10-210486, 2002-369049, and 06-121325 is a multiple plate system in which visible light and infrared light are separately received by respective imaging devices using a wavelength separation optical system such as a wavelength separation mirror or prism as an incident optical system.
  • infrared light emitted from an infrared irradiation device is irradiated to an object, and at the same time, in an image-taking portion, an infrared component of light from the object is reflected on a cold mirror and is made incident on one sensor.
  • a visible light component passing through the cold mirror is further separated into a red color component, a green color component, and a blue color component by three dichroic mirrors, and the above components are then made incident on respective sensors.
  • the visible light side has the structure similar to that of a related three-plate system, that is, wavelength separation is performed in a wavelength region including R, G, and B colors to obtain respective sensitivities, so that an image is formed.
  • three sensors are necessary and the cost is increased thereby; however, since the pixel size can be increased, the sensitivity can be improved.
  • an iris diaphragm is provided for an image-taking lens system, an optical filter which only transmits light having a wavelength of approximately 770 to 950 nm (infrared wavelength region) is used as a blade of the iris diaphragm, and visible light and infrared light are further separated from each other by a dichroic mirror.
  • an infrared-cut filter is provided at the transmitted visible light side, and a visible light-cut filter is provided at the reflected infrared light side.
  • the visible light and the infrared light are then made incident on respective sensors so as to separately obtain a visible image and an infrared image.
  • the reason the diaphragm portion is formed to have a function of absorbing (or reflecting) infrared light is that the structure is formed to be used in three-dimensional measurement application.
  • the structure described in Japanese Unexamined Patent Application Publication No. 09-166493 is a single-plate system in which a rotary type wavelength separation optical system is used as an incident optical system and in which visible light and infrared light are received by the same imaging device.
  • insertion and extraction of an infrared-cut filter is performed using a rotation mechanism, so that when the infrared-cut filter is inserted, a visible color image is output which is not influenced by near-infrared light and infrared light, and when the infrared-cut filter is extracted, an image is output which is obtained by adding the light intensity of visible light and that of near-infrared light.
  • Japanese Unexamined Patent Application Publication No. 09-130678 is that visible light and infrared light are received by the same imaging device using a diaphragm optical system having a wavelength separation function as an incident optical system.
  • FIGS. 23A and 23B are views each illustrating the structure of the sensor described in Japanese Unexamined Patent Application Publication No. 2004-103964, FIG. 23A is a graph showing optical absorption spectral characteristics of a semiconductor layer, and FIG. 23B is a schematic cross-sectional view of the structure of a semiconductor.
  • the optical absorption coefficient of a silicon (Si) semiconductor is decreased in the order from blue, green, red, to infrared light. That is, as for blue color, green color, red color, and infrared light included in incident light L 1 , by using the difference in position of light absorption in the depth direction of a semiconductor between wavelengths, layers for detecting visible light (blue, green, and red) and infrared light are provided in the Si semiconductor in the depth direction from the surface thereof as shown in FIG. 23B .
  • the amount of light which can be detected is not decreased from a theoretical point of view; however, when passing through the layer in which blue color light is detected, red color light and green color light are absorbed to a certain extent and are then detected as blue color light.
  • a blue signal is not present, when a green signal and a red signal enter the semiconductor, a signal also enters the layer in which blue color light is detected, and hence a pseudo signal is generated, so that sufficient color reproducibility may not be obtained.
  • the device becomes inevitably large and complicated.
  • the image sensor can only output an electrical signal which is obtained by synthesizing the visible image and the infrared image and may not output only the visible image nor only the infrared mage.
  • the visible color image and the infrared image are independently obtained by performing matrix calculation of outputs of the pixels provided with the respective four types of color filters each having its own filter characteristic, the visible color image and the infrared image can be independently and simultaneously output; however, the separation properties have a limitation.
  • the visible image obtained by the separation may not have sufficient color reproducibility, and the infrared image unfavorably contains an unnecessary component of the visible image.
  • the present invention was made in consideration of the problems described above, and it is desirable to provide a new structure in which at least one of the above problems is solved.
  • an imager which has a new structure in which a visible color image having superior color reproducibility and a near-infrared image which substantially contains no visible light component can be independently obtained.
  • the structure is provided in which when image-taking of a visible image and image-taking of an infrared image are simultaneously performed, a problem is solved in that the hue of the visible image becomes different from that of the original object, which is caused when an infrared-cut filter is removed. Accordingly, in the structure described above, an image (invisible image) of invisible light, such as infrared light or ultraviolet light, and a visible image having superior color reproducibility, that is, an accurate hue, can be simultaneously obtained. Furthermore, as still another example, the structure is also provided in which the increase in cost, which is caused by the use of an infrared-cut filter made of a glass having a large thickness as the case of a common imaging sensor, is suppressed.
  • An imager of an embodiment according to the present invention has a separation portion which separates an electromagnetic wave carrying an image into wavelength components; and image-taking portions detecting images of the wavelength components thus separated by the separation portion.
  • at least one of the image-taking portions has a detecting part optimized to detect a wavelength component which is to be detected.
  • “to have a detecting part optimized to detect a wavelength component which is to be detected” means, in short, that wavelength separation is also performed in the image-taking portion.
  • the wavelength separation is first performed in an incident system which is provided before the image-taking portions, and the wavelength separation is then also performed in at least one of the image-taking portions, so that the wavelength separation is performed a plurality of times.
  • a separation portion having superior separation properties is used as the separation portion which performs the wavelength separation in the incident system provided before the image-taking portions.
  • a spectral filter is preferably used which is made of a predetermined substrate having an optical transparency to an electromagnetic wave and a laminate member provided thereon.
  • layers having predetermined thicknesses are laminated to each other, and adjacent layers have different refractive indexes.
  • the laminate member is formed to reflect one wavelength of the electromagnetic wave and transmit another wavelength thereof.
  • a laminate member is provided on an incident surface side of the detecting part on which an electromagnetic wave is incident.
  • the laminate member described above has the structure in which layers having predetermined thicknesses are laminated to each other and adjacent layers have different refractive indexes so as to reflect a predetermined wavelength region component of the electromagnetic wave and so as to transmit the rest thereof.
  • wavelength separation may be performed using a monolayer film having a predetermined thickness. Even in the case in which a monolayer film is used, when the thickness thereof is changed, properties can be obtained which reflect a predetermined wavelength region component of an electromagnetic wave and transmit the rest thereof. Hence, a wavelength separation member using this monolayer film may be provided at the incident surface side of the detecting part on which an electromagnetic wave is incident.
  • the rest does not mean all wavelength components other than a reflection wavelength region component which is to be reflected but may be at least a component that does not practically contain the reflection wavelength region component.
  • that does not practically contain the reflection wavelength region component means that influence of the reflection wavelength region component is not substantially present or may be slightly present in some cases. The reason for this is that when a signal capable of ignoring the influence of the reflection wavelength region component is obtained at a transmission wavelength region side, any problems may not occur. In addition, as is the case described above, when a signal capable of ignoring the influence of the transmission wavelength region component is obtained at the reflection wavelength region side, any problems may not occur.
  • a spectral detecting part may be used having the structure in which a spectral filter using the above laminate member or monolayer film is integrally formed on the front surface of a light-receiving part.
  • the structure may be used in which wavelength separation is performed using the difference in position of light absorption between wavelengths in the depth direction of a semiconductor forming a detecting part, and in which while influence of a wavelength component other than a wavelength component which is to be detected by the detecting part is suppressed, the wavelength component which is to be detected by the above detecting part can be detected.
  • the wavelength component other than that to be detected can also be detected, by using this detection result, the wavelength component to be detected by this detecting part and a wavelength component which is to be detected by another detecting part may be corrected.
  • the structure is formed in which wavelength separation is surely performed in the incident system provided before the image-taking portions and is also performed in at least one of the image-taking portions.
  • the wavelength separation is performed in both the incident system and the image-taking system, the wavelength separation properties can be improved. For example, a visible image which is not influenced by infrared light and which has an accurate hue and an infrared image which is not influenced by visible light can be simultaneously obtained.
  • the cost can be decreased, and the size of the device can also be decreased.
  • an imaging device of an embodiment according to the present invention includes a spectroscopic portion which separates incident light into transmitted light and reflected light depending on frequency of the incident light, and an imaging area receiving the transmitted light.
  • the spectroscopic portion includes a multilayer structure having refraction index distribution.
  • FIG. 1 is a schematic view showing the structure of an imager of a first embodiment according to the present invention
  • FIG. 2 is a schematic view illustrating a spectral filter which separates an electromagnetic wave into predetermined wavelengths using a dielectric laminate film;
  • FIG. 3 is a schematic view illustrating a basic structure of a spectral filter using a dielectric laminate film
  • FIG. 4 is a schematic view illustrating a basic concept of a method for designing a laminate film
  • FIG. 5 is a graph showing reflection spectra illustrating a basic concept of a method for designing a laminate film
  • FIG. 6 is a graph showing reflection spectra illustrating a basic concept of a method for designing a laminate film
  • FIGS. 7A and 7B are views each illustrating the condition of a reflection central wavelength ⁇ (views each showing the concept of a reflection spectrum);
  • FIG. 8 is a graph showing reflection spectra illustrating the condition of a reflection central wavelength ⁇
  • FIG. 9 is a graph showing reflection spectra illustrating the condition of a reflection central wavelength ⁇
  • FIG. 10 shows a schematic structure of a spectral image sensor for single-wavelength separation using a laminate film according to the first embodiment
  • FIG. 11 includes graphs showing reflection spectra of a spectral image sensor for single-wavelength separation using a laminate film according to the first embodiment
  • FIG. 12 includes graphs showing detailed reflection spectra of a spectral image sensor for single-wavelength separation using a laminate film according to the first embodiment
  • FIG. 13 is a schematic view illustrating the structure of a spectral image sensor for single-wavelength separation using a laminate film according to the first embodiment
  • FIG. 14 is a graph showing a reflection spectrum illustrating a spectral image sensor for single-wavelength separation using a laminate film according to the first embodiment
  • FIG. 15 is a schematic view illustrating the structure of a spectral image sensor for single-wavelength separation using a laminate film according to a second embodiment
  • FIG. 16 is a schematic view illustrating the structure of a spectral image sensor for single-wavelength separation using a laminate film according to the second embodiment
  • FIG. 17 is a schematic view illustrating the structure of a spectral image sensor for single-wavelength separation using a laminate film according to a third embodiment
  • FIG. 18 is a schematic view illustrating the structure of a spectral image sensor for single-wavelength separation using a laminate film according to a fourth embodiment
  • FIG. 19 is a schematic view illustrating the structure of a spectral image sensor for single-wavelength separation using a laminate film according to a fifth embodiment
  • FIGS. 20A and 20B are schematic views showing the structure of an imager of the second embodiment according to the present invention.
  • FIGS. 21A , 21 B and 21 C are schematic views showing the structure of an imager of the third embodiment according to the present invention.
  • FIGS. 22A and 22B are schematic views showing one example of a signal acquisition method of the second and the third embodiments according to the present invention.
  • FIGS. 23A and 23B are views schematically illustrating the structure of a sensor described in Japanese Unexamined Patent Application Publication No. 2004-103964.
  • FIG. 1 is a schematic view showing the structure of an imager of the first embodiment according to the present invention.
  • An imager 100 of this first embodiment is an imager capable of independently obtaining a visible color image and a near-infrared image.
  • the imager 100 of the first embodiment has a taking lens 102 which guides light L carrying an image of an object Z to an image-taking portion side to form an image, and an optical member (hereinafter referred to as “wavelength separation optical system”) 104 for wavelength separation which separates incident light L 1 passing through the taking lens 102 into visible light VL and infrared light IR which is one example of invisible light.
  • wavelength separation optical system optical member
  • the wavelength separation optical system 104 is one example of a wavelength separation portion which separates light passing through the taking lens 102 into a plurality of wavelengths, the light being one example of an electromagnetic wave carrying an image. That is, the imager 100 has the structure in which wavelength separation is performed in an incident system.
  • the imager 100 of the first embodiment also has an infrared image-taking portion 110 having an image sensor 112 , a visible image-taking portion 120 having an image sensor 122 , and an imaging signal processing portion 130 , the image sensor 112 receiving the infrared light IR obtained by separation using the wavelength separation optical system 104 and forming an infrared image, the image sensor 122 receiving the visible light VL obtained by separation using the wavelength separation optical system 104 and forming a visible image, the imaging signal processing portion 130 processing imaging signals SIR and SV output from the infrared image-taking portion 110 and the visible image-taking portion 120 , respectively.
  • the imager 100 incorporates an optical image which represents the object Z and which contains the infrared light IR by using the taking lens 102 and separates the optical image into an infrared image (infrared light image) and a visible image (visible light image).
  • predetermined signal processing such as color signal separation into R, G, and B components
  • a color image signal and an infrared image signal are output, or a mixed image signal formed of the aforementioned two signals is output.
  • the taking lens 102 is a lens which is formed of an optical material such as quartz or sapphire transmitting light having a wavelength of approximately 380 to 2,200 nm and which incorporates an optical image containing the infrared light IR, so that the optical image is made incident on the wavelength separation optical system 104 while light of the optical image is being collected.
  • an optical material such as quartz or sapphire transmitting light having a wavelength of approximately 380 to 2,200 nm and which incorporates an optical image containing the infrared light IR, so that the optical image is made incident on the wavelength separation optical system 104 while light of the optical image is being collected.
  • the wavelength separation optical system 104 is provided on the optical axis of the taking lens 102 so that the incident light is separated into the visible light VL and the infrared light IR as invisible light, and while the infrared light IR is made incident on the infrared image-taking portion 110 after being totally reflected, the visible light is made incident on the visible image-taking portion 120 after passing through the wavelength separation optical system 104 .
  • the structure described above is a first feature of the imager 100 according to the first embodiment.
  • a mirror 105 is mounted, for example, at an angle of 45° relative to the optical axis, the mirror 105 totally reflecting the infrared light IR at a long wavelength side and transmitting all wavelength region components of the visible light VL having a wavelength shorter than that of the infrared light IR.
  • the mirror 105 in order to obtain superior wavelength separation properties, a mirror composed of an optical transparent substrate 105 a and an optical filter 105 b provided on a surface thereof is used.
  • the optical filter 105 b is a multilayered thin film (optical interference film) formed by depositing dielectric materials, and the substrate 105 a is formed of an optical material such as quartz, sapphire, or a high molecular weight material.
  • the optical image emitted from the taking lens 102 thus obtained is incorporated and is then separated using the optical interference and reflection by the optical filter 105 b into two wavelength regions as shown in the figure, that is, into the visible light VL having a wavelength of approximately 380 to 700 nm and the infrared light IR having a wavelength of approximately 700 to 2,200 nm, so that an infrared image thus obtained is made incident on the infrared image-taking portion 110 and so that a visible image is also made incident on the visible image-taking portion 120 .
  • a second feature of the imager 100 according to the first embodiment is that a detecting part (image sensor) optimized to detect a wavelength component which is to be detected is provided for at least one of the image-taking portions 110 and 120 .
  • the image sensor 122 optimized to detect the visible light VL is provided in the visible image-taking portion 120 which detects the visible light VL and a shorter wavelength side of the infrared light IR.
  • the “optimized image sensor” indicates an image sensor having a wavelength separation structure in which the amount of a wavelength component other than a wavelength component which is to be detected is decreased as small as possible in an imaging signal of the wavelength component which is to be detected.
  • the wavelength separation is performed twice, that is, the wavelength separation is performed on the light path by the wavelength separation optical system 104 and by the image sensor; hence, the size of the optical system can be decreased, and the wavelength separation properties can also be advantageously improved.
  • a spectral image sensor (spectral detecting part) 11 is provided at the incident surface side of the image sensor 122 on which an electromagnetic wave is incident, the spectral image sensor 11 having the structure which performs wavelength separation using a dielectric laminate film.
  • the dielectric laminate film is made of a laminate member in which layers having predetermined thicknesses are laminated to each other and in which adjacent layers have different refractive indexes from each other, the laminate member having properties which reflect a wavelength component (infrared light IR component in this embodiment) of the incident light (electromagnetic wave) other than that to be detected and which transmit the rest (visible light VL component in this embodiment) of the incident light.
  • a CCD type, a CMOS type, and any other type structure may be used.
  • the features of the laminate member described above may be said that the wavelength component which is to be detected (visible light VL in this embodiment) of the incident light (electromagnetic wave) is transmitted and that the rest thereof (infrared light IR in this embodiment) is reflected.
  • the image sensor 122 in order to response the case in which the infrared light IR component leaks to the visible light VL detection side, the image sensor 122 is used in the visible image-taking portion 120 , the sensor 122 having a spectral image sensor structure using a dielectric laminate film which is optimized to detect the visible light VL. Accordingly, without substantially receiving the influence of the infrared light IR, the visible image can be obtained independently of the infrared image.
  • the wavelength separation optical system 104 when the infrared light IR is reflected to the infrared image-taking portion 110 side and when the visible light VL is transmitted to the visible image-taking portion 120 side, a component at the visible light side depends on the separation capability of the wavelength separation optical system 104 , and although the visible light VL is a primary component, several percent of the infrared light IR is still inevitably contained therein.
  • the leak of the infrared light IR can be optically excluded, and only photoelectrons of the incident visible light VL component are converted into electrical signals.
  • wavelength separation is not performed on a light path
  • spectral filters using a dielectric multilayer film are integrally formed on visible light detecting pixels (particularly, each of R, G, and B pixels) on one image sensor
  • spectral filters using a dielectric multilayer film are not formed on infrared light detecting pixels, so that the visible image and the infrared image can be independently and simultaneously obtained.
  • the color reproducibility of the visible image obtained by wavelength separation may be decreased by an amount corresponding to the amount of leak, and the leak of the visible image may unfavorably appear in the infrared image in some cases.
  • the structure of this first embodiment is different from the structure described in Japanese Unexamined Patent Application Publication Nos. 10-210486 and 06-121325 in which the wavelength components separated by the wavelength separation optical system are made incident on the respective sensors having the structures equivalent to each other so that the visible image and the infrared image are independently obtained.
  • the structure is formed in which even in the image-taking system, the wavelength separation is performed, followed by the detection, the entire wavelength separation properties can be improved.
  • the structure of the first embodiment is different from that described in Japanese Unexamined Patent Application Publication No. 10-210486 in which the visible light component passing through the cold mirror is further separated into a red color, a green color, and a blue color component by the three dichroic mirrors and in which the components thus obtained are made incident on the respective sensors so as to obtain respective R, G, and B images of the visible light VL.
  • the three sensors are necessary for the visible light VL, and although the sensitivity is improved, the cost is disadvantageously increased.
  • the problem described above may not occur at all.
  • the structure of the first embodiment is also different from that described in Japanese Unexamined Patent Application Publication No. 2002-369049 in which wavelength separation is performed twice on the light path, and in which components thus separated are made incident on the respective sensors having structures similar to each other so as to independently obtain the visible image and the infrared image.
  • the method described in Japanese Unexamined Patent Application Publication No. 2002-369049 since the wavelength separation is performed twice on the light path, the optical system becomes disadvantageously large and complicated. In addition, problems of sensitivity, blurring and the like may occur. However, according to the structure of the first embodiment, the problems described above will not occur at all.
  • an infrared-cut filter which is one example of a subtractive filter
  • the cost can be significantly decreased.
  • an infrared-cut filter having a certain thickness and a certain weight is not used, the optical system can be decreased in size and in weight.
  • an insertion and extraction mechanism for an infrared-cut filter is not necessary, the device is not large and complicated. Compared to the case in which an existing infrared-cut filter made of a glass is used, the cost can be advantageously decreased, and in addition, owing to a smaller size, an imager such as a digital camera having superior portability can be provided.
  • an infrared-cut filter which is weak as a whole may be provided between the wavelength separation optical system 104 (mirror 105 ) and the visible image-taking portion 120 (at the light-receiving side of the visible image-taking portion 120 ).
  • an infrared-cut filter of 50 % or less is provided, the infrared light IR can be cut to a level at which the visible light VL is not substantially influenced.
  • a visible light-cut filter which is weak as a whole may be provided between the wavelength separation optical system 104 (mirror 105 ) and the infrared image-taking portion 110 (at the light-receiving side of the infrared image-taking portion 110 ).
  • the visible light VL can be cut to a level at which the infrared light IR is not substantially influenced.
  • a color filter may be provided at the light-receiving side of the image sensor 112 , the color filter transmitting at least the infrared light IR, which is the reflection wavelength region component, and also transmitting a predetermined wavelength component of the visible light VL which is the transmission wavelength region component.
  • the visible light-cut filter when the visible light-cut filter is not provided at the light-receiving side of the image sensor 112 , the visible light VL component leaks to the side detecting the infrared light IR, and the infrared image mixed with a visible image of this leak component is obtained.
  • the intensities of blue, red, and green colors detected by three color pixels R, G, and B, which are obtained by the image sensor 122 must be decreased.
  • a green color filter which transmits the infrared light IR and green color light is provided as a visible light-cut filter
  • a component containing both the infrared light IR and green visible light LG is obtained from the image sensor 112 ; however, by the difference from a green color component only made of the visible light VL which is obtained from the image sensor 122 , an image of the infrared light IR which is not substantially influenced by the visible light VL (green color light G in this case) can be obtained.
  • the green color filter is necessarily provided at the light-receiving side of the image sensor 112 , compared to the case in which the intensities of blue, red, and green colors detected by the three pixels R, G, and B are decreased without providing the green color filter, the process becomes easy.
  • a black color filter which transmits the infrared light IR and which absorbs only the visible light VL, is provided as a visible light-cut filter so as to absorb the visible light VL, a component only made of the infrared light IR is obtained from the image sensor 112 , and without performing the difference processing, an infrared image made of the infrared light IR which is not substantially influenced by the visible light VL can be obtained.
  • Charge signals read from the image sensors 112 and 122 that is, an infrared imaging signal SIR carrying an infrared image and a visible imaging signal SV carrying a visible image are sent to the imaging signal processing portion 130 , are amplified to a predetermined level thereby, and are then converted from an analog signal to a digital signal.
  • the digital image signal is processed by gamma correction and then separated into color separation signals of R, G, and B, these signals are converted to brightness signals and color signals or converted to picture signals formed from the aforementioned two types of signals, followed by output thereof.
  • the mirror 105 using a dielectric laminate film is applied to the wavelength separation optical system 104 for cutting the infrared light IR and transmitting the visible light VL so as to receive the infrared light IR and the visible light VL by respective image sensors, that is, when the infrared light is cut or is not cut depending on the situation, image-taking only using the visible light VL and image-taking only using the infrared light IR can be simultaneously performed, or image-taking only using the visible light VL and image-taking using both the infrared light IR and the visible light VL can be simultaneously performed.
  • Image-taking of a monochrome image or a color image in the daytime can be performed without receiving any influence of the infrared light IR, and in the night, image-taking can be performed using the infrared light IR. Whenever necessary, the visible image and the infrared image can be simultaneously output. Even in this case, in the daytime, an image only made of the infrared light IR can be obtained without receiving any influence of the visible light VL.
  • a monochrome image made of the visible light VL can be obtained which is not substantially influenced by the infrared light IR.
  • the calculation processing with the infrared light IR component is not necessary.
  • an optical member which separates the visible light VL into predetermined wavelength region components when color filters having predetermined wavelength transmission properties in the visible light VL region are provided on the image sensor 122 so as to correspond to pixels (unit pixel matrix), an image made of a specific wavelength region in the visible light VL region can be obtained which is not substantially influenced by the infrared light IR.
  • the visible light VL can be separated into wavelengths (into colors), and when synthesis processing is performed based on pixel signals obtained from these pixels having respective colors, a color image (visible color image) made of the visible light VL which is not substantially influenced by the infrared light IR can be obtained.
  • a color image visible color image
  • the calculation processing with the infrared light IR component is not necessary.
  • a monochrome image or a color image of the visible light VL and an “image relating to the infrared light IR” may always be independently obtained.
  • the “image relating to the infrared light IR” means an image of the infrared light IR which is not substantially influenced by the visible light VL, or an image made of a mixture of the infrared light IR and the visible light VL.
  • Image-taking (monochrome image-taking or color image-taking) of an image of the visible light VL which is not substantially influenced by the infrared light IR and image-taking of an image containing both the infrared light IR and the visible light VL can also be simultaneously performed.
  • image-taking of an image of the infrared light IR which is not substantially influenced by the visible light VL can also be performed.
  • the above “which is not substantially influenced” may also mean “which may be slightly influenced” as long as a clear difference may not be generally recognized by human visual sensation. That is, it may be acceptable when an infrared image (one example of physical information) which can ignore the influence of a transmission wavelength region (visible light VL) is obtained at the infrared light IR side and when a general image (one example of physical information) which can ignore the influence of a reflection wavelength region component (infrared light IR) is obtained at the visible light VL side.
  • the color filters there may be used primary color filters such as three primary color filters of the visible light VL (wavelength ⁇ in the range of 380 to 780 nm).
  • the visible light VL there are a blue color component (for example, the transmittance is approximately 1 at a wavelength ⁇ of 400 to 500 nm and is approximately zero at a wavelength other than that), a green color component (for example, the transmittance is approximately 1 at a wavelength ⁇ of 500 to 600 nm and is approximately zero at a wavelength other than that), and a red color component (for example, the transmittance is approximately 1 at a wavelength ⁇ of 600 to 700 nm and is approximately zero at a wavelength other than that).
  • a blue color component for example, the transmittance is approximately 1 at a wavelength ⁇ of 400 to 500 nm and is approximately zero at a wavelength other than that
  • a green color component for example, the transmittance is approximately 1 at a wavelength ⁇ of 500 to 600 nm and is approximately zero at a wavelength other than
  • complementary color filters contains components having approximately zero transmittance with respect to the visible three primary color components, such as yellow (Ye) (for example, the transmittance is approximately zero at a wavelength ⁇ of 400 to 500 nm and is approximately 1 at a wavelength other than that), magenta (Mg) (for example, the transmittance is approximately zero at a wavelength ⁇ of 500 to 600 nm and is approximately 1 at a wavelength other than that), and cyan (Cy) (for example, the transmittance is approximately zero at a wavelength ⁇ of 600 to 700 nm and is approximately 1 at a wavelength other than that).
  • yellow yellow
  • Mg magenta
  • Cy cyan
  • the complementary color filters having colors complementary to the respective primary colors are used for transmitted light in the visible light region, the sensitivity of an imager can be improved.
  • the primary color filters is used, easy signal processing can be advantageously performed since primary color signals can be obtained without performing difference processing.
  • a transmittance of approximately 1 indicates an ideal state and may include the case in which the transmittance in a specific wavelength region is significantly larger than that in the other wavelength regions. Hence, a filter may have a transmittance of other than 1 in a part of the specific wavelength region.
  • a transmittance of approximately zero indicates an ideal state as described above and may include the case in which the transmittance in a specific wavelength region is significantly smaller than that in the other wavelength regions. Hence, a filter may have a transmittance of other than zero in a part of the specific wavelength region.
  • any primary color filter and any complementary color filter may be used as long as transmitting a wavelength region component of a predetermined color (primary color or complementary color) in the visible light VL region, which is the transmission wavelength region component, regardless whether the infrared light IR region, which is the reflection wavelength region component, is transmitted or not, that is, regardless of the transmittance for the infrared light IR.
  • a wavelength region component of a predetermined color primary color or complementary color
  • each filter has sensitivity to the infrared light region and transmits light in the infrared light region.
  • any problems may not occur.
  • FIG. 2 is a view illustrating the concept of the wavelength separation in which an electromagnetic wave is separated into predetermined wavelengths using a dielectric laminate film.
  • light which is an electromagnetic wave, is separated into predetermined wavelengths.
  • a dielectric laminate film 1 is a laminate member composed of layers each having a predetermined thickness dj (j is an integer of 1 or more, and hereinafter, j indicates the same meaning as described above) and being laminated to each other, and in this laminate film 1 , a refractive index nj of one layer is different from that of a layer adjacent thereto (difference in refractive index being represented by ⁇ n). Accordingly, as described later, properties can be obtained which reflect a predetermined wavelength region component of the electromagnetic wave and transmit the rest thereof.
  • dj is an integer of 1 or more, and hereinafter, j indicates the same meaning as described above
  • the way of counting the number of dielectric layers, a j-th layer being represented by a layer 1 — j , forming the dielectric laminate film 1 is that thick layers (a 0-th layer 1 _ 0 and a k-th layer 1 — k ) provided at the two sides are not counted as the number of layers, and for example, the number of layers is counted from a first layer toward the k-th layer side.
  • the dielectric laminate film 1 is formed without the two thick layers (the 0-th layer 1 _ 0 and the k-th layer 1 — k ).
  • the reflectance (or transmittance) tends to have dependence on the wavelength k to a certain extent due to the interference in the dielectric laminate film 1 .
  • the difference ⁇ n in refractive index is increased, the above effect becomes significant.
  • the dielectric laminate film 1 has a periodical structure or satisfies a certain condition (for example, a thickness d of each layer is approximately ⁇ /4n)
  • a thickness d of each layer is approximately ⁇ /4n
  • the reflectance of light (specific wavelength region light) in a specific wavelength region is effectively increased so that most thereof is turned into a reflected light component L 2 , that is, the transmittance can be decreased
  • the reflectance of the other light can be decreased so that most thereof is turned into a transmitted light component L 3 , that is, the transmittance can be increased.
  • the wavelength ⁇ is a central wavelength in a certain wavelength region
  • n indicates the refractive index of the layer.
  • FIG. 3 is a conceptual view of the basic structure of the spectral filter 10 suitably applied to the mirror 105 using a dielectric laminate film.
  • FIG. 3 shows the case in which light is separated into infrared light IR and visible light VL.
  • the dielectric laminate film 1 is formed so as to have a high reflectance for the infrared light IR having a wavelength ⁇ (primarily at a long wavelength side of 780 nm or more) in an infrared region which is a longer wavelength side than that of the visible light VL, the infrared light IR can be cut.
  • a j-th layer being represented by a layer 1 — j , forming the dielectric laminate film 1 , since the dielectric laminate film 1 is formed of a plurality of layers, at least two types of members are used, and in addition, when the dielectric laminate film 1 is formed of at least three layers, all the layers may be formed of members different from each other, or two types (or more than that) of members may be laminated alternately or in an optional order.
  • the dielectric laminate film 1 may be basically formed of a first and a second layer material, and a part of the dielectric laminate film 1 may be replaced with a third layer material (or with two new types of layer materials or more).
  • FIGS. 4 to 6 are views illustrating a basic concept of a method for designing the dielectric laminate film 1 .
  • the dielectric laminate film 1 is formed of basic first and second layer materials so as to selectively reflect the infrared light IR.
  • the dielectric laminate film 1 used in this embodiment is sandwiched at the two sides thereof by thick layers made of silicon oxide (hereinafter referred to as “SiO 2 ”), the layer provided at the light incident side being called a 0-th layer and a layer opposite thereof being called a k-th layer, and a plurality of dielectric layers, a j-th layer being represented by a layer 1 — j , made of the first and the second layer materials are laminated to each other so as to form the dielectric laminate film 1 .
  • a j-th layer being represented by a layer 1 — j
  • common materials are used as the first and the second layer materials forming the dielectric layers.
  • silicon nitride Si 3 N 4 (hereinafter referred to as “SiN”) is used as the first layer material
  • silicon oxide SiO 2 is used as the second layer material
  • these two types of materials are laminated alternately.
  • dj (j indicates a layer number, and hereafter, j indicates the same meaning as described above) is the thickness of a dielectric layer 1 — j forming the dielectric laminate film 1 ; nj indicates the refractive index of the dielectric layer 1 — j ; and ⁇ 0 is a central wavelength (hereinafter referred to as “reflection central wavelength) of the reflection wavelength region.
  • the way of counting the number of dielectric layers, a j-th layer being represented by a layer 1 — j , forming the dielectric laminate film 1 is that thick SiO 2 layers provided at the two sides are not counted as the number of layers, and the number of layers is counted from the first layer toward the k-th layer side.
  • the structure of a SiN layer/a SiO 2 layer/a SiN layer is a three-layered structure
  • a SiN layer/a SiO 2 layer/a SiN layer/a SiO 2 layer/a SiN layer is a five-layered structure.
  • FIG. 4 a seven-layered structure is shown.
  • the reflection central wavelength ⁇ 0 of the infrared light IR of the reflection wavelength region is set to 900 nm
  • the silicon nitride SiN forming the odd-numbered layers has a refractive index n ⁇ of 2.3
  • the silicon oxide SiO 2 forming the 0-th layer, the k-th layer, and the even-numbered layers has a refractive index n ⁇ of 1.46
  • the difference ⁇ n in refractive index is 0.57.
  • FIG. 5 is a graph showing the reflectances R (reflection spectra) calculated using the effective Fresnel coefficient method by changing the number of layers, which are formed of common materials, of the structure shown in FIG. 4 , and by these results, the layer number dependence of the reflection spectra can be obtained.
  • the reflectance R having a peak around 900 nm, which is the reflection central wavelength ⁇ 0 of the infrared light IR is increased. Furthermore, it is understood that when a wavelength of 900 nm is selected as the reflection central wavelength ⁇ 0 , the infrared light IR and the visible light VL are approximately separated from each other. According to the results shown in FIG. 5 , it is understood that when the number of layers is set to 5 or more, a reflectance R of 0.5 or more can be obtained, and in particular, it is also understood that when the number of layers is set to 7 or more, the reflectance R can be preferably increased to more than 0.7.
  • FIG. 6 is a graph illustrating the thickness dependence of the reflectance R (the change in reflectance caused by variation in thickness of the dielectric layer 1 — j ).
  • the results are shown obtained by calculation based on the case in which a seven-layered structure is formed, and in which the thickness of each dielectric layer is changed by ⁇ 10%.
  • conditional equation (1) is an ideal value obtained by calculation based on the Fresnel coefficient method; however, in practice, the condition represented by the equation (1) is mild and has a wide range. For example, it is understood by calculation based on the Fresnel coefficient method that even when the thickness dj has an error of ⁇ 10%, the reflectance can be effectively increased.
  • the reflectance R can be effectively increased.
  • a sufficient reflectance R such as 0.5 or more, can be obtained, and in the entire infrared light IR (primarily at a long wavelength side of 780 nm or more), it is understood that strong reflection occurs.
  • the thickness dj of the dielectric layer 1 — j is practically set within the range represented by the following equation (2), the reflectance can be effectively and significantly increased.
  • FIGS. 7A to 9 are views illustrating the condition of the reflection central wavelength ⁇ 0 .
  • the numeric condition of the thickness dj depends on a spectral band width ⁇ IR of an infrared reflection region.
  • ⁇ IR spectral band width of an infrared reflection region
  • the reflection central wavelength ⁇ 0 may be selected so as to reflect the infrared light IR in the range of 0.78 ⁇ m to 0.95 ⁇ m.
  • the band width ⁇ IR of the infrared reflection region is increased when the difference ⁇ n in refractive index of the dielectric laminate film 1 is large, and to the contrary, when the difference ⁇ n in refractive index is small, the band width ⁇ IR is decreased.
  • the band width ⁇ IR of the infrared reflection region is decreased when the multilayered film is made of SiN/SiO 2
  • the band width ⁇ IR is increased when the multilayered film is made of Si/SiO 2 .
  • the multilayered film is made of SiN/SiO 2 (difference ⁇ n in refractive index being 0.57), from the calculation results based on the reflection central wavelengths ⁇ 0 of 780 nm and 950 nm shown in the reflection spectra in FIG. 8 , it is understood that the above conditions are almost satisfied when the reflection central wavelength ⁇ 0 is set in the range of 780 to 950 nm.
  • the results shown in FIG. 8 are obtained from a laminate structure shown in FIG. 13 which will be described later and are also obtained by calculation by only changing the film thickness dj of the dielectric layer 1 — j so as to obtain a central wavelength ⁇ 0 of 780 nm and that of 950 nm.
  • the multilayered film is made of Si/SiO 2 (difference ⁇ n in refractive index being 2.64), and when the central wavelength ⁇ 0 is in the range of 900 to 1,100 nm as reflection spectra shown in FIG. 9 , the above conditions are almost satisfied.
  • the reflection central wavelength ⁇ 0 satisfies the following equation (3-1).
  • the following equation (3-2) is satisfied.
  • These equations mean that the reflection central wavelength ⁇ 0 is ideally set around 900 nm. 780 nm ⁇ 0 ⁇ 1100 nm (3-1) 850 nm ⁇ 0 ⁇ 1000 nm (3-2)
  • the materials described above are merely examples, and the effect described above is not only limited to the combination between a silicon oxide SiO 2 layer and a silicon nitride SiN layer. It is estimated by calculation that when materials are selected to obtain a difference in refractive index of 0.3 or more and preferably 0.5 or more, an effect similar to that described above can be obtained.
  • the SiN film may have variation in composition to a certain extent which is caused by formation conditions.
  • the dielectric layer 1 — j forming the dielectric laminate film 1 besides silicon oxide SiO 2 and silicon nitride SiN, there may be used oxides such as alumina Al 2 O 3 , zirconia ZrO 2 (refractive index: 2.05), titanium dioxide TiO 2 (refractive index: 2.3 to 2.55), magnesium oxide MgO, and zinc oxide ZnO (refractive index: 2.1); high molecular weight materials such as polycarbonate PC (refractive index: 1.58) and an acrylic resin PMMA (refractive index: 1.49); and semiconductor materials such as silicon carbide SiC (refractive index: 2.65) and germanium Ge (refractive index: 4 to 5.5).
  • oxides such as alumina Al 2 O 3 , zirconia ZrO 2 (refractive index: 2.05), titanium dioxide TiO 2 (refractive index: 2.3 to 2.55), magnesium oxide MgO, and
  • the spectral filter 10 can be formed so as to have properties which may not be obtained from a glass. That is, by using a plastic, a light-weight spectral filter having superior durability (high temperature, high humidity, and impact resistance) can be formed.
  • FIGS. 10 to 14 are views illustrating one embodiment of a spectral image sensor 11 suitably applied to the image sensor 122 using the dielectric laminate film 1 .
  • This spectral image sensor 11 is formed by a basic method for designing the spectral filter 10 using the dielectric laminate film 1 .
  • a design example of the spectral image sensor 11 will be described in which the dielectric laminate film 1 which selectively reflects the infrared light IR is provided on a semiconductor element layer so as to receive a visible light VL component while the infrared light IR is cut.
  • the basic structure of the spectral image sensor 11 contains a semiconductor element layer and the spectral filter 10 provided on a light-receiving portion thereof. Only by this basic structure described above, a spectral image sensor used for single-wavelength separation (that is, for image-taking of a monochrome image) is formed; however, when predetermined colors (for example, R, G, and B) of color separation filters are provided at respective light-receiving portions of the spectral image sensor 11 , a spectral image sensor used for image-taking of a color image can be formed.
  • predetermined colors for example, R, G, and B
  • the dielectric laminate film 1 described with reference to FIGS. 4 to 6 is formed on a semiconductor element layer provided with a detection element such as a silicon (Si) photodetector and having a refractive index larger than that of each layer 1 — j forming the dielectric laminate film 1 , the distance from the semiconductor element layer to the dielectric laminate film 1 , that is, the thickness dk of a silicon oxide SiO 2 layer forming the k-th layer, that is, the dielectric layer 1 — k , is important.
  • a detection element such as a silicon (Si) photodetector
  • FIG. 11 includes views of reflection spectra for illustrating the relationship between the total reflected light LRtotal and the thickness dk of a silicon oxide SiO 2 layer forming the dielectric layer 1 — k .
  • the dielectric laminate film 1 having a seven-layered structure shown in FIG. 4 is used, and the results are obtained by calculation while the thickness dk of the dielectric layer 1 — k is changed.
  • the horizontal axis and the vertical axis represent the wavelength ⁇ ( ⁇ m) and the reflectance R, respectively.
  • the half width of the dip in the infrared region is 30 nm or less when the thickness dk is 2.5 ⁇ m or more, and in particular, since the half width of the dip is decreased to 20 nm or less when the thickness dk is 5.0 ⁇ m or more, the half width is sufficiently decreased relative to general, broad, natural light, and hence an averaged reflectance is obtained. Furthermore, from the spectra obtained when the thickness dk is 0.3 to 1.0 ⁇ m, it is also understood that the reflectance in the visible light VL region is high. As a result, the thickness dk is preferably approximately 0.154 ⁇ m, that is, the value that satisfies the equation (1) is most suitable.
  • FIG. 12 includes views of reflection spectra for illustrating the relationship between the reflectance R and the thickness dk of a silicon oxide SiO 2 layer forming the dielectric layer 1 — k .
  • the results are shown which are obtained by changing the thickness dk at around 0.154 ⁇ m.
  • the horizontal axis and the vertical axis represent the wavelength ⁇ ( ⁇ m) and the reflectance R, respectively.
  • the optimum structure of the spectral image sensor 11 practically has a dielectric laminate film 1 A of an eight-layered structure including the dielectric layer 1 — k , which is the k-th layer, and the results of the reflection spectrum obtained by calculation is as shown in FIG. 14 .
  • the dielectric laminate film 1 A has the structure in which a layer made of silicon oxide SiO 2 used as the second layer material is periodically provided four times on a silicon substrate 1 _ ⁇ .
  • FIGS. 15 to 19 are views showing modified examples of the spectral filter 10 and the spectral image sensor 11 .
  • the above spectral filter 10 has a basic structure using the dielectric laminate film 1 , and in addition, various modifications can be made.
  • the above spectral image sensor 11 has a basic structure in which the spectral filter 10 using the dielectric laminate film 1 is formed on a light-receiving portion such as a CMOS or a CCD, and various modifications can also be made.
  • a light-receiving portion such as a CMOS or a CCD
  • the reflection in the visible light region can also be decreased (hereinafter referred to as “the first modified example”).
  • the spectral image sensor 11 when the spectral image sensor 11 is formed in which between the third layer 1 _ ⁇ added in the first modified example and the silicon substrate 1 _ ⁇ , a fourth layer 1 _ ⁇ (such as a silicon oxide SiO 2 layer) having a refractive index smaller than that of the third layer 1 _ ⁇ is provided, a dark current can be further decreased (hereinafter referred to as “the second modified example”).
  • a fourth layer 1 _ ⁇ such as a silicon oxide SiO 2 layer having a refractive index smaller than that of the third layer 1 _ ⁇
  • a fifth layer 1 _ ⁇ (such as a silicon Si layer having a thickness d ⁇ of 61 nm and a refractive index of 4.1 which is higher than that of silicon nitride SiN and that of silicon oxide SiO 2 ) is additionally provided which has a refractive index higher than that of the basic first and second layer materials forming this dielectric laminate film 1 , the number of the dielectric layers, a j-th layer represented by a dielectric layer 1 — j , forming the dielectric laminate film 1 can be decreased (hereinafter referred to as “the third modified example”).
  • the spectral image sensor 11 in which a plurality of fifth layers 1 _ ⁇ (such as a silicon Si layer having a thickness d ⁇ of 61 nm and a refractive index of 4.1 which is higher than that of silicon nitride SiN and that of silicon oxide SiO 2 ) is additionally provided, each of which has a refractive index higher than that of the basic first and second layer materials forming this dielectric laminate film 1 , the number of layers can be further decreased (hereinafter referred to as “the fourth modified example”).
  • the fourth modified example the number of layers can be further decreased.
  • the third and fourth modified examples of the spectral image sensor 11 can also be applied to the spectral filter 10 .
  • the reflection in the visible light region can also be decreased.
  • the reflectance of a blue color B component (at a wavelength of approximately 420 nm) and that of a green color G component (at a wavelength of approximately 520 nm) are slightly increased, the reflectance of a red color R component (at a wavelength of approximately 600 nm) can be sufficiently decreased, and hence the visible light VL and the infrared light IR can be appropriately separated.
  • the spectral image sensor 11 is formed of the spectral filter 10 using the dielectric laminate film 1 ; however, the image sensor is not limited thereto, and any image sensor may be used as long as a member having properties, which reflect a predetermined wavelength region component of an electromagnetic wave and which transmit the rest thereof, is provided at an incident surface side on which the electromagnetic wave is incident.
  • a spectral filter can also be formed.
  • the reason for this is that when the thickness of the monolayer film is changed, an effect can be obtained which reflects a predetermined wavelength region component of an electromagnetic wave and transmits the rest thereof.
  • the cost can be advantageously decreased as compared to that of the laminate film.
  • the above monolayer film is not only applied to the spectral image sensor 11 but can also be applied to the wavelength separation optical system 104 , and the mirror 105 using a monolayer film having a predetermined thickness can be applied to the wavelength separation optical system 104 .
  • FIGS. 20A and 20B are schematic views showing the structure of an imager of a second embodiment according to the present invention.
  • the imager 100 of the second embodiment is the same as that of the first embodiment in which the visible color image and the near-infrared image are independently obtained; however, as shown in FIG. 20A , in the imager of this embodiment, the infrared image-taking portion 110 includes the image sensor 112 optimized to detect the infrared light IR.
  • the imaging signal processing portion 130 separates the visible light VL component and the infrared light IR component by calculation processing using a signal obtained from the infrared image-taking portion 110 and a signal obtained from the visible image-taking portion 120 .
  • the image sensor 122 may not be always necessary to be equivalent to the spectral image sensor 11 which is formed so as not to detect the infrared light IR. As shown in the figure, the image sensor 122 may detect the infrared light IR component besides the visible light VL component. Of course, it is more preferable when the image sensor 122 be equivalent to the spectral image sensor 11 which is formed so as not to detect the infrared light IR.
  • an image senor 112 _IR+VL is used for the infrared image-taking portion 110 , the image sensor 112 _IR+VL being optimized to detect the infrared light IR and having the structure in which a visible image and an infrared image are separately obtained by using the difference in position of light absorption in the depth direction of a semiconductor between wavelengths.
  • the image sensor 112 _IR+VL is used which can detect a wavelength component (the infrared light IR component in this case) to be detected while influence of a wavelength component (the visible light VL component in this case) other than that to be detected is suppressed.
  • the infrared image can be independently obtained.
  • a visible image can be obtained in a shallow region below the surface of the semiconductor, and when components of this visible image are synthesized with the visible image obtained by the image sensor 122 of the visible image-taking portion 120 , the sensitivity can be improved.
  • a light component thus separated to the infrared light side depends on the separation capability of the wavelength separation optical system 104 , and although the light component is primarily composed of the infrared light IR, several percent of the visible light VL is still contained.
  • the image sensor 112 _IR+VL is used which has the structure using the difference in position of light absorption in the depth direction of a semiconductor substrate between wavelengths.
  • electrons photoelectrically converted by the visible light VL having a wavelength of less than 780 nm are absorbed in a silicon Si substrate at a relatively small distance, such as approximately 5 ⁇ m, in the depth direction from the surface thereof.
  • the remaining light component that is, electrons photoelectrically converted by the infrared light IR having a wavelength of 780 nm or more are absorbed in a region deeper than 5 ⁇ m.
  • an electrical non-connection region is provided at the boundary between the detection regions for the above two types of electrons.
  • the infrared image-taking portion 110 when signal components of the visible light VL (such as a wavelength of less than 780 nm) which are photoelectrically converted in the shallow region are excluded, and when the component of the infrared light IR (such as a wavelength of 780 nm or more) which is photoelectrically converted in the deep region of the semiconductor layer is only used, electrical signal only made of the incident infrared light IR component is obtained.
  • the infrared image-taking portion 110 the infrared image can be obtained which is not substantially influenced by the visible light VL.
  • an electrical signal of a visible light VL component can also be obtained by the image sensor 112 .
  • the infrared light IR when passing through the region in which the visible light VL is detected, the infrared light IR is absorbed to a certain extent, and an infrared light IR component thus absorbed is mistakenly detected as the visible light VL; hence, as a result, the visible light VL is influenced by the infrared light IR component.
  • signal components GV_IR of the visible light VL (such as a wavelength of less than 780 nm) which are photoelectrically converted in the shallow region below the surface of the semiconductor layer and a signal component GIR of the infrared light IR (such as a wavelength of 780 nm or more) which is photoelectrically converted in the deep region of the semiconductor layer can also be separated and detected.
  • the image sensor 112 _IR+VL for the infrared light IR can also be practically used as an image sensor used for both the infrared light IR and visible light VL.
  • the leak of the visible light VL component to the infrared image-taking portion 110 indicates that the visible light VL component at the visible image-taking portion 120 side is decreased thereby, and that as the visible imaging signal SV obtained in the image sensor 122 , light-receiving sensitivity of the visible image is decreased.
  • the visible imaging signal SV_IR obtained by the image sensor 112 _IR+VL at the infrared image-taking portion 110 side and the visible imaging signal SV obtained by the image sensor 122 are synthesized in the imaging signal processing portion 130 , the decrease caused by the leak to the infrared image-taking portion 110 side can be corrected, and hence the sensitivity of the visible image can be improved.
  • a wavelength of the infrared light IR and a wavelength of the red color component of the visible light VL, which are detected around the boundary therebetween, are influenced with each other to a certain extent through the absorption.
  • the imaging signal processing portion 130 when the infrared imaging signal SIR obtained by the image sensor 112 _IR+VL is corrected (for example, by difference processing) using the visible imaging signal SV_IR obtained by the image sensor 112 _IR+VL at the infrared image-taking portion 110 side, the influence of the red color component in the visible light VL can also be suppressed.
  • FIGS. 21A , 21 B and 21 C are schematic views showing the structure of an imager of the third embodiment according to the present invention.
  • the imager 100 of the third embodiment is the same as that of the first and the second embodiments in which the visible color image and the near-infrared image are independently obtained; however, as shown in FIG. 21A , in the imager of this embodiment, the infrared image-taking portion 110 is provided with the image sensor 112 optimized to detect the infrared light IR, and the visible image-taking portion 120 is also provided with the image sensor 122 optimized to detect the visible light VL. That is, the structure of this embodiment is formed in combination of the first embodiment and the second embodiment.
  • the imaging signal processing portion 130 separates the visible light VL component and the infrared light IR component by calculation processing using a signal obtained from the infrared image-taking portion 110 and a signal obtained from the visible image-taking portion 120 .
  • the image senor 112 _IR+VL is used for the infrared image-taking portion 110 , the image sensor being optimized to detect the infrared light IR and having the structure in which a visible image and an infrared image are separately obtained by using the difference in position of light absorption in the depth direction of a semiconductor between wavelengths.
  • an image senor 122 _VL+IR is used at the visible image-taking portion 120 side, the image sensor being optimized to detect the visible light VL and having the structure in which a visible image and an infrared image are separately obtained by using the difference in position of light absorption in the depth direction of a semiconductor between wavelengths.
  • the image sensor 112 _VL+IR is used which can detect a wavelength component (the visible light VL component in this case) which is to be detected while influence of a wavelength component (the infrared light IR component in this case) other than that to be detected is suppressed.
  • an image sensor having the same structure as that shown in FIG. 21B is used, and on the light-receiving surface thereof, predetermined colors (such as 124 R, 124 G, and 124 B) of color separation filers 124 are provided for respective light-receiving portions (pixels), so that image-taking of a color image can be performed.
  • predetermined colors such as 124 R, 124 G, and 124 B
  • the visible image-taking portion 120 side when a signal component of the infrared light IR (such as a wavelength of 780 nm or more) which is photoelectrically converted in a deep region of the semiconductor layer is excluded, and when components of the visible light VL (such as a wavelength of less than 780 nm) which are photoelectrically converted in a shallow region below the surface of the semiconductor layer are only used, electrical signals only made of the incident visible light VL component are obtained for the respective colors.
  • the visible image-taking portion 120 the visible image can be obtained which is not substantially influenced by the infrared light IR.
  • a signal component GIR of the infrared light IR (such as a wavelength of 780 nm or more) which is photoelectrically converted in the deep region of the semiconductor layer and signal components GV_IR for respective colors of the visible light VL (such as a wavelength of less than 780 nm) which are photoelectrically converted in the shallow region below the surface of the semiconductor layer can be separated and detected.
  • the image sensor 122 _VL+IR for the visible light VL can also be practically used as an image sensor used for both the infrared light IR and visible light VL.
  • the leak of the infrared light IR component to the visible image-taking portion 120 indicates that the infrared light IR component at the infrared image-taking portion 110 side is decreased thereby, and that as the infrared imaging signal SIR obtained in the image sensor 112 , light-receiving sensitivity of the infrared image is decreased.
  • the infrared imaging signal SIR obtained by the image sensor 112 _IR+VL at the infrared image-taking portion 110 side and an infrared imaging signal SIR_VL obtained by the image sensor 122 _VL+IR are synthesized in the imaging signal processing portion 130 , the decrease caused by the leak to the visible image-taking portion 120 side can be corrected, and hence the sensitivity of the infrared image can be improved.
  • a wavelength of the infrared light IR and a wavelength of the red color component of the visible light VL, which are detected around the boundary therebetween, are influenced with each other to a certain extent through the absorption.
  • the imaging signal processing portion 130 when the visible imaging signal SV (particularly the red color component) obtained by the image sensor 122 _VL+IR is corrected (for example, by difference processing) using the infrared imaging signal SIR_VL obtained by the image sensor 122 _VL+IR at the visible image-taking portion 120 side, the influence of the infrared light IR component can be suppressed.
  • FIGS. 22A and 22B are views showing an example of a signal reading method using the image sensors 112 _IR+VL and 122 _VL+IR used in the second and the third embodiment, each image sensor having the structure in which the visible image and the infrared image are separately obtained using the difference in position of light absorption in the depth direction of a semiconductor between wavelengths.
  • FIG. 22A shows a circuit diagram
  • FIG. 22B shows a signal timing diagram.
  • transfer gates 734 W (or R, G, and B) and 734 IR for respective wavelengths are provided, respectively.
  • the photoelectric transducers 732 W (or R, G, and B) and 732 IR are connected to an amplifier 705 in a pixel through the respective transfer gates 734 W (or R, G, and B) and 734 IR, the amplifier 705 including an amplifier transistor 740 and a reset transistor 736 .
  • the amplifier transistor 740 is connected to a vertical signal line 751 via a vertical selection transistor 742 .
  • a pixel signal is output.
  • a selection pulse SEL is being supplied to the vertical selection transistor 742 in a vertical line which is to be read
  • reading pulses TW or R, G, and B
  • TIR are supplied to the transfer gates 734 W (or R, G, and B) and 734 IR, respectively, for reading respective signal charges
  • a reset pulse RST is supplied to the reset transistor 736 , so that a floating diffusion 738 is reset.
  • the aspects of the present invention are not limited to the embodiments described above, and it is not necessary that all the combinations of the characteristics described in the above embodiments be included in the solving means described in the specification.
  • the above embodiments include the aspects of the present invention on various stages, and when the disclosed constituent elements are appropriately used in combination, various aspects of the present invention can be extracted. Even when some constituent elements of all the constituent elements described in the embodiments are deleted, as long as the above effect can be obtained, the structure formed without using the above some constituent elements can also be extracted as one aspect of the present invention.
  • the above-described techniques are not limited to a technique in which visible light and infrared light are separated and in which a visible image and an infrared image are separately obtained by independently detecting the above separated light components.
  • visible light and ultraviolet light can be separated and can be detected, and ultraviolet light can be simultaneously detected together with visible light and can be formed into an image.
  • the visible light which is simultaneously detected in addition to detection of a monochrome image without performing wavelength separation, when the visible light region is separated, for example, into three primary color components using respective color filters as described above, a color image can also be detected.
  • ultraviolet image information which is not seen by naked eyes can also be obtained. Accordingly, the above device will be widely used as a key device, such as an optical-synthesis monitoring camera, for a new information system.
  • the infrared light IR is a reflection wavelength region component
  • the visible light VL having a wavelength shorter than that of the infrared light IR is a transmission wavelength region component.
  • the wavelength separation optical system 104 using the dielectric laminate film 1 in which the visible light VL is handled as a reflection wavelength region component and light (such as ultraviolet light) having a wavelength shorter than the visible light VL is handled as a transmission wavelength region component the visible light VL and the light (such as ultraviolet light) having a wavelength shorter than that of the visible light VL can be separated and detected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Electromagnetism (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Color Television Image Signal Generators (AREA)
  • Studio Devices (AREA)
  • Solid State Image Pick-Up Elements (AREA)
US11/358,885 2005-02-25 2006-02-21 Imager with image-taking portions optimized to detect separated wavelength components Expired - Fee Related US7531781B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JPP2005-050212 2005-02-25
JP2005050212A JP4839632B2 (ja) 2005-02-25 2005-02-25 撮像装置

Publications (2)

Publication Number Publication Date
US20070051876A1 US20070051876A1 (en) 2007-03-08
US7531781B2 true US7531781B2 (en) 2009-05-12

Family

ID=37045255

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/358,885 Expired - Fee Related US7531781B2 (en) 2005-02-25 2006-02-21 Imager with image-taking portions optimized to detect separated wavelength components

Country Status (2)

Country Link
US (1) US7531781B2 (ja)
JP (1) JP4839632B2 (ja)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080239091A1 (en) * 2007-03-30 2008-10-02 Fujifilm Corporation Image pickup apparatus and method
US20120050490A1 (en) * 2010-08-27 2012-03-01 Xuemin Chen Method and system for depth-information based auto-focusing for a monoscopic video camera
US20120050477A1 (en) * 2010-08-27 2012-03-01 Jeyhan Karaoguz Method and System for Utilizing Depth Information for Providing Security Monitoring
US20120050494A1 (en) * 2010-08-27 2012-03-01 Xuemin Chen Method and system for creating a view-angle dependent 2d and/or 3d image/video utilizing a monoscopic video camera array
US20120268607A1 (en) * 2007-03-29 2012-10-25 Kabushiki Kaisha Toshiba Image processing system and image acquisition method
US20140066781A1 (en) * 2012-08-28 2014-03-06 Electronics And Telecommunications Research Institute Medical diagnosis device and method for controlling the device
US20140098192A1 (en) * 2012-10-10 2014-04-10 Samsung Electronics Co., Ltd. Imaging optical system and 3d image acquisition apparatus including the imaging optical system
US8994792B2 (en) 2010-08-27 2015-03-31 Broadcom Corporation Method and system for creating a 3D video from a monoscopic 2D video and corresponding depth information
US20190361252A1 (en) * 2017-01-16 2019-11-28 Sony Corporation Branching optical system, imaging apparatus, and imaging system
US20220103732A1 (en) * 2020-09-29 2022-03-31 Aac Optics Solutions Pte. Ltd. Imaging assembly and camera

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7442629B2 (en) 2004-09-24 2008-10-28 President & Fellows Of Harvard College Femtosecond laser-induced formation of submicrometer spikes on a semiconductor substrate
US7057256B2 (en) 2001-05-25 2006-06-06 President & Fellows Of Harvard College Silicon-based visible and near-infrared optoelectric devices
KR101513311B1 (ko) 2006-09-29 2015-04-22 유니버시티 오브 플로리다 리서치 파운데이션, 인크. 적외선 감지 및 표시를 위한 방법 및 장치
KR100877069B1 (ko) 2007-04-23 2009-01-09 삼성전자주식회사 이미지 촬상 장치 및 방법
US20090160981A1 (en) * 2007-12-20 2009-06-25 Micron Technology, Inc. Apparatus including green and magenta pixels and method thereof
JP2009260411A (ja) * 2008-04-11 2009-11-05 Olympus Corp 撮像装置
WO2010033127A1 (en) * 2008-09-22 2010-03-25 Sionyx, Inc. Response-enhanced monolithic-hybrid pixel
US7968834B2 (en) 2008-09-22 2011-06-28 Sionyx, Inc. Response-enhanced monolithic-hybrid pixel
KR101441589B1 (ko) 2008-10-07 2014-09-29 삼성전자 주식회사 가시광선 이미지와 원적외선 이미지를 광학적으로 융합하는장치
US8330840B2 (en) * 2009-08-06 2012-12-11 Aptina Imaging Corporation Image sensor with multilayer interference filters
US9911781B2 (en) 2009-09-17 2018-03-06 Sionyx, Llc Photosensitive imaging devices and associated methods
US9673243B2 (en) 2009-09-17 2017-06-06 Sionyx, Llc Photosensitive imaging devices and associated methods
KR20110050063A (ko) * 2009-11-06 2011-05-13 삼성전자주식회사 픽셀과 이를 포함하는 이미지 처리 장치들
JP2011151269A (ja) * 2010-01-22 2011-08-04 Rohm Co Ltd 撮像装置
US8692198B2 (en) 2010-04-21 2014-04-08 Sionyx, Inc. Photosensitive imaging devices and associated methods
EP2583312A2 (en) 2010-06-18 2013-04-24 Sionyx, Inc. High speed photosensitive devices and associated methods
KR101419419B1 (ko) * 2011-02-03 2014-07-14 브로드콤 코포레이션 모노스코픽 2d 비디오 및 대응하는 깊이 정보로부터 3d 비디오를 생성하기 위한 방법 및 시스템
BR112013031013A2 (pt) * 2011-06-06 2016-11-29 Nanoholdings Llc dispositivo para formação de imagem em infravermelho integrando um dispositivo de conversão ascendente em ir com um sensor de imagem cmos
US9496308B2 (en) 2011-06-09 2016-11-15 Sionyx, Llc Process module for increasing the response of backside illuminated photosensitive imagers and associated methods
CA2840498A1 (en) 2011-06-30 2013-01-03 University Of Florida Research Foundation, Inc. A method and apparatus for detecting infrared radiation with gain
US20130016203A1 (en) 2011-07-13 2013-01-17 Saylor Stephen D Biometric imaging devices and associated methods
US9064764B2 (en) 2012-03-22 2015-06-23 Sionyx, Inc. Pixel isolation elements, devices, and associated methods
CN102780841A (zh) * 2012-07-24 2012-11-14 苏州工业园区七星电子有限公司 一种基于近红外光谱透雾***
JP5820845B2 (ja) * 2012-09-07 2015-11-24 キヤノン・コンポーネンツ株式会社 照明装置、イメージセンサユニットおよび紙葉類識別装置
JP6466346B2 (ja) 2013-02-15 2019-02-06 サイオニクス、エルエルシー アンチブルーミング特性を有するハイダイナミックレンジcmos画像センサおよび関連づけられた方法
US9939251B2 (en) 2013-03-15 2018-04-10 Sionyx, Llc Three dimensional imaging utilizing stacked imager devices and associated methods
US9209345B2 (en) 2013-06-29 2015-12-08 Sionyx, Inc. Shallow trench textured regions and associated methods
JP6236932B2 (ja) * 2013-07-02 2017-11-29 日油株式会社 転写用波長選択的反射フィルム並びにそれを用いた転写方法及び転写成型物
IN2014MU03621A (ja) * 2013-11-18 2015-10-09 Jds Uniphase Corp
JP2016033694A (ja) * 2014-07-30 2016-03-10 東芝テック株式会社 物体認識装置及び物体認識プログラム
GB201421512D0 (en) 2014-12-03 2015-01-14 Melexis Technologies Nv A semiconductor pixel unit for simultaneously sensing visible light and near-infrared light, and a semiconductor sensor comprising same
CN104822033B (zh) * 2015-05-05 2017-09-01 太原理工大学 一种红外与可见光图像融合的视觉传感器及其使用方法
JP2018529214A (ja) 2015-06-11 2018-10-04 ユニバーシティー オブ フロリダ リサーチ ファウンデーション, インコーポレイテッドUniversity Of Florida Research Foundation, Inc. 単分散ir吸収ナノ粒子及び関連する方法及びデバイス
JP6410203B1 (ja) * 2017-02-21 2018-10-24 株式会社ナノルクス 固体撮像素子及び撮像装置
US11223781B2 (en) 2017-08-21 2022-01-11 Sony Corporation Image-capturing apparatus and image-capturing method
CN109474770B (zh) * 2017-09-07 2021-09-14 华为技术有限公司 一种成像装置及成像方法
US11153514B2 (en) * 2017-11-30 2021-10-19 Brillnics Singapore Pte. Ltd. Solid-state imaging device, method for driving solid-state imaging device, and electronic apparatus
JP7121892B2 (ja) * 2018-05-15 2022-08-19 株式会社三井光機製作所 光学モジュール及び光学装置
JP2020099016A (ja) * 2018-12-19 2020-06-25 株式会社オーケー社鹿児島 微弱放電撮像システム
CN110519502A (zh) * 2019-09-24 2019-11-29 远形时空科技(北京)有限公司 一种融合了深度相机和普通相机的传感器及实现方法
JP7398284B2 (ja) 2020-01-21 2023-12-14 日本放送協会 カラー画像撮像装置
JP2022147021A (ja) * 2021-03-23 2022-10-06 ソニーセミコンダクタソリューションズ株式会社 撮像素子、撮像装置及び撮像素子の制御方法
US20230254475A1 (en) * 2022-02-04 2023-08-10 Meta Platforms, Inc. Color tuned optical modules with color calibration operations
JP7442848B2 (ja) * 2022-03-30 2024-03-05 華晋グローバル株式会社 撮像装置、撮像装置の制御方法
WO2024013142A1 (en) * 2022-07-15 2024-01-18 Lightcode Photonics Oü Image capture device with wavelength separation device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4957371A (en) * 1987-12-11 1990-09-18 Santa Barbara Research Center Wedge-filter spectrometer
JPH06121325A (ja) 1992-10-07 1994-04-28 Nippon Hoso Kyokai <Nhk> カラー撮像装置
JPH09130678A (ja) 1995-10-27 1997-05-16 Nikon Corp 固体撮像装置
JPH09166493A (ja) 1995-12-15 1997-06-24 Nikon Corp 撮像装置、撮像方法、および受光装置
JPH10210486A (ja) 1997-01-21 1998-08-07 Sony Corp 画像撮像装置および方法
US5910816A (en) * 1995-06-07 1999-06-08 Stryker Corporation Imaging system with independent processing of visible an infrared light energy
US6013912A (en) * 1996-11-19 2000-01-11 Commissariat A L'energie Atomique Multispectral semiconductor resonant-cavity detector sensitive in at least two wavelength bands
US6082858A (en) * 1998-04-29 2000-07-04 Carnegie Mellon University Apparatus and method of monitoring a subject's eyes using two different wavelengths of light
JP2002142228A (ja) 2000-10-31 2002-05-17 Toyota Central Res & Dev Lab Inc 撮像装置
JP2002369049A (ja) 2001-06-08 2002-12-20 Pentax Corp 画像検出装置と絞り装置
JP2004103964A (ja) 2002-09-12 2004-04-02 Foundation For Nara Institute Of Science & Technology 固体撮像素子、及び該素子を用いた撮像装置
US20040125222A1 (en) * 2002-12-30 2004-07-01 Bradski Gary R. Stacked semiconductor radiation sensors having color component and infrared sensing capability
US6794219B1 (en) * 2003-07-28 2004-09-21 Eastman Kodak Company Method for creating a lateral overflow drain, anti-blooming structure in a charge coupled device
US20060091284A1 (en) * 2004-10-20 2006-05-04 Viens Jean F Multi-spectral pixel and focal plane array
US20060114551A1 (en) * 2003-11-10 2006-06-01 Matsushita Electric Industrial Co., Ltd. Imaging device and an imaging method
US7138619B1 (en) * 2004-09-28 2006-11-21 Rockwell Collins, Inc. Method and apparatus for coincident viewing at a plurality of wavelengths

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3153050B2 (ja) * 1993-07-05 2001-04-03 松下電子工業株式会社 白熱電球
JPH0894830A (ja) * 1994-09-27 1996-04-12 Toshiba Lighting & Technol Corp 光干渉膜、管球、ダイクロイックミラーおよびダイクロイックミラー付き管球
JP3496498B2 (ja) * 1998-01-22 2004-02-09 松下電器産業株式会社 白熱電球
JP2002000560A (ja) * 2000-06-16 2002-01-08 Matsushita Electric Ind Co Ltd 撮影装置
JP4817584B2 (ja) * 2002-05-08 2011-11-16 キヤノン株式会社 カラー撮像素子

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4957371A (en) * 1987-12-11 1990-09-18 Santa Barbara Research Center Wedge-filter spectrometer
JPH06121325A (ja) 1992-10-07 1994-04-28 Nippon Hoso Kyokai <Nhk> カラー撮像装置
US5910816A (en) * 1995-06-07 1999-06-08 Stryker Corporation Imaging system with independent processing of visible an infrared light energy
JPH09130678A (ja) 1995-10-27 1997-05-16 Nikon Corp 固体撮像装置
JPH09166493A (ja) 1995-12-15 1997-06-24 Nikon Corp 撮像装置、撮像方法、および受光装置
US6013912A (en) * 1996-11-19 2000-01-11 Commissariat A L'energie Atomique Multispectral semiconductor resonant-cavity detector sensitive in at least two wavelength bands
JPH10210486A (ja) 1997-01-21 1998-08-07 Sony Corp 画像撮像装置および方法
US6082858A (en) * 1998-04-29 2000-07-04 Carnegie Mellon University Apparatus and method of monitoring a subject's eyes using two different wavelengths of light
JP2002142228A (ja) 2000-10-31 2002-05-17 Toyota Central Res & Dev Lab Inc 撮像装置
JP2002369049A (ja) 2001-06-08 2002-12-20 Pentax Corp 画像検出装置と絞り装置
JP2004103964A (ja) 2002-09-12 2004-04-02 Foundation For Nara Institute Of Science & Technology 固体撮像素子、及び該素子を用いた撮像装置
US20040125222A1 (en) * 2002-12-30 2004-07-01 Bradski Gary R. Stacked semiconductor radiation sensors having color component and infrared sensing capability
US6794219B1 (en) * 2003-07-28 2004-09-21 Eastman Kodak Company Method for creating a lateral overflow drain, anti-blooming structure in a charge coupled device
US20060114551A1 (en) * 2003-11-10 2006-06-01 Matsushita Electric Industrial Co., Ltd. Imaging device and an imaging method
US7138619B1 (en) * 2004-09-28 2006-11-21 Rockwell Collins, Inc. Method and apparatus for coincident viewing at a plurality of wavelengths
US20060091284A1 (en) * 2004-10-20 2006-05-04 Viens Jean F Multi-spectral pixel and focal plane array

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120268607A1 (en) * 2007-03-29 2012-10-25 Kabushiki Kaisha Toshiba Image processing system and image acquisition method
US8723980B2 (en) * 2007-03-29 2014-05-13 Kabushiki Kaisha Toshiba Image processing system and image acquisition method
US7961229B2 (en) * 2007-03-30 2011-06-14 Fujifilm Corporation Image pickup apparatus and method using visible light and infrared
US20080239091A1 (en) * 2007-03-30 2008-10-02 Fujifilm Corporation Image pickup apparatus and method
US8994792B2 (en) 2010-08-27 2015-03-31 Broadcom Corporation Method and system for creating a 3D video from a monoscopic 2D video and corresponding depth information
US20120050494A1 (en) * 2010-08-27 2012-03-01 Xuemin Chen Method and system for creating a view-angle dependent 2d and/or 3d image/video utilizing a monoscopic video camera array
US20120050477A1 (en) * 2010-08-27 2012-03-01 Jeyhan Karaoguz Method and System for Utilizing Depth Information for Providing Security Monitoring
US20120050490A1 (en) * 2010-08-27 2012-03-01 Xuemin Chen Method and system for depth-information based auto-focusing for a monoscopic video camera
US20140066781A1 (en) * 2012-08-28 2014-03-06 Electronics And Telecommunications Research Institute Medical diagnosis device and method for controlling the device
US20140098192A1 (en) * 2012-10-10 2014-04-10 Samsung Electronics Co., Ltd. Imaging optical system and 3d image acquisition apparatus including the imaging optical system
US9667944B2 (en) * 2012-10-10 2017-05-30 Samsung Electronics Co., Ltd. Imaging optical system and 3D image acquisition apparatus including the imaging optical system
US9998730B2 (en) 2012-10-10 2018-06-12 Samsung Electronics Co., Ltd. Imaging optical system and 3D image acquisition apparatus including the imaging optical system
US20190361252A1 (en) * 2017-01-16 2019-11-28 Sony Corporation Branching optical system, imaging apparatus, and imaging system
US10788676B2 (en) * 2017-01-16 2020-09-29 Sony Corporation Branching optical system, imaging apparatus, and imaging system
US20220103732A1 (en) * 2020-09-29 2022-03-31 Aac Optics Solutions Pte. Ltd. Imaging assembly and camera

Also Published As

Publication number Publication date
JP2006238093A (ja) 2006-09-07
JP4839632B2 (ja) 2011-12-21
US20070051876A1 (en) 2007-03-08

Similar Documents

Publication Publication Date Title
US7531781B2 (en) Imager with image-taking portions optimized to detect separated wavelength components
KR102201627B1 (ko) 고체 촬상 소자 및 촬상 장치
CN102447826B (zh) 可见及红外双重模式成像***
US8134191B2 (en) Solid-state imaging device, signal processing method, and camera
TWI770168B (zh) 誘發透射濾光片
US8227883B2 (en) Solid-state imaging device and camera
KR101244147B1 (ko) 물리 정보 취득 방법, 물리 정보 취득 장치 및 반도체 장치
JP5617063B1 (ja) 近赤外線カットフィルタ
JP5070742B2 (ja) 情報取得方法、情報取得装置、半導体装置、信号処理装置
JP4899008B2 (ja) 改良型カラーフォトディテクタアレイ及びその製造方法
US20140347493A1 (en) Image-capturing device and filter
JP4887915B2 (ja) 固体撮像装置
CN101146182A (zh) 图像传感器和数字照相机
JP2006190958A (ja) 物理情報取得方法および物理情報取得装置、複数の単位構成要素が配列されてなる物理量分布検知の半導体装置の製造方法
JP2000151933A (ja) 撮像素子及びその製造方法
US20130181113A1 (en) Solid-state imaging equipment
EP3450938B1 (en) An image sensor and an imaging apparatus
WO2021161961A1 (ja) 光学フィルタ及びその製造方法、光センサ並びに固体撮像素子
WO2024086959A1 (en) Stacked sensor for simultaneouly detecting visible light and infrared light

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUMI, HIROFUMI;TODA, ATSUSHI;REEL/FRAME:017399/0792;SIGNING DATES FROM 20060317 TO 20060320

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20210512