CN112119624A - Image sensor, imaging device and mobile platform - Google Patents
Image sensor, imaging device and mobile platform Download PDFInfo
- Publication number
- CN112119624A CN112119624A CN201980032071.0A CN201980032071A CN112119624A CN 112119624 A CN112119624 A CN 112119624A CN 201980032071 A CN201980032071 A CN 201980032071A CN 112119624 A CN112119624 A CN 112119624A
- Authority
- CN
- China
- Prior art keywords
- pixel unit
- pixel
- photosensitive element
- image sensor
- tube
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 44
- 238000012545 processing Methods 0.000 claims description 14
- 230000005540 biological transmission Effects 0.000 claims description 13
- 239000002184 metal Substances 0.000 description 13
- 230000002829 reductive effect Effects 0.000 description 13
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 9
- 229910052710 silicon Inorganic materials 0.000 description 9
- 239000010703 silicon Substances 0.000 description 9
- 238000009826 distribution Methods 0.000 description 7
- 238000000034 method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 239000000463 material Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 206010034960 Photophobia Diseases 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000009792 diffusion process Methods 0.000 description 3
- 208000013469 light sensitivity Diseases 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005468 ion implantation Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/58—Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/40—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
- H04N25/46—Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/709—Circuitry for control of the power supply
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Power Engineering (AREA)
- Electromagnetism (AREA)
- Condensed Matter Physics & Semiconductors (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
- Solid State Image Pick-Up Elements (AREA)
Abstract
An image sensor (10), an imaging device (100) and a mobile platform. The image sensor (10) includes a plurality of pixel cells (12), each pixel cell (12) including a photosensitive element (122) located within the pixel cell (12). The plurality of pixel units (12) include a first pixel unit (121) for imaging, and a second pixel unit (123) and a third pixel unit (125) for focusing. The size of the photosensitive element (122) of the first pixel unit (121) is larger than the size of the photosensitive element (122) of the second pixel unit (123) and larger than the size of the photosensitive element (122) of the third pixel unit (125). The photosensitive element (122) of the second pixel unit (123) is positioned on the right side of the second pixel unit (123), and the photosensitive element (122) of the third pixel unit (125) is positioned on the left side of the third pixel unit (125).
Description
Technical Field
The present disclosure relates to the field of camera technologies, and in particular, to an image sensor, an imaging device, and a mobile platform for phase detection and auto-focusing.
Background
In the related art, in order for a camera to acquire a clear image, the camera has a focusing function. One of the focusing techniques is a Phase Detect Auto Focus (PDAF) technique. In this technique, some pixels of the image sensor are pixels for focusing. These pixels for focusing include a left half-covered pixel and a right half-covered pixel. In a specific implementation, a semi-covered pixel is formed by forming a metal shield layer (metal shield) for shielding light above a photosensitive surface of the pixel. However, when light is irradiated on the cover layer, a part of the light is reflected, and the reflected light enters the surrounding pixels with normal light sensitivity through reflection/refraction of the surrounding interface, so that the pixel signals with normal light sensitivity are interfered, and the imaging quality of the pixels with normal light sensitivity around the focusing pixel point is further reduced.
Disclosure of Invention
The application provides an image sensor, an imaging device and a mobile platform.
The image sensor of the embodiment of the application comprises a plurality of pixel units, each pixel unit comprises a photosensitive element located in the pixel unit, the pixel units comprise a first pixel unit, a second pixel unit and a third pixel unit, the first pixel unit is used for imaging, the second pixel unit and the third pixel unit are used for focusing, the size of the photosensitive element of the first pixel unit is larger than that of the photosensitive element of the second pixel unit and larger than that of the photosensitive element of the third pixel unit, the photosensitive element of the second pixel unit is located on the right side of the second pixel unit, and the photosensitive element of the third pixel unit is located on the left side of the third pixel unit.
According to the image sensor of the embodiment of the application, the size of the photosensitive elements of the second pixel unit and the third pixel unit for focusing is reduced, so that an additional metal shading layer can be omitted, the interference of additional reflected light to the first pixel unit for surrounding imaging is reduced, and the quality of an image formed by the pixel units for imaging around the pixel unit for focusing is kept stable and consistent.
The imaging device of the embodiment of the present application includes the image sensor of the above embodiment.
According to the imaging device of the embodiment of the application, the size of the photosensitive elements of the focusing second pixel unit and the focusing third pixel unit is reduced, so that an additional metal shading layer can be omitted, the interference of additional reflected light to the surrounding imaging first pixel unit is reduced, and the quality of images formed by the imaging pixel units surrounding the focusing pixel unit is kept stable and consistent.
The mobile platform of the embodiment of the application comprises the imaging device of the embodiment.
According to the moving platform of the embodiment of the application, the sizes of the photosensitive elements of the second pixel unit and the third pixel unit for focusing are reduced, so that an additional metal shading layer can be omitted, the interference of additional reflected light to the surrounding first pixel unit for imaging is reduced, and the quality of images formed by the pixel units for imaging around the pixel unit for focusing is kept stable and consistent.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic view of a pixel structure of a related art image sensor;
fig. 2 is a schematic view of a pixel arrangement of a related art image sensor;
FIG. 3 is a graph showing the results of the response of the left half covered pixel row and the right half covered pixel row of a related art image sensor to light at adjacent locations;
FIG. 4 is a schematic diagram of a left half masking pixel structure of a related art image sensor;
FIG. 5 is a schematic diagram of a pixel unit structure of an image sensor according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of a pixel unit arrangement of an image sensor according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram illustrating a result of response of a second pixel cell row and a third pixel cell row of the image sensor to light at adjacent positions according to the embodiment of the present application;
FIG. 8 is a circuit schematic of a pixel cell according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an image sensor of an embodiment of the present application sensing light;
fig. 10 is a block schematic diagram of an image forming apparatus according to an embodiment of the present application;
fig. 11 is a schematic structural view of an image forming apparatus according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the present application, it is to be understood that the terms "center," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the present application and for simplicity in description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed in a particular orientation, and be operated in a particular manner, and are not to be construed as limiting the present application. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it is to be noted that the terms "mounted," "connected," and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected unless otherwise explicitly stated or limited. Either mechanically or electrically. Either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise direct contact of the first and second features, or may comprise contact of the first and second features not directly but through another feature in between. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The disclosure herein provides many different embodiments or examples for implementing different configurations of the present application. In order to simplify the disclosure of the present application, specific example components and arrangements are described herein. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, examples of various specific processes and materials are provided herein, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
Referring to fig. 1, in the related art image sensor, the pixel array includes a pixel (Normal pixel)102 for Normal light sensing, a pixel for focusing, a Left half-covered pixel (Left shift pixel)104 and a Right half-covered pixel (Right shift pixel)106, which are formed by forming a metal cover layer (metal shield)108 for shielding light above the light sensing surface of the pixel. One arrangement of these three pixels across the pixel array is shown in fig. 2. When imaging is performed by the image sensor, pixel signals of all the left half cover pixels 104 are extracted individually as one row, and pixel signals of all the right half cover pixels 106 are extracted individually as another row. In the case of focus accuracy, since left half-covered pixel 104 and right half-covered pixel 106 are physically closely arranged, the image luminance distributions of left half-covered pixel row 101 and right half-covered pixel row 103 substantially completely overlap; under the condition of inaccurate focusing, as shown in fig. 3, due to the blocking of the metal cover layer 108 to light, the response of the left half-covered pixel row 101 and the right half-covered pixel row 103 to light at adjacent positions is obviously different, and the height of the rectangle below each pixel in the figure represents the intensity of the signal correspondingly collected, so that the brightness distribution of the image of the left half-covered pixel row 101 and the right half-covered pixel row 103 is shifted, and the shift amount and the required movement amount for focusing the lens have a one-to-one correspondence relationship, therefore, according to the brightness distribution shift amount of the left half-covered pixel row 101 and the right half-covered pixel row 103, the required movement distance for accurate focusing of the current lens can be calculated, and the motor is controlled to move the lens to realize accurate focusing, and the lens does not need to be moved repeatedly, and the focusing speed can be greatly increased. This manner of focusing may be referred to as a phase detection autofocus technique.
However, as can be seen from fig. 4, when light is irradiated on the metal covering layer 108 for shielding light, a part of the light is reflected, and the reflected light enters the pixels 102 with normal light sensing around through reflection or refraction of the surrounding interface, so that the pixel signals of the pixels 102 with normal light sensing are interfered, thereby reducing the imaging quality of the pixels 102 with normal light sensing around the phase detection autofocus pixel.
Accordingly, the present embodiment proposes an image sensor 10. Referring to fig. 5, the image sensor 10 of the present embodiment includes a plurality of pixel units 12, and each pixel unit 12 includes a photosensitive element 122 located in the pixel unit 12. The plurality of pixel units 12 include a first pixel unit 121 for imaging, a second pixel unit 123 for focusing, and a third pixel unit 125 for focusing. The size of the photosensitive element 122 of the first pixel unit 121 is larger than the size of the photosensitive element 122 of the second pixel unit 123 and larger than the size of the photosensitive element 122 of the third pixel unit 125. The photosensitive element 122 of the second pixel unit 123 is located at the right side of the second pixel unit 123, and the photosensitive element 122 of the third pixel unit 125 is located at the left side of the third pixel unit 125.
In the image sensor 10 according to the embodiment of the present application, the size of the photosensitive elements 122 of the focusing second pixel unit 123 and the focusing third pixel unit 125 can be reduced, so that an additional metal light shielding layer can be omitted, interference of additional reflected light to the surrounding imaging first pixel unit 121 is reduced, and the quality of an image formed by the imaging pixel unit 12 surrounding the focusing pixel unit 12 is maintained stably and uniformly.
In the illustrated embodiment, the planar shape of the photosensitive element 122 of the pixel unit 12 is substantially square, and is arranged in a row-column manner to form a pixel array. In other embodiments, the planar shape of the photosensitive element 122 of the pixel unit 12 may also be other polygons or other shapes, and the arrangement is not limited to the determinant arrangement, and is not limited herein.
Specifically, image sensor 10 also includes a lens layer 14, a filter layer 16, and a buffer layer 18. The lens layer 14 includes a plurality of lenses 142, and each lens 142 corresponds to one pixel unit 12 and is located above the pixel unit 12. The lens 142 may be a micro lens (micro lens), and the lens 142 is used to converge the incident light. A filter layer 16 is disposed between the pixel unit 12 and the lens layer 14, and the filter layer 16 is used for filtering out stray light in incident light. In one example, filter layer 16 may be a color filter layer, such that incident light (visible light) is divided into light with different colors (e.g., R, B, G, W, etc.) and then incident on photosensitive element 122 through the color filter layer. In other examples, filter layer 16 may also be an infrared filter layer or other filter layer, selected based on the function of image sensor 10. Buffer layer 18 is located between pixel cells 12 and filter layer 16. Buffer layer 18 may be used to protect the photodiodes, to adjust the optical path into the pixel cells, and to maintain the flatness of filter layer 16. In fig. 5, lens layer 14, filter layer 16, and buffer layer 18 are stacked sequentially from top to bottom to form an Optical stack (Optical stack). The image sensor 10 includes a silicon device layer including a silicon layer 124 and a photosensitive element 122 disposed on the silicon layer 124, that is, the silicon device layer includes a plurality of pixel units 12, and the pixel units 12 include the silicon layer 124 and the photosensitive element 122. The photosensitive element 122 may be a Photodiode (PD). Each pixel cell 12 may include one or more photodiodes.
It is understood that in the present application, a pixel array includes three pixel structures: the first pixel unit 121 (Normal pixel) has a larger size of the photosensitive element 122, incident light is converged by the lens 142 and then enters the photosensitive element 122, and photons are fully absorbed to generate photo-generated electrons and are retained in the photosensitive element 122; a second pixel unit 123 (Right PD pixel) having a photosensitive element 122 with a smaller size than the photosensitive element 122 of the first pixel unit 121, and when the incident light is converged by the lens 142 and enters the silicon device layer, only electrons generated by the light entering the photosensitive element 122 will be retained in the photosensitive element 122, and the photo-generated charges outside the photosensitive element 122 will not be collected by the photosensitive element 122 and thus will not be used as a signal; the third pixel unit 125 (Left PD pixel) has a similar principle as the second pixel unit 123, and only the photon-excited electrons entering its photosensitive element 122 are collected and used as signals for subsequent processing.
In one example, the arrangement of the three pixel structures across the pixel array is schematically illustrated in fig. 6. In imaging with the image sensor 10, the pixel signals of all the second pixel units 123 are extracted individually as one row, and the pixel signals of all the third pixel units 125 are extracted individually as another row. In the case of focus accuracy, since the second pixel cell 123 and the third pixel cell 125 are physically arranged next to each other, the image luminance distributions of the second pixel cell row 127 and the third pixel cell row 129 substantially completely overlap; in the case of inaccurate focusing, as shown in fig. 7, since light is irradiated in the silicon layer 124 outside the photosensitive element 122, generated charges are not collected, i.e. not used to represent signal intensity, so that the response of the second pixel unit row 127 and the third pixel unit row 129 to light at adjacent positions is significantly different, and the height of the rectangle below each pixel in the figure represents the corresponding collected signal intensity, so that the brightness distribution of the image of the second pixel unit row 127 and the third pixel unit row 129 is shifted, and the shift amount has a one-to-one correspondence relationship with the amount of lens focusing required to be moved, therefore, according to the brightness distribution shift amount of the second pixel unit row 127 and the third pixel unit row 129, the distance required to be moved for current lens accurate focusing can be calculated, and the motor-moving lens can be controlled to realize accurate focusing without repeatedly moving the lens, and the focusing speed can be greatly improved by multiple times of calculation. Moreover, since no metal covering layer is required to be arranged, adverse effects of reflected light or scattered light on the photosensitive elements 122 of the first pixel unit 121, the second pixel unit 123 and the third pixel unit 125 are reduced or eliminated, so that the processing results (including imaging and focusing) of the output signals of the photosensitive elements 122 by the back-end processing circuit are more accurate.
In addition, the above-mentioned "physically arranged closely" is "closely" defined in design, and does not necessarily mean that there are one or more pixel units 12 at the position of the image sensor 10, between the second pixel unit 123 and the third pixel unit 125.
In summary, the image sensor 10 according to the embodiment of the present application can achieve fast focusing and prevent the reflected light from interfering with the surrounding first pixel unit 121. The image sensor 10 of the embodiment of the application can be widely applied to the fields of consumer electronics, security monitoring, industrial automation, artificial intelligence, the Internet of things and the like, is used for collecting and sorting image data information, and provides an information source for subsequent processing and application.
In some embodiments, the second pixel unit 123 and the third pixel unit 125 are adjacently staggered.
In this way, a way is achieved in which the second pixel cell 123 and the third pixel cell 125 are physically arranged next to each other. Specifically, the second pixel units 123 and the third pixel units 125 are arranged adjacently and alternately, which means that at least one third pixel unit 125 is arranged at the adjacent position of each second pixel unit 123, at least one second pixel unit 123 is arranged at the adjacent position of each third pixel unit 125, and one or more pixel units 12 can be arranged between the second pixel units 123 and the third pixel units 125. The second pixel unit 123 and the third pixel unit 125 may be arranged in the same row or in different rows, and the arrangement manner is not particularly limited. For example, in pixel cell 122 of the 3 x 3 array, the middle pixel cell is the second pixel cell 123, then the other 8 pixel cells: at least one of the upper pixel unit, the lower pixel unit, the left pixel unit, the right pixel unit, the upper left pixel unit, the upper right pixel unit, the lower left pixel unit, and the lower right pixel unit is the third pixel unit 125.
In some embodiments, the size of the photosensitive element 122 of the second pixel unit 123 is the same as the size of the photosensitive element 122 of the third pixel unit 125. In this way, when the second pixel unit 123 and the third pixel unit 125 are at the same position, the intensity of the signals collected by the photosensitive element 122 of the second pixel unit 123 and the photosensitive element 122 of the third pixel unit 125 is the same.
Further, the size of the photosensitive element 122 of the second pixel unit 123 is half of the size of the photosensitive element 122 of the first pixel unit 121. In this way, the size of the light receiving element 122 of the second pixel unit 123 for focusing and the size of the light receiving element 122 of the third pixel unit 125 are both smaller than the size of the light receiving element 122 of the first pixel unit 121 for imaging.
In one embodiment, referring to fig. 6, the planar shape of the photosensitive element 122 is substantially rectangular, the length of the photosensitive element 122 of the second pixel unit 123 and the length of the photosensitive element 122 of the third pixel unit 125 are both equal to half of the length of the photosensitive element 122 of the first pixel unit 121, and the width of the photosensitive element 122 of the second pixel unit 123 and the width of the photosensitive element 122 of the third pixel unit 125 are equal to the width of the photosensitive element 122 of the first pixel unit 121. In other embodiments, the width of the light sensing element 122 of the second pixel unit 123 and the width of the light sensing element 122 of the third pixel unit 125 are both equal to half of the width of the light sensing element 122 of the first pixel unit 121, the length of the light sensing element 122 of the second pixel unit 123 and the length of the light sensing element 122 of the third pixel unit 125 are both equal to the length of the light sensing element 122 of the first pixel unit 121, and the area of the light sensing surface 1222 of the light sensing element 122 of the second pixel unit 123 and the area of the light sensing surface 1222 of the light sensing element 122 of the third pixel unit 125 are both smaller than the area of the light sensing surface 1222 of the light sensing element 122 of the first pixel unit 121.
In some embodiments, referring to fig. 5, the pixel unit 12 has a middle axis a. The photosensitive element 122 of the second pixel unit 123 is located to the right of the middle axis a of the second pixel unit 123, or more than one-half of the photosensitive elements 122 of the second pixel unit 123 are located to the right of the middle axis a of the second pixel unit 123.
It is understood that the photosensitive element 122 of the second pixel unit 123 is located on the right side of the silicon layer 124 of the second pixel unit 123. When the size of the photosensitive element 122 of the second pixel unit 123 is smaller than half of the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the second pixel unit 123 is located on the right side of the middle axis a of the second pixel unit 123. When the size of the photosensitive element 122 of the second pixel unit 123 is equal to half of the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the second pixel unit 123 is located at the right side of the middle axis a of the second pixel unit 123 and the left edge of the photosensitive element 122 coincides with the middle axis a. When the size of the photosensitive element 122 of the second pixel unit 123 is greater than half the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the second pixel unit 123 greater than one-half is located on the right side of the middle axis a of the second pixel unit 123.
In some embodiments, the pixel cell 12 has a medial axis a. The photosensitive element 122 of the third pixel cell 125 is located to the left of the middle axis a of the third pixel cell 125, or more than one-half of the photosensitive element 122 of the third pixel cell 125 is located to the left of the middle axis a of the third pixel cell 125.
It is understood that the photosensitive element 122 of the third pixel unit 125 is located on the far left side of the silicon layer 124 of the third pixel unit 125. When the size of the photosensitive element 122 of the third pixel unit 125 is smaller than half of the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis a of the third pixel unit 125. When the size of the photosensitive element 122 of the third pixel unit 125 is equal to half of the size of the photosensitive element 122 of the first pixel unit 121, the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis a of the third pixel unit 125 and the right edge of the photosensitive element 122 coincides with the middle axis a. When the size of the photosensitive element 122 of the third pixel unit 125 is greater than half of the size of the photosensitive element 122 of the first pixel unit 121, more than one-third of the photosensitive element 122 of the third pixel unit 125 is located on the left side of the middle axis a of the third pixel unit 125.
In some embodiments, the sum of the photosensitive area of the photosensitive element 122 of the second pixel cell 123 and the photosensitive area of the photosensitive element 122 of the third pixel cell 125 accounts for less than 6% of the total photosensitive area of the pixel cell 12.
In this way, the photosensitive area of the pixel unit 12 for image formation is ensured to be sufficiently large, thereby ensuring the image quality of the image sensor 10. In one embodiment, the sum of the photosensitive area of the photosensitive element 122 of the second pixel cell 123 and the photosensitive area of the photosensitive element 122 of the third pixel cell 125 accounts for 5% of the total photosensitive area of the pixel cell 12. It should be noted that, in the present application, the light-sensing area of the light-sensing element 122 is proportional to the size, and the larger the size of the light-sensing element 122 is, the larger the light-sensing area is.
Referring to fig. 8, in some embodiments, the pixel unit 12 includes a switch element 126. The switch assembly 126 is connected to the photosensitive element 122. The switching assembly 126 is configured to selectively empty or output the charge of the light sensing element 122.
Specifically, the switch assembly 126 includes a pass tube TX, a reset tube RST, a source follower tube SF, and a gate tube SEL. The transmission tube TX, the reset tube RST, the source follower tube SF and the gate tube SEL are all triodes. The emitter of the transmission tube TX is connected to the photodiode, and the collector of the transmission tube TX is connected to the emitter of the reset tube RST. The base electrode of the source following tube SF is connected with the collector electrode of the transmission tube TX and the emitter electrode of the reset tube RST, and the collector electrode of the source following tube SF is connected with the emitter electrode of the gate tube SEL. The collector of the gate line SEL is connected to the signal line 1262. The light sensing element 122 is a photodiode PD.
In the example of fig. 8, the floating diffusion FD is a node where the transfer transistor TX, the reset transistor RST, and the source follower transistor SF are electrically connected. The source follower tube SF is a voltage source follower tube and is used for translating the voltage of the FD end to the PXD end, so that the voltage of the PXD end and the voltage of the FD end have a corresponding relation. The signal line 1262 is connected to the back-end processing circuit to transmit an output signal of the image sensor 10 to the back-end processing circuit for processing.
In the present application, the first pixel unit 121, the second pixel unit 123, and the third pixel unit 125 are different in the size of the photosensitive element 122.
Referring to fig. 9, in some embodiments, the image sensor 10 includes a reset phase, an exposure phase, a reference voltage readout phase, and a signal voltage readout phase performed in time sequence. The switch assembly 126 is configured to empty the charge of the light sensing element 122 during a reset phase. The switching element 126 is configured to cause the photosensitive element 122 to generate charge and accumulate in the photosensitive element 122 during the exposure phase. The switching component 126 is configured to output the reference voltage during the reference voltage sensing phase. The switching component 126 is configured to cause the charge of the photosensitive element 122 to be output during the signal voltage readout phase to form a signal voltage.
Specifically, the reset phase: the reset tube RST and the transmission tube TX are set at a high potential, the gate tube SEL is set at a low potential, and the charges in the photodiode PD are completely reset and emptied. And (3) an exposure stage: the transmission tube TX is at a low potential, the reset tube RST is kept at a high potential, the gate tube SEL is kept at a low potential, and photo-generated electrons are generated in the photodiode PD and accumulated in the photodiode PD due to illumination. Reference voltage sensing stage: the gate tube SEL is set at a high potential, the reset tube RST is set at a low potential, and the source end of the gate tube SEL obtains a voltage Vref at the moment, and the voltage Vref is quantitatively read out through a subsequent analog-digital conversion circuit. Signal voltage reading stage: the gate SEL maintains a high potential, the reset tube RST maintains a low potential, the transmission tube TX is turned on (high potential), electrons in the photodiode PD enter the floating diffusion FD to cause the potential of the floating diffusion FD to decrease, the transmission tube TX (low potential) is turned off, a voltage Vsig is obtained at a source end of the gate SEL, and the Vsig is quantized and read by a subsequent analog-to-digital conversion circuit. When finally processed by the back-end processing circuit, Vref-Vsig is the final corresponding image signal. Where Vref is the voltage collected at the PXD node shown in fig. 8 when the reference voltage is read, and Vsig is the voltage collected at the PXD node when the signal voltage is read.
In some embodiments, the photosensitive element 122 is doped N-type.
It is understood that the photosensitive element 122 is fabricated by ion implantation (semiconductor standard process). In one embodiment, the photosensitive element 122 is an N-type doped photodiode.
Referring to fig. 10 and 11, an imaging device 100 according to an embodiment of the present application includes the image sensor 10 according to the above embodiment.
In the imaging device 100 according to the embodiment of the present invention, the size of the photosensitive elements 122 of the focusing second pixel unit 123 and the focusing third pixel unit 125 can be reduced, so that the extra metal light shielding layer can be omitted, the interference of extra reflected light to the surrounding imaging first pixel unit 121 can be reduced, and the quality of the image formed by the imaging pixel units 12 surrounding the focusing pixel units 12 can be maintained stably and uniformly.
In some embodiments, the imaging device 100 includes processing circuitry 20, the processing circuitry 20 being coupled to the image sensor 10. The processing circuit 20 is used for determining imaging information of the image sensor 10 according to the output signal of the first pixel unit 121 and for determining focusing information according to the output signals of the second pixel unit 123 and the third pixel unit 125.
Thus, the imaging device 100 can use the first pixel unit 121 to image, and can calculate the distance that the lens of the imaging device 100 needs to be moved for focusing accurately according to the luminance distribution offset of the second pixel unit row 127 and the third pixel unit row 129, and control the motor to move the lens to realize focusing accurately, so that the focusing speed is greatly increased.
Further, in the embodiment of fig. 11, the imaging device 100 further includes a lens module 30 and a circuit board 40, and the lens module 30 includes a lens barrel 32, a lens holder 34 and a lens 36. The image sensor 10 and the processing circuit 20 are disposed on the circuit board 40, the lens holder 34 is disposed on the circuit board 40, the lens barrel 32 is connected to the lens holder 34, and the lens 36 is disposed in the lens barrel 32. The number of lenses 36 may be one or two or more. The imaging device 100 may also include a drive module 50 (e.g., a motor). The driving module 50 is used for driving the lens barrel 32 and/or the lens 36 to move back and forth along the optical axis OP of the lens module 30, so as to move the lens to realize accurate focusing.
The mobile platform of the embodiment of the present application includes the imaging device 100 of the above embodiment.
In the moving platform according to the embodiment of the present application, by reducing the size of the photosensitive elements 122 of the focusing second pixel unit 123 and the focusing third pixel unit 125, an additional metal light shielding layer can be omitted, so that interference of additional reflected light to the surrounding imaging first pixel unit 121 is reduced, and the quality of an image formed by the imaging pixel unit 12 surrounding the focusing pixel unit 12 is maintained stable and consistent.
It is understood that the mobile platform may be a drone, an unmanned vehicle, a mobile robot, or other mobile platform, etc. on which the imaging device 100 is mounted.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: numerous changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
Claims (17)
1. An image sensor, comprising a plurality of pixel units, each of the pixel units including a photosensitive element located in the pixel unit, the plurality of pixel units including a first pixel unit, a second pixel unit and a third pixel unit, the first pixel unit being used for imaging, the second pixel unit and the third pixel unit being used for focusing, the size of the photosensitive element of the first pixel unit being larger than the size of the photosensitive element of the second pixel unit and larger than the size of the photosensitive element of the third pixel unit, the photosensitive element of the second pixel unit being located on the right side of the second pixel unit, and the photosensitive element of the third pixel unit being located on the left side of the third pixel unit.
2. The image sensor of claim 1, wherein the second pixel cells and the third pixel cells are adjacently staggered.
3. The image sensor of claim 1, wherein the size of the photosensitive element of the second pixel cell is the same as the size of the photosensitive element of the third pixel cell.
4. The image sensor of claim 3, wherein the size of the photosensitive element of the second pixel cell is half the size of the photosensitive element of the first pixel cell.
5. The image sensor of claim 1, wherein the pixel cell has a central axis,
the photosensitive element of the second pixel unit is positioned on the right side of the middle axis of the second pixel unit, or more than one half of the photosensitive element of the second pixel unit is positioned on the right side of the middle axis of the second pixel unit.
6. The image sensor of claim 1, wherein the pixel cell has a central axis,
the photosensitive element of the third pixel unit is positioned on the left side of the middle axis of the third pixel unit, or more than one half of the photosensitive element of the third pixel unit is positioned on the left side of the middle axis of the third pixel unit.
7. The image sensor of claim 1, wherein the image sensor comprises a lens layer, the lens layer comprising a plurality of lenses, each lens corresponding to and being located above one of the pixel cells.
8. The image sensor of claim 7, wherein the image sensor comprises a filter layer between the pixel cells and the lens layer.
9. The image sensor of claim 8, wherein the image sensor comprises a buffer layer between the pixel cells and the filter layer.
10. The image sensor of claim 1, wherein a percentage of a sum of a photosensitive area of the photosensitive element of the second pixel cell and a photosensitive area of the photosensitive element of the third pixel cell to a total photosensitive area of the pixel cells is less than 6%.
11. The image sensor of claim 1, wherein the pixel cell comprises a switching component coupled to the photosensitive element, the switching component configured to selectively clear or output charge from the photosensitive element.
12. The image sensor of claim 11, wherein the image sensor comprises a reset phase, an exposure phase, a reference voltage readout phase, and a signal voltage readout phase performed in time series,
the switch assembly is configured to empty the charge of the light sensing element during the reset phase;
the switch component is configured to enable the photosensitive element to generate charges and accumulate the charges on the photosensitive element in the exposure phase;
the switching component is configured to output a reference voltage during the reference voltage sensing phase;
the switching component is configured to cause the charge of the light sensing element to be output during the signal voltage readout phase to form a signal voltage.
13. The image sensor of claim 11, wherein the switch assembly comprises a transmission tube, a reset tube, a source follower tube and a gate tube, the transmission tube, the reset tube, the source follower tube and the gate tube are all triodes, an emitter of the transmission tube is connected to the photodiode, a collector of the transmission tube is connected to an emitter of the reset tube, a base of the source follower tube is connected to a collector of the transmission tube and an emitter of the reset tube, a collector of the source follower tube is connected to an emitter of the gate tube, and a collector of the gate tube is connected to a signal line.
14. The image sensor of claim 1, wherein the photosensitive element is N-type doped.
15. An imaging apparatus comprising the image sensor according to any one of claims 1 to 14.
16. The imaging device of claim 15, comprising a processing circuit coupled to the image sensor, the processing circuit configured to determine imaging information of the image sensor according to the output signals of the first pixel unit and configured to determine focusing information according to the output signals of the second pixel unit and the third pixel unit.
17. A mobile platform comprising the imaging apparatus of claim 15 or 16.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/113097 WO2021077374A1 (en) | 2019-10-24 | 2019-10-24 | Image sensor, imaging apparatus, and mobile platform |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112119624A true CN112119624A (en) | 2020-12-22 |
Family
ID=73799349
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980032071.0A Pending CN112119624A (en) | 2019-10-24 | 2019-10-24 | Image sensor, imaging device and mobile platform |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN112119624A (en) |
WO (1) | WO2021077374A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113992856A (en) * | 2021-11-30 | 2022-01-28 | 维沃移动通信有限公司 | Image sensor, camera module and electronic equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN204761572U (en) * | 2014-05-01 | 2015-11-11 | 半导体元件工业有限责任公司 | Imaging device |
CN107040724A (en) * | 2017-04-28 | 2017-08-11 | 广东欧珀移动通信有限公司 | Double-core focus image sensor and its focusing control method and imaging device |
US20180160027A1 (en) * | 2013-03-14 | 2018-06-07 | Apple Inc. | Image Sensor with In-Pixel Depth Sensing |
CN108200367A (en) * | 2017-02-03 | 2018-06-22 | 思特威电子科技(美国)有限公司 | Pixel unit and the formation method of pixel unit and digital camera imaging system components |
CN108281438A (en) * | 2018-01-18 | 2018-07-13 | 德淮半导体有限公司 | Imaging sensor and forming method thereof |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102695007B (en) * | 2012-05-15 | 2014-10-29 | 格科微电子(上海)有限公司 | Image sensor and driving method thereof |
TW201514599A (en) * | 2013-10-07 | 2015-04-16 | Novatek Microelectronics Corp | Image sensor and image capturing system |
KR102294316B1 (en) * | 2014-08-04 | 2021-08-26 | 엘지이노텍 주식회사 | Image sensor and image pick-up apparatus including the same |
KR102242472B1 (en) * | 2014-12-18 | 2021-04-20 | 엘지이노텍 주식회사 | Image sensor, image pick-up apparatus including the same and portable terminal including the apparatus |
CN109167941B (en) * | 2018-11-09 | 2021-02-09 | 德淮半导体有限公司 | Image sensor and method for manufacturing the same |
CN110062144A (en) * | 2019-05-14 | 2019-07-26 | 德淮半导体有限公司 | Phase focus image sensor and forming method thereof, working method |
-
2019
- 2019-10-24 WO PCT/CN2019/113097 patent/WO2021077374A1/en active Application Filing
- 2019-10-24 CN CN201980032071.0A patent/CN112119624A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180160027A1 (en) * | 2013-03-14 | 2018-06-07 | Apple Inc. | Image Sensor with In-Pixel Depth Sensing |
CN204761572U (en) * | 2014-05-01 | 2015-11-11 | 半导体元件工业有限责任公司 | Imaging device |
CN108200367A (en) * | 2017-02-03 | 2018-06-22 | 思特威电子科技(美国)有限公司 | Pixel unit and the formation method of pixel unit and digital camera imaging system components |
CN107040724A (en) * | 2017-04-28 | 2017-08-11 | 广东欧珀移动通信有限公司 | Double-core focus image sensor and its focusing control method and imaging device |
CN108281438A (en) * | 2018-01-18 | 2018-07-13 | 德淮半导体有限公司 | Imaging sensor and forming method thereof |
Also Published As
Publication number | Publication date |
---|---|
WO2021077374A1 (en) | 2021-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10074682B2 (en) | Phase difference detection in pixels | |
US10999544B2 (en) | Image sensor including phase detection pixels and image pickup device | |
CN106133912B (en) | Solid-state image sensor, electronic device, and imaging method | |
CN105518862B (en) | The driving method and electronic device of solid imaging element, solid imaging element | |
CN110010633B (en) | Image forming apparatus with a plurality of image forming units | |
CN204761572U (en) | Imaging device | |
JP4797606B2 (en) | Imaging device | |
CN103037180B (en) | Imageing sensor and picture pick-up device | |
CN109981939B (en) | Imaging system | |
CN103038666B (en) | There is the electromagnetic radiation detector that gain ranging is selected | |
US20140332661A1 (en) | Focus adjustment apparatus, focus adjustment method, storage medium storing focus adjustment program, and imaging apparatus | |
CN111697014A (en) | Improved microlens for semiconductor device having single photon avalanche diode pixels | |
US9467619B2 (en) | Focus detection apparatus, electronic apparatus, manufacturing apparatus, and manufacturing method | |
KR102128467B1 (en) | Image sensor and image photograph apparatus including image sensor | |
US11417697B2 (en) | Imaging device and imaging apparatus | |
CN112740661A (en) | Solid-state imaging device, control method of solid-state imaging device, and electronic apparatus | |
JP2020017943A (en) | Image pickup device | |
CN111627944A (en) | Photoelectric conversion apparatus, imaging system, radiation imaging system, and movable object | |
CN112119624A (en) | Image sensor, imaging device and mobile platform | |
CN213278093U (en) | Semiconductor device with a plurality of transistors | |
CN117061845A (en) | Camera assembly, electronic equipment, image shooting method and device | |
US20220028910A1 (en) | Image sensor having two-colored color filters sharing one photodiode | |
US20220293659A1 (en) | Image sensing device | |
EP2061235A1 (en) | Sensitivity correction method and imaging device | |
US11172146B2 (en) | Imaging apparatus and solid-state imaging device used therein |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201222 |
|
RJ01 | Rejection of invention patent application after publication |