CN112738385A - Sensor and shooting module - Google Patents
Sensor and shooting module Download PDFInfo
- Publication number
- CN112738385A CN112738385A CN202110337172.8A CN202110337172A CN112738385A CN 112738385 A CN112738385 A CN 112738385A CN 202110337172 A CN202110337172 A CN 202110337172A CN 112738385 A CN112738385 A CN 112738385A
- Authority
- CN
- China
- Prior art keywords
- light
- area
- shooting
- depth
- filter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application relates to the technical field of sensors and discloses a sensor, which comprises: a light sensing area; the photosensitive area comprises a plurality of pixel areas; at least one pixel area in the plurality of pixel areas comprises a shooting area and a depth of field measuring area; wherein the photographing region is configured to receive the photographing light, and generate photographing image information of a partial region of the object based on the photographing light; a depth measurement area configured to receive depth measurement light, and generate depth image information of a local area of the target object based on the depth measurement light; at least two layers of shooting photosensitive layers are arranged in the shooting area. The sensor provided by the application can reduce the space occupied by the sensor by arranging the shooting area and the depth of field measuring area in one sensor. The application also discloses a shooting module.
Description
Technical Field
The application relates to the technical field of sensors, for example to a sensor and shoot module.
Background
Currently, a 3D camera needs to be separately provided with a photographing sensor for obtaining a 2D image and a depth sensor for obtaining a depth image. Obtaining the 3D image requires the 2D image and the depth image to be algorithmically compared and fused.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: at least more than two sensors are needed for obtaining the 2D image and the depth image, and the occupied space is large; the 2D image and the depth image need to be compared and fused to obtain the 3D image, and the data processing speed is reduced.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a sensor and a shooting module, which are used for solving the technical problems that the occupied space of the sensor is large, a 2D image and a depth image need to be compared and fused, and the data processing speed is reduced.
In some embodiments, the sensor comprises: a light sensing area;
the photosensitive area comprises a plurality of pixel areas;
at least one of the plurality of pixel areas comprises a shooting area and a depth of field measuring area;
wherein the photographing region is configured to receive photographing light, and photographing image information of a local area of a target object is generated based on the photographing light;
the depth measurement area is configured to receive depth measurement light and generate depth image information of a local area of the target object based on the depth measurement light;
the shooting area is internally provided with at least two shooting photosensitive layers, different shooting photosensitive layers in the at least two shooting photosensitive layers are configured to respectively absorb light of different wave bands, and shooting image information corresponding to the different shooting photosensitive layers of a local area of the target object is generated based on the absorbed light of the wave bands;
wherein the photographed image information corresponding to the different photographed photosensitive layers forms photographed image information of a local area of the object.
In some embodiments, three photosensitive layers are disposed within the capture area; the three shooting photosensitive layers are respectively as follows: a first shooting photosensitive layer, a second shooting photosensitive layer and a third shooting photosensitive layer;
the first photographic sensitive layer is configured to absorb light of a first waveband and generate first photographic image information of a local area of a target object based on the light of the first waveband;
the second shooting photosensitive layer is configured to absorb light of a second wave band and generate second shooting image information of a local area of a target object based on the light of the second wave band;
the third shooting photosensitive layer is configured to absorb third wave band light and generate third shooting image information of a local area of a target object based on the third wave band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
In some embodiments, a light filtering assembly is also included; the light filtering assembly includes: a shooting light transmission area and a depth of field measurement light transmission area;
the shooting light transmission area is configured to filter light to be transmitted to the shooting area so as to obtain shooting light required by the shooting area;
the depth measurement light transmission area is configured to filter light to be transmitted to the depth measurement area to obtain depth measurement light required by the depth measurement area.
In some embodiments, the capture light transmissive region includes a first filter disposed between the object and the first capture light sensitive layer;
the first filter film is configured to filter light to be transmitted to the three-layer photographic sensitive layer to obtain light of a wavelength band required by the three-layer photographic sensitive layer.
In some embodiments, the capture light transmissive region includes a second filter disposed between the first capture photosensitive layer and the second capture photosensitive layer;
the second filter film is configured to filter light to be transmitted to the second and third photographing photosensitive layers to obtain light of wavelength bands required by the second and third photographing photosensitive layers.
In some embodiments, the photographing light transmitting region includes a third filter disposed between the second photographing photosensitive layer and the third photographing photosensitive layer;
the third filter film is configured to filter light to be transmitted to the third photographic sensitive layer to obtain light of a wavelength band required by the third photographic sensitive layer.
In some embodiments, a depth measurement sensitive layer is disposed within the depth measurement zone.
In some embodiments, a depth measurement sensitive layer is disposed within the depth measurement zone.
In some embodiments, the depth of field measurement sensitive layer comprises a depth of field measurement sensitive region.
In some embodiments, the depth measurement light transmissive region includes a fourth filter disposed between the target and the depth measurement sensitive layer;
the fourth filter film is configured to filter light to be transmitted to the depth measurement photosensitive layer to obtain depth measurement light required by the depth measurement photosensitive layer.
In some embodiments, the camera module comprises the sensor.
The sensor and the shooting module provided by the embodiment of the disclosure can realize the following technical effects:
by arranging the photographing region and the depth-of-field measuring region in one sensor, the space occupied by the sensor can be reduced. By arranging the shooting area and the depth-of-field measurement area in one pixel area, the shot image information and the depth-of-field image information generated by the shooting area and the depth-of-field measurement area respectively can be directly recorded by taking pixels as units, subsequent image comparison and data fusion processing are not needed, and the data processing speed can be increased.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
At least one embodiment is illustrated by the accompanying drawings, which correspond to the accompanying drawings, and which do not form a limitation on the embodiment, wherein elements having the same reference numeral designations are shown as similar elements, and which are not to scale, and wherein:
FIG. 1 is a schematic diagram of a sensor provided by an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a photosensitive region provided by an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a pixel region according to an embodiment of the disclosure;
fig. 4 is a schematic structural diagram of a shooting area provided by the embodiment of the present disclosure;
FIG. 5 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 6 is a schematic structural diagram of a filter assembly provided by an embodiment of the present disclosure;
FIG. 7 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 8 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 9 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a depth-of-field measurement area provided by an embodiment of the present disclosure;
FIG. 11 is a schematic view of a structure of a depth of field measurement sensitive layer provided by an embodiment of the present disclosure;
FIG. 12 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a shooting module according to an embodiment of the present disclosure.
Reference numerals:
100: a sensor; 101: a light sensing area; 102: a pixel region; 103: a shooting area; 104: a depth of field measurement zone; 105: a filter assembly; 106: a photographing light transmission region; 107: a depth-of-field measuring light transmission region; 109: a first photographing photosensitive layer; 110: a second photographing photosensitive layer; 111: a third photographing sensitive layer; 112: a first light filter film; 113: a second light filter film; 114: a third light filter film; 115: a fourth light filter film; 116: a depth of field measuring photosensitive layer; 117: a depth-of-field measurement photosensitive area; 118: depth of field measuring the non-photosensitive area; 200: shooting module.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, at least one embodiment may be practiced without these specific details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
As shown in fig. 1, 2, and 3, embodiments of the present disclosure provide a sensor 100, the sensor 100 may include: a photosensitive region 101;
the photosensitive region 101 includes a plurality of pixel regions 102;
at least one pixel area 102 of the plurality of pixel areas 102 includes a shooting area 103 and a depth measurement area 104;
wherein the photographing region 103 is configured to receive the photographing light, and generate photographing image information of a local area of the subject based on the photographing light;
a depth-of-field measurement area 104 configured to receive the depth-of-field measurement light, and generate depth-of-field image information of a local area of the target object based on the depth-of-field measurement light;
at least two shooting photosensitive layers are arranged in the shooting area 103, different shooting photosensitive layers in the at least two shooting photosensitive layers are configured to respectively absorb light of different wave bands, and shooting image information corresponding to the different shooting photosensitive layers of a local area of the target object is generated based on the absorbed light of the wave bands;
wherein the shot image information corresponding to the different shot photosensitive layers forms shot image information of a local area of the object.
In some embodiments, the shooting area 103 and the depth measurement area 104 may be located in the same plane. Alternatively, the shooting area 103 and the depth measurement area 104 do not overlap in a direction perpendicular to the shooting area 103.
In some embodiments, the at least two photographic sensitive layers are provided in a stack. That is, at least two photographing photosensitive layers overlap each other with an overlapping area in a direction perpendicular to the photographing photosensitive layers.
As shown in fig. 4, in some embodiments, three photosensitive layers may be disposed in the capturing area 103, wherein the three photosensitive layers are: a first photographing photosensitive layer 109, a second photographing photosensitive layer 110, and a third photographing photosensitive layer 111. Alternatively, the first photographing photosensitive layer 109 may be used to absorb blue light. Alternatively, the second photographing photosensitive layer 110 may be used to absorb green light. Alternatively, the third photographing photosensitive layer 111 may be used to absorb red light.
In some embodiments, three photographic sensitive layers are provided in a stack.
In some embodiments, seven photosensitive layers may be disposed in the capturing area, where the seven photosensitive layers are: a red light-sensitive layer, an orange light-sensitive layer, a yellow light-sensitive layer, a green light-sensitive layer, a blue light-sensitive layer, an indigo light-sensitive layer and a violet light-sensitive layer. Alternatively, a red photosensitive layer may be used to absorb red light. Alternatively, an orange photosensitive layer may be used to absorb orange light. Alternatively, a yellow photosensitive layer may be used to absorb yellow light. Alternatively, the green photosensitive layer may be used to absorb green light. Alternatively, a blue sensitive layer may be used to absorb blue light. Alternatively, the indigo photosensitive layer may be used to absorb indigo light. Alternatively, a violet photosensitive layer may be used to absorb violet light.
In some embodiments, each of the at least two photographic sensitive layers may absorb light in more than one wavelength band.
In some embodiments, each of the at least two imaging photosensitive layers may absorb light in one wavelength band. Alternatively, red light may be absorbed. Alternatively, green light may be absorbed. Alternatively, blue light may be absorbed.
In some embodiments, each of the at least two photographic sensitive layers may absorb light in two wavelength bands. Alternatively, red light and green light may be absorbed. Alternatively, blue and red light may be absorbed, and so on.
As shown in fig. 4, in some embodiments, three photographic sensitive layers are disposed within the photographic zone 103; the three shooting photosensitive layers are respectively: a first photographing photosensitive layer 109, a second photographing photosensitive layer 110, and a third photographing photosensitive layer 111;
a first photographic photosensitive layer 109 configured to absorb light of a first wavelength band, and generate first photographic image information of a local area of the subject based on the light of the first wavelength band;
a second photographic photosensitive layer 110 configured to absorb light of a second wavelength band, and generate second photographic image information of a local area of the object based on the light of the second wavelength band;
a third photosensitive layer 111 configured to absorb the third wavelength band light and generate third captured image information of a local region of the target based on the third wavelength band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
In some embodiments, the first band of wavelengths of light may be more than one band of wavelengths of light. Alternatively, the first wavelength band light may be one wavelength band light. Alternatively, it may be red light. Alternatively, it may be green light. Alternatively, blue light may be used. The first wavelength band light may be two wavelength band light. Alternatively, red light and green light may be used. Alternatively, blue light and red light, etc.
In some embodiments, the second wavelength band of light may be more than one wavelength band of light. Alternatively, the second wavelength band light may be one wavelength band light. Alternatively, it may be red light. Alternatively, it may be green light. Alternatively, blue light may be used. The second wavelength band light may be two wavelength band light. Alternatively, red light and green light may be used. Alternatively, blue light and red light, etc.
In some embodiments, the third wavelength band of light may be more than one wavelength band of light. Alternatively, the third wavelength band of light may be one wavelength band of light. Alternatively, it may be red light. Alternatively, it may be green light. Alternatively, blue light may be used. The third wavelength band light may be two wavelength bands of light. Alternatively, red light and green light may be used. Alternatively, blue light and red light, etc.
As shown in fig. 2, in some embodiments, photosensitive region 101 is a region in sensor 100 that is used for light sensing. Optionally, the photosensitive area 101 includes a plurality of pixel areas 102, and each pixel area 102 of the plurality of pixel areas 102 is an area occupied by a pixel in the photosensitive area 101.
As shown in fig. 3, in some embodiments, a photographing region 103 and a depth measurement region 104 are included in at least one pixel region 102 of the plurality of pixel regions 102. Optionally, a photographing region 103 and a depth measurement region 104 are included in each of the plurality of pixel regions 102. Alternatively, any one of the other pixel regions 102 of the plurality of pixel regions 102 (except the pixel region 102 including the photographing region 103 and the depth of field measurement region 104 among the plurality of pixel regions 102) may select one of the photographing region 103, the depth of field measurement region 104, and the non-photographing region 103 and the depth of field measurement region 104 within the pixel region 102 independently of each other.
In some embodiments, the captured image information of the local region of the target object may be an analog signal. Alternatively, the 2D image may be generated based on captured image information of a local area of the target object.
In some embodiments, the range image information of the local region of the object may be an analog signal. Alternatively, the range image may be generated based on range image information of a local region of the target object. Alternatively, the analog signal corresponding to the captured image information of the local area of the target object may be different from the analog signal corresponding to the range image information of the local area of the target object.
In some embodiments, the shooting area 103 and the depth measurement area 104 may be provided in one pixel area 102, and the shot image information of the local area of the object generated by the shooting area 103 and the depth image information of the local area of the object generated by the depth measurement area 104 are directly recorded in units of pixels.
In some embodiments, the captured image information of the local area of the target object and the depth image information of the local area of the target object generated by the capturing area 103 and the depth measurement area 104 located in one pixel area 102 may be collected, respectively, a corresponding captured image may be generated based on the captured image information of the local area of the target object, a corresponding depth image may be generated based on the depth image information of the local area of the target object, the captured image and the depth image may be fused, and the fused image may be recorded in units of pixels.
In some embodiments, the photographic light may be visible light.
In some embodiments, the depth of field measurement light may be invisible light. Alternatively, the invisible light may be infrared or ultraviolet. Table 1 exemplarily illustrates wavelength distributions of visible light and invisible light.
TABLE 1
As shown in fig. 4, in some embodiments, the first photographic sensitive layer 109 may be used to absorb blue light. Alternatively, the second photographing photosensitive layer 110 may be used to absorb green light. Alternatively, the third photographing photosensitive layer 111 may be used to absorb red light.
As shown in fig. 5 and 6, in some embodiments, sensor 100 further includes a filter assembly 105; the filter assembly 105 includes: a photographing light transmission region 106 and a depth-of-field measurement light transmission region 107;
a photographing light transmitting area 106 configured to filter light to be transmitted to the photographing area 103 to obtain photographing light required for the photographing area 103;
the depth-of-field measurement light transmission region 107 is configured to filter the light to be transmitted to the depth-of-field measurement region 104 to obtain the depth-of-field measurement light required by the depth-of-field measurement region 104.
As shown in fig. 5, in some embodiments, the filter assembly 105 includes a photographing light transmitting region 106 and a depth measurement light transmitting region 107. Optionally, the photographing light transmitting area 106 is disposed corresponding to the photographing area 103, and the photographing light transmitting area 106 is configured to: the light to be transmitted to the photographing region 103 is filtered to obtain photographing light required for the photographing region 103. Optionally, the depth of field measurement light transmission area 107 is disposed corresponding to the depth of field measurement area 104, and the depth of field measurement light transmission area 107 is configured to: the light to be transmitted to the depth of field measurement area 104 is filtered to obtain the depth of field measurement light required by the depth of field measurement area 104.
As shown in fig. 7, in some embodiments, the photographing light transmitting region 106 includes a first filter 112 disposed between the target object and the first photographing photosensitive layer 109;
and a first filter 112 configured to filter light to be transmitted to the three-layered photographic sensitive layer to obtain light of a wavelength band required for the three-layered photographic sensitive layer.
As shown in fig. 7, in some embodiments, the first photosensitive layer 109 can absorb blue light, and when the light emitted to the first photosensitive layer 109 includes uv light, the first filter 112 can filter the uv light to prevent the uv light from interfering with the first photosensitive layer 109.
As shown in fig. 8, in some embodiments, the capture light transmissive region 106 includes a second filter 113 disposed between the first capture photosensitive layer 109 and the second capture photosensitive layer 110;
a second filter 113 configured to filter light to be transmitted to the second and third photosensitive photographing layers 110 and 111 to obtain light of a wavelength band required for the second and third photosensitive photographing layers 110 and 111.
In some embodiments, the second filter 113 can filter out light that is not completely absorbed by the first photo sensitive layer 109. Optionally, the second filter 113 can filter light that may interfere with the second photosensitive photographing layer 110. The photographing light transmitting region 106 may include a second filter 113. Alternatively, the photographing light transmitting region 106 may include a first filter 112 and a second filter 113. Fig. 8 exemplarily shows a case where the photographing light transmitting region 106 includes the second filter 113.
As shown in fig. 9, in some embodiments, the photographing light transmitting region 106 includes a third filter 114 disposed between the second photographing photosensitive layer 110 and the third photographing photosensitive layer 111;
and a third filter 114 configured to filter light to be transmitted to the third photosensitive photographing layer 111 to obtain light of a wavelength band required by the third photosensitive photographing layer 111.
In some embodiments, the third filter 114 can filter out light that is not completely absorbed by the second photosensitive imaging layer 110. Alternatively, the third filter 114 can filter light that would interfere with the third photosensitive photographing layer 111. The photographing light transmitting region 106 may include a third filter 114. Alternatively, the photographing light transmitting region 106 may include a first filter 112 and a third filter 114. Alternatively, the photographing light transmitting region 106 may include a second filter 113 and a third filter 114. Alternatively, the photographing light transmitting region 106 may include a first filter 112, a second filter 113, and a third filter 114. Fig. 9 exemplarily shows a case where the photographing light transmitting region 106 includes the third filter 114.
As shown in fig. 10, in some embodiments, when the sensor 100 includes the photosensitive region 101, multiple depth of field measurement photosensitive layers 116 may be disposed within the depth of field measurement region 104. Optionally, a depth of field measurement sensitive layer 116 is disposed within the depth of field measurement region 104. Fig. 10 illustrates an example of a depth of view measurement sensitive layer 116 disposed within the depth of view measurement region 104.
As further shown in fig. 10, in some embodiments, when the sensor 100 includes a photosensitive region 101 and a filtering component, multiple depth of field measurement photosensitive layers 116 may be disposed within the depth of field measurement region 104. Optionally, a depth of field measurement sensitive layer 116 is disposed within the depth of field measurement region 104. Fig. 10 illustrates an example of a depth of view measurement sensitive layer 116 disposed within the depth of view measurement region 104.
As shown in fig. 11, in some embodiments, when one depth of field measurement sensitive layer 116 is disposed within the depth of field measurement region 104, the depth of field measurement sensitive layer 116 may include a plurality of depth of field measurement sensitive regions 117. Optionally, the depth-of-field measurement sensitive layer 116 may include a depth-of-field measurement sensitive region 117. Optionally, the depth-of-field measurement photosensitive area 117 may be configured to receive the depth-of-field measurement light, and generate depth-of-field image information of a local area of the target object based on the depth-of-field measurement light. Optionally, the depth of field measurement sensitive layer 116 may include a depth of field measurement non-sensitive region 118. Alternatively, the depth-of-field measurement photosensitive region 117 may be distributed anywhere within the depth-of-field measurement photosensitive layer 116. Fig. 11 schematically illustrates the depth-of-field measurement sensitive layer 116 including a depth-of-field measurement sensitive region 117, wherein the depth-of-field measurement sensitive region 117 is disposed at a middle position of the depth-of-field measurement sensitive layer 116.
As shown in fig. 12, in some embodiments, the depth of field measurement light transmitting region 107 includes a fourth filter 115 disposed between the target and the depth of field measurement photosensitive layer 116;
the fourth filter 115 is configured to filter the light to be transmitted to the depth measurement photosensitive layer 116 to obtain the depth measurement light required by the depth measurement photosensitive layer 116.
In some embodiments, when the depth of field measurement sensitive layer 116 needs to absorb invisible light, the fourth filter 115 can be used to: the light to be transmitted to the sensitive depth of field measurement layer 116 is filtered to obtain the invisible light required by the sensitive depth of field measurement layer 116.
As shown in fig. 13, an embodiment of the present disclosure provides a camera module 200, and the camera module 200 includes the sensor 100.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It is clear to those skilled in the art that, for convenience and brevity of description, the working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit may be merely a division of a logical function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the drawings, the width, length, thickness, etc. of structures such as elements or layers may be exaggerated for clarity and descriptive purposes. When an element or layer is referred to as being "disposed on" (or "mounted on," "laid on," "attached to," "coated on," or the like) another element or layer, the element or layer may be directly "disposed on" or "over" the other element or layer, or intervening elements or layers may be present, or even partially embedded in the other element or layer.
Claims (10)
1. A sensor, comprising: a light sensing area;
the photosensitive area comprises a plurality of pixel areas;
at least one of the plurality of pixel areas comprises a shooting area and a depth of field measuring area;
wherein the photographing region is configured to receive photographing light, and photographing image information of a local area of a target object is generated based on the photographing light;
the depth measurement area is configured to receive depth measurement light and generate depth image information of a local area of the target object based on the depth measurement light;
the shooting area is internally provided with at least two shooting photosensitive layers, different shooting photosensitive layers in the at least two shooting photosensitive layers are configured to respectively absorb light of different wave bands, and shooting image information corresponding to the different shooting photosensitive layers of a local area of the target object is generated based on the absorbed light of the wave bands;
wherein the photographed image information corresponding to the different photographed photosensitive layers forms photographed image information of a local area of the object.
2. The sensor of claim 1, wherein three photographic sensitive layers are disposed within the photographic zone; the three shooting photosensitive layers are respectively as follows: a first shooting photosensitive layer, a second shooting photosensitive layer and a third shooting photosensitive layer;
the first photographic sensitive layer is configured to absorb light of a first waveband and generate first photographic image information of a local area of a target object based on the light of the first waveband;
the second shooting photosensitive layer is configured to absorb light of a second wave band and generate second shooting image information of a local area of a target object based on the light of the second wave band;
the third shooting photosensitive layer is configured to absorb third wave band light and generate third shooting image information of a local area of a target object based on the third wave band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
3. The sensor of claim 2, further comprising a filter assembly; the light filtering assembly includes: a shooting light transmission area and a depth of field measurement light transmission area;
the shooting light transmission area is configured to filter light to be transmitted to the shooting area so as to obtain shooting light required by the shooting area;
the depth measurement light transmission area is configured to filter light to be transmitted to the depth measurement area to obtain depth measurement light required by the depth measurement area.
4. The sensor of claim 3, wherein the capture light transmissive region comprises a first filter disposed between the object and the first capture light sensitive layer;
the first filter film is configured to filter light to be transmitted to the three-layer photographic sensitive layer to obtain light of a wavelength band required by the three-layer photographic sensitive layer.
5. The sensor of claim 3, wherein the capture light transmissive region comprises a second filter disposed between the first capture photosensitive layer and the second capture photosensitive layer;
the second filter film is configured to filter light to be transmitted to the second and third photographing photosensitive layers to obtain light of wavelength bands required by the second and third photographing photosensitive layers.
6. The sensor of claim 3, wherein the capture light transmissive region comprises a third filter disposed between the second capture sensitive layer and the third capture sensitive layer;
the third filter film is configured to filter light to be transmitted to the third photographic sensitive layer to obtain light of a wavelength band required by the third photographic sensitive layer.
7. A sensor according to claim 3, wherein a depth measurement sensitive layer is provided within the depth measurement zone.
8. The sensor of claim 7, wherein the depth measurement sensitive layer comprises a depth measurement sensitive area.
9. The sensor of claim 7, wherein the depth measurement light transmissive region comprises a fourth filter disposed between the target and the depth measurement sensitive layer;
the fourth filter film is configured to filter light to be transmitted to the depth measurement photosensitive layer to obtain depth measurement light required by the depth measurement photosensitive layer.
10. A camera module, characterized in that it comprises a sensor according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110337172.8A CN112738385A (en) | 2021-03-30 | 2021-03-30 | Sensor and shooting module |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110337172.8A CN112738385A (en) | 2021-03-30 | 2021-03-30 | Sensor and shooting module |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112738385A true CN112738385A (en) | 2021-04-30 |
Family
ID=75596032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110337172.8A Pending CN112738385A (en) | 2021-03-30 | 2021-03-30 | Sensor and shooting module |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112738385A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101064848A (en) * | 2006-04-27 | 2007-10-31 | 索尼株式会社 | Image data processing apparatus and method |
US20140263980A1 (en) * | 2011-01-28 | 2014-09-18 | Aptina Imaging Corpoation | Imagers with depth sensing capabilities |
CN104184935A (en) * | 2013-05-27 | 2014-12-03 | 鸿富锦精密工业(深圳)有限公司 | Image shooting device and method |
CN106340527A (en) * | 2016-11-15 | 2017-01-18 | 豪威科技(上海)有限公司 | Back-illuminated image sensor chip |
CN107465849A (en) * | 2016-06-03 | 2017-12-12 | 致伸科技股份有限公司 | The image capture unit of image depth of field measuring method and application this method |
CN110299375A (en) * | 2019-07-08 | 2019-10-01 | 芯盟科技有限公司 | Semiconductor structure and forming method thereof |
CN110740236A (en) * | 2019-10-16 | 2020-01-31 | Oppo广东移动通信有限公司 | Image sensor, electronic device, image processing method, and storage medium |
CN111314634A (en) * | 2018-12-11 | 2020-06-19 | 江西合力泰科技有限公司 | Manufacturing method of image sensor, sensor and pixel reading method |
CN112363180A (en) * | 2020-10-28 | 2021-02-12 | Oppo广东移动通信有限公司 | Imaging distance measuring sensor, method, system and storage medium |
-
2021
- 2021-03-30 CN CN202110337172.8A patent/CN112738385A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101064848A (en) * | 2006-04-27 | 2007-10-31 | 索尼株式会社 | Image data processing apparatus and method |
US20140263980A1 (en) * | 2011-01-28 | 2014-09-18 | Aptina Imaging Corpoation | Imagers with depth sensing capabilities |
CN104184935A (en) * | 2013-05-27 | 2014-12-03 | 鸿富锦精密工业(深圳)有限公司 | Image shooting device and method |
CN107465849A (en) * | 2016-06-03 | 2017-12-12 | 致伸科技股份有限公司 | The image capture unit of image depth of field measuring method and application this method |
CN106340527A (en) * | 2016-11-15 | 2017-01-18 | 豪威科技(上海)有限公司 | Back-illuminated image sensor chip |
CN111314634A (en) * | 2018-12-11 | 2020-06-19 | 江西合力泰科技有限公司 | Manufacturing method of image sensor, sensor and pixel reading method |
CN110299375A (en) * | 2019-07-08 | 2019-10-01 | 芯盟科技有限公司 | Semiconductor structure and forming method thereof |
CN110740236A (en) * | 2019-10-16 | 2020-01-31 | Oppo广东移动通信有限公司 | Image sensor, electronic device, image processing method, and storage medium |
CN112363180A (en) * | 2020-10-28 | 2021-02-12 | Oppo广东移动通信有限公司 | Imaging distance measuring sensor, method, system and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2380345B1 (en) | Improving the depth of field in an imaging system | |
EP2083446B1 (en) | Image pickup apparatus | |
US9793308B2 (en) | Imager integrated circuit and stereoscopic image capture device | |
US9118850B2 (en) | Camera system with multiple pixel arrays on a chip | |
US8786741B2 (en) | Digital multi-spectral camera system having at least two independent digital cameras | |
EP3414763B1 (en) | Mobile device for hdr video capture | |
EP2757790A1 (en) | Photographing device and photographing method for taking picture by using a plurality of microlenses | |
EP3439038A1 (en) | Image capturing element and image capturing device | |
US20170111557A1 (en) | Camera assembly with filter providing different effective entrance pupil sizes based on light type | |
CN105306786A (en) | Image processing methods for image sensors with phase detection pixels | |
WO2014160819A1 (en) | Multi field-of-view multi sensor electro-optical fusion-zoom camera | |
CN211481355U (en) | Multispectral sensing structure, sensor and camera | |
WO2005089369B1 (en) | Apparatus for imaging using an array of lenses | |
CN103842877A (en) | Imaging device and focus parameter value calculation method | |
CN108900763A (en) | Filming apparatus, electronic equipment and image acquiring method | |
CN110266922A (en) | Camera module and electronic equipment | |
US20180188502A1 (en) | Panorama image capturing device having at least two camera lenses and panorama image capturing module thereof | |
JP5532766B2 (en) | Imaging device and imaging apparatus | |
TWI584643B (en) | Camera devices and systems based on a single imaging sensor and methods for manufacturing the same | |
JP5476731B2 (en) | Image sensor | |
CN112738385A (en) | Sensor and shooting module | |
CN112804438A (en) | Sensor and shooting module | |
WO2017217053A1 (en) | Imaging device and filter | |
CN112738386A (en) | Sensor, shooting module and image acquisition method | |
CN209472639U (en) | Photographic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210430 |