CN107124536B - Dual-core focusing image sensor, focusing control method thereof and imaging device - Google Patents

Dual-core focusing image sensor, focusing control method thereof and imaging device Download PDF

Info

Publication number
CN107124536B
CN107124536B CN201710296855.7A CN201710296855A CN107124536B CN 107124536 B CN107124536 B CN 107124536B CN 201710296855 A CN201710296855 A CN 201710296855A CN 107124536 B CN107124536 B CN 107124536B
Authority
CN
China
Prior art keywords
focusing
photosensitive
dual
core
output value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710296855.7A
Other languages
Chinese (zh)
Other versions
CN107124536A (en
Inventor
曾元清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201710296855.7A priority Critical patent/CN107124536B/en
Publication of CN107124536A publication Critical patent/CN107124536A/en
Application granted granted Critical
Publication of CN107124536B publication Critical patent/CN107124536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Automatic Focus Adjustment (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a dual-core focusing image sensor, a focusing control method thereof and an imaging device, wherein the dual-core focusing image sensor comprises the following components: the focusing photosensitive unit comprises a photosensitive unit array, a filtering unit array arranged on the photosensitive unit array and a micro-lens array positioned on the filtering unit array, wherein the micro-lens array comprises a first micro-lens and a second micro-lens, one first micro-lens covers one white filtering unit, one white filtering unit covers one focusing photosensitive unit, the white filtering unit consists of N adjacent right-angled triangles, the area of one white filtering unit is half of that of one focusing photosensitive unit, and one second micro-lens covers one double-core focusing photosensitive pixel. The dual-core focusing image sensor provided by the embodiment of the invention can increase the light flux of the focusing pixel, and provides a hardware basis for improving the focusing speed in a low-light environment and improving the accuracy of color restoration.

Description

Dual-core focusing image sensor, focusing control method thereof and imaging device
Technical Field
The invention relates to the technical field of image equipment, in particular to a dual-core focusing image sensor, a focusing control method thereof and an imaging device.
Background
Among related focusing technologies, the dual-core full-pixel focusing technology has become the most advanced focusing technology in the market. Compared with contrast focusing, laser focusing and phase focusing technologies, the dual-core full-pixel focusing technology has the advantages of higher focusing speed and wider focusing range. In addition, in the dual-core full-pixel focusing technology, the dual-core photodiodes are combined into one pixel to output during imaging, so that the focusing performance can be ensured, and the image quality is not influenced.
However, when the dual-core all-pixel focusing technique is used for focusing, the photodiode of each pixel is divided into two parts, so that the light transmission amount is reduced, and the dual-core focusing is difficult in a low-light environment.
Disclosure of Invention
The object of the present invention is to solve at least to some extent one of the above mentioned technical problems.
Therefore, a first objective of the present invention is to provide a method for controlling focusing of a dual-core focusing image sensor, which can increase the light flux of a focusing pixel, effectively increase the focusing speed in a low-light environment, and simultaneously improve the accuracy of color restoration.
A second objective of the present invention is to provide a dual-core focusing image sensor.
A third object of the present invention is to provide an image forming apparatus.
A fourth object of the present invention is to provide a mobile terminal.
In order to achieve the above object, an embodiment of a first aspect of the present invention provides a focus control method for a dual-core focusing image sensor, wherein the dual-core focusing image sensor includes: the image sensor comprises a photosensitive unit array, a light filtering unit array arranged on the photosensitive unit array and a micro-lens array positioned on the light filtering unit array, wherein the micro-lens array comprises a first micro-lens and a second micro-lens, one first micro-lens covers one white light filtering unit, one white light filtering unit covers one focusing photosensitive unit, the white light filtering unit consists of N adjacent right-angled triangles, the area of one white light filtering unit is half of that of one focusing photosensitive unit, and one second micro-lens covers one dual-core focusing photosensitive pixel, and the image sensor comprises the following steps:
controlling the photosensitive unit array to enter a focusing mode;
reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels;
and carrying out focusing control according to the first phase difference information and the second phase difference information.
The focusing control method of the dual-core focusing image sensor is based on the fact that a first micro lens covers a square white filtering unit, a white filtering unit covers the middle part of a focusing photosensitive unit, the coverage area of the white filtering unit occupies half of that of the focusing photosensitive unit, a second micro lens covers a dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information, so that the focusing speed under a low-light environment is effectively improved, and meanwhile the accuracy of color restoration is improved.
In order to achieve the above object, a second embodiment of the present invention provides a dual-core focusing image sensor, including: the focusing photosensitive unit comprises a photosensitive unit array, a filtering unit array arranged on the photosensitive unit array and a micro-lens array positioned on the filtering unit array, wherein the micro-lens array comprises a first micro-lens and a second micro-lens, one first micro-lens covers one white filtering unit, one white filtering unit covers one focusing photosensitive unit, the white filtering unit consists of N adjacent right-angled triangles, the area of one white filtering unit is half of that of one focusing photosensitive unit, and one second micro-lens covers one double-core focusing photosensitive pixel.
The dual-core focusing image sensor provided by the embodiment of the invention has the advantages that the micro lens array comprising the first micro lens and the second micro lens is arranged, one first micro lens covers one square white light filtering unit, one white light filtering unit covers the middle part of one focusing photosensitive unit, the covering area accounts for half of that of the focusing photosensitive unit, and the second micro lens covers one dual-core focusing photosensitive pixel, so that the light transmission amount of the focusing pixel can be increased, and a hardware basis is provided for improving the focusing speed in a low-light environment and improving the color reduction accuracy.
In order to achieve the above object, an embodiment of a third aspect of the present invention proposes an image forming apparatus including: the dual-core focusing image sensor provided by the embodiment of the second aspect; the control module controls the photosensitive unit array to enter a focusing mode; reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels; and carrying out focusing control according to the first phase difference information and the second phase difference information.
The imaging device provided by the embodiment of the invention is characterized in that a first micro lens covers a square white light filtering unit, a white light filtering unit covers the middle part of a focusing photosensitive unit, the coverage area of the white light filtering unit is half of that of the focusing photosensitive unit, a second micro lens covers a dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information, so that the focusing speed in a low-light environment is effectively increased, and the accuracy of color reduction is improved.
In order to achieve the above object, a fourth aspect of the present invention further provides a mobile terminal, which includes a housing, a processor, a memory, a circuit board, and a power circuit, wherein the circuit board is disposed inside a space enclosed by the housing, and the processor and the memory are disposed on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory is used for storing executable program codes; the processor executes a program corresponding to the executable program code by reading the executable program code stored in the memory, so as to execute the focusing control method of the dual-core focusing image sensor proposed in the embodiment of the first aspect.
The mobile terminal provided by the embodiment of the invention is based on the fact that the first micro lens covers the square white light filtering unit, the white light filtering unit covers the middle part of the focusing photosensitive unit, the coverage area accounts for half of the focusing photosensitive unit, the second micro lens covers the dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased by reading the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information, so that the focusing speed in a low-light environment is effectively improved, and the accuracy of color reduction is improved.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic diagram of a conventional dual-core focusing image sensor;
FIG. 2 is a cross-sectional view of a dual core in-focus image sensor according to one embodiment of the present invention;
FIG. 3 is a top view of a dual core in focus image sensor according to one embodiment of the present invention;
FIG. 4 is a first microlens array density distribution diagram;
FIG. 5 is a flowchart of a focus control method of a dual-core focus image sensor according to an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating the division effect of the covered portion of 2 × 2 photosensitive pixels in the focusing photosensitive unit by the white filter unit according to an embodiment of the present invention;
FIG. 7 is a flowchart of a focus control method of a dual-core focus image sensor according to another embodiment of the present invention;
FIG. 8 is a schematic diagram of an interpolation algorithm for obtaining pixel values of a focusing photosensitive unit;
FIG. 9 is a schematic structural diagram of an imaging apparatus according to an embodiment of the invention;
fig. 10 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
A dual-core focusing image sensor, a focusing control method thereof, and an imaging device according to embodiments of the present invention are described below with reference to the accompanying drawings.
The dual-core full-pixel focusing technology is the most advanced focusing technology in the current market, the dual-core focusing sensor structure adopted by the focusing technology is shown in fig. 1, and each microlens (the circle in fig. 1 represents a microlens) corresponds to two photodiodes. When the imaging process is performed, the values of "1" and "2" are added to obtain a single-component pixel value. When the focusing process is performed, the values of "1" and "2" are read out, and the driving amount and driving direction of the lens can be calculated by calculating the phase difference between the two values.
It can be understood that as the total number of pixels increases, the photosensitive areas corresponding to "1" and "2" become smaller, which reduces the throughput, resulting in that the phase information in a low light environment is easily drowned by noise and the focusing is difficult.
Therefore, in order to solve the problem that the conventional dual-core full-pixel focusing technology is difficult to focus in a low-light environment, the invention provides a focusing control method of a dual-core focusing image sensor, which can increase the light flux of focusing pixels and effectively improve the focusing speed in the low-light environment.
The dual-core focusing image sensor required for implementing the focusing control method of the dual-core focusing image sensor provided by the invention is introduced below.
Fig. 2 is a cross-sectional view of a dual core in-focus image sensor according to an embodiment of the present invention, and fig. 3 is a top view of the dual core in-focus image sensor according to an embodiment of the present invention.
As shown in fig. 2 and 3, the dual core in-focus image sensor 100 includes a photosensitive cell array 10, a filter cell array 20, and a microlens array 30.
Wherein, the filter unit array 20 is disposed on the photosensitive unit array 10, and the microlens array 30 is disposed on the filter unit array 20. The microlens array 30 includes first and second microlenses 31 and 32. A first microlens 31 covers a white filter unit 21, a white filter unit 21 covers a focusing photosensitive unit 11, the white filter unit 21 is composed of N × N adjacent right triangles, the area of a white filter unit 21 is half of that of a focusing photosensitive unit 11, a second microlens 32 covers a filter unit 22, and a filter unit 22 covers a dual-core focusing photosensitive pixel 12.
In the embodiment of the present invention, the arrangement of the dual-core focusing photosensitive pixels 12 is a Bayer pattern (Bayer pattern). The bayer structure is adopted, so that the image signal can be processed by adopting a traditional algorithm aiming at the bayer structure, and large adjustment on a hardware structure is not needed. The dual-core focus-sensitive pixel 12 has two photodiodes, a first photodiode 121 and a second photodiode 122, respectively, corresponding to "1" and "2" of each dual-core focus-sensitive pixel 12 in fig. 3, respectively.
In the embodiment of the present invention, the focusing photosensitive unit 11 includes N × N photosensitive pixels 110, and the white filter unit 21 covers a lower right half of an upper left photosensitive pixel, an upper right half of a lower left photosensitive pixel, a lower left half of an upper right photosensitive pixel, and an upper left half of a lower right photosensitive pixel in the focusing photosensitive unit 11. In the dual-core focusing image sensor structure shown in fig. 3, the focusing photosensitive unit 11 (the dotted line portion in the figure) includes 2 × 2 photosensitive pixels 110, one white filtering unit 21, i.e., W in the figure, covers the lower right half of the upper left photosensitive pixel, the upper right half of the lower left photosensitive pixel, the lower left half of the upper right photosensitive pixel, and the upper left half of the lower right photosensitive pixel in one focusing photosensitive unit 11, the area of the white filtering unit 21 is half of that of the focusing photosensitive unit 11, and the remaining portions of the focusing photosensitive unit 11, i.e., regions a, b, c, and d in fig. 3, are ordinary filtering units, and are covered by semicircular filters, so that the portions can provide corresponding RGB pixel values for interpolation reduction calculation in subsequent imaging, and the accuracy of color reduction is improved.
In summary, in the dual-core focusing image sensor 100 according to the embodiment of the present invention, N × N photosensitive pixels 110 form a group and share one first microlens 31, the white filter unit 21 covers the focusing photosensitive units 11, and the area of one white filter unit 21 is half of that of one focusing photosensitive unit 11.
In one embodiment of the present invention, the microlens array 30 includes a horizontal center line and a vertical center line, and the first microlens 31 is plural. The plurality of first microlenses 31 includes a first group of first microlenses 31 disposed at a horizontal center line and a second group of first microlenses 31 disposed at a vertical center line.
In an embodiment of the present invention, the microlens array 30 may further include four edges, and in this case, the plurality of first microlenses 31 further includes a third group of first microlenses 31 disposed on the four edges.
When the microlens array 30 includes a horizontal center line, a vertical center line, and four side lines, the lens density of the first group of first microlenses 31 and the second group of first microlenses 31 is greater than the lens density of the third group of first microlenses 31.
For ease of understanding, the arrangement of the first microlenses 31 in the microlens array 30 is described below with reference to the drawings. Fig. 4 is a first microlens array density distribution diagram. As shown in fig. 4, the white filter units 21 covered by the first microlenses 31, i.e., W in the figure, are scattered in the entire dual-core focusing image sensor, accounting for 3% to 5% of the total number of pixels, and are distributed more densely on the horizontal central line and the vertical central line of the microlens array 30, and are distributed sparsely on the four side lines, so that the focusing accuracy and speed in the middle area of the screen are considered preferentially, and the focusing speed is effectively increased without affecting the image quality.
In fig. 3 and 4, W indicates that the filter unit covered by the first microlens 31 in the dual-core in-focus image sensor is the white filter unit 21, and a larger amount of light can be obtained when the white filter unit 21 is used. The filter unit covered by the first microlens 31 may also be a green filter unit, i.e., W in fig. 3 and 4 may be replaced by G, and when the green filter unit is used, more information is available in the imaging process. It should be understood that the embodiment of the present invention only uses the white filter unit as an example for illustration, and is not to be taken as a limitation of the present invention.
Based on the structure of the dual-core focusing image sensor in fig. 2-4, the following describes a focusing control method of the dual-core focusing image sensor according to an embodiment of the present invention. Fig. 5 is a flowchart of a focus control method of a dual-core focus image sensor according to an embodiment of the present invention, as shown in fig. 5, the method includes the following steps:
and S51, controlling the photosensitive unit array to enter a focusing mode.
When the camera is used for shooting, if the displayed picture is insufficient in definition, the photosensitive unit array can be controlled to enter a focusing mode so as to improve the definition of the picture through focusing.
S52, the first phase difference information of the focus photosensitive unit and the second phase difference information of the dual-core focus photosensitive pixels are read.
In an embodiment of the present invention, after entering the focusing mode, the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive pixel may be further read.
Optionally, in an embodiment of the present invention, reading the first phase difference information of the focusing photosensitive unit may include: reading output values of a part of photosensitive pixels in the focusing photosensitive unit as first output values; reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value; and acquiring first phase difference information according to the first output value and the second output value.
It should be noted that, in the embodiment of the present invention, reading the first phase difference information of the focusing photosensitive unit, specifically, reading the first phase difference information of the portion of the focusing photosensitive unit covered by the white filter unit.
In the following, referring to fig. 3 and fig. 6, the focusing photosensitive unit includes 2 × 2 photosensitive pixels, and the white filter unit covers the lower right half of the upper left photosensitive pixel, the upper right half of the lower left photosensitive pixel, the lower left half of the upper right photosensitive pixel, and the upper left half of the lower right photosensitive pixel in the focusing photosensitive unit. The area (white part in fig. 3, denoted by W) covered by the white filter unit in the focusing photosensitive unit can be divided from different angles, as shown in fig. 6, and 3 diagrams shown in fig. 6 are the division of W from the left and right sides, the upper and lower sides, and the diagonal side, and the following is specifically described:
example one, W is divided from the left and right sides.
In this example, W is divided into two parts, left and right, and the output values of a part of the photosensitive pixels in the focus photosensitive unit may be the output values of two "1" s on the left side of W as the first output value, and the output values of two "2" s on the right side of W as the second output value.
Example two, W is divided from the upper and lower sides.
In this example, W is divided into two upper and lower portions, and the output values of a part of the photosensitive pixels in the focus photosensitive unit may be the output values of two "1" s on the upper side of W as the first output value, and the output values of two "2" s on the lower side of W as the second output value.
Example three, W is divided from the diagonal side.
In this example, W is divided into two parts according to two diagonal lines, that is, two "1" output values at the top left corner and the bottom right corner of W are used as the first output values, and two "2" output values at the bottom left corner and the top right corner are used as the second output values.
In an embodiment of the present invention, after the first output value and the second output value are read, the first phase difference information may be acquired according to the first output value and the second output value.
For example, taking the read two output values of "1" on the left side of W as the first output value and the two output values of "2" on the right side of W as the second output value as an example, the sum of the two output values of "1" on the left side may be calculated as the first phase information, the sum of the two output values of "2" on the right side may be calculated as the second phase information, and finally the difference between the first phase information and the second phase information may be calculated as the first phase difference information.
In the embodiment of the present invention, the output values of the left and right sides of the portion covered by the white filter unit in the focusing photosensitive unit are respectively used as the first output value and the second output value, and the first phase difference information in the left and right directions can be detected; the output values of the upper side and the lower side of a part covered by the white filtering unit in the focusing photosensitive unit are respectively used as a first output value and a second output value, and first phase difference information in the upper direction and the lower direction can be detected; the output values of two diagonal lines covered by the white filter unit in the focusing photosensitive unit are respectively used as a first output value and a second output value, and the oblique first phase difference information can be detected.
Optionally, in an embodiment of the present invention, reading the second phase difference information of the dual-core focusing photosensitive pixels may include: reading an output value of the first photodiode as a third output value; reading an output value of the second photodiode as a fourth output value; and acquiring second phase difference information according to the third output value and the fourth output value.
Still taking fig. 3 as an example, in fig. 3, the second phase difference information of all the dual-core focus-sensitive pixels is calculated in the same manner, and only the second phase difference information at Gr in fig. 3 is taken as an example for description. First, the output value of "1" at Gr is read as the third output value, and then the output value of "2" at Gr is read as the fourth output value, and the second phase difference information is obtained from the third output value and the fourth output value, for example, the difference between the third output value and the fourth output value may be calculated as the second phase difference information.
And S53, performing focusing control according to the first phase difference information and the second phase difference information.
In the embodiment of the invention, after the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive unit are read, the focusing control can be performed according to the first phase difference information and the second phase difference information.
In the related dual-core focusing technology, a phase difference is usually calculated according to output values of two photodiodes in a dual-core focusing photosensitive pixel, so as to calculate a driving amount and a driving direction of a lens, thereby realizing focusing. In a low light environment, the focusing speed is slow.
In the embodiment of the invention, based on the fact that the first micro lens covers one white light filtering unit and one white light filtering unit covers one focusing photosensitive unit, the white light filtering unit is adopted, the first phase difference information with larger light transmission quantity can be obtained under the low-light environment for focusing processing, and the focusing speed under the low-light environment is further improved.
The focusing control method of the dual-core focusing image sensor is based on the fact that a first micro lens covers a square white filtering unit, a white filtering unit covers the middle part of a focusing photosensitive unit, the coverage area of the white filtering unit occupies half of that of the focusing photosensitive unit, a second micro lens covers a dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information, so that the focusing speed under a low-light environment is effectively improved, and meanwhile the accuracy of color restoration is improved.
It should be understood that the purpose of focusing is to obtain a higher definition picture. In practical applications, after the focusing process is completed, a further imaging process is usually included, so that, as shown in fig. 7, on the basis of fig. 5, step S53 is followed by:
and S71, controlling the photosensitive unit array to enter an imaging mode.
In an embodiment of the present invention, after the focus control is completed, the array of photosensitive cells is further controlled to enter an imaging mode.
S72, the array of photosensitive cells is controlled to perform exposure, and the output values of the array of photosensitive cells are read to obtain the pixel values of the array of photosensitive cells to generate an image.
And the pixel value of the part of the focusing photosensitive unit covered by the white filtering unit is obtained by an interpolation reduction algorithm.
In the embodiment of the invention, after the photosensitive cell array enters the imaging mode, the photosensitive cell array is controlled to be exposed, and the output value of the photosensitive cell array is read, so that the pixel value of the photosensitive cell array is obtained to generate an image.
In an embodiment of the present invention, reading the output value of the light sensing unit array to obtain the pixel value of the light sensing unit array may include: after the output values of two photodiodes in a dual-core focusing photosensitive pixel are read, adding the output values of the two photodiodes to obtain a pixel value of the dual-core focusing photosensitive pixel; for the part covered by the white filtering unit in the focusing photosensitive unit, an interpolation reduction algorithm is adopted to obtain the pixel value of the part, wherein the interpolation reduction algorithm can be any one of a nearest neighbor interpolation algorithm, a bilinear interpolation algorithm and a cubic convolution interpolation algorithm.
For simplicity, a nearest neighbor interpolation algorithm may be used to obtain the pixel value of the focusing photosensitive unit, i.e. the gray value of the input pixel closest to the position to which the focusing photosensitive unit is mapped is selected as the interpolation result, i.e. the pixel value of the focusing photosensitive unit.
FIG. 8 is a schematic diagram of an interpolation algorithm for obtaining pixel values of a focus sensing unit.
As shown in fig. 8, in the focusing photosensitive unit including 2 × 2 photosensitive pixels, the white filter unit (white region in the drawing) covers the lower right half of the upper left one of the photosensitive pixels, the upper right half of the lower left one of the photosensitive pixels, the lower left half of the upper right one of the photosensitive pixels, and the upper left half of the lower right one of the photosensitive pixels. In order to output an image with a good image quality, interpolation restoration needs to be performed on output values of covered portions in each photosensitive pixel, that is, RGB values of covered portions in each photosensitive pixel need to be obtained through calculation. The pixel average value of the neighboring pixels may be taken as the pixel value of the covered portion of the photosensitive pixel. Taking the calculation of the RGB value at the upper left corner "1" in the white filter unit as an example, for the convenience of description, the R pixel value at the upper left corner "1" is recorded as R10And the G pixel value is marked as G10And the B pixel value is B10The calculation formulas are respectively as follows:
R10=Ra
Figure BDA0001283270200000081
Figure BDA0001283270200000082
it should be noted that, the interpolation reduction method for RGB values at the bottom left corner "1", the top right corner "2" and the bottom right corner "2" in the white filter unit is similar to the RGB value reduction method at the top left corner "1", and adjacent pixel points are selected for interpolation reduction, and no example is given here to avoid redundancy.
It should be noted that the above description of the algorithm for obtaining the pixel value of the focus photosensitive unit is only used for explaining the present invention, and should not be taken as limiting the present invention. In actual processing, to obtain more accurate pixel values, the pixel values of several adjacent pixels can be used for interpolation restoration, and are not limited to the pixel values of the adjacent pixels, wherein a higher weight is assigned to a pixel value close to the pixel value, and a lower weight is assigned to a pixel value far away from the pixel value, that is, the weight occupied by the pixel value in the interpolation restoration algorithm is inversely proportional to the distance of the restored pixel.
In the embodiment of the invention, after the pixel values of the focusing photosensitive units are restored, the image can be generated according to the pixel values of all the pixel points in the photosensitive unit array.
According to the focusing control method of the dual-core focusing image sensor, after focusing control is finished, the photosensitive unit array is controlled to enter an imaging mode, exposure is carried out on the photosensitive unit array, the output value of the photosensitive unit array is read to obtain the pixel value of the photosensitive unit array, so that an image is generated, and the image quality can be improved.
In order to implement the above embodiments, the present invention further provides a dual-core focused image sensor, fig. 2 is a cross-sectional view of the dual-core focused image sensor according to an embodiment of the present invention, and fig. 3 is a top view of the dual-core focused image sensor according to an embodiment of the present invention.
It should be noted that, the explanation about the dual-core focusing image sensor in the foregoing focusing control method embodiment of the dual-core focusing image sensor is also applicable to the dual-core focusing image sensor in the embodiment of the present invention, and the implementation principle is similar, and details are not described here.
The dual-core focusing image sensor provided by the embodiment of the invention has the advantages that the micro lens array comprising the first micro lens and the second micro lens is arranged, one first micro lens covers one square white light filtering unit, one white light filtering unit covers the middle part of one focusing photosensitive unit, the covering area accounts for half of that of the focusing photosensitive unit, and the second micro lens covers one dual-core focusing photosensitive pixel, so that the light transmission amount of the focusing pixel can be increased, and a hardware basis is provided for improving the focusing speed in a low-light environment and improving the color reduction accuracy.
In order to implement the above embodiment, the present invention further provides an imaging device, and fig. 9 is a schematic structural diagram of an imaging device according to an embodiment of the present invention.
As shown in fig. 9, the imaging apparatus 900 includes the dual core focus image sensor 100 of the above embodiment and a control module 910. Wherein the content of the first and second substances,
the control module 910 controls the photosensitive cell array to enter a focusing mode, reads first phase difference information of the focusing photosensitive cells and second phase difference information of the dual-core focusing photosensitive pixels, and performs focusing control according to the first phase difference information and the second phase difference information.
Optionally, in an embodiment of the present invention, the control module 910 is configured to read output values of a part of photosensitive pixels in the focusing photosensitive unit as a first output value, read output values of another part of photosensitive pixels in the focusing photosensitive unit as a second output value, and obtain the first phase difference information according to the first output value and the second output value.
It should be noted that, in the embodiment of the present invention, reading the first phase difference information of the focusing photosensitive unit, specifically, reading the first phase difference information of the portion of the focusing photosensitive unit covered by the white filter unit.
In an embodiment of the present invention, a dual-core focusing photosensitive pixel in the dual-core focusing image sensor 100 has two photodiodes, a first photodiode and a second photodiode. Therefore, the control module 910 is further configured to read the output value of the first photodiode as a third output value, read the output value of the second photodiode as a fourth output value, and obtain the second phase difference information according to the third output value and the fourth output value.
It should be understood that the purpose of focusing is to obtain a higher definition picture. In practical applications, after the focusing process is completed, a further imaging process is usually included, and therefore, in an embodiment of the present invention, the control module 910 is further configured to control the photosensitive cell array to enter an imaging mode, control the photosensitive cell array to perform exposure, and read output values of the photosensitive cell array to obtain pixel values of the photosensitive cell array so as to generate an image, where the pixel values of the portion of the focusing photosensitive cell covered by the white filter unit are obtained through an interpolation reduction algorithm.
The imaging device provided by the embodiment of the invention is characterized in that a first micro lens covers a square white light filtering unit, a white light filtering unit covers the middle part of a focusing photosensitive unit, the coverage area of the white light filtering unit is half of that of the focusing photosensitive unit, a second micro lens covers a dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased by reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information, so that the focusing speed in a low-light environment is effectively increased, and the accuracy of color reduction is improved.
In order to implement the above embodiments, the present invention further provides a mobile terminal, and fig. 10 is a schematic structural diagram of the mobile terminal according to an embodiment of the present invention.
As shown in fig. 10, the mobile terminal 1000 includes a housing 1001, a processor 1002, a memory 1003, a circuit board 1004, and a power supply circuit 1005, wherein the circuit board 1004 is disposed inside a space surrounded by the housing 1001, and the processor 1002 and the memory 1003 are disposed on the circuit board 1004; a power supply circuit 1005 for supplying power to each circuit or device of the mobile terminal; the memory 1003 is used for storing executable program codes; the processor 1002 runs a program corresponding to the executable program code by reading the executable program code stored in the memory 1003 for executing the focus control method of the dual-core focus image sensor in the above-described embodiment.
The mobile terminal provided by the embodiment of the invention is based on the fact that the first micro lens covers the square white light filtering unit, the white light filtering unit covers the middle part of the focusing photosensitive unit, the coverage area accounts for half of the focusing photosensitive unit, the second micro lens covers the dual-core focusing photosensitive pixel, and the light transmission amount of the focusing pixel can be increased by reading the first phase difference information of the focusing photosensitive unit and the second phase difference information of the dual-core focusing photosensitive pixel and carrying out focusing control according to the first phase difference information and the second phase difference information, so that the focusing speed in a low-light environment is effectively improved, and the accuracy of color reduction is improved.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It should be noted that in the description of the present specification, reference to the description of the term "one embodiment", "some embodiments", "an example", "a specific example", or "some examples", etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (19)

1. A focus control method of a dual-core focusing image sensor is characterized in that the dual-core focusing image sensor comprises the following steps: the focusing photosensitive unit comprises N photosensitive pixels, the N photosensitive pixels correspond to one white filtering unit, the white filtering unit consists of N adjacent right-angled triangles, the area of one white filtering unit is half of that of one focusing photosensitive unit, one second microlens covers a double-core focusing photosensitive pixel, the microlens array comprises a horizontal central line, a vertical central line and four side lines, and the first microlenses are multiple, and the lens density of the first microlenses on the horizontal centerline and the vertical centerline is greater than the lens density of the first microlenses on the four edges, the method comprising the steps of:
controlling the photosensitive unit array to enter a focusing mode;
reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels;
and carrying out focusing control according to the first phase difference information and the second phase difference information.
2. The method of claim 1, wherein reading the first phase difference information of the focus photosensitive unit comprises:
reading output values of a part of photosensitive pixels in the focusing photosensitive unit and taking the output values as first output values;
reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value;
and acquiring the first phase difference information according to the first output value and the second output value.
3. The method of claim 1, wherein the dual-core focusing-photosensitive pixel has two photodiodes, a first photodiode and a second photodiode, reading second phase difference information of the dual-core focusing-photosensitive pixel, comprising:
reading an output value of the first photodiode as a third output value;
reading an output value of the second photodiode as a fourth output value;
and acquiring the second phase difference information according to the third output value and the fourth output value.
4. The method of claim 1, wherein said dual-core focus-sensitive pixels are arranged in a bayer array.
5. The method of claim 1, wherein the plurality of first microlenses comprises:
a first set of first microlenses disposed at the horizontal centerline; and
a second set of first microlenses disposed at the vertical centerline.
6. The method of claim 5, wherein the plurality of first microlenses further comprises:
and a third group of first microlenses arranged on the four edge lines.
7. The method of claim 6, wherein the first set of first microlenses and the second set of first microlenses have a lens density greater than a lens density of the third set of first microlenses.
8. The method of claim 1, wherein the white filter element covers a bottom right half of a top left photosensitive pixel, a top right half of a bottom left photosensitive pixel, a bottom left half of a top right photosensitive pixel, and a top left half of a bottom right photosensitive pixel in the focusing photosensitive element, the method further comprising:
controlling the photosensitive unit array to enter an imaging mode;
and controlling the photosensitive unit array to perform exposure, and reading an output value of the photosensitive unit array to obtain a pixel value of the photosensitive unit array so as to generate an image, wherein the pixel value of the part of the focusing photosensitive unit covered by the white filter unit is obtained by an interpolation reduction algorithm.
9. A dual-core in-focus image sensor, comprising:
an array of photosensitive cells;
the light filtering unit array is arranged on the photosensitive unit array;
a micro lens array positioned above the filter unit array;
wherein the microlens array comprises first and second microlenses, one of the first microlenses covering a white filter unit, one of the white filter units covering a focusing photosensitive unit, the focusing photosensitive unit comprises N × N photosensitive pixels, the N × N photosensitive pixels correspond to a white filter unit, the white light filtering units are composed of N × N adjacent right-angled triangles, the area of one white light filtering unit is half of that of one focusing photosensitive unit, one second micro lens covers one dual-core focusing photosensitive pixel, the micro lens array comprises a horizontal central line, a vertical central line and four side lines, the first micro lenses are multiple, and the lens density of the first microlenses on the horizontal center line and the vertical center line is greater than the lens density of the first microlenses on the four edge lines.
10. The dual-core focusing image sensor of claim 9, wherein said dual-core focusing photosensitive pixels are arranged in a bayer array.
11. A dual core in-focus image sensor as claimed in claim 9, wherein the plurality of first microlenses includes:
a first set of first microlenses disposed at the horizontal centerline; and
a second set of first microlenses disposed at the vertical centerline.
12. A dual core in-focus image sensor as claimed in claim 11, wherein said plurality of first microlenses further comprises:
and a third group of first microlenses arranged on the four edge lines.
13. A dual core in-focus image sensor as claimed in claim 12, wherein the lens density of said first group of first microlenses and said second group of first microlenses is greater than the lens density of said third group of first microlenses.
14. A dual core focus image sensor as claimed in claim 9, wherein said white filter unit covers a lower right half of upper left-side photosensitive pixels, an upper right half of lower left-side photosensitive pixels, a lower left half of upper right-side photosensitive pixels, and an upper left half of lower right-side photosensitive pixels in said focus photosensitive unit.
15. An image forming apparatus, comprising:
the dual core in-focus image sensor of any one of claims 9 to 14; and
the control module controls the photosensitive unit array to enter a focusing mode;
reading first phase difference information of the focusing photosensitive unit and second phase difference information of the dual-core focusing photosensitive pixels;
and carrying out focusing control according to the first phase difference information and the second phase difference information.
16. The imaging apparatus of claim 15, wherein the control module is specifically configured to:
reading output values of a part of photosensitive pixels in the focusing photosensitive unit and taking the output values as first output values;
reading the output value of the other part of photosensitive pixels in the focusing photosensitive unit as a second output value;
and acquiring the first phase difference information according to the first output value and the second output value.
17. The imaging apparatus of claim 15, wherein the dual-core focus-sensitive pixel has two photodiodes, a first photodiode and a second photodiode, the control module is specifically configured to:
reading an output value of the first photodiode as a third output value;
reading an output value of the second photodiode as a fourth output value;
and acquiring the second phase difference information according to the third output value and the fourth output value.
18. The imaging apparatus of claim 15, wherein the control module is further to:
controlling the photosensitive unit array to enter an imaging mode;
and controlling the photosensitive unit array to perform exposure, and reading an output value of the photosensitive unit array to obtain a pixel value of the photosensitive unit array so as to generate an image, wherein the pixel value of the part of the focusing photosensitive unit covered by the white filter unit is obtained by an interpolation reduction algorithm.
19. A mobile terminal comprises a shell, a processor, a memory, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the mobile terminal; the memory is used for storing executable program codes; the processor runs a program corresponding to an executable program code stored in the memory by reading the executable program code for performing the focus control method of the dual core focus image sensor according to any one of claims 1 to 8.
CN201710296855.7A 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device Active CN107124536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710296855.7A CN107124536B (en) 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710296855.7A CN107124536B (en) 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device

Publications (2)

Publication Number Publication Date
CN107124536A CN107124536A (en) 2017-09-01
CN107124536B true CN107124536B (en) 2020-05-08

Family

ID=59726471

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710296855.7A Active CN107124536B (en) 2017-04-28 2017-04-28 Dual-core focusing image sensor, focusing control method thereof and imaging device

Country Status (1)

Country Link
CN (1) CN107124536B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108600712B (en) 2018-07-19 2020-03-31 维沃移动通信有限公司 Image sensor, mobile terminal and image shooting method
CN109922270A (en) * 2019-04-17 2019-06-21 德淮半导体有限公司 Phase focus image sensor chip
CN111586323A (en) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 Image sensor, control method, camera assembly and mobile terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104241310A (en) * 2014-09-23 2014-12-24 上海集成电路研发中心有限公司 CMOS image pixel array with two-lenticule layer
CN106358026A (en) * 2015-07-15 2017-01-25 三星电子株式会社 Image sensor including auto-focusing pixel and image processing system including the same

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10044959B2 (en) * 2015-09-24 2018-08-07 Qualcomm Incorporated Mask-less phase detection autofocus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104241310A (en) * 2014-09-23 2014-12-24 上海集成电路研发中心有限公司 CMOS image pixel array with two-lenticule layer
CN106358026A (en) * 2015-07-15 2017-01-25 三星电子株式会社 Image sensor including auto-focusing pixel and image processing system including the same

Also Published As

Publication number Publication date
CN107124536A (en) 2017-09-01

Similar Documents

Publication Publication Date Title
CN107040724B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN106982328B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN107105140B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN107146797B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
EP3396942B1 (en) Image sensor, imaging method and electronic device
US11082605B2 (en) Method of photography processing for camera module, terminal, using same and storage medium implementing same
US10397465B2 (en) Extended or full-density phase-detection autofocus control
CN106982329B (en) Image sensor, focusing control method, imaging device and mobile terminal
CN107040702B (en) Image sensor, focusing control method, imaging device and mobile terminal
WO2018196703A1 (en) Image sensor, focusing control method, imaging device and mobile terminal
CN107124536B (en) Dual-core focusing image sensor, focusing control method thereof and imaging device
CN106921823A (en) Imageing sensor, camera module and terminal device
WO2018137773A1 (en) Method and device for blind correction of lateral chromatic aberration in color images
JP5526984B2 (en) Image processing apparatus, image processing method, computer program for image processing, and imaging apparatus
JP2018064249A (en) Imaging apparatus and control method thereof
CN116506745A (en) Image forming apparatus and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: GUANGDONG OPPO MOBILE TELECOMMUNICATIONS CORP., Ltd.

GR01 Patent grant
GR01 Patent grant