WO2023020527A1 - 图像处理方法、装置、电子设备及可读存储介质 - Google Patents

图像处理方法、装置、电子设备及可读存储介质 Download PDF

Info

Publication number
WO2023020527A1
WO2023020527A1 PCT/CN2022/112970 CN2022112970W WO2023020527A1 WO 2023020527 A1 WO2023020527 A1 WO 2023020527A1 CN 2022112970 W CN2022112970 W CN 2022112970W WO 2023020527 A1 WO2023020527 A1 WO 2023020527A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
exposure
pixels
pixel
control path
Prior art date
Application number
PCT/CN2022/112970
Other languages
English (en)
French (fr)
Inventor
黄春成
Original Assignee
维沃移动通信(杭州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信(杭州)有限公司 filed Critical 维沃移动通信(杭州)有限公司
Publication of WO2023020527A1 publication Critical patent/WO2023020527A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • the present application belongs to the technical field of image processing, and in particular relates to an image processing method, device, electronic equipment and readable storage medium.
  • phase focusing With the rapid development of focusing technology, electronic devices can perform phase (PD, phase detection) focusing, and PD focusing is achieved through phase detection.
  • PD phase detection
  • PD focusing is achieved through phase detection.
  • the imaging pixels RGB pixels, red, green and blue pixels
  • the imaging pixels include GR pixel 61, R pixel 62, B pixel 63, and GB pixel 64.
  • the image sensor can be applied to the scene of shooting a human face, and the human face is more sensitive to G pixels, so the G channel can be set
  • the G pixels of the red and green channels, that is, the GR pixels 61, and the G pixels of the blue and green channels, that is, the GB pixels 64) add a PD pixel, wherein the PD pixels may include left (L) pixels and right (R) pixels.
  • L and R left pixels
  • FIG. 2 it can be seen that the image sensor here has added PD pixels (indicated by L and R) on the basis of the imaging pixels in FIG. 1 .
  • the same reference numerals represent the same objects, and the reference numerals in FIG.
  • the arrangement form of the PD pixels is not limited to that shown in FIG. 2 , so that the PD pixels can be used to assist focusing.
  • the phase difference (Phase diff) of the focus area can be calculated to achieve phase focus.
  • the exposure parameters configured by the sensor's imaging pixels and PD pixels are the same set of parameters, so simply adjusting the exposure parameters from one dimension of PD focusing effect or imaging focusing may lead to performance degradation in another dimension.
  • the mainstream image control method is mainly to adjust the unified exposure parameter with reference to the imaging effect of the imaging pixels in the Sensor. For example, in a backlit scene, in order to ensure better imaging quality of the background, the uniform exposure parameters of the imaging pixels and PD pixels are adjusted to be smaller, which will cause the face objects in the focus area to be relatively dark.
  • the post-image processing method improves the image quality of the face, but this will still make the phase value in the focus area very low, and the signal-to-noise ratio is poor, resulting in a decrease in the focus quality of the face area including PD pixels.
  • the image processing method in the related art has the problem that the focusing quality based on PD focusing is degraded.
  • the purpose of the embodiments of the present application is to provide an image processing method, device, electronic device, and readable storage medium, which can solve the problem of degradation of focusing quality based on PD focusing in image processing methods in the related art.
  • the embodiment of the present application provides an image processing method, the method comprising:
  • the candidate PD pixels are used in the image sensor for phase focusing pixels
  • the candidate PD pixels include target PD pixels
  • the first exposure parameter is controlled by the first control path
  • the second exposure parameter of the imaging pixel in the image sensor is controlled by a second control path
  • the first control path is different from the second control path
  • the image sensor is different from the first control path
  • an image processing device which includes:
  • the first determination module is configured to determine the first exposure parameter according to the shooting scene and the brightness information of the target PD pixel in the focus area of the image sensor;
  • a first generating module configured to perform a first exposure on candidate PD pixels in the image sensor by using the first exposure parameters configured by the first control path, and generate a first image, wherein the candidate PD pixels are the A pixel used for phase focusing in an image sensor, the candidate PD pixel includes the target PD pixel, and the first exposure parameter is controlled by the first control path;
  • a focusing module configured to perform phase focusing on the shooting scene based on the first image
  • the second exposure parameter of the imaging pixel in the image sensor is controlled by a second control path
  • the first control path is different from the second control path
  • the image sensor is different from the first control path
  • an embodiment of the present application provides an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored in the memory and operable on the processor, and the program or instruction is The processor implements the steps of the method described in the first aspect when executed.
  • an embodiment of the present application provides a readable storage medium, on which a program or an instruction is stored, and when the program or instruction is executed by a processor, the steps of the method described in the first aspect are implemented .
  • the embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions, so as to implement the first aspect the method described.
  • the parameter control method of the imaging pixel and the PD pixel in the image sensor can be modified, and by independently setting the exposure parameters of the imaging pixel and the exposure parameter of the PD pixel in the image sensor, the exposure parameter of the PD pixel can be used to control the
  • the PD pixels in the image sensor are exposed to generate the first image, and the first image is used to perform phase focusing, so that while the exposure parameters of the imaging pixels are used to ensure the imaging quality, the PD focusing effect can be improved, the PD performance can be optimized, and the imaging pixels can be improved.
  • the configuration of the exposure parameters of the PD pixel and the exposure parameter of the PD pixel is not independent, which leads to the problem that the imaging effect and the PD focusing effect cannot be optimal.
  • FIG. 1 is one of the pixel layout schematic diagrams of an image sensor in the prior art
  • FIG. 2 is the second schematic diagram of pixel layout of an image sensor in the prior art
  • Fig. 3 is the third schematic diagram of pixel layout of an image sensor in the prior art
  • Fig. 4 is the fourth schematic diagram of pixel layout of an image sensor in the prior art
  • Fig. 5 is one of the pixel layout schematic diagrams of an image sensor according to an embodiment of the present application.
  • FIG. 6 is a flowchart of an image processing method according to an embodiment of the present application.
  • Fig. 7 is one of the schematic diagrams of the pixel output mode of an image sensor in the prior art
  • Fig. 8 is the second schematic diagram of the pixel output mode of an image sensor in the prior art
  • FIG. 9 is one of the schematic diagrams of pixel output modes of an image sensor according to an embodiment of the present application.
  • Fig. 10 is the second schematic diagram of the pixel output mode of an image sensor according to an embodiment of the present application.
  • Fig. 11 is a block diagram of an image processing device according to an embodiment of the present application.
  • FIG. 12 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
  • Fig. 13 is a schematic diagram of a hardware structure of an electronic device according to another embodiment of the present application.
  • the exposure parameters configured by the imaging pixels of the Sensor and the PD pixels are the same set of parameters, that is, the exposure parameters of the PD pixels are consistent with the exposure parameters of the imaging pixels, as compared to Figure 1, as shown in Figure 1 3, Fig. 3 shows the setting of the exposure parameters of each imaging pixel in the Sensor without PD pixels, and the vertical line 11 in the pixel grid in Fig. The exposure parameters are marked with reference signs.
  • the pixel layout in FIG. 3 is the same as that in FIG. 1 , and only the schematic lines of the exposure parameters are additionally shown in FIG. 3 . It can be seen from Fig. 3 that the exposure parameters of each imaging pixel are the same.
  • Figure 4 shows the settings of the exposure parameters of the imaging pixels and PD pixels in the Sensor with PD pixels, in Figure 4,
  • the vertical line 21 in the pixel grid represents the exposure parameter of the pixel of the pixel grid, and the exposure parameter of each imaging pixel and the exposure parameter of each PD pixel are not marked here.
  • the pixel layout of FIG. 4 is the same as that of FIG. 2 Likewise, here only the schematic lines for the exposure parameters are additionally shown in FIG. 4 . It can be seen from Figure 4 that the exposure parameters of the imaging pixels and PD pixels in the Sensor are the same.
  • an embodiment of the present application provides an image processing method, and the image processing method of each embodiment will be described in detail below with reference to the accompanying drawings.
  • FIG. 6 shows a flowchart of an image processing method according to an embodiment of the present application, and the method may specifically include the following steps:
  • Step 101 determining a first exposure parameter according to the shooting scene and the brightness information of the target phase pixel in the focus area of the image sensor;
  • the target phase pixel that is, the target PD pixel;
  • the PD pixel can include L pixel and R pixel, for the L pixel and R pixel in the PD pixel, a pixel in the image sensor, if the half position of the pixel is added to
  • the metal covering makes the pixels on the left half covered can only receive light from the left, and the pixels on the left half covered are called L pixels; similarly, the pixels on the right half covered can only receive light from the left Accepting the light from the right, the pixel that is covered by half of the right is called the R pixel; in the image sensor, the L pixel and the R pixel appear in pairs at adjacent positions, as shown in Figure 2, Figure 4, and Figure 5 Show.
  • the exposure parameters may include but not limited to integration time (integration time, INT), analog gain (gain), digital gain and so on.
  • the integration time represents the exposure time (exposure time) in line units.
  • INT is 159, which means that the image sensor (Sensor) has an exposure time of 159 lines.
  • the meanings of the integration time and the exposure time are the same. , both represent the exposure time of the Sensor, but integration time is a relative concept, that is, in units of lines, and the absolute time occupied by each line is related to the clock frequency and how many pclks (ie line length) each line contains; while exposure time is It refers to the absolute time of Sensor exposure.
  • the value of the second exposure parameter is different from the value of the first exposure parameter.
  • the exposure parameter is integration time
  • the value of the integration time for exposing the PD pixels is different from the integration time for exposing the imaging pixels. The value of is different.
  • the second exposure parameter is an exposure parameter used for second exposure to the imaging pixel in the image sensor, that is to say, in the embodiment of the present application, the imaging pixel and the PD pixel in the Sensor can be set independently exposure parameters.
  • FIG. 5 Compared with the same exposure parameter 21 for imaging pixels and PD pixels in FIG. 4, refer to FIG. 5 of the embodiment of the present application (wherein, the pixel layout in FIG. 5 is the same as that in FIG. 2 and FIG. 4, In order to show the lines representing the exposure parameters more clearly, the imaging pixels and PD pixels (L pixels and R pixels in FIG. 5 ) in FIG.
  • the pixel layout can be explained with reference to Fig. 2 and Fig. 4, that is to say, in Fig.
  • each pixel grid except the L pixel and the R pixel represents an imaging pixel
  • the exposure parameter 41 can be used to control the exposure of the L pixel and the R pixel in the PD pixel in the Sensor, and the vertical line in each pixel grid including the L pixel or the R pixel in FIG. 5 is used to represent the exposure parameter 41;
  • Each imaging pixel shown in FIG. 5 is exposed using an exposure parameter 42.
  • the vertical line in the pixel grid of each imaging pixel in FIG. 5 is used to represent the exposure parameter 42.
  • the exposure parameters of each imaging pixel in FIG. 5 are marked with reference signs, and the exposure parameters of the imaging pixels in the second to eighth rows can refer to the exposure parameters 42 of the imaging pixels shown in the first row.
  • the arrows in Fig. 5 are used to indicate the pixels not shown in each row in the Sensor.
  • the value of the second exposure parameter used to control the exposure of the imaging pixel may be the value of the exposure parameter that optimizes the imaging effect; and the value of the first exposure parameter used to control the exposure of the PD pixel may be such that the exposed
  • the first image corresponding to the PD pixel can make the value of the phase focusing effect optimal, that is, the first image generated after using the value of the first exposure parameter to expose the PD pixel can be used for improvement and optimization
  • the PD focusing effect improves the PD performance.
  • the value of the first exposure parameter that can improve and optimize the PD focus effect can be determined by combining the current shooting scene of the Sensor and the brightness information of the target PD pixel in the focus area of the Sensor, wherein the target PD pixel represents focus Each PD pixel within the region.
  • step 101 in order to determine the value of the first exposure parameter that can improve and optimize the PD focusing effect and improve the PD performance, step 101 can be implemented in the following manner, so as to determine the value of the first exposure parameter value, specifically: determine a preset brightness condition that matches the shooting scene; then, if the brightness information of the target PD pixel in the focus area of the image sensor does not meet the preset brightness condition, determine the preset brightness condition that matches the preset brightness condition.
  • a lot of experiments may be done in advance to determine the preset brightness conditions of the PD pixels in the focus area that can optimize the PD focus effect and improve the PD performance under various shooting scenarios.
  • the preset shooting scenes may include but not limited to backlight scenes, point light source scenes, portrait scenes, etc., and corresponding preset brightness conditions are configured for each preset shooting scene.
  • the preset brightness condition may be a preset brightness range, or a first threshold value indicating the number of overexposed PD pixels in the focus area, and a second threshold value indicating the number of underexposed PD pixels in the focus area.
  • a first threshold value indicating the number of overexposed PD pixels in the focus area
  • a second threshold value indicating the number of underexposed PD pixels in the focus area.
  • the current shooting scene is a backlight scene
  • Adjustment for the adjusted value of the exposure parameter, it can be determined based on the preset brightness range, because there is a logical relationship between the exposure parameter and the brightness value of the PD pixel in the focus area, so it can be reversed to determine that it meets the preset brightness
  • the value of the first exposure parameter in the range is deduced back and used as the value of the first exposure parameter used when exposing the PD pixels of the shooting scene during this focusing.
  • the focus can be obtained.
  • a histogram of brightness values of PD pixels in the area using the histogram to determine whether the number of underexposed PD pixels in the focus area is less than or equal to a second threshold, and whether the number of overexposed PD pixels in the focus area is less than or equal to
  • the first threshold if it is, it means that the brightness information of the PD pixels in the focus area meets the preset brightness conditions, and there is no need to adjust the exposure parameters of the PD pixels in the Sensor; if the number of underexposed PD pixels in the focus area is greater than the first threshold Two thresholds, or if the number of overexposed PD pixels in the focus area is greater than the first threshold, it means that the brightness information of the target PD pixels does not meet the preset brightness conditions, and the PD performance is not
  • the exposure parameter of the PD pixel is adjusted, and the adjusted value of the exposure parameter can be determined based on the above-mentioned first threshold and the second threshold, because there is a logical relationship between the exposure parameter and the brightness value of the PD pixel in the focus area, therefore,
  • the value of the first exposure parameter that satisfies the above-mentioned first threshold and the second threshold can be determined inversely, and the value obtained by inversion can be used as the value used for exposing the PD pixels of the shooting scene when focusing this time.
  • the preset brightness conditions of the PD pixels in the focus area matching the shooting scene can be flexibly determined in combination with the shooting scene; then the target PD pixels in the focus area in the image sensor (referring to the current focus If the brightness information of the PD pixels in the area does not meet the preset brightness conditions, it means that the value of the exposure parameter used for PD pixel exposure cannot make the PD focus performance optimal, and it is necessary to determine the brightness information that is consistent with the preset brightness conditions.
  • the value of the matched first exposure parameter, and the value of the first exposure parameter is used to make the brightness information of the target PD pixel after the first exposure conform to the preset brightness condition, so that this
  • the embodiment of the application can accurately determine the value of the first exposure parameter for exposing PD pixels that conforms to the shooting scene and can optimize the PD focusing effect in combination with the shooting scene, which not only improves the exposure for PD pixel exposure
  • the accuracy of the value of the parameter can be adjusted reasonably and flexibly as the shooting scene changes, and the PD focusing performance can be optimized in various shooting scenes.
  • Step 102 using the first exposure parameters configured by the first control path to perform a first exposure on candidate phase pixels in the image sensor to generate a first image, wherein the candidate phase pixels are used in the image sensor
  • the candidate phase pixels include the target phase pixels
  • the first exposure parameter is controlled by the first control path
  • the candidate phase pixel that is, the candidate PD pixel
  • the candidate PD pixel can be each PD pixel in the Sensor, or it can be a PD pixel in the focus area of the Sensor. Therefore, the candidate PD pixel includes the target PD pixel of.
  • the focus area can be flexibly changed, it is not only possible to expose the target PD pixels in the current focus area in the Sensor using the first exposure parameter, but to expose each PD pixel in the Sensor (the candidate PD pixel here represents each PD pixel in the Sensor) to perform uniform exposure shooting to generate a frame of the first image. In this way, even if the focus area changes, the image information of each PD pixel corresponding to the changed focus area can still be determined from the first image.
  • the Sensor includes not only PD pixels but also imaging pixels, it can be understood here that only the PD pixels in the Sensor are exposed to generate the first image.
  • the first image is generated by exposing the PD pixels in the focus area as an example for description.
  • the focus area is a human face
  • the brightness of the face area is relatively dark
  • the existing technology will set the imaging pixel with a higher exposure parameter, resulting in the area of the point light source being darkened.
  • the method of the embodiment of the present application can reduce the exposure parameters of the PD pixels in the focus area, that is, the point light source area, and not use the higher exposure parameters for exposing the imaging pixels, thereby reducing the exposure of the point light source area. Overexposure problem.
  • the exposure parameters of the PD pixels can also be individually configured to optimize the performance of the exposed PD pixels during PD focusing.
  • the second exposure parameter of the imaging pixel in the image sensor is controlled by a second control path, wherein the first control path is different from the second control path, and the image sensor is different from the first control path , and the second control path are connected.
  • the image sensor when separately configuring the exposure parameters of the imaging pixels and the PD pixels, it may be realized by separating the semiconductor hardware paths, that is, configuring the image sensor to be connected to different semiconductor hardware paths. Wherein, the image sensor can communicate with the back-end controller through different semiconductor hardware channels.
  • the first exposure parameter can be configured for the first control path between the image sensor and the controller; and the second exposure parameter can be configured for the second control path between the image sensor and the controller, considering the need to take into account Imaging effect and focusing effect, therefore, the values of these two sets of exposure parameters are different in most cases.
  • the respective control of the exposure parameters of the PD pixels and the imaging pixels can be realized through the first control channel and the second control channel, so as to ensure the imaging effect and optimize the PD focusing performance.
  • the imaging pixels in the image sensor may be exposed using the second exposure parameter to generate the imaging image.
  • Step 103 performing phase focusing on the shooting scene based on the first image
  • the first image is the exposure map of all PD pixels in the Sensor
  • a partial image belonging to the focus area can be determined in the first image, and the phase of the scene can be captured using this partial image Focusing, since the focus area may change, this method can accurately obtain the exposure image of the PD pixel corresponding to the latest focus area, so as to assist phase focus.
  • the first image is an exposure map of the PD pixels in the focus area of the Sensor
  • the first image can be directly used for phase focusing of the shooting scene.
  • the parameter control method of the imaging pixel and the PD pixel in the image sensor can be modified, and by independently setting the exposure parameters of the imaging pixel and the exposure parameter of the PD pixel in the image sensor, the exposure parameter of the PD pixel can be used to control the
  • the PD pixels in the image sensor are exposed to generate the first image, and the first image is used for phase focusing, so that while the exposure parameters of the imaging pixels are used to ensure the imaging quality, the PD focusing effect can be improved, the PD performance can be optimized, and the imaging pixels can be improved.
  • the configurations of the exposure parameters and the exposure parameters of the PD pixels are not independent, which leads to the problem that the imaging effect and the PD focusing effect cannot be optimal.
  • the method according to the embodiment of the present application may further include:
  • Step 100 according to the first exposure time in the first exposure parameter, determine the first frame rate corresponding to the first exposure
  • step 100 the execution sequence of step 100 and step 101 .
  • the first frame rate is higher than the second frame rate corresponding to the second exposure
  • the second frame rate corresponding to the imaging pixels may be separated from the first frame rate corresponding to the PD pixels, instead of using a uniform frame rate for pixel exposure.
  • the respective control of the frame rate, so that the first frame rate for exposing the PD pixels can be set higher than the second frame rate for exposing the imaging pixels.
  • the focusing speed can be increased when the first frame rate corresponding to the exposure operation of the PD pixels is higher than the second frame rate corresponding to the exposure operation of the imaging pixels used for display.
  • the first frame rate and the first exposure parameter can be configured for the first control path; frame rate and the first exposure parameters, performing first exposure to the candidate PD pixels in the image sensor to generate multiple frames of first images;
  • phase focusing may be performed multiple times on the shooting scene based on each frame of the first image in the multiple frames of the first image.
  • the first frame rate of the PD pixels is 50 frames/s, that is, 50 exposures can be performed on the PD pixels within 1 second, then each pair of PD pixels completes one exposure and outputs a frame of the first image, and the first image can be phased. focus.
  • the frame rate of the imaging pixel is 30ms to output a frame of image
  • the frame rate of the PD pixel is increased, for example, 15ms to output a frame
  • the PD pixel can be processed within 30ms
  • the two exposures output the first image of two frames of PD pixels, so that the two frames of the first image can be used for focusing within 30m, and the two focusing is completed, which makes the speed of phase focusing using PD pixels faster.
  • the frame rate of the imaging pixel and the frame rate of the PD pixel can be independently controlled, by setting the two frame rates to different frame rates, specifically, the first frame rate corresponding to the PD pixel can be set to higher than the second frame rate of the imaging pixels, so that the first frame rate can be used to perform multiple exposures to the PD pixels during the time when the second lower frame rate is used to expose the imaging pixels once, so that each The output of the first image of a frame of PD pixels completes one phase focusing, thereby completing multiple phase focusing.
  • the frame rate of the imaging pixels can be different from the frame rate of the PD pixels, it can be specifically set for the PD pixels For a higher frame rate, it is possible to quickly complete multiple PD focusing through the high-speed frame rate of the PD pixels within the duration of outputting a frame of image for display (the image generated by exposing the imaging pixels).
  • the method according to the embodiment of the present application may further include:
  • the first frame rate of the PD pixels used for focusing be set higher than the second frame rate of the imaging pixels used for display, so as to improve the focusing speed, but also the Rate separation, and configure the timing of exposing PD pixels before the timing of exposing imaging pixels, so as to improve image distortion or screen stretching changes caused by focusing during sensor exposure.
  • the reason for the above-mentioned image distortion and image stretching caused by focusing is that because the lens needs to be moved for focusing, the image exposure of the focus area is performed during the focusing process, so that the moving lens during the image exposure process causes the image to be distorted or Stretch situation. Then, in order to solve this problem, in this embodiment, since the duration of a phase focus on the shooting scene can be determined based on the first image, that is, the duration of each focus is known, then the PD can be controlled The time for exposing the pixels. Before the time for exposing the imaging pixels this time, the PD pixels used for focusing can be exposed. After the exposure of the PD pixels is completed, the focal length is determined based on the first image generated by the exposure.
  • the PD pixels have completed the exposure.
  • the imaging pixels have completed the exposure, so that there is no exposed image during the lens movement. Therefore, the exposure of both the PD pixels and the imaging pixels is not affected by the moving lens. Therefore, there will be no problems of image stretching and deformation when focusing.
  • the second image in this embodiment is used for display, that is, for display, not for focusing. Since not all pixels in the Sensor are imaging pixels, both imaging pixels and PDs are included. pixels, then when generating the second image, the PD pixels can also be used for display, that is, the second image can be generated by exposing all the imaging pixels and all the PD pixels in the Sensor. However, since the PD pixels are used for When sending to display, the definition is low, so the image quality of the PD pixel position in the second image will be poor.
  • each PD pixel in each pixel position (named after the target position) that originally included PD pixels in the Sensor can also be removed, and according to the pixel values of the imaging pixels around the target position To generate the pixel value of the target position, the newly generated pixel value of each target position is used as the pixel value of the imaging pixel of the target position to supplement the target position. Then, after supplementing the pixel values of each target position, all the imaging pixels in the Sensor (including the pixel values of the supplemented imaging pixels at the target position) are exposed to generate the second image.
  • the output second image is used for display, so the second image generated in this way has higher definition and better image quality at each target position.
  • the generated and output first image is used to determine information such as the focal length when the camera focuses.
  • PD pixels and imaging pixels can only be exposed at the same time.
  • the respective frame rates of PD pixels and imaging pixels are exposed using different control channels for frame rate configuration. , enabling frame rate separation such that PD pixels can be exposed before or after imaging pixels.
  • the difference in output can be controlled by adjusting the exposure timing of PD pixels, that is, the sequence.
  • the timing is controlled in the field blanking period between the current frame (the second image of the current frame) and the previous frame (the second image) of the imaging pixel, so that the moving lens
  • the timing of the process falls before the exposure timing of the imaging pixels, then the PD pixels have completed the exposure before the lens moves, and the imaging pixels have completed the exposure after the lens moves, therefore, there is no exposed image during the lens movement. Therefore, the exposure of both PD pixels and imaging pixels is not affected by the moving lens, so there is no problem of image stretching and deformation during focusing.
  • the Sensor after generating the first image, can output the first image; similarly, after generating the second image, the Sensor can output the second image; the Sensor can communicate with the background processing end through a data transmission channel, and the Sensor can Output the first image and the second image to the background processing end through the data transmission channel.
  • the method in the embodiment of the present application can control the output time point of the image frame of the imaging pixel and the image frame of the PD pixel to be separated, Then, after the output time point is separated, the Sensor can perform pixel output as shown in Fig. 9 and Fig. 10 , but not limited to the pixel output mode in Fig. 9 and Fig. 10 .
  • the output time point of the PD pixel in different rows does not need to be consistent with the output time point of the imaging pixel row, and the output time point of the PD pixel row
  • the time point may precede the output time point of the imaging pixel row.
  • the Sensor may also advance the output time point of the PD pixel to the current frame (the image frame of the imaging pixel, for example, the second image of the current frame corresponding to the first image) and the previous frame ( image frame of the imaged pixel, eg the second image of the previous frame) during the vertical blanking period (VBLT, Vertical Blank Time).
  • the current frame the image frame of the imaging pixel, for example, the second image of the current frame corresponding to the first image
  • the previous frame image frame of the imaged pixel, eg the second image of the previous frame
  • VBLT Vertical Blank Time
  • Fig. 10 can be improved, in the vertical blanking period of the image frame of the imaging pixels of the current frame and the previous frame, the PD pixels of all rows, for example, 5 rows of PD pixels; then output The first row of imaging pixels of the current frame; then output the imaging pixels of other rows in sequence.
  • the output time point of the PD pixel can be advanced to the vertical blanking period of the imaging image of the current frame and the imaging image of the previous frame. Then, by using the output first images of PD pixels in all rows to perform phase focusing, the phase focusing can be completed before the output of the imaging image of the current frame without affecting the imaging image of the next frame.
  • the output first frame first image corresponds to the output first frame second image.
  • the control of the blanking period can be used to make the first image of a frame after the exposure is completed, and the step of using the first image to focus can be performed in the first image of the frame.
  • the corresponding second image is output before.
  • the first frame rate and the first exposure parameter configured by the first control path in the above embodiment are used to perform the first exposure on the candidate PD pixels in the image sensor to generate multiple frames
  • the step of first image and in the step of performing phase focusing on the shooting scene multiple times based on each frame of the first image in the multiple frames of the first image, in generating each frame of the first image, and using each
  • the operation steps of focusing on the first image of the frame can be completed within the vertical blanking period between the second image corresponding to the first image of the frame (that is, the second image of the current frame) and the second image of the previous frame.
  • the specific output method of the specific pixels is: before the first row of imaging pixels is exposed and output, the PD pixels are obtained first, and all the PD pixels of all rows are fully exposed and output, wherein, since the first frame rate corresponding to the PD pixels is relatively The second frame rate of the imaging pixels is higher, so that the Sensor can transmit many rows of PD pixels at a time while meeting the frame rate of the PD pixels; then, push the motor to move the lens to focus; and then, the first row of imaging pixels Exposure and output.
  • the image frame (such as the first image) and the imaging pixels of the PD pixels are exposed and output None of the image frames (such as the second image) can capture the picture of the lens moving, so as to solve the problem of picture stretching and distortion when moving the lens to focus.
  • all the exposure and output of the imaging pixels and the subsequent operation of pushing the motor to focus are completed within the vertical blanking period of the second image of the current frame and the second image of the previous frame.
  • the Sensor can advance the output time point of the PD pixel to the current frame (the image frame of the imaging pixel, such as the second image of the current frame corresponding to the first image of a frame) and the previous frame (the image frame of the imaging pixel , such as the vertical blanking period of the last frame of the second image), so that PD pixel information can be obtained faster, so that PD focusing can be performed at an earlier point in time, so that PD focusing can be completed in the current frame without affecting The next frame of image (that is, the second image of the next frame).
  • the method according to the embodiment of the present application may further include: determining the first data amount of the candidate PD pixel according to the bandwidth of the first control path; Configure the first data amount; determine the second data amount of the imaging pixel according to the bandwidth of the second control path; configure the second data amount for the second control path, wherein the second The data volume is the data volume of imaging pixels in the image sensor.
  • the respective byte sizes of the PD pixels and imaging pixels in the Sensor can also be separated.
  • the so-called byte size refers to the data volume of each PD pixel and the data volume of each imaging pixel.
  • the data size of different PD pixels is the same, and the data size of different imaging pixels is the same.
  • the size of each byte of the PD pixel and the imaging pixel in the Sensor is uniform, but this application passes Different control channels output the second image for display and the first image for focusing generated by the Sensor respectively. Therefore, the data size of the pixels of the image to be transmitted can be configured for different control channels to transmit Image data of different data volumes.
  • the actual bandwidth of the data transmission channel (here including the first control channel and the second control channel) of the image data generated by the Sensor to the exposure can be used to reasonably determine the pixels to be transmitted.
  • the determination of the data volume of the PD pixels is taken as an example for illustration, and the manner of determining the data volume of the imaging pixels is the same.
  • the data volume of the PD pixel with the same data volume as the imaging pixel can be reduced, for example, the data volume of each imaging pixel (that is, the data of the pixel value size) is 10 bytes, and the data size of each PD pixel can be configured as 8 bytes; in the case that the bandwidth of the first control path is idle, in order to improve the image accuracy of the PD pixel, the PD The size of the pixel value of the pixel is increased. For example, the data size of each imaging pixel is 10 bytes, and the data size of each PD pixel is 12 bytes.
  • the data size of each PD pixel in the generated first image is the adjusted data size in this step.
  • the data volume of each PD pixel is different from the data volume of each imaging pixel.
  • the data volumes of the two can be configured to be different, so that the PD pixels can be flexibly adjusted according to the bandwidth of the image sensor and the image accuracy requirements of the PD pixels.
  • the size of the data volume can be configured to be different, so that the PD pixels can be flexibly adjusted according to the bandwidth of the image sensor and the image accuracy requirements of the PD pixels.
  • the respective data volumes of the PD pixels and imaging pixels in the Sensor can be separately controlled for independent control, so that the image sensor can be flexibly controlled according to the bandwidth of the first control path and the second control path of the image sensor.
  • the image processing method provided in the embodiment of the present application may be executed by an image processing device, or a control module in the image processing device for executing the image processing method.
  • the image processing device executed by the image processing device is taken as an example to describe the image processing device provided in the embodiment of the present application.
  • FIG. 11 shows a block diagram of an image processing device according to an embodiment of the present application.
  • the image processing device includes:
  • the first determination module 201 is configured to determine a first exposure parameter according to the shooting scene and the brightness information of the target phase PD pixel in the focus area of the image sensor;
  • the first generation module 202 is configured to perform a first exposure on candidate PD pixels in the image sensor by using the first exposure parameters configured by the first control path, and generate a first image, wherein the candidate PD pixels are all A pixel used for phase focusing in the image sensor, the candidate PD pixel includes the target PD pixel, and the first exposure parameter is controlled by the first control path;
  • a focusing module 203 configured to perform phase focusing on the shooting scene based on the first image
  • the second exposure parameter of the imaging pixel in the image sensor is controlled by a second control path
  • the first control path is different from the second control path
  • the image sensor is different from the first control path
  • the first determination module 201 includes:
  • the first determination sub-module is used to determine the preset brightness condition matching with the shooting scene
  • a second determining submodule configured to determine a first exposure parameter that matches the preset brightness condition when the brightness information of the target PD pixel in the focus area of the image sensor does not meet the preset brightness condition
  • the value of the first exposure parameter is used to make the luminance information of the target PD pixel after the first exposure comply with the preset luminance condition.
  • the device also includes:
  • a second determining module configured to determine a first frame rate corresponding to the first exposure according to the first exposure time in the first exposure parameter
  • the first frame rate is higher than the second frame rate corresponding to the second exposure
  • the first generation module 202 includes:
  • a configuration submodule configured to configure the first frame rate and the first exposure parameter for the first control path
  • a generating submodule configured to perform first exposure on candidate PD pixels in the image sensor by using the first frame rate and the first exposure parameters configured by the first control path, and generate multiple frames of first images ;
  • the focusing module 203 is further configured to perform multiple phase focusing on the shooting scene based on each frame of the first image in the multiple frames of the first image.
  • the device also includes:
  • a second generation module configured to use the second exposure parameters and the second frame rate configured by the second control path to perform a second exposure on the imaging pixels in the image sensor to generate a second image .
  • the device also includes:
  • a second determining module configured to determine the first data amount of the candidate PD pixels according to the bandwidth of the first control channel
  • a first configuration module configured to configure the first data amount for the first control path
  • a third determining module configured to determine a second data amount of the imaging pixel according to the bandwidth of the second control path
  • a second configuration module configured to configure the second data amount for the second control channel, wherein the second data amount is the data amount of an imaging pixel in the image sensor.
  • the parameter control method of the imaging pixel and the PD pixel in the image sensor can be modified, and by independently setting the exposure parameters of the imaging pixel and the exposure parameter of the PD pixel in the image sensor, the exposure parameter of the PD pixel can be used to control the
  • the PD pixels in the image sensor are exposed to generate the first image, and the first image is used to perform phase focusing, so that while the exposure parameters of the imaging pixels are used to ensure the imaging quality, the PD focusing effect can be improved, the PD performance can be optimized, and the imaging pixels can be improved.
  • the configuration of the exposure parameters of the PD pixel and the exposure parameter of the PD pixel is not independent, which leads to the problem that the imaging effect and the PD focusing effect cannot be optimal.
  • the image processing apparatus in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the device may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle electronic device, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant). assistant, PDA), etc.
  • the non-mobile electronic device may be a personal computer (personal computer, PC), television (television, TV), teller machine or self-service machine, etc., which are not specifically limited in this embodiment of the present application.
  • the image processing device in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in this embodiment of the present application.
  • the image processing apparatus provided in the embodiments of the present application can implement the various processes implemented in the foregoing method embodiments, and details are not repeated here to avoid repetition.
  • the embodiment of the present application further provides an electronic device 2000, including a processor 2002, a memory 2001, and programs or instructions stored in the memory 2001 and operable on the processor 2002,
  • an electronic device 2000 including a processor 2002, a memory 2001, and programs or instructions stored in the memory 2001 and operable on the processor 2002,
  • the program or instruction is executed by the processor 2002, each process of the above-mentioned image processing method embodiment can be achieved, and the same technical effect can be achieved. To avoid repetition, details are not repeated here.
  • the electronic devices in the embodiments of the present application include the above-mentioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 13 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010, etc. part.
  • the electronic device 1000 can also include a power supply (such as a battery) for supplying power to various components, and the power supply can be logically connected to the processor 1010 through the power management system, so that the management of charging, discharging, and function can be realized through the power management system. Consumption management and other functions.
  • a power supply such as a battery
  • the structure of the electronic device shown in FIG. 13 does not constitute a limitation to the electronic device.
  • the electronic device may include more or fewer components than shown in the figure, or combine some components, or arrange different components, and details will not be repeated here. .
  • the sensor 1005 is used to determine the first exposure parameter according to the shooting scene and the brightness information of the target phase PD pixel in the focus area of the image sensor;
  • Candidate PD pixels perform the first exposure to generate a first image, wherein the candidate PD pixels are pixels used for phase focusing in the image sensor, the candidate PD pixels include target PD pixels, and the first exposure parameters controlled by the first control channel; performing phase focusing on the shooting scene based on the first image;
  • the second exposure parameter of the imaging pixel in the image sensor is controlled by a second control path
  • the first control path is different from the second control path
  • the image sensor is different from the first control path
  • the parameter control method of the imaging pixel and the PD pixel in the image sensor can be modified, and by independently setting the exposure parameters of the imaging pixel and the exposure parameter of the PD pixel in the image sensor, the exposure parameter of the PD pixel can be used to control the
  • the PD pixels in the image sensor are exposed to generate the first image, and the first image is used to perform phase focusing, so that while the exposure parameters of the imaging pixels are used to ensure the imaging quality, the PD focusing effect can be improved, the PD performance can be optimized, and the imaging pixels can be improved.
  • the configuration of the exposure parameters of the PD pixel and the exposure parameter of the PD pixel is not independent, which leads to the problem that the imaging effect and the PD focusing effect cannot be optimal.
  • the sensor 1005 is configured to determine a preset brightness condition that matches the shooting scene; when the brightness information of the target PD pixel in the focus area of the image sensor does not meet the preset brightness condition, determine and The value of the first exposure parameter matched by the preset brightness condition, wherein the value of the first exposure parameter is used to make the brightness information of the target PD pixel after the first exposure comply with the Preset brightness conditions.
  • the preset brightness conditions of the PD pixels in the focus area matching the shooting scene can be flexibly determined in combination with the shooting scene; then the target PD pixels in the focus area in the image sensor (referring to the current focus area in the Sensor If the luminance information of the PD pixels in the 1) does not meet the preset luminance condition, it means that the value of the exposure parameter used for PD pixel exposure cannot make the PD focus performance optimal, and it needs to be determined to match the preset luminance condition
  • the value of the first exposure parameter, and the value of the first exposure parameter is used to make the brightness information of the target PD pixel after the first exposure comply with the preset brightness condition, so that the present application
  • the embodiment can accurately determine the value of the first exposure parameter for exposing PD pixels that conforms to the shooting scene and can optimize the PD focusing effect in combination with the shooting scene, which not only improves the exposure parameter for PD pixel exposure
  • the accuracy of the value, and the value can be adjusted reasonably and flexibly as the shooting scene changes, and the PD
  • the sensor 1005 is configured to determine a first frame rate corresponding to the first exposure according to the first exposure time in the first exposure parameter; wherein, the first frame rate is higher than that corresponding to the second exposure the second frame rate; configure the first frame rate and the first exposure parameter for the first control path; adopt the first frame rate and the first exposure parameter configured by the first control path performing first exposure on the candidate PD pixels in the image sensor to generate multiple frames of first images; and performing phase focusing on the shooting scene multiple times based on each frame of the first images in the multiple frames of first images.
  • the frame rate of the imaging pixel and the frame rate of the PD pixel can be independently controlled, by setting the two frame rates to different frame rates, specifically, the first frame rate corresponding to the PD pixel can be set to is higher than the second frame rate of the imaging pixels, so that the first frame rate can be used to perform multiple exposures to the PD pixels during the time when the imaging pixels are exposed once at a lower second frame rate, so that each output For the first image of a frame of PD pixels, phase focusing is completed once, thereby completing multiple phase focusing.
  • the PD pixel can be set to Higher frame rate, so that multiple times of PD focusing can be quickly completed through the high-speed frame rate of PD pixels within the duration of outputting a frame of image for display (image generated by exposing imaging pixels).
  • the senor 1005 is configured to use the second exposure parameter and the second frame rate configured by the second control path to perform a second exposure on the imaging pixel in the image sensor to generate a second Two images.
  • the respective frame rates of the PD pixels and the imaging pixels are separated, so that the PD pixels can be output after the imaging pixels Before or after exposure, in order to solve the problem of image stretching and deformation when using PD pixels for phase focusing, since the frame rates of PD pixels and imaging pixels are separated, the exposure timing of PD pixels can be adjusted in real time control the output timing of the PD pixels within the gaps between the images of the imaging pixels of different frames, here, the timing is controlled between the current frame (the second image of the current frame) and the previous frame (the second image) of the imaging pixels During the field blanking period between, so that the timing of the process of moving the lens falls before the exposure timing of the imaging pixel, then the PD pixel has completed the exposure before the lens moves, and the imaging pixel has completed the exposure after the lens moves. Therefore, the lens During the movement, there is no exposed image. Therefore, the exposure of both PD pixels and
  • the sensor 1005 is configured to determine the first data amount of the candidate PD pixel according to the bandwidth of the first control path; configure the first data amount for the first control path;
  • the bandwidth of the second control path is to determine the second data amount of the imaging pixel; configure the second data amount for the second control path, wherein the second data amount is the data of the imaging pixel in the image sensor volume size.
  • the respective data volumes of PD pixels and imaging pixels in the Sensor can be separately controlled for independent control, so that they can be flexibly determined according to the bandwidths of the first control path and the second control path of the image sensor.
  • the data volume of the PD pixels and imaging pixels that need to be transmitted in the case of tight bandwidth, to reduce the data volume of pixels (including PD pixels, imaging pixels), thereby reducing the transmission bandwidth; in the case of idle bandwidth, to Increase the data size of pixels (PD pixels, imaging pixels), so that the image accuracy of the first image generated by using PD pixels can be improved, so that when using the first image for phase focusing, the focusing accuracy is improved, the focusing effect is good, and the imaging is improved.
  • the input unit 1004 may include a graphics processor (Graphics Processing Unit, GPU) 10041 and a microphone 10042, and the graphics processor 10041 is used for the image capture device (such as the image data of the still picture or video obtained by the camera) for processing.
  • the display unit 1006 may include a display panel 10061, and the display panel 10061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 1007 includes a touch panel 10071 and other input devices 10072 .
  • the touch panel 10071 is also called a touch screen.
  • the touch panel 10071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 10072 may include, but are not limited to, physical keyboards, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the memory 1009 can be used to store software programs as well as various data, including but not limited to application programs and operating systems.
  • Processor 1010 may integrate an application processor and a modem processor, wherein the application processor mainly processes operating systems, user interfaces, and application programs, and the modem processor mainly processes wireless communications. It can be understood that the foregoing modem processor may not be integrated into the processor 1010 .
  • the embodiment of the present application also provides a readable storage medium, the readable storage medium stores a program or an instruction, and when the program or instruction is executed by a processor, each process of the above-mentioned image processing method embodiment is realized, and can achieve the same To avoid repetition, the technical effects will not be repeated here.
  • the processor is the processor in the electronic device described in the above embodiments.
  • the readable storage medium includes computer readable storage medium, such as computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk, etc.
  • the embodiment of the present application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the above image processing method embodiment Each process can achieve the same technical effect, so in order to avoid repetition, it will not be repeated here.
  • chips mentioned in the embodiments of the present application may also be called system-on-chip, system-on-chip, system-on-a-chip, or system-on-a-chip.
  • the term “comprising”, “comprising” or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article or apparatus comprising a set of elements includes not only those elements, It also includes other elements not expressly listed, or elements inherent in the process, method, article, or device. Without further limitations, an element defined by the phrase “comprising a " does not preclude the presence of additional identical elements in the process, method, article, or apparatus comprising that element.
  • the scope of the methods and devices in the embodiments of the present application is not limited to performing functions in the order shown or discussed, and may also include performing functions in a substantially simultaneous manner or in reverse order according to the functions involved. Functions are performed, for example, the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请公开了一种图像处理方法、装置、电子设备及可读存储介质,属于图像处理技术领域。该方法包括:根据拍摄场景和图像传感器中对焦区域内目标PD像素的亮度信息,确定第一曝光参数;采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,所述候选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括所述目标PD像素,所述第一曝光参数由所述第一控制通路控制;基于所述第一图像对所述拍摄场景进行相位对焦;所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、第二控制通路均连接。

Description

图像处理方法、装置、电子设备及可读存储介质
相关申请的交叉引用
本申请要求在2021年08月19日提交中国专利局、申请号为202110953703.6、名称为“图像处理方法、装置、电子设备及可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请属于图像处理技术领域,具体涉及一种图像处理方法、装置、电子设备及可读存储介质。
背景技术
随着对焦技术的快速发展,电子设备可以进行相位(PD,phase detection)对焦,PD对焦是通过相位检测实现对焦。为了实现相位对焦,需要图像传感器(Sensor)的成像像素(RGB像素,红绿蓝像素)的基础上(如图1所示的图像传感器的成像像素的示意图,成像像素的布局方式不限于图1,成像像素包括GR像素61,R像素62、B像素63、GB像素64,本例中,图像传感器可以应用于拍摄人脸的场景,而人脸对G像素较为敏感,因此可以对G通道设置红绿通道的G像素,即GR像素61,以及蓝绿通道的G像素,即GB像素64),增加一种PD像素,其中,PD像素可以包括左(L)像素和右(R)像素。例如,对比于图1,参照图2可以看出,这里的图像传感器在图1的成像像素的基础上,增加了PD像素(以L、R示出)。对于图1和图2,相同的附图标记表示相同的对象,这里不再对图2的附图标记做一一赘述,参照图1的解释即可。其中,PD像素的排列形式不限于图2,从而可以利用PD像素辅助对焦。通过PD像素中的L像素和R像素的像素值,可以计算对焦区域的相位差(Phase diff),从而实现相位对焦。
目前,Sensor的成像像素和PD像素所配置的曝光参数是同一组参数,那么单纯从PD对焦效果或者是成像对焦的一个维度去调整该曝光参数,可能存在导致另一个维度的性能下降。目前,主流的图像控制方式主要是参照Sensor 中成像像素的成像效果来调整该统一的曝光参数。例如,在逆光场景下,为了保证背景较好的成像质量,将成像像素和PD像素的统一的曝光参数调整为较小,这样将导致对焦区域内的人脸对象相对较暗,当然,可以通过后期的图像处理方式来对人脸成像质量进行提升,但是,这还是会使得对焦区域内的相位值很低,信噪比差,从而导致包括PD像素的人脸区域的对焦质量下降。
因此,相关技术中的图像处理方式存在着基于PD对焦的对焦质量下降的问题。
发明内容
本申请实施例的目的是提供一种图像处理方法、装置、电子设备及可读存储介质,能够解决相关技术中的图像处理方式所存在的基于PD对焦的对焦质量下降的问题。
第一方面,本申请实施例提供了一种图像处理方法,该方法包括:
根据拍摄场景和图像传感器中对焦区域内目标相位PD像素的亮度信息,确定第一曝光参数;
采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,其中,所述候选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括目标PD像素,所述第一曝光参数由所述第一控制通路控制;
基于所述第一图像对所述拍摄场景进行相位对焦;
其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、所述第二控制通路均连接。
第二方面,本申请实施例提供了一种图像处理装置,该装置包括:
第一确定模块,用于根据拍摄场景和图像传感器中对焦区域内目标PD像素的亮度信息,确定第一曝光参数;
第一生成模块,用于采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,其中,所述候 选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括所述目标PD像素,所述第一曝光参数由所述第一控制通路控制;
对焦模块,用于基于所述第一图像对所述拍摄场景进行相位对焦;
其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、所述第二控制通路均连接。
第三方面,本申请实施例提供了一种电子设备,该电子设备包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面所述的方法的步骤。
第四方面,本申请实施例提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面所述的方法的步骤。
第五方面,本申请实施例提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面所述的方法。
在本申请实施例中,可以修改图像传感器中成像像素与PD像素的参数控制方式,通过独立设置图像传感器中成像像素的曝光参数和PD像素的曝光参数,从而可以利用PD像素的曝光参数来对图像传感器中的PD像素进行曝光生成第一图像,并利用第一图像进行相位对焦,从而可以在利用成像像素的曝光参数确保成像质量的同时,提升PD对焦效果,优化PD性能,改善因成像像素的曝光参数与PD像素的曝光参数的配置不独立导致成像效果与PD对焦效果无法得到最优效果的问题。
附图说明
图1是现有技术中的一种图像传感器的像素布局示意图之一;
图2是现有技术中的一种图像传感器的像素布局示意图之二;
图3是现有技术中的一种图像传感器的像素布局示意图之三;
图4是现有技术中的一种图像传感器的像素布局示意图之四;
图5是本申请一个实施例的一种图像传感器的像素布局示意图之一;
图6是本申请一个实施例的图像处理方法的流程图;
图7是现有技术中的一种图像传感器的像素输出方式示意图之一;
图8是现有技术中的一种图像传感器的像素输出方式示意图之二;
图9是本申请一个实施例的一种图像传感器的像素输出方式示意图之一;
图10是本申请一个实施例的一种图像传感器的像素输出方式示意图之二;
图11是本申请一个实施例的图像处理装置的框图;
图12是本申请一个实施例的电子设备的硬件结构示意图;
图13是本申请另一个实施例的电子设备的硬件结构示意图。
具体实施例
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”等所区分的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”,一般表示前后关联对象是一种“或”的关系。
发明人在实现本申请的过程中发现,Sensor的成像像素和PD像素所配置的曝光参数是同一组参数,即PD像素的曝光参数保持与成像像素的曝光参数一致,对比于图1,如图3所示,在图3中示出了不存在PD像素的Sensor中各成像像素的曝光参数的设置,以图3中像素格内的竖线11表示曝光参数,这里未对每个成像像素的曝光参数做附图标记,图3的像素布局方式与图1相同,这里只是在图3中附加示出了曝光参数的示意线条。从图3可以看出每个成像像素的曝光参数是相同的。那么在增加了PD像素的Sensor中,对 比于图2,如图4所示,在图4中示出了存在PD像素的Sensor中成像像素和PD像素的曝光参数的设置,在图4中,以像素格内的竖线21表示该像素格的像素的曝光参数,这里未对每个成像像素的曝光参数和每个PD像素的曝光参数做附图标记,图4的像素布局方式与图2相同,这里只是在图4中附加示出了曝光参数的示意线条。从图4可以看出Sensor中成像像素和PD像素的曝光参数相同。那么由于Sensor中成像像素和PD像素采用统一的曝光参数设置,并以成像效果为主,来调整该统一的曝光参数,使得利用PD像素进行PD对焦时的对焦效果下降,影响PD性能。为了提升PD对焦效果和PD性能,本申请实施例提供了一种图像处理方法,下面结合附图来对各个实施例的图像处理方法做详细阐述。
参照图6,示出了本申请一个实施例的图像处理方法的流程图,所述方法具体可以包括如下步骤:
步骤101,根据拍摄场景和图像传感器中对焦区域内目标相位像素的亮度信息,确定第一曝光参数;
其中,目标相位像素,也即目标PD像素;PD像素可以包括L像素和R像素,对于PD像素中的L像素和R像素,图像传感器中的一个像素点,若将该像素点的一半位置加了金属遮盖,使得被遮住左边一半的像素点只能接受左边来的光,该被遮住左边一半的像素点称之为L像素;同理,被遮住右边一半的像素点就只能接受右边来的光,该被遮住右边一半的像素点称之为R像素;在图像传感器中,L像素和R像素是在相邻位置成对出现,如图2、图4、图5所示。
其中,曝光参数可以包括但不限于积分时间(integration time,INT),模拟增益(gain)以及数字gain等。
其中,积分时间是以行为单位表示曝光时间(exposure time)的,比如说INT为159,就是指图像传感器(Sensor)曝光时间为159行,积分时间和曝光时间两者所代表的意思是相同的,都是表示Sensor的曝光时间,但是integration time是一个相对的概念,即以行为单位,而每行所占的绝对时间与时钟频率和每一行包含多少pclk(即行长)有关;而exposure time则是指Sensor曝光的绝对时间。
其中,第二曝光参数的取值与所述第一曝光参数的取值不同,例如曝光 参数为积分时间,则对PD像素进行曝光的积分时间的取值,与对成像像素进行曝光的积分时间的取值不同。
其中,所述第二曝光参数为用于对所述图像传感器中成像像素进行第二曝光的曝光参数,也就是说,在本申请实施例中,可以对Sensor中的成像像素和PD像素设置独立的曝光参数。
对比于图4中对成像像素和PD像素采用相同的曝光参数21进行曝光,参照本申请实施例的图5(其中,图5中的像素布局方式与图2、图4的像素布局方式相同,为了更加清楚的示出表示曝光参数的线条,这里未对图5中的成像像素以及PD像素(图5中的L像素、R像素)以不同的灰度底色做像素区分,图5中具体像素布局参照图2、图4的解释即可,也就是说,在图5中除L像素和R像素之外的每个像素格表示的都是成像像素)可以看出,在本申请实施例中,可以采用曝光参数41控制对Sensor中PD像素中的L像素和R像素进行曝光,图5中每个包括L像素或R像素的像素格内的竖线用于表示曝光参数41;而对图5中示出的每个成像像素采用曝光参数42进行曝光,图5中每个成像像素的像素格内的竖线用于表示曝光参数42,为了附图标记的清楚和简要,这里未对图5中每个成像像素的曝光参数做附图标记,第2~8行的成像像素的曝光参数参照第一行示出的成像像素的曝光参数42即可。图5中的箭头用于表示Sensor中每行未示出的像素点。
其中,用于控制成像像素曝光的第二曝光参数的取值可以是使得成像效果最优的曝光参数的取值;而控制PD像素曝光的第一曝光参数的取值则可以是使得曝光后的PD像素对应的第一图像,能够使得进行相位对焦效果最优的取值,即采用第一曝光参数的取值来对PD像素进行曝光后,所生成的第一图像,能够用于改善和优化PD对焦效果,使得PD性能提升。
本实施例中,可以结合Sensor的当前拍摄场景和Sensor中对焦区域内目标PD像素的亮度信息,来确定能够改善和优化PD对焦效果的第一曝光参数的取值,其中,目标PD像素表示对焦区域内的每个PD像素。
可选地,在一个实施例中,为了确定能够改善和优化PD对焦效果,使得PD性能提升的第一曝光参数的取值,可以通过以下方式来实现步骤101,从而确定该第一曝光参数的取值,具体而言:确定与拍摄场景匹配的预设亮度条件;那么在所述图像传感器中对焦区域内的目标PD像素的亮度信息不符合 所述预设亮度条件的情况下,确定与所述预设亮度条件匹配的第一曝光参数的取值,其中,所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件。
示例地,预先可以经过大量实验来确定各种拍摄场景下,能够使得PD对焦效果最优、PD性能提升的对焦区域内PD像素的预设亮度条件。
预设拍摄场景可以包括但不限于逆光场景、点光源场景、人像场景等,针对每个预设拍摄场景分别配置对应的预设亮度条件。
在不同实施例中,该预设亮度条件可以是预设亮度范围,还可以是表示对焦区域内过曝的PD像素的数量的第一阈值,对焦区域内欠曝的PD像素的数量的第二阈值,考虑到用于确定第一曝光参数的取值的策略的区别,预设亮度条件也存在区别。
例如当前的拍摄场景为逆光场景,则可以预先配置的与逆光场景匹配的预设亮度条件,例如预设亮度范围,然后,获取Sensor中对焦区域内的每个PD像素(上文称之为目标PD像素)的亮度值,对该亮度值求平均值,如果该平均值在该预设亮度范围内,则说明目标PD像素的亮度信息符合该预设亮度条件,不需要对PD像素的曝光参数进行调整;如果该平均值不在该预设亮度范围内,则说明目标PD像素的亮度信息不符合该预设亮度条件,PD性能不是最优,需要对用于PD曝光的PD像素的曝光参数进行调整,对于曝光参数调整后的取值,则可以基于该预设亮度范围来确定,因为曝光参数与对焦区域的PD像素的亮度值存在着逻辑关系,因此,可以反推确定符合该预设亮度范围的第一曝光参数的取值,将反推得到的该取值作为用于本次对焦时对拍摄场景的PD像素进行曝光时所采用的第一曝光参数的取值。
同理,如果与当前拍摄场景匹配的预设亮度条件为表示对焦区域内过曝的PD像素的数量的第一阈值,对焦区域内欠曝的PD像素的数量的第二阈值,则可以获取对焦区域内PD像素的亮度值的直方图,利用该直方图确定对焦区域内欠曝的PD像素的数量是否小于或等于第二阈值,以及该对焦区域内过曝的PD像素的数量是否小于或等于第一阈值,如果是,则说明对焦区域内PD像素的亮度信息符合该预设亮度条件,不需要对Sensor中PD像素的曝光参数进行调整;如果对焦区域内欠曝的PD像素的数量大于第二阈值,或者,该对焦区域内过曝的PD像素的数量大于第一阈值,则说明目标PD像素的亮度 信息不符合该预设亮度条件,PD性能不是最优,需要对用于PD曝光的PD像素的曝光参数进行调整,对于曝光参数调整后的取值,则可以基于上述第一阈值和第二阈值来确定,因为曝光参数与对焦区域的PD像素的亮度值存在着逻辑关系,因此,可以反推确定符合满足上述第一阈值和第二阈值的条件的第一曝光参数的取值,将反推得到的该取值作为用于本次对焦时对拍摄场景的PD像素进行曝光时所采用的第一曝光参数的取值。
在本申请实施例中,可以结合拍摄场景来灵活确定与该拍摄场景匹配的对焦区域内PD像素的预设亮度条件;那么在图像传感器中对焦区域内的目标PD像素(指代Sensor中当前对焦区域内的PD像素)的亮度信息不符合所述预设亮度条件的情况下,说明用于PD像素曝光的曝光参数的取值无法使得PD对焦性能最优,需要确定与所述预设亮度条件匹配的第一曝光参数的取值,而所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件,使得本申请实施例能够结合拍摄场景准确地确定出符合该拍摄场景的、且能够使PD对焦效果最优的用于曝光PD像素的第一曝光参数的取值,不仅提升了用于PD像素曝光的曝光参数的取值的准确度,而且能够随着拍摄场景的变化,来合理灵活地调整该取值,能够在各种拍摄场景下,优化PD对焦性能。
步骤102,采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选相位像素进行第一曝光,生成第一图像,其中,所述候选相位像素为所述图像传感器中用于相位对焦的像素,所述候选相位像素包括所述目标相位像素,所述第一曝光参数由所述第一控制通路控制;
其中,候选相位像素,也即候选PD像素,该候选PD像素可以是Sensor中的每个PD像素,也可以是Sensor中对焦区域内的PD像素,因此,该候选PD像素是包括该目标PD像素的。
由于对焦区域可以灵活地发生变化,因此,可以不仅仅地对Sensor中当前对焦区域内的目标PD像素采用第一曝光参数进行曝光,而是,可以对Sensor中每个PD像素(这里候选PD像素表示Sensor中的每个PD像素)进行统一的曝光拍摄,来生成一帧第一图像。这样,即便对焦区域发生变化,那么仍旧可以从第一图像中确定出变化后的对焦区域所对应的各个PD像素的图像信息。
此外,虽然Sensor中不仅仅包括PD像素,还包括成像像素,但这里可以理解为只对Sensor中的PD像素进行曝光,来生成第一图像。
示例地,以对对焦区域内的PD像素进行曝光来生成第一图像为例进行说明。
在逆光的拍摄场景中,例如对焦区域为人脸,则人脸区域亮度较暗,则可以将Sensor中人脸区域内的PD像素的曝光参数,例如曝光时间设置长一些,从而将人脸调亮,利于对人脸区域进行对焦。
在点光源的拍摄场景中,点光源之外的区域很暗,为了将点光源周围区域提亮,优化成像效果,现有技术会将成像像素设置较高的曝光参数,从而导致点光源区域被过曝,那么本申请实施例的方法则可以对对焦区域,即点光源区域内的PD像素的曝光参数降低,不用于对成像像素进行曝光的较高的曝光参数,从而可以降低点光源区域的过曝问题。
由此可见,本申请实施例的方法在应用到逆光、点光源等极限场景下,可以改善因为成像质量而导致PD性能下降问题。
此外,在非上述极限场景下,也可以通过单独配置PD像素的曝光参数,来使得曝光后的PD像素在PD对焦时性能提升优化。
其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,其中,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、第二控制通路均连接。
示例地,在单独配置成像像素和PD像素的曝光参数时,可以通过在半导体硬件通路上的分离实现,即配置图像传感器与不同的半导体硬件通路连接。其中,图像传感器可以通过不同的半导体硬件通路与后端的控制器通信连接。
可以对所述图像传感器与控制器之间的第一控制通路配置所述第一曝光参数;以及对所述图像传感器与控制器之间的第二控制通路配置第二曝光参数,考虑到需要兼顾成像效果和对焦效果,因此,这两组曝光参数的取值大多数情况下不同。
本实施例可以通过第一控制通路和第二控制通路来分别实现对PD像素、成像像素的曝光参数的各自控制,确保成像效果的同时也可以优化PD对焦性能。
可选地,在图像传感器用于曝光用于显示的成像图像时,可以采用上述 第二曝光参数对图像传感器中的成像像素进行曝光,来生成成像图像。
步骤103,基于所述第一图像对所述拍摄场景进行相位对焦;
其中,在第一图像是Sensor中全部PD像素的曝光图的情况下,则可以根据当前的对焦区域,在第一图像中确定属于该对焦区域的部分图像,利用该部分图像进行拍摄场景的相位对焦,由于对焦区域可能发生变化,因此,这种方式可以准确地获取到最新的对焦区域对应的PD像素的曝光图像,从而用于辅助相位对焦。
在第一图像是Sensor中对焦区域内的PD像素的曝光图的情况下,则可以直接利用该第一图像进行拍摄场景的相位对焦。
在本申请实施例中,可以修改图像传感器中成像像素与PD像素的参数控制方式,通过独立设置图像传感器中成像像素的曝光参数和PD像素的曝光参数,从而可以利用PD像素的曝光参数来对图像传感器中的PD像素进行曝光生成第一图像,并利用第一图像进行相位对焦,从而可以在利用成像像素的曝光参数确保成像质量的同时,提升PD对焦效果,优化PD性能,改善因成像像素的曝光参数与PD像素的曝光参数的配置不独立导致成像效果与PD对焦效果无法得到最优效果的问题。
可选地,在步骤102之前,根据本申请实施例的方法还可以包括:
步骤100,根据所述第一曝光参数中的第一曝光时间,确定所述第一曝光对应的第一帧率;
其中,对于步骤100和步骤101的执行顺序不做限制。
其中,所述第一帧率高于第二曝光对应的第二帧率;
本实施例中,可以将成像像素对应的第二帧率和PD像素对应的第一帧率进行分离,并非采用统一的帧率进行像素曝光。在进行分离时,可以参照曝光参数的独立配置方式,通过在半导体硬件通路上的分离来实现对不同像素的不同帧率控制,即采用不同的半导体硬件通路,来分别实现对成像像素、PD像素的帧率的各自控制,从而可以将用于对PD像素进行曝光的第一帧率设置的高于用于对成像像素进行曝光的第二帧率。
由于PD像素用于相位对焦,那么在PD像素的曝光操作对应的第一帧率,相较于用于显示的成像像素的曝光操作对应的第二帧率更高时,则可以提升对焦速度,虽然为了提升对焦速度,PD像素对应的帧率越高越好,但是,在 配置PD像素对应的第一帧率时,还是受限于对PD像素进行曝光的第一曝光参数中的第一曝光时间这一参数信息。
例如PD像素的曝光时间为20ms,那么1s(秒)=1000ms(毫秒),1000ms/20ms=50帧,则PD像素对应的第一帧率的最大值为50帧/s。
那么在本实施例中,在执行步骤102时,则可以对所述第一控制通路配置所述第一帧率和所述第一曝光参数;采用所述第一控制通路配置的所述第一帧率和所述第一曝光参数,对所述图像传感器中的候选PD像素进行第一曝光,生成多帧第一图像;
那么在本实施例中,在执行步骤103时,则可以基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦。
例如PD像素的第一帧率为50帧/s,即1s内对PD像素可以进行50次曝光,那么每对PD像素完成一次曝光输出一帧第一图像,就可以对该第一图像进行相位对焦。由于原本PD像素和成像像素对应的帧率相同,例如成像像素的帧率为30ms输出一帧图像,那么将pd像素的帧率提高,例如15ms输出一帧,则在30ms内可以对PD像素进行两次曝光输出两帧pd像素的第一图像,从而在30m内可以利用两帧第一图像分别进行对焦,完成两次对焦,使得利用pd像素进行相位对焦的速度变快。
在本申请实施例中,由于可以将成像像素的帧率和PD像素的帧率进行独立控制,那么通过将两个帧率设置为不同帧率,具体可以将PD像素对应的第一帧率设置的高于成像像素的第二帧率,从而使得在采用较低的第二帧率对成像像素进行一次曝光的时间内,可以利用第一帧率来对PD像素进行多次曝光,从而对每输出的一帧PD像素的第一图像,就完成一次相位对焦,从而完成多次相位对焦,因此,通过将成像像素的帧率设置的与PD像素的帧率不同,具体可以为将PD像素设置为更高速的帧率,从而可以在输出一帧用于显示的图像(对成像像素进行曝光而生成的图像)的时长内,通过PD像素的高速帧率,快速完成多次PD对焦。
可选地,所述基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦之后,根据本申请实施例的方法还可以包括:
采用所述第二控制通路配置的所述第二曝光参数和所述第二帧率,对所述图像传感器中的所述成像像素进行第二曝光,生成第二图像。
具体而言,在本实施例中,不仅可以将用于对焦的PD像素的第一帧率设置的高于用于显示的成像像素的第二帧率,从而提升对焦速度,而且还可以通过帧率分离,并配置对PD像素进行曝光的时序,在对成像像素进行曝光的时序之前,从而可改善sensor曝光过程中对焦导致的图像扭曲或者是画面拉伸变化等情况。
发明人发现存在上述对焦导致的画面扭曲和画面拉伸的原因在于,由于对焦需要移动镜头,那么在对焦过程中,对对焦区域进行了图像曝光,使得图像曝光过程中移动镜头导致了图像存在扭曲或拉伸的情况。那么为了解决该问题,在本实施例中,由于基于第一图像,对所述拍摄场景进行一次相位对焦的时长是可以确定的,即每次对焦的时长是已知的,那么可以控制对PD像素进行曝光的时间,在对本次成像像素进行曝光的时间之前,就可以对用于对焦的PD像素进行曝光,在完成对PD像素的曝光之后,再基于曝光生成的第一图像来确定焦距等对焦信息,从而利用该对焦信息来推动镜头中的马达来完成相位对焦操作;在马达推动完成之后,即相位对焦完成之后,再对成像图像进行曝光生成用于显示的第二图像,使得在镜头移动之前PD像素已经完成了曝光,在镜头移动之后,成像像素完成了曝光,使得镜头移动过程中,不存在曝光的图像,因此,不论是PD像素还是成像像素的曝光都不受移动镜头的影响,从而不会存在对焦时图像拉伸和变形的问题。
需要说明的是,本实施例中的第二图像用于送显,即用于显示,而非用于对焦,由于Sensor中并不是全部像素均为成像像素,既包括了成像像素也包括了PD像素,那么在生成第二图像时,PD像素也可以用于送显,即可以通过对Sensor中的全部成像像素和全部PD像素进行曝光,来生成第二图像,但是,由于PD像素在用于送显时,清晰度较低,因此,会使得第二图像中PD像素位置的画质较差。
那么可选地,在生成第二图像时,还可以对Sensor中原本包括PD像素的各个像素点位置(以目标位置命名)中的各个PD像素去除,并根据目标位置周围的成像像素的像素值来生成目标位置的像素值,将新生成的各个目标位置的像素值来作为该目标位置的成像像素的像素值来补充到该目标位置。那么在对各个目标位置的像素值进行重新补充后,再对Sensor中全部的成像像素(包括目标位置被补充的成像像素的像素值)进行曝光,从而生成第二 图像。输出的第二图像用于显示,那么这种方式生成的第二图像,在各个目标位置的清晰度更高,画质更好。
而在生成第二图像之前,生成并输出的第一图像则用于确定摄像头对焦时的焦距等信息。
在本申请实施例中,相比于传统技术中PD像素和成像像素只可以同时进行曝光,本实施例中将PD像素和成像像素进行曝光的各自的帧率采用不同的控制通路进行帧率配置,实现了帧率的分离,使得PD像素可以在成像像素之前或之后进行曝光。这里为了解决利用PD像素进行相位对焦时所存在的画面拉伸变形的问题,由于PD像素和成像像素各自的帧率分离,因此,可以通过将PD像素的曝光时机,即时序,控制输出的不同帧的成像像素的图像的间隙内,这里,将该时序控制在成像像素的当前帧(当前帧第二图像)和上一帧(第二图像)之间的场消隐期内,使得移动镜头的过程的时序落在成像像素的曝光时序之前,那么在镜头移动之前PD像素已经完成了曝光,在镜头移动之后,成像像素完成了曝光,因此,镜头移动过程中,不存在曝光的图像。所以,不论是PD像素还是成像像素的曝光都不受移动镜头的影响,从而不会存在对焦时图像拉伸和变形的问题。
可选地,在生成第一图像之后,Sensor可以输出第一图像;同理,在生成第二图像之后,Sensor可以输出第二图像;Sensor可以通过数据传输通道与后台处理端通信连接,Sensor可以将第一图像、第二图像通过该数据传输通道输出到后台处理端。
在传统技术中,如图7所示,是Sensor逐行输出像素的一种方式,具体为每一行输出的像素中先输出成像像素,再在当前行的水平消隐期内,即水平空白时间(horizontal blank time,HBLT)中输出PD像素;再如,如图8所示,是Sensor逐行输出像素的另一种方式,具体为先逐行输出成像像素,待当前帧的全部行的成像像素输出完成之后,再在当前帧的成像像素的图像帧和下一帧的成像像素的图像帧的垂直消隐期,即场消隐期垂直空白时间(vertical blank time,VBLT)中输出所有行的PD像素。这两种传统方案都只能先输出成像像素行,后输出PD像素行。
在本申请实施例中,由于上述第一帧率可以与上述第二帧率不同,因此,本申请实施例的方法可以控制成像像素的图像帧和PD像素的图像帧的输出 时刻点进行分离,那么在输出时刻点分离之后,Sensor可以如图9、图10所示,但不仅限于图9、图10的像素输出方式进行像素输出。
在本申请实施例中,PD像素的输出时刻点与成像像素的输出时刻点分离后,不同行的PD像素输出的时刻点不需与成像像素行的输出时刻点保持一致,PD像素行的输出时刻点可以先于成像像素行的输出时刻点。
例如如图9所示,在输出第一行成像像素之后,在第一行的水平消隐期内输出三行PD像素;然后输出第二行成像像素,在第二行成像像素输出之后,在第一行的水平消隐期内输出两行PD像素;
再如,如图10所示,在当前帧和上一帧的成像像素的图像帧的垂直消隐期内,输出3行PD像素;然后输出当前帧的首行成像像素;在当前帧的首行成像像素的水平消隐期内,输出两行PD像素。
可选地,在一个实施例中,Sensor还可以将PD像素的输出时刻点提前到当前帧(成像像素的图像帧,例如与第一图像对应的当前帧的第二图像)与上一帧(成像像素的图像帧,例如上一帧的第二图像)的场消隐期(VBLT,垂直空隙时间)中。
举例说明,以图10为例,可以对图10进行改进,在当前帧和上一帧的成像像素的图像帧的垂直消隐期内,所有行的PD像素,例如5行PD像素;然后输出当前帧的首行成像像素;接着依次输出其他行的成像像素。使得PD像素的输出时刻点可以提前到当前帧的成像图像和上一帧的成像图像的场消隐期内。那么在利用输出的所有行的PD像素的第一图像进行相位对焦,则可以在当前帧的成像图像输出之前完成相位对焦,而不影响下一帧成像图像。
其中,由于第一帧率和第二帧率相同,因此,相同时间内生成的第一图像的帧数和第二图像的帧数是相同的,因此,按照曝光顺序生成的一组第一图像和一组第二图像实际上是可以一一对应起来的,例如输出的首帧第一图像与输出的首帧第二图像是相互对应的,那么对于任意一组相互对应的第一图像和第二图像,虽然两种图像的曝光时长存在区别,但是可以通过消隐期的控制,来使得一帧第一图像在曝光完成之后,利用第一图像进行对焦的步骤,可以在该帧第一图像对应的第二图像输出之前进行。
也就是说,上述实施例中的采用所述第一控制通路配置的所述第一帧率和所述第一曝光参数,对所述图像传感器中的候选PD像素进行第一曝光,生 成多帧第一图像的步骤,以及基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦的步骤中,在生成每一帧第一图像、和利用每一帧第一图像进行对焦的操作步骤,都可以在该帧第一图像对应的第二图像(即当前帧第二图像)与上一帧第二图像的场消隐期内完成。
具体的像素的具体输出方式为:成像像素的第一行曝光并输出之前,首先获取PD像素,完成所有行的PD像素的全部曝光并输出,其中,由于PD像素对应的第一帧率相较成像像素的第二帧率更高,从而可以在满足PD像素的帧率的情况下,Sensor一次传输很多行的PD像素;然后,推动马达移动镜头进行对焦;再然后,再进行首行成像像素的曝光并输出。使得输出的的成像像素不会抓取到镜头移动时的画面,且PD像素在镜头移动之前完成了曝光并输出,因此,曝光并输出得到PD像素的图像帧(例如第一图像)和成像像素的图像帧(例如第二图像)均无法抓取到镜头移动的画面,从而解决移动镜头对焦时的画面拉伸和变形扭曲的问题。其中,成像像素的全部曝光和输出以及之后的推动马达进行对焦的操作都在当前帧第二图像和上一帧第二图像的场消隐期内完成。
本实施例中,Sensor可以将PD像素的输出时刻点提前到当前帧(成像像素的图像帧,例如与一帧第一图像对应的当前帧第二图像)与上一帧(成像像素的图像帧,例如上一帧第二图像)的场消隐期中,从而可以更快获取到PD像素信息,从而在更早的时刻点进行PD对焦,从而使PD对焦能够在当前帧内完成,而不影响下一帧图像(即下一帧第二图像)。
可选地,在执行步骤102之前,根据本申请实施例的方法还可以包括:根据所述第一控制通路的带宽,确定所述候选PD像素的第一数据量;对所述第一控制通路配置所述第一数据量;根据所述第二控制通路的带宽,确定所述成像像素的第二数据量;对所述第二控制通路配置所述第二数据量,其中,所述第二数据量为所述图像传感器中成像像素的数据量大小。
其中,本实施例的附加步骤与上述步骤100、步骤101之间的执行顺序不做限制。
本实施例中,还可以将Sensor中PD像素和成像像素各自的字节数大小进行分离,所谓字节数大小,即每个PD像素的数据量大小,以及每个成像像素的数据量大小,其中,不同PD像素的数据量大小是相同的,不同成像像素 的数据量大小是相同,在传统技术中,Sensor中PD像素和成像像素各自的字节数大小是统一的,而本申请则通过不同控制通路来分别输出Sensor生成的用于送显的第二图像和用于对焦的第一图像,因此,可以通过对不同控制通路分别配置所需要传送的图像的像素的数据量大小,来传送不同数据量的图像数据。
那么在确定像素的数据量大小时,可以依据Sensor对曝光生成的图像数据的数据传输通道(这里包括第一控制通路和第二控制通路)的实际带宽情况,来合理确定所需要传送的像素的数据量大小;Sensor可以通过该数据传输通道来将图像数据传输给后台处理端。
以确定PD像素的数据量大小为例进行说明,对于确定成像像素的数据量大小的方式同理。其中,在第一控制通路的带宽紧张的情况下,则可以将原本与成像像素相同数据量大小的PD像素的数据量大小进行降低,例如每个成像像素的数据量大小(即像素值的数据量大小)是10个字节,可以配置每个PD像素的数据量大小为8个字节;在该第一控制通路的带宽空闲的情况下,则为了提高PD像素的图像精度,可以对PD像素的像素值的大小进行提高,例如每个成像像素的数据量大小为10字节,配置每个PD像素的数据量大小为12字节。
那么经过本申请实施例的对PD像素的数据量的大小进行调节后,生成的第一图像中每个PD像素的数据量大小均为本步骤调整后的数据量大小。
可选地,每个所述PD像素的数据量大小,与每个所述成像像素的数据量大小不同。
由于对PD像素和成像像素各自的数据量大小进行了单独配置,那么可以配置二者的数据量大小不同,从而可以依据图像传感器的带宽情况,以及PD像素的图像精度要求来灵活地调整PD像素的数据量大小。
在本申请实施例中,可以将Sensor中PD像素和成像像素各自的数据量大小进行分离来独立控制,从而可以根据所述图像传感器的第一控制通路和第二控制通路的带宽,来灵活地确定所需要传送的PD像素和成像像素的数据量大小,在带宽紧张的情况下,来降低像素(包括PD像素、成像像素)的数据量大小,从而减少传输带宽;在带宽空闲的情况下,来提升像素(PD像素、成像像素)的数据量大小,从而可以提升利用PD像素生成的第一图像的图像 精度,使得利用第一图像进行相位对焦时,对焦精度提升,对焦效果好,以及提升成像像素的第二图像的图像质量。
需要说明的是,本申请实施例提供的图像处理方法,执行主体可以为图像处理装置,或者该图像处理装置中的用于执行图像处理方法的控制模块。本申请实施例中以图像处理装置执行图像处理方法为例,说明本申请实施例提供的图像处理装置。
参照图11,示出了本申请一个实施例的图像处理装置的框图。该图像处理装置包括:
第一确定模块201,用于根据拍摄场景和图像传感器中对焦区域内目标相位PD像素的亮度信息,确定第一曝光参数;
第一生成模块202,用于采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,其中,所述候选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括所述目标PD像素,所述第一曝光参数由所述第一控制通路控制;
对焦模块203,用于基于所述第一图像对所述拍摄场景进行相位对焦;
其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、所述第二控制通路均连接。
可选地,所述第一确定模块201包括:
第一确定子模块,用于确定与拍摄场景匹配的预设亮度条件;
第二确定子模块,用于在所述图像传感器中对焦区域内的目标PD像素的亮度信息不符合所述预设亮度条件的情况下,确定与所述预设亮度条件匹配的第一曝光参数的取值,其中,所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件。
可选地,所述装置还包括:
第二确定模块,用于根据所述第一曝光参数中的第一曝光时间,确定所述第一曝光对应的第一帧率;
其中,所述第一帧率高于第二曝光对应的第二帧率;
所述第一生成模块202包括:
配置子模块,用于对所述第一控制通路配置所述第一帧率和所述第一曝 光参数;
生成子模块,用于采用所述第一控制通路配置的所述第一帧率和所述第一曝光参数,对所述图像传感器中的候选PD像素进行第一曝光,生成多帧第一图像;
所述对焦模块203,还用于基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦。
可选地,所述装置还包括:
第二生成模块,用于采用所述第二控制通路配置的所述第二曝光参数和所述第二帧率,对所述图像传感器中的所述成像像素进行第二曝光,生成第二图像。
可选地,所述装置还包括:
第二确定模块,用于根据所述第一控制通路的带宽,确定所述候选PD像素的第一数据量;
第一配置模块,用于对所述第一控制通路配置所述第一数据量;
第三确定模块,用于根据所述第二控制通路的带宽,确定所述成像像素的第二数据量;
第二配置模块,用于对所述第二控制通路配置所述第二数据量,其中,所述第二数据量为所述图像传感器中成像像素的数据量大小。
在本申请实施例中,可以修改图像传感器中成像像素与PD像素的参数控制方式,通过独立设置图像传感器中成像像素的曝光参数和PD像素的曝光参数,从而可以利用PD像素的曝光参数来对图像传感器中的PD像素进行曝光生成第一图像,并利用第一图像进行相位对焦,从而可以在利用成像像素的曝光参数确保成像质量的同时,提升PD对焦效果,优化PD性能,改善因成像像素的曝光参数与PD像素的曝光参数的配置不独立导致成像效果与PD对焦效果无法得到最优效果的问题。
本申请实施例中的图像处理装置可以是装置,也可以是终端中的部件、集成电路、或芯片。该装置可以是移动电子设备,也可以为非移动电子设备。示例性的,移动电子设备可以为手机、平板电脑、笔记本电脑、掌上电脑、车载电子设备、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA) 等,非移动电子设备可以为个人计算机(personal computer,PC)、电视机(television,TV)、柜员机或者自助机等,本申请实施例不作具体限定。
本申请实施例中的图像处理装置可以为具有操作***的装置。该操作***可以为安卓(Android)操作***,可以为iOS操作***,还可以为其他可能的操作***,本申请实施例不作具体限定。
本申请实施例提供的图像处理装置能够实现上述方法实施例实现的各个过程,为避免重复,这里不再赘述。
可选地,如图12所示,本申请实施例还提供一种电子设备2000,包括处理器2002,存储器2001,存储在存储器2001上并可在所述处理器2002上运行的程序或指令,该程序或指令被处理器2002执行时实现上述图像处理方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要注意的是,本申请实施例中的电子设备包括上述所述的移动电子设备和非移动电子设备。
图13为实现本申请实施例的一种电子设备的硬件结构示意图。
该电子设备1000包括但不限于:射频单元1001、网络模块1002、音频输出单元1003、输入单元1004、传感器1005、显示单元1006、用户输入单元1007、接口单元1008、存储器1009、以及处理器1010等部件。
本领域技术人员可以理解,电子设备1000还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理***与处理器1010逻辑相连,从而通过电源管理***实现管理充电、放电、以及功耗管理等功能。图13中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
其中,传感器1005,用于根据拍摄场景和图像传感器中对焦区域内目标相位PD像素的亮度信息,确定第一曝光参数;采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,其中,所述候选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括目标PD像素,所述第一曝光参数由所述第一控制通路控制;基于所述第一图像对所述拍摄场景进行相位对焦;
其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一 控制通路、所述第二控制通路均连接。
在本申请实施例中,可以修改图像传感器中成像像素与PD像素的参数控制方式,通过独立设置图像传感器中成像像素的曝光参数和PD像素的曝光参数,从而可以利用PD像素的曝光参数来对图像传感器中的PD像素进行曝光生成第一图像,并利用第一图像进行相位对焦,从而可以在利用成像像素的曝光参数确保成像质量的同时,提升PD对焦效果,优化PD性能,改善因成像像素的曝光参数与PD像素的曝光参数的配置不独立导致成像效果与PD对焦效果无法得到最优效果的问题。
可选地,传感器1005,用于确定与拍摄场景匹配的预设亮度条件;在所述图像传感器中对焦区域内的目标PD像素的亮度信息不符合所述预设亮度条件的情况下,确定与所述预设亮度条件匹配的第一曝光参数的取值,其中,所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件。
本申请实施例中,可以结合拍摄场景来灵活确定与该拍摄场景匹配的对焦区域内PD像素的预设亮度条件;那么在图像传感器中对焦区域内的目标PD像素(指代Sensor中当前对焦区域内的PD像素)的亮度信息不符合所述预设亮度条件的情况下,说明用于PD像素曝光的曝光参数的取值无法使得PD对焦性能最优,需要确定与所述预设亮度条件匹配的第一曝光参数的取值,而所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件,使得本申请实施例能够结合拍摄场景准确地确定出符合该拍摄场景的、且能够使PD对焦效果最优的用于曝光PD像素的第一曝光参数的取值,不仅提升了用于PD像素曝光的曝光参数的取值的准确度,而且能够随着拍摄场景的变化,来合理灵活地调整该取值,能够在各种拍摄场景下,优化PD对焦性能。
可选地,传感器1005,用于根据所述第一曝光参数中的第一曝光时间,确定所述第一曝光对应的第一帧率;其中,所述第一帧率高于第二曝光对应的第二帧率;对所述第一控制通路配置所述第一帧率和所述第一曝光参数;采用所述第一控制通路配置的所述第一帧率和所述第一曝光参数,对所述图像传感器中的候选PD像素进行第一曝光,生成多帧第一图像;基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦。
本申请实施例中,由于可以将成像像素的帧率和PD像素的帧率进行独立控制,那么通过将两个帧率设置为不同帧率,具体可以将PD像素对应的第一帧率设置的高于成像像素的第二帧率,从而使得在采用较低的第二帧率对成像像素进行一次曝光的时间内,可以利用第一帧率来对PD像素进行多次曝光,从而对每输出的一帧PD像素的第一图像,就完成一次相位对焦,从而完成多次相位对焦,因此,通过将成像像素的帧率设置的与PD像素的帧率不同,具体可以为将PD像素设置为更高速的帧率,从而可以在输出一帧用于显示的图像(对成像像素进行曝光而生成的图像)的时长内,通过PD像素的高速帧率,快速完成多次PD对焦。
可选地,传感器1005,用于采用所述第二控制通路配置的所述第二曝光参数和所述第二帧率,对所述图像传感器中的所述成像像素进行第二曝光,生成第二图像。
在本申请实施例中,相比于传统技术中PD像素只可以在成像像素之后输出,本实施例中将PD像素和成像像素进行曝光的各自的帧率进行分离,使得PD像素可以在成像像素之前或之后进行曝光,这里为了解决利用PD像素进行相位对焦时所存在的画面拉伸变形的问题,由于PD像素和成像像素各自的帧率分离,因此,可以通过将PD像素的曝光时机,即时序,将PD像素的输出时机控制在不同帧的成像像素的图像的间隙内,这里,将该时序控制在成像像素的当前帧(当前帧第二图像)和上一帧(第二图像)之间的场消隐期内,使得移动镜头的过程的时序落在成像像素的曝光时序之前,那么在镜头移动之前PD像素已经完成了曝光,在镜头移动之后,成像像素完成了曝光,因此,镜头移动过程中,不存在曝光的图像。所以,不论是PD像素还是成像像素的曝光都不受移动镜头的影响,从而不会存在对焦时图像拉伸和变形的问题。
可选地,传感器1005,用于根据所述第一控制通路的带宽,确定所述候选PD像素的第一数据量;对所述第一控制通路配置所述第一数据量;根据所述第二控制通路的带宽,确定所述成像像素的第二数据量;对所述第二控制通路配置所述第二数据量,其中,所述第二数据量为所述图像传感器中成像像素的数据量大小。
本申请实施例中,可以将Sensor中PD像素和成像像素各自的数据量大 小进行分离来独立控制,从而可以根据所述图像传感器的第一控制通路和第二控制通路的带宽,来灵活地确定所需要传送的PD像素和成像像素的数据量大小,在带宽紧张的情况下,来降低像素(包括PD像素、成像像素)的数据量大小,从而减少传输带宽;在带宽空闲的情况下,来提升像素(PD像素、成像像素)的数据量大小,从而可以提升利用PD像素生成的第一图像的图像精度,使得利用第一图像进行相位对焦时,对焦精度提升,对焦效果好,以及提升成像像素的第二图像的图像质量。
应理解的是,本申请实施例中,输入单元1004可以包括图形处理器(Graphics Processing Unit,GPU)10041和麦克风10042,图形处理器10041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元1006可包括显示面板10061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板10061。用户输入单元1007包括触控面板10071以及其他输入设备10072。触控面板10071,也称为触摸屏。触控面板10071可包括触摸检测装置和触摸控制器两个部分。其他输入设备10072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。存储器1009可用于存储软件程序以及各种数据,包括但不限于应用程序和操作***。处理器1010可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1010中。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现上述图像处理方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的电子设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现上述图像处理方法实施例的各个过程,且能达到相同的技术效果,为避免重复, 这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为***级芯片、***芯片、芯片***或片上***芯片等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,或者网络设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (15)

  1. 一种图像处理方法,其中,所述方法包括:
    根据拍摄场景和图像传感器中对焦区域内目标相位PD像素的亮度信息,确定第一曝光参数;
    采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,其中,所述候选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括目标PD像素,所述第一曝光参数由所述第一控制通路控制;
    基于所述第一图像对所述拍摄场景进行相位对焦;
    其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、所述第二控制通路均连接。
  2. 根据权利要求1所述的方法,其中,所述根据拍摄场景和图像传感器中对焦区域内目标相位PD像素的亮度信息,确定第一曝光参数,包括:
    确定与拍摄场景匹配的预设亮度条件;
    在所述图像传感器中对焦区域内的目标PD像素的亮度信息不符合所述预设亮度条件的情况下,确定与所述预设亮度条件匹配的第一曝光参数的取值,其中,所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件。
  3. 根据权利要求1所述的方法,其中,
    所述采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像之前,所述方法还包括:
    根据所述第一曝光参数中的第一曝光时间,确定所述第一曝光对应的第一帧率;
    其中,所述第一帧率高于第二曝光对应的第二帧率;
    所述采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,包括:
    对所述第一控制通路配置所述第一帧率和所述第一曝光参数;
    采用所述第一控制通路配置的所述第一帧率和所述第一曝光参数,对所述图像传感器中的候选PD像素进行第一曝光,生成多帧第一图像;
    所述基于所述第一图像对所述拍摄场景进行相位对焦,包括:
    基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦。
  4. 根据权利要求3所述的方法,其中,
    所述基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦之后,所述方法还包括:
    采用所述第二控制通路配置的所述第二曝光参数和所述第二帧率,对所述图像传感器中的所述成像像素进行第二曝光,生成第二图像。
  5. 根据权利要求1所述的方法,其中,所述采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像之前,所述方法还包括:
    根据所述第一控制通路的带宽,确定所述候选PD像素的第一数据量;
    对所述第一控制通路配置所述第一数据量;
    根据所述第二控制通路的带宽,确定所述成像像素的第二数据量;
    对所述第二控制通路配置所述第二数据量,其中,所述第二数据量为所述图像传感器中成像像素的数据量大小。
  6. 一种图像处理装置,其中,所述装置包括:
    第一确定模块,用于根据拍摄场景和图像传感器中对焦区域内目标PD像素的亮度信息,确定第一曝光参数;
    第一生成模块,用于采用第一控制通路配置的所述第一曝光参数对所述图像传感器中的候选PD像素进行第一曝光,生成第一图像,其中,所述候选PD像素为所述图像传感器中用于相位对焦的像素,所述候选PD像素包括所述目标PD像素,所述第一曝光参数由所述第一控制通路控制;
    对焦模块,用于基于所述第一图像对所述拍摄场景进行相位对焦;
    其中,所述图像传感器中成像像素的第二曝光参数由第二控制通路控制,所述第一控制通路与所述第二控制通路不同,且所述图像传感器与所述第一控制通路、所述第二控制通路均连接。
  7. 根据权利要求6所述的装置,其中,所述第一确定模块包括:
    第一确定子模块,用于确定与拍摄场景匹配的预设亮度条件;
    第二确定子模块,用于在所述图像传感器中对焦区域内的目标PD像素的 亮度信息不符合所述预设亮度条件的情况下,确定与所述预设亮度条件匹配的第一曝光参数的取值,其中,所述第一曝光参数的取值,用于使得经所述第一曝光后的所述目标PD像素的亮度信息符合所述预设亮度条件。
  8. 根据权利要求6所述的装置,其中,所述装置还包括:
    第二确定模块,用于根据所述第一曝光参数中的第一曝光时间,确定所述第一曝光对应的第一帧率;
    其中,所述第一帧率高于第二曝光对应的第二帧率;
    所述第一生成模块包括:
    配置子模块,用于对所述第一控制通路配置所述第一帧率和所述第一曝光参数;
    生成子模块,用于采用所述第一控制通路配置的所述第一帧率和所述第一曝光参数,对所述图像传感器中的候选PD像素进行第一曝光,生成多帧第一图像;
    所述对焦模块,还用于基于所述多帧第一图像中的每帧第一图像,对所述拍摄场景进行多次相位对焦。
  9. 根据权利要求8所述的装置,其中,所述装置还包括:
    第二生成模块,用于采用所述第二控制通路配置的所述第二曝光参数和所述第二帧率,对所述图像传感器中的所述成像像素进行第二曝光,生成第二图像。
  10. 根据权利要求6所述的装置,其中,所述装置还包括:
    第二确定模块,用于根据所述第一控制通路的带宽,确定所述候选PD像素的第一数据量;
    第一配置模块,用于对所述第一控制通路配置所述第一数据量;
    第三确定模块,用于根据所述第二控制通路的带宽,确定所述成像像素的第二数据量;
    第二配置模块,用于对所述第二控制通路配置所述第二数据量,其中,所述第二数据量为所述图像传感器中成像像素的数据量大小。
  11. 一种电子设备,其中,包括处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1至5中任意一项所述的图像处理方法的步骤。
  12. 一种可读存储介质,其中,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1至5中任意一项所述的图像处理方法的步骤。
  13. 一种芯片,其中,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1至5中任意一项所述的图像处理方法。
  14. 一种计算机程序产品,其特征在于,所述程序产品被存储在非易失的存储介质中,所述程序产品被至少一个处理器执行以实现如权利要求1至5中任意一项所述的图像处理方法。
  15. 一种图像处理装置,其特征在于,所述装置被配置成用于执行如权利要求1至5中任意一项所述的图像处理方法。
PCT/CN2022/112970 2021-08-19 2022-08-17 图像处理方法、装置、电子设备及可读存储介质 WO2023020527A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110953703.6 2021-08-19
CN202110953703.6A CN113660425B (zh) 2021-08-19 2021-08-19 图像处理方法、装置、电子设备及可读存储介质

Publications (1)

Publication Number Publication Date
WO2023020527A1 true WO2023020527A1 (zh) 2023-02-23

Family

ID=78492326

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/112970 WO2023020527A1 (zh) 2021-08-19 2022-08-17 图像处理方法、装置、电子设备及可读存储介质

Country Status (2)

Country Link
CN (1) CN113660425B (zh)
WO (1) WO2023020527A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113660425B (zh) * 2021-08-19 2023-08-22 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质
CN114554086B (zh) * 2022-02-10 2024-06-25 支付宝(杭州)信息技术有限公司 一种辅助拍摄方法、装置及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247477A (zh) * 2007-02-16 2008-08-20 佳能株式会社 摄像设备及其控制方法
EP3098638A1 (en) * 2015-05-29 2016-11-30 Phase One A/S Adaptive autofocusing system
CN108322651A (zh) * 2018-02-11 2018-07-24 广东欧珀移动通信有限公司 拍摄方法和装置、电子设备、计算机可读存储介质
CN111586323A (zh) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 图像传感器、控制方法、摄像头组件和移动终端
CN113660425A (zh) * 2021-08-19 2021-11-16 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030193594A1 (en) * 2002-04-16 2003-10-16 Tay Hiok Nam Image sensor with processor controlled integration time
CN104755981B (zh) * 2012-11-14 2017-04-12 富士胶片株式会社 图像处理装置、摄像装置及图像处理方法
CN107948519B (zh) * 2017-11-30 2020-03-27 Oppo广东移动通信有限公司 图像处理方法、装置及设备
CN109005342A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 全景拍摄方法、装置和成像设备
CN108683863B (zh) * 2018-08-13 2020-01-10 Oppo广东移动通信有限公司 成像控制方法、装置、电子设备以及可读存储介质
CN109040609B (zh) * 2018-08-22 2021-04-09 Oppo广东移动通信有限公司 曝光控制方法、装置、电子设备和计算机可读存储介质
CN110278375B (zh) * 2019-06-28 2021-06-15 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN110381263B (zh) * 2019-08-20 2021-04-13 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101247477A (zh) * 2007-02-16 2008-08-20 佳能株式会社 摄像设备及其控制方法
EP3098638A1 (en) * 2015-05-29 2016-11-30 Phase One A/S Adaptive autofocusing system
CN108322651A (zh) * 2018-02-11 2018-07-24 广东欧珀移动通信有限公司 拍摄方法和装置、电子设备、计算机可读存储介质
CN111586323A (zh) * 2020-05-07 2020-08-25 Oppo广东移动通信有限公司 图像传感器、控制方法、摄像头组件和移动终端
CN113660425A (zh) * 2021-08-19 2021-11-16 维沃移动通信(杭州)有限公司 图像处理方法、装置、电子设备及可读存储介质

Also Published As

Publication number Publication date
CN113660425A (zh) 2021-11-16
CN113660425B (zh) 2023-08-22

Similar Documents

Publication Publication Date Title
WO2023020527A1 (zh) 图像处理方法、装置、电子设备及可读存储介质
US11470294B2 (en) Method, device, and storage medium for converting image from raw format to RGB format
CN108307125B (zh) 一种图像采集方法、装置和存储介质
US10616511B2 (en) Method and system of camera control and image processing with a multi-frame-based window for image data statistics
WO2015025740A1 (ja) 制御装置、制御方法、および電子機器
CN110958401B (zh) 一种超级夜景图像颜色校正方法、装置和电子设备
US11768423B2 (en) Image acquisition apparatus, electronic device, image acquisition method and storage medium
WO2020034702A1 (zh) 控制方法、装置、电子设备和计算机可读存储介质
WO2023077939A1 (zh) 摄像头的切换方法、装置、电子设备及存储介质
CN108702460B (zh) 使用显示器光来改善前置相机性能
EP3836532A1 (en) Control method and apparatus, electronic device, and computer readable storage medium
CN109547699A (zh) 一种拍照的方法及装置
KR20220064170A (ko) 이미지 센서를 포함하는 전자 장치 및 그 동작 방법
WO2023020532A1 (zh) 图像处理方法、装置、电子设备及可读存储介质
WO2021179142A1 (zh) 一种图像处理方法及相关装置
CN111835941B (zh) 图像生成方法及装置、电子设备、计算机可读存储介质
CN112437237A (zh) 拍摄方法及装置
WO2022262848A1 (zh) 图像处理方法、装置和电子设备
US12022214B2 (en) Sensitivity-biased pixels
WO2022042753A1 (zh) 拍摄方法、装置及电子设备
US9288461B2 (en) Apparatus and method for processing image, and computer-readable storage medium
JP2014179781A (ja) 撮像ユニット、撮像装置および撮像制御プログラム
CN113760077A (zh) 移动终端、移动终端的温度控制方法、装置及存储介质
JP2006332722A (ja) 画像処理装置、撮像装置、および画像処理方法
CN117956296A (zh) 视频拍摄方法及其装置

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE