CN115170676A - Sensor calibration and image correction method and device, electronic equipment and storage medium - Google Patents

Sensor calibration and image correction method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115170676A
CN115170676A CN202210879467.2A CN202210879467A CN115170676A CN 115170676 A CN115170676 A CN 115170676A CN 202210879467 A CN202210879467 A CN 202210879467A CN 115170676 A CN115170676 A CN 115170676A
Authority
CN
China
Prior art keywords
sensor
image
pixel
calibration
compensation value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210879467.2A
Other languages
Chinese (zh)
Inventor
唐城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202210879467.2A priority Critical patent/CN115170676A/en
Publication of CN115170676A publication Critical patent/CN115170676A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Input (AREA)

Abstract

The application relates to a sensor calibration method, a sensor calibration device, a computer device, a storage medium and a computer program product. The method comprises the following steps: acquiring a calibration image shot by each sensor; obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor; obtaining a second image according to the first image corresponding to each sensor; for each sensor, determining a light response non-uniformity compensation value for the sensor from the sensor's corresponding first image and the second image. The method can shield noise except the PRNU to a great extent, and further can accurately determine the light response nonuniformity compensation value of the sensor according to the first image and the second image corresponding to the sensor.

Description

Sensor calibration and image correction method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of imaging technologies, and in particular, to a method and an apparatus for calibrating a sensor and correcting an image, an electronic device, and a computer-readable storage medium.
Background
With the rapid popularization of electronic devices such as smartphones and tablet computers, electronic devices become an indispensable part of life. An important function of electronic equipment is to realize photographing, and with the development of a photographing technology, the requirements of people on the imaging quality of photographed images are higher and higher.
To achieve better imaging quality, providing more dynamic range and image detail, FWC (Full Well capacitance) on the imaging sensor is also increasing. In the same case, the larger the FWC, the larger the number of electrons received, the stronger the electrical signal, the higher the signal-to-noise ratio, and the better the image quality.
However, as FWC increases, noise generated by the imaging sensor due to PRNU (Photo Response Non-Uniformity) influence also becomes large.
Disclosure of Invention
The embodiment of the application provides a sensor calibration and image correction method, a sensor calibration and image correction device, electronic equipment and a computer readable storage medium, which can reduce noise generated by a PRNU.
In a first aspect, the present application provides a sensor calibration method. The method comprises the following steps:
acquiring a calibration image shot by each sensor;
obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
obtaining a second image according to the first image corresponding to each sensor;
for each sensor, determining a light response non-uniformity compensation value for the sensor from the sensor's corresponding first image and the second image.
In a second aspect, the application further provides a sensor calibration device. The device comprises:
the calibration image acquisition module is used for acquiring calibration images shot by each sensor;
the first data processing module is used for obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
the second data processing module is used for obtaining a second image according to the first image corresponding to each sensor;
and the compensation data determining module is used for determining a light response nonuniformity compensation value of each sensor according to the first image and the second image corresponding to the sensor.
In a third aspect, the application also provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
acquiring a calibration image shot by each sensor;
obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
obtaining a second image according to the first image corresponding to each sensor;
for each sensor, determining a light response non-uniformity compensation value for the sensor from the sensor's corresponding first image and the second image.
In a fourth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of:
acquiring calibration images shot by each sensor;
obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
obtaining a second image according to the first image corresponding to each sensor;
for each sensor, determining a light response non-uniformity compensation value for the sensor from the sensor's corresponding first image and the second image.
In a fifth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
acquiring a calibration image shot by each sensor;
obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
obtaining a second image according to the first image corresponding to each sensor;
for each sensor, determining a light response non-uniformity compensation value for the sensor from the sensor's corresponding first image and the second image.
According to the sensor calibration method, the sensor calibration device, the electronic equipment, the readable storage medium and the computer program product, the calibration images shot by the sensors are obtained, the first image corresponding to the sensor is obtained according to the calibration image shot by the same sensor, the second image is obtained according to the first image corresponding to each sensor, and the optical response nonuniformity compensation value of each sensor is determined according to the first image and the second image corresponding to the sensor. According to the method and the device, aiming at the obtained calibration images shot by the sensors, the calibration images shot by the same sensor are processed to obtain the first images, then the first images corresponding to the sensors are processed to obtain the second images, noise except PRNU can be shielded to a great extent, and then the optical response nonuniformity compensation value of the sensor can be accurately determined according to the first images and the second images corresponding to the sensors.
In a sixth aspect, the present application provides an image correction method. The method comprises the following steps:
acquiring a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier;
when the sensor identification is the same as the calibration sensor identification, correcting an initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image;
when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting an initial image acquired by the sensor according to the default compensation data to obtain a target image;
sending a calibration data updating request to a server according to the sensor identifier;
receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; and the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In a seventh aspect, the present application further provides an image correction apparatus. The device comprises:
the calibration data acquisition module is used for acquiring a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier;
the first correction module is used for correcting the initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier to obtain a target image when the sensor identifier is the same as the calibrated sensor identifier;
the second correction module is used for acquiring default compensation data when the sensor identifier is different from the calibrated sensor identifier, and correcting the initial image acquired by the sensor according to the default compensation data to obtain a target image;
the updating request sending module is used for sending a calibration data updating request to a server according to the sensor identifier;
the updating data receiving module is used for receiving the optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; and the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In an eighth aspect, the present application further provides a computer device. The computer device comprises a memory storing a computer program and a processor implementing the following steps when executing the computer program:
acquiring a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier;
when the sensor identification is the same as the calibration sensor identification, correcting an initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image;
when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting an initial image acquired by the sensor according to the default compensation data to obtain a target image;
sending a calibration data updating request to a server according to the sensor identifier;
receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; and the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In a ninth aspect, the present application further provides a computer-readable storage medium. The computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier;
when the sensor identification is the same as the calibration sensor identification, correcting the initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image;
when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting an initial image acquired by the sensor according to the default compensation data to obtain a target image;
sending a calibration data updating request to a server according to the sensor identifier;
receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; and the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In a tenth aspect, the present application further provides a computer program product. The computer program product comprising a computer program which when executed by a processor performs the steps of:
acquiring a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier;
when the sensor identification is the same as the calibration sensor identification, correcting the initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image;
when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting an initial image acquired by the sensor according to the default compensation data to obtain a target image;
sending a calibration data updating request to a server according to the sensor identifier;
receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; and the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
According to the image correction method, the image correction device, the electronic equipment, the readable storage medium and the computer program product, the sensor identification and the sensor calibration data are obtained, wherein the sensor calibration data comprise the calibrated sensor identification and the optical response nonuniformity compensation value corresponding to the calibrated sensor identification; when the sensor identification is the same as the calibration sensor identification, correcting the initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image; when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting the initial image acquired by the sensor according to the default compensation data to obtain a target image; sending a calibration data updating request to a server according to the sensor identifier; receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; the optical response nonuniformity compensation value corresponding to the calibration sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method. According to the method and the device, the optical response nonuniformity compensation value corresponding to the sensor identification can be acquired in time in a local storage and server sending mode, and the initial image acquired by the sensor is corrected through the accurate optical response nonuniformity compensation value, so that the image quality of the target image can be better improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a graph illustrating a comparison of signal-to-noise ratios for different PRNUs in one embodiment;
FIG. 2 is a diagram of an exemplary sensor calibration method and image correction method;
FIG. 3 is a flow chart of a method for sensor calibration in one embodiment;
FIG. 4 is a diagram illustrating the capture of calibration images in one embodiment;
FIG. 5 is a flow diagram of step 304 in one embodiment;
FIG. 6 is a flow chart of step 306 in one embodiment;
FIG. 7 is a flowchart of step 308 in one embodiment;
FIG. 8 is a flow diagram of step 702 in one embodiment;
FIG. 9 is a flowchart of step 704 in one embodiment;
FIG. 10 is a flow chart of a method of image correction in one embodiment;
FIG. 11 is a flowchart of step 1004 in one embodiment;
FIG. 12 is a flow chart of a method for sensor calibration in another embodiment;
FIG. 13 is a flow chart of a method for sensor calibration in another embodiment;
FIG. 14 is a flowchart of an image correction method in another embodiment;
FIG. 15 is a block diagram showing the structure of a sensor calibration apparatus according to an embodiment;
FIG. 16 is a block diagram showing the structure of an image correction apparatus according to an embodiment;
FIG. 17 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Due to the limitation of volume factors, the electronic device cannot increase the volume of the photosensitive element and enhance the imaging quality by a corresponding hardware processing structure without limit, so that most of the traditional methods improve the imaging quality by acquiring multi-frame images for fusion, which is essentially performed from two aspects, namely, on one hand, the definition is improved, and on the other hand, the Signal-to-Noise Ratio (SNR) is improved. However, as the FWC of the sensor increases, nfp (fixed noise) caused by the PRNU also becomes a main noise, which prevents the image quality from being further improved.
In one example, the SNR is calculated as shown in equation (1) below.
SNR=20×log 10 (Signal/Noise _ total) formula (1)
Figure BDA0003763691680000061
Nfp Signal × PRNU equation (3)
Wherein, signal represents the Signal value of the image, noise _ total represents the Noise value of the image, the Noise value of the image comprises Nrd (read Noise), nps (photon Noise), nfp (fixed pattern Noise), and PRNU represents the optical response nonuniformity of the image.
Note that the readout noise Nrd is an error in reading the amount of detected photons; photon noise Nps is the noise that results from the deviation of the actual situation from the theoretical situation due to the change in the number of photons reaching the sensor; fixed noise Nfp, typically represented by the signal variation of a single pixel output under uniform illumination conditions, is due primarily to the effects of the PRNU in the highlighted regions of the image, nfp. Since each photodiode of the photosensitive device is configured with an ADC (Analog-to-Digital Converter) amplifier, more than one million ADC amplifiers are required if the number of pixels is million. Although the product is manufactured uniformly, the size, doping concentration, dust in the manufacturing process, and deviation of parameters of a field effect transistor (MOS) in each pixel structure may cause variation of pixel output signals. Other conditions being equal, the image signal-to-noise ratio SNR cases where the PRNU is 1% and 2%, respectively, are as shown in fig. 1, and it can be seen from fig. 1 that as the luminance increases, the difference between the signal-to-noise ratio corresponding to 1%and the signal-to-noise ratio corresponding to 2%gradually increases, that is, in a highlight area of the image, the influence of the PRNU on the SNR is large. Therefore, it is necessary to compensate for the sensor's PRNU.
The sensor calibration method provided by the embodiment of the application can be applied to an application environment as shown in fig. 2, wherein the terminal 202 communicates with the server 204 through a network. The data storage system may store data that the server 204 needs to process. The data storage system may be integrated on the server 204, or may be placed on the cloud or other network server. The calibration images shot by the sensors can be obtained through the terminal 202 and then sent to the server 204, and the server 204 obtains a first image corresponding to the sensor according to the calibration images shot by the same sensor; obtaining a second image according to the first image corresponding to each sensor; for each sensor, a light response non-uniformity compensation value for the sensor is determined from the first and second images for the sensor. The terminal 202 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, cameras, internet of things devices, and portable wearable devices, and the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart car-mounted devices, and the like. The portable wearable device can be a smart watch, a smart bracelet, a head-mounted device, and the like. The server 204 may be implemented as a stand-alone server or a server cluster composed of a plurality of servers.
It should be noted that the sensor calibration method provided in the embodiment of the present application is not limited to the application environment shown in fig. 2, and may also be applied to the terminal 202 alone or applied to the server 204 alone.
The image correction method provided by the embodiment of the application can also be applied to the application environment shown in fig. 2. The terminal 202 acquires a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier; when the sensor identification is the same as the calibration sensor identification, correcting the initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image; when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting the initial image acquired by the sensor according to the default compensation data to obtain a target image; the terminal 202 sends a calibration data updating request to the server 204 according to the sensor identifier; the terminal 202 receives the optical response nonuniformity compensation value which is sent by the server 204 according to the calibration data update request and is matched with the sensor identifier, and stores the optical response nonuniformity compensation value; the optical response nonuniformity compensation value corresponding to the calibration sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In one embodiment, as shown in fig. 3, a sensor calibration method is provided, which is described by taking the method as an example applied to the server in fig. 2, and includes the following steps 302 to 308.
Step 302, obtaining calibration images shot by each sensor.
In this embodiment, the terminal may send the calibration image shot by each sensor to the server, and the server obtains the calibration image shot by each sensor sent by the terminal. The calibration image may be an image obtained by shooting the calibration plate by the sensor, or an image obtained by shooting a smooth plane by the sensor, for example, a wall surface, an open space, or the like. In this embodiment, each sensor faces the same scene at the same position and captures the corresponding calibration image under the same condition, that is, the sensors are different and the other factors are the same. Typically, the sensors to be calibrated are multiple sensors of the same model and produced in the same batch. Each sensor can shoot a plurality of frames of calibration images.
In one example, a single sensor is usually used for calibration, but since the sensor does not have the condition of direct shooting test, the sensor is usually required to be assembled into a corresponding sensor module for shooting, that is, corresponding shooting accessories such as a lens and a motor are required to be configured, which is beneficial for shooting a calibration image.
In one example, the calibration images captured by the respective sensors can be obtained through a capturing schematic diagram as shown in fig. 4. The sensor module 402 corresponding to the sensor is shot facing the calibration plate 404 with the positioning mark, and the calibration image corresponding to the sensor is obtained. The other sensors are placed at the same position, i.e. facing the calibration plate 404 at the same angle, so as to obtain calibration images corresponding to the other sensors.
And 304, obtaining a first image corresponding to the sensor according to the calibration image shot by the same sensor.
And the server obtains a first image corresponding to the sensor according to the calibration image shot by the same sensor. In this embodiment, each sensor has a unique sensor identifier, and the server may obtain the first image corresponding to the sensor identifier according to the calibration image corresponding to the same sensor identifier.
Optionally, the first image corresponding to the sensor identifier may be obtained according to multiple frames of calibration images corresponding to the same sensor identifier. For example, the first image corresponding to the sensor identifier may be obtained according to an average value of pixel values of pixels in multiple frames of calibration images corresponding to the same sensor identifier.
And step 306, obtaining a second image according to the first image corresponding to each sensor.
And the server obtains a second image according to the first image corresponding to each sensor. Generally, each sensor corresponds to one frame of first image, a plurality of sensors correspond to a plurality of frames of first images, and one frame of second image is obtained according to the plurality of frames of first images corresponding to the plurality of sensors.
Alternatively, the second image may be obtained according to an average value of pixel values of pixels in multiple frames of the first image corresponding to each sensor. Wherein the second image may characterize a pixel average of each sensor, the pixel average comprising a pixel signal value and a noise value, and thus the second image may eliminate random noise of the first image.
And 308, determining a light response nonuniformity compensation value of each sensor according to the first image and the second image corresponding to the sensor.
For each sensor, the server determines a light response non-uniformity compensation value for the sensor from the first and second images corresponding to the sensor. Among these, photo response non-uniformity (PRNU) is a source of digital camera mode noise, which is seen as a change in pixel responsivity under illumination on a photosensitive element. The optical response nonuniformity compensation value is data for compensating the optical response nonuniformity of the sensor, and the influence of the optical response nonuniformity on the imaging image quality of the sensor can be avoided after the optical response nonuniformity compensation value compensates the optical response nonuniformity of the sensor.
Alternatively, a third image corresponding to sensor PRNU noise may be derived from the first and second images corresponding to the sensor, and then an optical response non-uniformity compensation value for the sensor may be determined from the first and third images. For example, a third image corresponding to sensor PRNU noise is obtained from the difference between the first image and the second image corresponding to the sensor. Alternatively, the pixel value difference of the pixels at the same position in the first image and the second image may be used as the pixel value of the pixel at the corresponding position in the third image, and the third image corresponding to the sensor PRNU noise may be obtained according to the pixel value of the pixel at each position in the third image.
Alternatively, the light response non-uniformity compensation value of the sensor may be determined from the ratio of the third image to the first image. For example, the optical response non-uniformity compensation value corresponding to the corresponding pixel position is determined according to the ratio of the pixel in the third image to the pixel at the corresponding position in the first image, and the optical response non-uniformity compensation value corresponding to the sensor is determined according to the optical response non-uniformity compensation value corresponding to each pixel position.
In this embodiment, when processing the calibration images, the calibration images captured by the respective sensors are placed in the same coordinate system, and the number of pixels and the distribution of pixels included in the calibration images captured by the respective sensors are the same. The positions of the pixels in the calibration images are usually described using coordinates in a coordinate system, and it can be understood that the corresponding coordinates of the pixels at the same positions in different calibration images are the same.
In the sensor calibration method, calibration images shot by each sensor are obtained, a first image corresponding to the sensor is obtained according to the calibration images shot by the same sensor, a second image is obtained according to the first image corresponding to each sensor, and for each sensor, a light response nonuniformity compensation value of the sensor is determined according to the first image and the second image corresponding to the sensor. In the embodiment, for the obtained calibration images shot by each sensor, the calibration images shot by the same sensor are processed to obtain the first image, and then the first images corresponding to each sensor are processed to obtain the second image, so that noises except for the PRNU can be shielded to a great extent, and further, the optical response nonuniformity compensation value of the sensor can be accurately determined according to the first image and the second image corresponding to the sensor.
In some embodiments, as shown in fig. 5, the step 304 of obtaining a first image corresponding to the sensor according to the calibration image captured by the same sensor includes the following steps 502 to 504.
Step 502, taking the average value of the pixel values of the pixels at the same position in the calibration image shot by the same sensor as the pixel value of the pixel at the corresponding position in the first image.
And regarding a plurality of frames of calibration images shot by the same sensor, taking the average value of the pixel values of the pixels at the same position in the plurality of frames of calibration images as the pixel value of the pixel at the corresponding position in the first image. In one example, the same sensor a captures 10 frames of calibration images, the pixel value average value of the first pixel of each frame of image in the 10 frames of images is used as the pixel value of the first pixel in the first image, the pixel value average value of the second pixel of each frame of image in the 10 frames of images is used as the pixel value of the second pixel at the corresponding position in the first image, and the pixel values of the other pixels in the first image corresponding to the sensor a are obtained by analogy.
In one example, the pixel values of the pixels in the first image may be obtained according to equation (4) below,
Figure BDA0003763691680000091
in the formula (4), m represents the frame number of the calibration image, p _ m _ i represents the pixel value of the ith pixel in the calibration image of the mth frame, and image _ average _ i represents the pixel value of the ith pixel in the first image.
Step 504, obtaining a first image according to the pixel values of the pixels at each position in the first image.
After the pixel values of the pixels at the respective positions in the first image are obtained, the first image is obtained from the pixel values of all the pixels. Because the calibration image contains read noise Nrd and photon noise Nps, the photon noise belongs to random noise, and the theoretical value can be changed into the original value after the photon noise is averaged
Figure BDA0003763691680000101
Wherein m represents the number of frames of the calibration image, so that photon noise can be effectively reduced, and readout noise is also reduced.
In this embodiment, the average value of the pixel values of the pixels at the same position in the calibration image captured by the same sensor is used as the pixel value of the pixel at the corresponding position in the first image, so as to obtain the pixel values of all the pixels in the first image, and then obtain the first image.
In some embodiments, as shown in fig. 6, the step 306 of obtaining the second image according to the first image corresponding to each sensor includes the following steps 602 to 604.
Step 602, taking the average value of the pixel values of the pixels at the same position in the first image corresponding to each sensor as the pixel value of the pixel at the corresponding position in the second image.
Each sensor corresponds to one frame of first image, a plurality of sensors correspond to a plurality of frames of first images, and the average value of the pixel values of the pixels at the same position in the plurality of frames of first images is used as the pixel value of the pixel at the corresponding position in the second image. Wherein, the plurality of sensors generally refers to the same type of sensors produced in the same batch.
In one example, the pixel values of the pixels in the second image may be obtained according to the following equation (5),
Figure BDA0003763691680000102
in formula (5), n represents the number of sensors, i.e., the number of frames of the first image, image _ average _ i' represents the pixel value of the ith pixel in the second image, and image _ average _ n _ i represents the pixel value of the ith pixel in the first image of the nth frame.
Step 604, obtaining a second image according to the pixel values of the pixels at each position in the second image.
After the pixel values of the pixels at the positions in the second image are obtained, the second image is obtained according to the pixel values of all the pixels.
In the sensor calibration method, the average value of the pixel values of the pixels at the same position in the first image corresponding to each sensor is used as the pixel value of the pixel at the corresponding position in the second image, so that the pixel values of all the pixels in the second image are obtained, noise caused by PRNU is reduced, only environmental noise remains in the second image, and thus, extraction of PRNU noise is facilitated, and the photoresponse non-uniformity compensation value of the sensor is determined more accurately.
In one embodiment, as shown in fig. 7, the step 308 of determining the optical response non-uniformity compensation value of the sensor according to the corresponding first image and second image of the sensor for each sensor includes the following steps 702 to 704.
Step 702, for each sensor, obtaining a third image corresponding to the sensor according to the first image and the second image corresponding to the sensor.
And aiming at each sensor, the server obtains a third image corresponding to the sensor according to the first image and the second image corresponding to the sensor. Wherein the third image can be understood as a noise model caused by the non-uniformity of the light response of the sensors, one sensor for each frame of the third image.
Optionally, a third image corresponding to the sensor may be obtained according to a difference between the first image and the second image corresponding to the sensor. Specifically, the pixel value difference of the pixels at the same position in the first image and the second image may be used as the pixel value of the pixel at the corresponding position in the third image, and the third image corresponding to the sensor may be obtained according to the pixel value of the pixel at each position in the third image.
In step 704, a photo-response non-uniformity compensation value of the sensor is determined based on the first image and the third image.
The server determines a light response non-uniformity compensation value of the sensor according to the first image and the third image. Optionally, the light response nonuniformity compensation value of the sensor corresponding to the first image is determined according to the ratio of the third image to the first image. For example, regarding a first target pixel in the first image, a pixel in the third image at the same position as the first target pixel is used as a second target pixel, a ratio of the second target pixel to the first target pixel is used as a light response unevenness compensation value corresponding to the first target pixel position, and the light response unevenness compensation value of the sensor is determined according to the light response unevenness compensation value corresponding to each first target pixel position. Wherein the first target pixel may be any one of the pixels in the first image.
In the sensor calibration method, the first image and the second image corresponding to each sensor are used for obtaining the third image of the sensor, and the third image represents a noise model caused by the light response nonuniformity of the sensor, so that the light response nonuniformity compensation value of the sensor can be determined more accurately according to the first image and the third image.
In some embodiments, as shown in fig. 8, the step 702 of obtaining the third image corresponding to the sensor according to the first image and the second image corresponding to the sensor includes the following steps 802 to 804.
Step 802, the pixel value difference of the pixels at the same position in the first image and the second image is used as the pixel value of the pixel at the corresponding position in the third image.
The server may use the pixel value difference of the pixel at the same position in the first image and the second image as the pixel value of the pixel at the corresponding position in the third image. That is, the pixels at the same position in the first image and the second image may be subjected to a difference, and the resulting difference may be used as the pixel value of the pixel at the same position in the third image.
Optionally, the absolute value of the pixel value difference of the pixels at the same position in the first image and the second image is used as the pixel value of the pixel at the corresponding position in the third image.
And step 804, obtaining a third image corresponding to the sensor according to the pixel values of the pixels at the positions in the third image.
After the pixel values of the pixels at the positions in the third image are obtained, that is, after the pixel values of all the pixels in the third image are obtained, the third image corresponding to the sensor is obtained from the pixel values of all the pixels.
According to the sensor calibration method, the pixel value difference of the pixels at the same position in the first image and the second image is used as the pixel value of the pixel at the corresponding position in the third image, the third image corresponding to the sensor is obtained according to the pixel value of the pixel at each position in the third image, and a noise model caused by the optical response nonuniformity represented by the third image can be accurately obtained, so that the optical response nonuniformity of the sensor can be accurately calibrated.
In some embodiments, as shown in fig. 9, the step 704 of determining the light response non-uniformity compensation value of the sensor from the first image and the third image may include the following steps 902 to 904.
Step 902, regarding a first target pixel in a first image, taking a pixel in a third image with the same position as the first target pixel as a second target pixel; and determining a photoresponse nonuniformity compensation value corresponding to the position of the first target pixel according to the first target pixel and the second target pixel.
In this embodiment, the server determines, for a first target pixel in the first image, a pixel in the third image at the same position as the first target pixel as a second target pixel, and determines, according to the first target pixel and the second target pixel, a light response nonuniformity compensation value corresponding to the first target pixel position. Specifically, a first target pixel in the first image is determined, where the first target pixel may be any one pixel in the first image, then a second target pixel in the third image, which is located at the same position as the first target pixel, is determined, and then a photoresponse nonuniformity compensation value corresponding to the first target pixel location is determined according to the first target pixel and the second target pixel.
Step 904, determining a compensation value of the optical response nonuniformity of the sensor according to the compensation value of the optical response nonuniformity corresponding to each first target pixel position.
After the light response nonuniformity compensation values corresponding to the first target pixels in the first image are obtained, the light response nonuniformity compensation values of the sensor are obtained according to the light response nonuniformity compensation values corresponding to all the first target pixels.
In this embodiment, the photoresponse nonuniformity compensation value corresponding to the position of the first target pixel is determined through the first target pixel in the first image and the second target pixel in the third image at the same position as the first target pixel, so as to obtain the photoresponse nonuniformity compensation value of the sensor, that is, the photoresponse nonuniformity compensation value of the sensor is determined through the photoresponse nonuniformity compensation value corresponding to each pixel, so that the photoresponse nonuniformity of the minimum unit pixel imaged by the sensor can be compensated, and the accuracy of the photoresponse nonuniformity compensation value of the sensor is greatly increased.
In some embodiments, determining the photoresponse nonuniformity compensation value corresponding to the first target pixel position from the first target pixel and the second target pixel in step 902 includes: and taking the ratio of the second target pixel to the first target pixel as the light response nonuniformity compensation value corresponding to the first target pixel position.
In this embodiment, the ratio of the second target pixel to the first target pixel may be used as the compensation value of the light response nonuniformity corresponding to the first target pixel position. Optionally, a ratio of the pixel value of the second target pixel to the pixel value of the first target pixel is used as the light response nonuniformity compensation value corresponding to the first target pixel position. In practical applications, the pixel value of the first target pixel may be a pixel value of any one pixel in the first image, or may be a pixel value average value of a plurality of pixels in the same channel as the first target pixel in the first image, for example, a pixel value average value of N × N pixels in the same channel, such as N =5.
It will be appreciated that one first target pixel location corresponds to one light response non-uniformity compensation value that compensates for the light response non-uniformity of the corresponding first target pixel location. The light response non-uniformity compensation values for different first target pixel locations may be the same or different.
In one example, the light response non-uniformity compensation value R corresponding to the first target pixel location can be obtained by the following equation (6):
R=p 2 /p 1 formula (6)
Wherein p is 1 A pixel value, p, representing a first target pixel 2 Representing the pixel value of the second target pixel.
In this embodiment, the ratio of the second target pixel to the first target pixel is used as the light response nonuniformity compensation value corresponding to the first target pixel position, so that the ratio of the noise value and the signal value of the light response nonuniformity corresponding to the first target pixel position can be obtained, the light response nonuniformity compensation value corresponding to the first target pixel position can be more accurately reflected, and a more accurate light response nonuniformity compensation value of the sensor can be obtained.
In one embodiment, as shown in fig. 10, an image correction method is provided, which is described by taking the method as an example applied to the terminal in fig. 2, and includes the following steps 1002 to 1010.
Step 1002, obtaining a sensor identifier and sensor calibration data, where the sensor calibration data includes a calibrated sensor identifier and a photo-response non-uniformity compensation value corresponding to the calibrated sensor identifier.
When a terminal leaves a factory, sensor calibration data such as a sensor identifier in the terminal and an optical response nonuniformity compensation value corresponding to the sensor identifier are usually burned into a storage unit of the terminal, and after the terminal is started, the sensor calibration data are read from the storage unit, so that the sensor of the terminal is compensated, that is, an initial image acquired by the sensor is corrected.
The terminal can obtain a sensor identifier and sensor calibration data from a local place, wherein the sensor identifier is a sensor identifier in the terminal, the sensor calibration data comprises a calibration sensor identifier and an optical response nonuniformity compensation value corresponding to the calibration sensor identifier, and the calibration sensor identifier refers to an identifier corresponding to a sensor calibrated according to the sensor calibration method. In general, in order to enable the sensor calibration data to correct an initial image obtained by a sensor of the terminal in time and obtain a target image with better image quality, a sensor identifier of the terminal is the same as a calibration sensor identifier.
And 1004, when the sensor identifier is the same as the calibrated sensor identifier, correcting the initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier to obtain a target image.
When the terminal recognizes that the sensor identifier of the terminal is the same as the calibration sensor identifier, the optical response nonuniformity compensation value corresponding to the calibration sensor is matched with the sensor identifier of the terminal, and the initial image acquired by the sensor of the terminal is corrected according to the optical response nonuniformity compensation value corresponding to the calibration sensor identifier acquired locally, so that the target image is acquired.
Optionally, a part of pixels or all pixels of the initial image may be corrected according to the photoresponse nonuniformity compensation value corresponding to the calibration sensor identifier, so as to obtain the target image.
Alternatively, the pixel in the initial image and the light response nonuniformity compensation value corresponding to the pixel may be multiplied to obtain the pixel at the corresponding position in the target image.
Step 1006, when the sensor identifier is different from the calibrated sensor identifier, acquiring default compensation data, and correcting the initial image acquired by the sensor according to the default compensation data to obtain a target image.
And when the sensor identification of the terminal is different from the calibration sensor identification, acquiring default compensation data from the local, and correcting the initial image acquired by the sensor according to the default compensation data to obtain a target image. In this embodiment, the default compensation data may be stored in the terminal in advance, for example, in a shooting module of the terminal, and the default compensation data may be an average value of the optical response nonuniformity compensation values corresponding to the calibration sensors in the same batch, or an optical response nonuniformity compensation value obtained by processing according to the optical response nonuniformity compensation values corresponding to the calibration sensors.
Step 1008, sending a calibration data update request to the server according to the sensor identifier.
And when the terminal identifies that the sensor identifier of the terminal is different from the calibration sensor identifier, sending a calibration data updating request to the server according to the sensor identifier. The calibration data updating request is used for applying an optical response nonuniformity compensation value matched with the sensor identifier to the server, so that when the terminal starts shooting application next time, the initial image acquired by the sensor of the terminal is corrected by using the optical response nonuniformity compensation value matched with the sensor identifier to obtain a target image.
Step 1010, receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; the optical response nonuniformity compensation value corresponding to the calibration sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
The terminal receives the optical response nonuniformity compensation value which is sent by the server according to the calibration data update request and is matched with the sensor identifier, and stores the optical response nonuniformity compensation value, for example, the optical response nonuniformity compensation value can be stored in a storage unit of the terminal or a storage unit of the shooting module, and the like, and the storage location is not particularly limited in the present application. The optical response nonuniformity compensation value corresponding to the local calibration sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In this embodiment, the optical response nonuniformity compensation value corresponding to the sensor identifier can be obtained in time by means of local storage and server transmission, and the initial image obtained by the sensor is corrected by the accurate optical response nonuniformity compensation value, so that the image quality of the target image can be better improved.
In some embodiments, as shown in fig. 11, the step 1004 of correcting the initial image acquired by the sensor according to the compensation value of the optical response non-uniformity corresponding to the calibrated sensor identifier to obtain the target image includes the following steps 1102 to 1106.
Step 1102, a first pixel larger than a preset brightness threshold in the initial image is obtained.
And selecting a pixel with the pixel brightness larger than a preset brightness threshold value as a first pixel aiming at the initial image acquired by the sensor. The first pixel may be all pixels in the initial image. The preset brightness threshold value can be different according to different using scenes of the sensor, that is, the preset brightness threshold values in different scenes can be different, and the preset brightness threshold value can be set according to actual processing requirements, so that the method is not further limited in the application.
And 1104, correcting the first pixel according to the optical response nonuniformity compensation value corresponding to the first pixel position in the optical response nonuniformity compensation values to obtain a corrected first pixel.
And selecting the light response nonuniformity compensation value corresponding to the first pixel position from the light response nonuniformity compensation values corresponding to the calibration sensor identification, and correspondingly correcting the first pixel to obtain a corrected first pixel. Alternatively, the product of the photoresponse nonuniformity compensation value corresponding to the first pixel position and the pixel value of the first pixel may be taken as the pixel value of the corrected first pixel, thereby achieving the correction of the first pixel.
Step 1106, obtaining a target image according to the second pixel and the corrected first pixel; wherein the second pixel is a pixel in the initial image other than the first pixel.
And the terminal obtains the target image according to the second pixel and the corrected first pixel, wherein the second pixel is a pixel except the first pixel in the initial image. It can be understood that the second pixel remains the original position, and the position of the corrected first pixel is the same as the position of the first pixel, and then the corresponding positions of the second pixel and the corrected first pixel are placed to obtain the target image.
In this embodiment, a part of the first pixels in the initial image is selected by presetting a brightness threshold, and the first pixels with higher brightness are corrected according to the photoresponse nonuniformity compensation value at the corresponding position to obtain a corrected target image, so that the correction accuracy of the initial image can be improved, and the calculation complexity can be reduced, thereby obtaining the target image with better image quality more quickly.
In one embodiment, the image correction method is applied to a terminal as shown in fig. 12, and when the terminal is turned on for the first time, the sensor identifier of the terminal and the sensor calibration data stored in the terminal are obtained, where the sensor calibration data includes the calibrated sensor identifier and the PRNU compensation value corresponding to the calibrated sensor identifier. When the terminal identifies that the sensor identifier is the same as the calibration sensor identifier, the PRNU compensation value corresponding to the calibration sensor identifier is used for compensating the sensor, namely, an initial image acquired by the sensor is corrected to obtain a target image; and when the terminal identifies that the sensor identifier is different from the calibration sensor identifier, the PRNU of the sensor is compensated by using default compensation data, a calibration data updating request is sent to the server, and a PRNU compensation value which is sent according to the data updating request and is matched with the sensor identifier is received from the server and stored. When the terminal is started for the second time, the PRNU compensation value corresponding to the calibration sensor identifier which is consistent with the sensor identifier and stored in the terminal can be directly obtained, and then the PRNU of the sensor is compensated. In the embodiment, the sensor is replaced in the using process of the terminal, so that when the condition of the sensor identification is changed, the initial image acquired by the sensor can be corrected timely and accurately, and the target image with better image quality can be obtained.
In one example, the sensor calibration and image correction method can be implemented by steps 1302 to 1314 and 1402 to 1410 as follows. A schematic flow chart of the sensor calibration method is shown in fig. 13, and a schematic flow chart of the image correction method is shown in fig. 14.
Step 1302, acquiring calibration images shot by each sensor under a preset condition.
The preset conditions in this embodiment may be as follows: color temperature: 6500K (Kelvin); calibrating the illuminance of the light source of the plate: 400lux (lux); distance between calibration plate and sensor: 10-20 cm; the focusing distance of the sensor is infinite; the shutter time of the sensor is satisfied that the brightness of the captured calibration image is 800LSB (least significant bit), and each sensor captures 20 frames of calibration image. Under the preset condition, the sensor is assembled into a sensor module, namely the sensor module also comprises shooting accessories such as a lens, a motor and the like, and all the sensor modules shoot the same calibration plate at the same position at the same angle to obtain a corresponding calibration image. Wherein, the calibration plate is a uniform surface light source with a positioning mark.
And 1304, preprocessing the acquired calibration images shot by the sensors.
For example, the calibration image is subjected to Dead Pixel Correction (DPC), and a dead Pixel may refer to a point in an image output after being exposed to light by a sensor under uniform illumination conditions, where the brightness is significantly different from the surroundings. In the case of a bayer sensor, bayer array compensation processing is required for the calibration image. Any conventional pretreatment method can be used in the present embodiment, and is not limited herein.
Step 1306, taking the average value of the pixel values of the pixels at the same position in the calibration image shot by the same sensor as the pixel value of the pixel at the corresponding position in the first image; and obtaining the first image according to the pixel value of the pixel at each position in the first image.
Step 1308, align the calibration images corresponding to different sensors.
The central pixel of the calibration image can be determined according to the positioning mark on the calibration plate, so that the calibration images corresponding to different sensors are aligned, and the coordinates of the pixels at corresponding positions in the calibration images corresponding to different sensors are the same.
Step 1310, taking the average value of the pixel values of the pixels at the same position in the first image corresponding to each sensor as the pixel value of the pixel at the corresponding position in the second image; and obtaining a second image according to the pixel values of the pixels at the positions in the second image.
Step 1312, taking the pixel value difference of the pixels at the same position in the first image and the second image as the pixel value of the pixel at the corresponding position in the third image; and obtaining a third image corresponding to the sensor according to the pixel value of the pixel at each position in the third image.
A step 1314 of regarding the first target pixel in the first image, and regarding a pixel in the third image at the same position as the first target pixel as a second target pixel; taking the ratio of the second target pixel to the first target pixel as a photoresponse nonuniformity compensation value corresponding to the first target pixel position; and obtaining the light response nonuniformity compensation value of the sensor according to the light response nonuniformity compensation value corresponding to each first target pixel position.
Step 1402, the terminal obtains a sensor identifier and sensor calibration data, wherein the sensor calibration data includes a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier.
Step 1404, when the sensor identifier is the same as the calibration sensor identifier, taking the product of the light response nonuniformity compensation value corresponding to the calibration sensor identifier and the pixel at the corresponding position with the brightness of more than 400LSB in the initial image acquired by the sensor as the pixel at the corresponding position in the target image.
And step 1406, when the sensor identification is different from the calibrated sensor identification, acquiring default compensation data and correcting the initial image acquired by the sensor according to the default compensation data to obtain a target image.
And 1408, the terminal sends a calibration data updating request to the server according to the sensor identifier.
And step 1410, the terminal receives the optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and stores the optical response nonuniformity compensation value.
The optical response nonuniformity compensation value corresponding to the calibration sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to steps 1302 to 1314.
Here, steps 1302 to 1314 may be performed by the terminal or the server. Optionally, if the optical response nonuniformity compensation value is processed by the terminal, the obtained optical response nonuniformity compensation value can be exported and stored to the terminal corresponding to each sensor identifier, and is uploaded to the server for storage; if the light response unevenness compensation value is processed by the server, the light response unevenness compensation value is downloaded from the server and is derived and stored to the terminal corresponding to each sensor identification.
Optionally, in the process of obtaining the target image, before the initial image is corrected according to the light response nonuniformity compensation value, the dead pixel in the image obtained by the sensor may be removed, that is, the image is processed by the DPC module, then the initial image is corrected according to the light response nonuniformity compensation value, and then the image is processed by the LSC (Lens shading Correction) module to obtain the target image.
According to the sensor calibration and image correction method, aiming at calibration images shot by various sensors acquired under a preset condition, a first image is obtained by processing the calibration image shot by the same sensor, photon noise is removed, noise is read out, then the first image corresponding to each sensor is processed to obtain a second image, noise except PRNU can be shielded to a great extent, a noise value caused by PRNU is obtained according to the first image and the second image corresponding to the sensors, and an optical response nonuniformity compensation value of the sensor can be accurately determined according to the ratio of the PRNU noise value to a corresponding pixel in the first image; the terminal can timely acquire the optical response nonuniformity compensation value corresponding to the sensor identifier in a local storage and server sending mode, and correct the initial image acquired by the sensor through the accurate optical response nonuniformity compensation value, so that the image quality of the target image can be better improved.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be rotated or alternated with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the application also provides a sensor calibration device for realizing the sensor calibration method. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so the specific limitations in one or more embodiments of the sensor calibration device provided below can refer to the limitations on the sensor calibration method in the above, and are not described herein again.
In one embodiment, as shown in fig. 15, there is provided a sensor calibration apparatus including: a calibration image acquisition module 1502, a first data processing module 1504, a second data processing module 1506, and a compensation data determination module 1508, wherein:
a calibration image obtaining module 1502, configured to obtain calibration images captured by each sensor;
the first data processing module 1504 is used for obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
the second data processing module 1506 is configured to obtain a second image according to the first image corresponding to each sensor;
a compensation data determining module 1508, configured to determine, for each sensor, a light response non-uniformity compensation value for the sensor from the first image and the second image corresponding to the sensor.
In one embodiment, the first data processing module 1504 is further configured to: taking the average value of the pixel values of the pixels at the same position in the calibration image shot by the same sensor as the pixel value of the pixel at the corresponding position in the first image; and obtaining the first image according to the pixel value of the pixel at each position in the first image.
In one embodiment, the second data processing module 1506 is further configured to: taking the average value of the pixel values of the pixels at the same position in the first image corresponding to each sensor as the pixel value of the pixel at the corresponding position in the second image; and obtaining the second image according to the pixel value of the pixel at each position in the second image.
In one embodiment, the compensation data determination module 1508 is further configured to: for each sensor, obtaining a third image corresponding to the sensor according to the first image and the second image corresponding to the sensor; determining a photoresponse non-uniformity compensation value for the sensor from the first image and the third image.
In one embodiment, the compensation data determination module 1508 is further configured to: taking the pixel value difference value of the pixel at the same position in the first image and the second image as the pixel value of the pixel at the corresponding position in the third image; and obtaining a third image corresponding to the sensor according to the pixel value of the pixel at each position in the third image.
In one embodiment, the compensation data determination module 1508 is further configured to: regarding a first target pixel in the first image, taking a pixel in the third image at the same position as the first target pixel as a second target pixel; determining a photoresponse nonuniformity compensation value corresponding to the first target pixel position according to the first target pixel and the second target pixel; and determining the light response nonuniformity compensation value of the sensor according to the light response nonuniformity compensation value corresponding to each first target pixel position.
In one embodiment, the compensation data determination module 1508 is further configured to: and taking the ratio of the second target pixel to the first target pixel as the light response nonuniformity compensation value corresponding to the first target pixel position.
Based on the same inventive concept, the embodiment of the present application further provides an image correction apparatus for implementing the image correction method. The implementation scheme for solving the problem provided by the apparatus is similar to the implementation scheme described in the above method, so the specific limitations in one or the following embodiments of the image correction apparatus can be referred to the limitations of the image correction method in the above, and are not described herein again.
In one embodiment, as shown in fig. 16, there is provided an image correction apparatus including: a calibration data obtaining module 1602, a first correcting module 1604, a second correcting module 1606, an update request sending module 1608, and an update data receiving module 1610, wherein:
a calibration data obtaining module 1602, configured to obtain a sensor identifier and sensor calibration data, where the sensor calibration data includes a calibrated sensor identifier and a light response nonuniformity compensation value corresponding to the calibrated sensor identifier;
a first correction module 1604, configured to correct an initial image obtained by the sensor according to a compensation value of nonuniformity of optical response corresponding to the calibrated sensor identifier when the sensor identifier is the same as the calibrated sensor identifier, so as to obtain a target image;
a second correcting module 1606, configured to, when the sensor identifier is different from the calibrated sensor identifier, obtain default compensation data and correct the initial image obtained by the sensor according to the default compensation data to obtain a target image;
an update request sending module 1608, configured to send a calibration data update request to the server according to the sensor identifier;
an update data receiving module 1610, configured to receive an optical response nonuniformity compensation value that is sent by the server according to the calibration data update request and matches the sensor identifier, and store the optical response nonuniformity compensation value; and the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method.
In one embodiment, the first calibration module 1604 is further configured to: acquiring a first pixel which is larger than a preset brightness threshold value in the initial image; correcting the first pixel according to a photoresponse nonuniformity compensation value corresponding to the first pixel position in the photoresponse nonuniformity compensation values to obtain a corrected first pixel; obtaining the target image according to the second pixel and the corrected first pixel; the second pixel is a pixel other than the first pixel in the initial image.
All or part of the modules in the sensor calibration device or the image correction device can be realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 17. The computer apparatus includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input device. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The input/output interface of the computer device is used for exchanging information between the processor and an external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a sensor calibration method or an image correction method. The display unit of the computer device is used for forming a visual picture and can be a display screen, a projection device or a virtual reality imaging device. The display screen can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 17 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of a sensor calibration method or an image correction method.
Embodiments of the present application also provide a computer program product containing instructions that, when run on a computer, cause the computer to perform a sensor calibration method or an image correction method.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, displayed data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data need to comply with the relevant laws and regulations and standards of the relevant country and region.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high-density embedded nonvolatile Memory, resistive Random Access Memory (ReRAM), magnetic Random Access Memory (MRAM), ferroelectric Random Access Memory (FRAM), phase Change Memory (PCM), graphene Memory, and the like. Volatile Memory can include Random Access Memory (RAM), external cache Memory, and the like. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others. The databases referred to in various embodiments provided herein may include at least one of relational and non-relational databases. The non-relational database may include, but is not limited to, a block chain based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic devices, quantum computing based data processing logic devices, etc., without limitation.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present application shall be subject to the appended claims.

Claims (13)

1. A sensor calibration method, comprising:
acquiring a calibration image shot by each sensor;
obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
obtaining a second image according to the first image corresponding to each sensor;
for each sensor, determining a light response non-uniformity compensation value for the sensor from the sensor's corresponding first image and the second image.
2. The method according to claim 1, wherein obtaining the first image corresponding to the sensor according to the calibration image captured by the same sensor comprises:
taking the average value of the pixel values of the pixels at the same position in the calibration image shot by the same sensor as the pixel value of the pixel at the corresponding position in the first image;
and obtaining the first image according to the pixel value of the pixel at each position in the first image.
3. The method of claim 1, wherein obtaining the second image from the first image corresponding to each sensor comprises:
taking the average value of the pixel values of the pixels at the same position in the first image corresponding to each sensor as the pixel value of the pixel at the corresponding position in the second image;
and obtaining the second image according to the pixel value of the pixel at each position in the second image.
4. The method of claim 1, wherein determining, for each sensor, a light response non-uniformity compensation value for the sensor from the first image and the second image for the sensor comprises:
for each sensor, obtaining a third image corresponding to the sensor according to the first image and the second image corresponding to the sensor;
determining a photoresponse non-uniformity compensation value for the sensor from the first image and the third image.
5. The method of claim 4, wherein obtaining a third image corresponding to the sensor from the first image and the second image corresponding to the sensor comprises:
taking the pixel value difference value of the pixel at the same position in the first image and the second image as the pixel value of the pixel at the corresponding position in the third image;
and obtaining a third image corresponding to the sensor according to the pixel value of the pixel at each position in the third image.
6. The method of claim 4, wherein determining the light response non-uniformity compensation value for the sensor from the first image and the third image comprises:
regarding a first target pixel in the first image, taking a pixel in the third image at the same position as the first target pixel as a second target pixel;
determining a photoresponse nonuniformity compensation value corresponding to the first target pixel position according to the first target pixel and the second target pixel;
and determining the light response nonuniformity compensation value of the sensor according to the light response nonuniformity compensation value corresponding to each first target pixel position.
7. The method of claim 6, wherein determining the photoresponse nonuniformity compensation value corresponding to the first target pixel position from the first target pixel and the second target pixel comprises:
and taking the ratio of the second target pixel to the first target pixel as the light response nonuniformity compensation value corresponding to the first target pixel position.
8. An image correction method, comprising:
acquiring a sensor identifier and sensor calibration data, wherein the sensor calibration data comprises a calibration sensor identifier and a light response nonuniformity compensation value corresponding to the calibration sensor identifier;
when the sensor identification is the same as the calibration sensor identification, correcting an initial image acquired by the sensor according to the optical response nonuniformity compensation value corresponding to the calibration sensor identification to obtain a target image;
when the sensor identification is different from the calibration sensor identification, acquiring default compensation data and correcting an initial image acquired by the sensor according to the default compensation data to obtain a target image;
sending a calibration data updating request to a server according to the sensor identifier;
receiving an optical response nonuniformity compensation value which is sent by the server according to the calibration data updating request and is matched with the sensor identifier, and storing the optical response nonuniformity compensation value; the optical response nonuniformity compensation value corresponding to the calibrated sensor identifier and the optical response nonuniformity compensation value sent by the server are obtained according to the sensor calibration method of any one of claims 1 to 7.
9. The method of claim 8, wherein the correcting the initial image obtained by the sensor according to the compensation value of the nonuniformity of optical response corresponding to the calibrated sensor identifier to obtain a target image comprises:
acquiring a first pixel which is larger than a preset brightness threshold value in the initial image;
correcting the first pixel according to the light response nonuniformity compensation value corresponding to the first pixel position in the light response nonuniformity compensation values to obtain a corrected first pixel;
obtaining the target image according to the second pixel and the corrected first pixel; the second pixel is a pixel other than the first pixel in the initial image.
10. A sensor calibration device, comprising:
the calibration image acquisition module is used for acquiring calibration images shot by each sensor;
the first data processing module is used for obtaining a first image corresponding to the sensor according to a calibration image shot by the same sensor;
the second data processing module is used for obtaining a second image according to the first image corresponding to each sensor;
and the compensation data determining module is used for determining a light response nonuniformity compensation value of each sensor according to the first image and the second image corresponding to the sensor.
11. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program, wherein the computer program, when executed by the processor, causes the processor to carry out the steps of the method according to any of the claims 1 to 9.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 9.
13. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 9.
CN202210879467.2A 2022-07-25 2022-07-25 Sensor calibration and image correction method and device, electronic equipment and storage medium Pending CN115170676A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210879467.2A CN115170676A (en) 2022-07-25 2022-07-25 Sensor calibration and image correction method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210879467.2A CN115170676A (en) 2022-07-25 2022-07-25 Sensor calibration and image correction method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115170676A true CN115170676A (en) 2022-10-11

Family

ID=83497582

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210879467.2A Pending CN115170676A (en) 2022-07-25 2022-07-25 Sensor calibration and image correction method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115170676A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760653A (en) * 2023-01-09 2023-03-07 武汉中导光电设备有限公司 Image correction method, device, equipment and readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115760653A (en) * 2023-01-09 2023-03-07 武汉中导光电设备有限公司 Image correction method, device, equipment and readable storage medium

Similar Documents

Publication Publication Date Title
US8004566B2 (en) Self calibration of white balance for a digital camera device using correlated color temperature data
US8994845B2 (en) System and method of adjusting a camera based on image data
JP4668183B2 (en) Method and apparatus for reducing the effects of dark current and defective pixels in an imaging device
US9412154B2 (en) Depth information based optical distortion correction circuit and method
US20170332000A1 (en) High dynamic range light-field imaging
CN112202986A (en) Image processing method, image processing apparatus, readable medium and electronic device thereof
US20090213250A1 (en) Internal Storage of Camera Characteristics During Production
US20080055431A1 (en) Dark frame subtraction using multiple dark frames
CN113132695A (en) Lens shadow correction method and device and electronic equipment
JP7385711B2 (en) Optical sensor module, optical sensing data acquisition method, electronic equipment, storage medium
US20090021603A1 (en) Exposure adjustment methods and systems
CN115170676A (en) Sensor calibration and image correction method and device, electronic equipment and storage medium
KR102285756B1 (en) Electronic system and image processing method
CN117499616A (en) Control method, system, equipment and medium for camera module lens shading correction
US10560648B2 (en) Systems and methods for rolling shutter compensation using iterative process
US20220210399A1 (en) Anomalous pixel detection systems and methods
EP2658245A1 (en) System and method of adjusting camera image data
CN115829896A (en) Image fusion method and device and electronic equipment
WO2020249000A1 (en) Image sensor, and image photographing apparatus and method
CN108668124B (en) Photosensitive chip testing method and device based on charge calculation
CN115665557B (en) Image processing method and device and image acquisition equipment
CN116095505A (en) Signal processing method, device, electronic equipment and storage medium
CN115037915B (en) Video processing method and processing device
CN114257738B (en) Automatic exposure method, device, equipment and storage medium
Dehos et al. Practical photoquantity measurement using a camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination