CN114494209A - White point detection method, automatic white balance method, calibration method, medium, and apparatus - Google Patents

White point detection method, automatic white balance method, calibration method, medium, and apparatus Download PDF

Info

Publication number
CN114494209A
CN114494209A CN202210105006.XA CN202210105006A CN114494209A CN 114494209 A CN114494209 A CN 114494209A CN 202210105006 A CN202210105006 A CN 202210105006A CN 114494209 A CN114494209 A CN 114494209A
Authority
CN
China
Prior art keywords
target
color
white point
color space
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210105006.XA
Other languages
Chinese (zh)
Inventor
池晓芳
杨培杉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rockchip Electronics Co Ltd
Original Assignee
Rockchip Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rockchip Electronics Co Ltd filed Critical Rockchip Electronics Co Ltd
Priority to CN202210105006.XA priority Critical patent/CN114494209A/en
Publication of CN114494209A publication Critical patent/CN114494209A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Facsimile Image Signal Circuits (AREA)

Abstract

The invention provides a white point detection method, an automatic white balance method, a calibration method, a medium and equipment. The white point detection method comprises the following steps: acquiring a target image, wherein the target image corresponds to a target light source; performing color transformation on the target image to transform the target image into a target color space to generate a target color space image, the target color space image being represented by first color values and second color values, the first color values being indicative of a color temperature, the second color values being indicative of a color rendering index; and determining a pixel point in a white point area in the target color space image as a white point in the target image. The white point detection result obtained by the white point detection method has higher accuracy.

Description

White point detection method, automatic white balance method, calibration method, medium, and apparatus
Technical Field
The present invention relates to an image processing method, and more particularly, to a white point detection method, an automatic white balance method, a calibration method, a medium, and an apparatus.
Background
When one observes the natural world with the eye, the perception of the same color under different light is essentially the same, and this ability to eliminate or mitigate the effects of the light source and achieve "seeing" the color of the surface of the actual object is called color constancy. However, cameras do not have the ability to present different imaging colors to target objects that are actually the same color under different lighting. This is because the image sensor on the camera simply records all the light projected onto it, and cannot itself distinguish whether the light projected onto the image sensor is due to color reflections from the object itself, or due to color-shifting ambient light. With automatic white balancing techniques, the camera can automatically estimate the ambient color light and remove the effects of the scene illumination color so that the camera can have color constancy that is the same as the human visual system.
Disclosure of Invention
The invention aims to provide a white point detection method, an automatic white balance method, a calibration method, a medium and equipment, which are used for solving the problem of low white point detection precision in the prior art.
A first aspect of the present invention provides a white point detection method, comprising: acquiring a target image, wherein the target image corresponds to a target light source; performing color transformation on the target image to transform the target image into a target color space to generate a target color space image, the target color space image being represented by first color values and second color values, the first color values being indicative of a color temperature, the second color values being indicative of a color rendering index; and determining a pixel point in a white point area in the target color space image as a white point in the target image.
In an embodiment of the first aspect, the white point detecting method further includes: determining transformation parameters associated with a device used to acquire the target image; wherein color transforming the target image comprises: and performing the color transformation according to the transformation parameters.
In an embodiment of the first aspect, determining the transformation parameter comprises: acquiring images of a standard color card shot by the equipment under the irradiation of a plurality of reference light sources as a plurality of first reference images; performing the color transformation according to a set of predetermined parameters to transform the plurality of first reference images into a plurality of target color space reference images; determining, for the set of predetermined parameters, a degree of dispersion of the second color values for pixels in the plurality of target color space reference images corresponding to reference white points in the plurality of first reference images; and determining the transformation parameter according to a target predetermined parameter corresponding to the degree of dispersion in the set of predetermined parameters.
In an embodiment of the first aspect, determining a degree of dispersion of the second color values of pixels in the plurality of target color space reference images corresponding to the reference white points in the plurality of first reference images comprises: determining a degree of deviation of the second color values from a reference line in the target color space.
In an embodiment of the first aspect, 1. determining a degree of dispersion of the second color values of pixels in the plurality of target color space reference images corresponding to the reference white points in the plurality of first reference images comprises: determining a degree of deviation of the second color values from a reference line in the target color space.
In an embodiment of the first aspect, the color transforming the target image includes: performing a first transformation on the target image to transform the target image to an intermediate color space to generate an intermediate color space image, the intermediate color space image being independent of a brightness of the target image; and performing a second transformation on the intermediate color space image to generate the target color space image such that the target color space is rotated by a set angle with respect to the intermediate color space.
In an embodiment of the first aspect, the target color space includes a first axis and a second axis intersecting the first axis, the first axis represents the color temperature, and the second axis represents the color rendering index.
In an embodiment of the first aspect, the first transforming the target image includes performing the first transformation by following equations 1 and 2:
Figure BDA0003493663360000021
formula 1;
Figure BDA0003493663360000022
formula 2; the second transformation of the intermediate color space image comprises a transformation by equation 3 belowThe second transformation:
Figure BDA0003493663360000023
formula 3; wherein ω is1、ω2、ω3And theta is the transformation parameter, X and Y represent color values of pixels of the intermediate color space image in the intermediate color space, X and Y represent color values of pixels of the target color space image in the target color space, R, G and B represent color values of R, G and B channels of pixels of the target image, and X is0And y0Represents a color value of a white point in the intermediate color space under illumination by a particular light source, and x0And y0Represents the origin of the rotation, and θ represents the set angle.
In an embodiment of the first aspect, the white point detection method further includes: determining the white point region, wherein determining the white point region comprises: acquiring an image of a standard color chart shot under the irradiation of the target light source as a second reference image; performing the color transformation on the reference white point in the second reference image to transform the reference white point to the target color space to generate a target color space reference white point; and determining the white point area according to the distribution of the target color space reference white points.
In an embodiment of the first aspect, determining the white point region according to the distribution of the target color space reference white points includes: determining the white point region having a rectangular shape in the target color space.
A second aspect of the present invention provides another white point detection method, comprising: acquiring a target image, wherein the target image corresponds to a target light source; performing a first color transformation on the target image to transform the target image to a target color space to generate a target color space image, the target color space image being represented by a first color value and a second color value, the first color value being indicative of a color temperature, the second color value being indicative of a color rendering index; determining a pixel point in a first white point area in the target color space image as a first candidate white point; performing a second color transformation on the target image to transform the target image to a UV color space to generate a UV color space image; determining a pixel point in a second white point area in the UV color space image as a second candidate white point; and determining a white point in the target image according to the first candidate white point and the second candidate white point.
A third aspect of the present invention provides an automatic white balancing method including: acquiring a white point in the target image by adopting the white point detection method according to any one of the first aspect and the second aspect of the invention; and determining a white balance gain corresponding to the target light source according to the color value of the white point in the target image.
A fourth aspect of the present invention provides a conversion parameter calibration method, including: determining the transformation parameters by a white point detection method according to any one of the second to fifth terms of the first aspect of the invention; and determining the transformation parameters as calibration parameters corresponding to the color space of the device.
A fifth aspect of the invention provides a computer readable storage medium having stored thereon a computer program for execution by a processor to implement the white point detection method according to any one of the first or second aspects of the invention, the automatic white balancing method according to the third aspect of the invention or the transformation parameter calibration method according to the fourth aspect of the invention.
A fifth aspect of the present invention provides an electronic apparatus, comprising: a memory configured to store a computer program; and a processor, communicatively connected to the memory, configured to invoke the computer program to perform the white point detection method according to any one of the first or second aspects of the invention, the automatic white balance method according to the third aspect of the invention, or the transformation parameter calibration method according to the fourth aspect of the invention.
As described above, the white point detection method described in one or more embodiments of the present invention has the following advantageous effects:
compared with the prior art, the invention provides a different white point detection method, and the white point detection method is characterized in that a target image is converted into a target color space, and pixel points in a white point area in the target image after color conversion are obtained and used as white points in the target image. In this way, the white point detection result has higher accuracy, and the color temperature estimation accuracy is improved.
In addition, the colors in the target color space are represented by a first color value and a second color value, and the first color value and the second color value are independent of brightness, so that the white point detection method can counteract the influence of brightness change on white point distribution, and is favorable for further improving the accuracy of white point detection.
Drawings
Fig. 1 is a flowchart illustrating a white point detection method according to an embodiment of the present invention.
FIG. 2A is a detailed flowchart illustrating the determination of transformation parameters according to the white point detection method of the present invention in one embodiment.
Fig. 2B is an exemplary diagram of a standard color chart of the white point detection method according to an embodiment of the invention.
Fig. 2C is an exemplary diagram of a first reference diagram in an embodiment of the white point detection method according to the invention.
Fig. 2D is a detailed flowchart of step S24 of the white point detection method according to an embodiment of the invention.
FIG. 2E is a diagram illustrating an example of a result of color transformation in an embodiment of the white point detection method of the present invention.
FIG. 3 is a flowchart illustrating color transformation of the white point detection method according to an embodiment of the present invention.
Fig. 4A is a flowchart illustrating a specific process of acquiring a white point region according to the white point detection method of the present invention in an embodiment.
Fig. 4B is a diagram illustrating an example of a white point area obtained by the white point detection method according to an embodiment of the present invention.
Fig. 5 is a flowchart illustrating a white point detection method according to an embodiment of the invention.
Fig. 6 is a flowchart illustrating an automatic white balance method according to an embodiment of the present invention.
Fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the drawings only show the components related to the present invention rather than being drawn according to the number, shape and size of the components in actual implementation, and the type, number and proportion of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated. Moreover, in this document, relational terms such as "first," "second," and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
In some solutions, the RGB components of the image are usually used to estimate the color temperature and thus achieve automatic white balance. However, it is difficult to accurately estimate the color temperature using the RGB components of the image. For example, the RGB response under candles may be closer for a wood colored surface and a white surface under a cool daylight lamp. It is difficult to correctly estimate the two color temperatures using the RGB components of the image, and the performance of automatic white balance is degraded when the color temperature estimation is wrong.
According to the embodiment of the disclosure, a novel white point detection method is provided, and a pixel point in a white point area in a target image after transformation is determined as a white point in the target image by transforming the target image into a target color space. In this way, the technical solution according to the embodiments of the present disclosure does not rely on RGB components of the target image for white point detection, so the accuracy of the obtained white point detection result is higher, which is beneficial to improving the accuracy of color temperature estimation.
Hereinafter, specific embodiments of the present disclosure will be described by way of exemplary embodiments with reference to the accompanying drawings.
Fig. 1 is a flow chart illustrating a white point detection method according to an embodiment of the present disclosure. Each step in this flowchart may be implemented by one or more specific modules. In some embodiments, these modules may be modules in a chip such as a GPU, DSP, etc. As shown in fig. 1, the white point detection method includes the following steps S11 to S13.
In step S11, a target image is acquired, the target image corresponding to a target light source. The target image may be an image of an object taken under illumination by the target light source. Devices for capturing images include, but are not limited to, digital cameras.
In step S12, the target image is color-transformed to transform the target image into a target color space to generate a target color space image, the target color space image being represented by a first color value and a second color value, the first color value being used for representing a color temperature, and the second color value being used for representing a color rendering index.
When the target color space is described by a form of a coordinate system, if the first color value corresponds to an abscissa of the coordinate system and the second color value corresponds to an ordinate of the coordinate system, the target color space has the following two properties: only the color temperature of the color is changed along the X direction (namely, the horizontal direction), and the change process of the color temperature from low to high is represented from left to right; only the position on the isochromatic curve is changed in the Y direction (i.e., the vertical direction), that is, the color rendering index of different colors in the Y direction is different. At this time, the target color space may be obtained by performing a warping transformation on the CIE1931 chromaticity coordinate plane.
In step S13, a pixel point in the white point region in the target color space image is determined as a white point in the target image. The white point area may be preset according to actual requirements or experience, or may be obtained by calibrating with a corresponding calibration method, which is not limited in the present invention. The determined white point region may be used for various adjustments (e.g., adjustment of white balance) to the target image or the device that captured the image.
As is apparent from the above description, the present embodiment proposes a white point detection method different from the related art, which converts a target image into a target color space and acquires pixel points located in a white point region in the color-converted target image as white points in the target image. In this way, the white point detection result has higher accuracy, and the color temperature estimation accuracy is improved.
In addition, the colors in the target color space are represented by a first color value and a second color value, and the first color value and the second color value are independent of brightness, so that the white point detection method can counteract the influence of brightness change on white point distribution, and is favorable for further improving the accuracy of white point detection.
Practical test results show that in the embodiment, after the target image is converted into the target color space, all white points are distributed in a regular rectangular area, so that the white point judgment can be conveniently and quickly realized, and the white point judgment logic is greatly simplified.
In an embodiment of the present disclosure, the white point detection method further includes: determining transformation parameters associated with a device used to acquire the target image; wherein color transforming the target image comprises: and performing the color transformation according to the transformation parameters.
Fig. 2A is a flow chart illustrating obtaining the transformation parameters in an embodiment according to the present disclosure. As shown in fig. 2A, acquiring the transformation parameters includes the following steps S21 to S23.
In step S21, images of a standard color chart captured by the apparatus under illumination of a plurality of reference light sources are acquired as a plurality of first reference images, the standard color chart including at least white color patches.
Fig. 2B is an example diagram illustrating a standard color chip according to an embodiment of the present disclosure. For example, FIG. 2B shows an X-Rite24 standard color chart. It should be noted that, in practice, the X-Rite24 standard color card includes 24 color patches with different colors, where the pixel points in the color patches 19 to 22 can be regarded as white dots, and the pixel points in the other color patches can be regarded as non-white dots. Fig. 2C is an example diagram illustrating a first reference diagram according to an embodiment of the present disclosure. Fig. 2C shows images of the standard color card taken under different reference light sources (including: a light source, CWF light source, D50 light source, D65 light source, D75 light source, HZ light source, and TL84 light source), i.e., first reference images corresponding to the different reference light sources.
Preferably, each of the reference light sources is located on or near the blackbody locus. For any reference light source, the reference light source is located near the blackbody locus, which means that the distance between the chromaticity of the reference light source and the blackbody locus is smaller than a preset distance value. It should be noted that, it is only a preferable mode of the present embodiment that each of the reference light sources is located on or near the blackbody locus, but the present embodiment is not limited thereto, and a reference light source far from the blackbody estimation may be selected in practical applications. For example, the CWF light source in fig. 2C is a light source away from the blackbody locus, and the rest of the light sources are located on or near the blackbody locus.
Preferably, after acquiring each of the first reference images, the present embodiment further includes a step of performing demosaicing processing on each of the first reference images.
In step S22, the color transformation is performed according to a set of predetermined parameters to transform the plurality of first reference images into a plurality of target color space reference images.
In step S23, for the set of predetermined parameters, a degree of dispersion of the second color values of pixel points in the plurality of target color space reference images corresponding to the reference white points in the plurality of first reference images is determined.
The reference white point refers to a white point in the first reference image, corresponding to a white color block in the standard color card. Specifically, the second color value of the pixel point corresponding to each reference white point can be obtained by performing color transformation on the reference white point in each first reference image by using the set of predetermined parameters. Therefore, the second color value Y of the pixel point corresponding to each reference white point can be regarded as a function of the predetermined parameter, i.e., Y ═ f (Q), where Q is a set of predetermined parameters and f is a correspondence of Q and Y.
In step S24, the transformation parameter is determined according to a target predetermined parameter corresponding to the degree of dispersion in the set of predetermined parameters.
As mentioned above, the second color value Y of the pixel point corresponding to each of the reference white points can be regarded as a function of the predetermined parameter. Therefore, assuming that the number of the reference white points is N, the second color values of N corresponding pixel points can be obtained for any group of predetermined parameters Q, and then the dispersion degree of the second color values of the N corresponding pixel points can be obtained, where the dispersion degree can be represented by, for example, variance or standard deviation. Therefore, there is also a corresponding relationship between the dispersion degree of the second color value of the pixel point corresponding to each of the reference white points and the predetermined parameter, and it is further known that the transformation parameter can be determined according to a target predetermined parameter corresponding to the dispersion degree in the set of predetermined parameters. For example, the predetermined parameter may be solved with the minimization of the degree of dispersion as a target to obtain a set of target predetermined parameters such that the degree of dispersion is sufficiently small as the transformation parameter. However, the present invention is not limited to this, and other methods may be used to obtain the transformation parameters in practical applications.
As can be seen from the above description, the present embodiment can obtain a set of transformation parameters such that the degree of dispersion of the second color values of the pixels corresponding to the reference white points in the plurality of first reference images is as small as possible. After any image is subjected to color transformation according to the transformation parameters, the distribution of white points irradiated by light sources with different color temperatures and similar color rendering indexes in the target color space is approximately on a horizontal line, and the distribution of white points irradiated by light sources with similar color temperatures and different color rendering indexes in the target color space is approximately on a vertical line.
Optionally, determining a degree of dispersion of the second color values of pixel points in the plurality of target color space reference images corresponding to the reference white points in the plurality of first reference images comprises: determining a degree of deviation of the second color values from a reference line in the target color space, the degree of deviation being representable by, for example, variance, standard deviation, dispersion coefficient, or the like.
Optionally, fig. 2D is a flowchart illustrating determining the transformation parameter according to a target predetermined parameter corresponding to the degree of dispersion in the set of predetermined parameters according to an embodiment of the present disclosure. As shown in fig. 2D, determining the transformation parameter according to the target predetermined parameter corresponding to the degree of dispersion in the set of predetermined parameters includes the following steps S241 and S242.
In step S241, it is determined whether the degree of dispersion is less than a predetermined value. The predetermined value may be set according to actual requirements or experience.
In step S242, if the degree of dispersion is smaller than the predetermined value, the transformation parameter is determined according to the target predetermined parameter corresponding to the degree of dispersion.
Fig. 2E is a diagram illustrating an example of a result of color transformation according to an embodiment of the present disclosure. For example, fig. 2D shows the result of color conversion of the white point in each of the first reference images shown in fig. 2C based on the set of conversion parameters acquired in the present embodiment. From this figure, it can be seen that the color temperature in the target color space gradually increases from left to right, and the white points illuminated by the HZ, a, TL84, D65 and D75 light sources located on the black body locus are substantially on a straight line parallel to the X-axis. In addition, for white points irradiated by TL84 light source and CWF light source which have similar color temperatures but different color rendering indexes, the X coordinate is close and the Y coordinate is greatly different. Therefore, after the target image is subjected to color transformation based on the transformation parameters acquired in this embodiment, the colors in the target image after the color transformation can all be represented by the first color value and the second color value, the first color value is only related to the color temperature, and the second color value is only related to the color rendering index.
It should be noted that the above-mentioned manner of obtaining the transformation parameters is only an optional manner, but the present invention is not limited to this, and the transformation parameters may be set according to actual needs or experience in practical applications.
Fig. 3 is a flowchart illustrating a method of color transforming the target image according to an embodiment of the present disclosure. As shown in fig. 3, the method includes the following steps S31 to S32.
In step S31, a first transformation is performed on the target image to transform the target image to an intermediate color space to generate an intermediate color space image, the intermediate color space image being independent of the brightness of the target image.
In step S32, a second transformation is performed on the intermediate color space image to generate the target color space image such that the target color space is rotated by a set angle with respect to the intermediate color space.
In this embodiment, the first transformation and the second transformation may be implemented by using corresponding transformation parameters. The transformation parameters may be set according to actual requirements or experience, or may be obtained by using the method shown in fig. 2A.
Optionally, the target color space comprises a first axis representing the color temperature and a second axis intersecting the first axis representing the color rendering index.
In an embodiment of the present disclosure, the RGB image (including the target image, the first reference image, and/or the second reference image) is first transformed by equations (1) and (2) below, and the intermediate color space image is second transformed by equation (3) below:
Figure BDA0003493663360000081
Figure BDA0003493663360000082
Figure BDA0003493663360000083
wherein omega1、ω2、ω3And theta is the transformation parameter, X and Y represent color values of pixels of the intermediate color space image in the intermediate color space, X and Y represent color values of pixels of the target color space image in the target color space, R, G and B represent color values of R, G and B channels of pixels of the target image, and X is0And y0Represents a color value of a white point in the intermediate color space under illumination by a particular light source, and x0And y0Represents the origin of the rotation, θ represents the set angle, and the specific light source is, for example, a D65 light source.
Alternatively, each of the transformation parameters in this embodiment may be obtained by the method shown in fig. 2A. At this time, if the variance varY in the Y direction is adopted to represent the dispersion degree of the second color value of the pixel point corresponding to the reference white point in the plurality of first reference images, the transformation parameter may be obtained by solving the following equation (4):
123,θ]argmin (vary), formula (4).
In an embodiment of the present disclosure, the white point detection method further includes: determining the white point region. Fig. 4A is a flow chart illustrating a method of determining the white point region in an embodiment in accordance with the present disclosure. As shown in fig. 4A, the method includes the following steps S41 to S43.
S41, acquiring an image of the standard color card shot under the irradiation of the target light source as a second reference image; wherein the capture device of the second reference image is the same as the target image.
S42, performing the color transformation on the reference white point in the second reference image to transform the reference white point to the target color space to generate a target color space reference white point. In this embodiment, the reference white point is a white point in the second reference image, and corresponds to a white color block in the standard color card.
And S43, determining the white point area according to the distribution of the target color space reference white point. For example, an area within a minimum bounding polygon of each reference white point may be obtained as the white point area, but the invention is not limited thereto.
Preferably, determining the white point region according to the distribution of the target color space reference white points comprises: the white point area with a rectangular shape is determined in the target color space, which is easy to implement and simpler in hardware structure.
Preferably, the range of the white point area can be adaptively adjusted according to the number of the reference white points and the ambient brightness, for example, when the number of the reference white points is large or the ambient brightness is high, the range of the white point area can be appropriately expanded; when the number of the reference white points is small or the ambient brightness is low, the range of the white point region can be appropriately narrowed.
Fig. 4B is an exemplary diagram illustrating a white spot area according to an embodiment of the present disclosure. Specifically, when the target image implements color conversion by using the above-described formula (1), the distribution of the pixel points of different colors in the standard color card in the target color space is as shown in fig. 4B, where the numbers correspond to the numbers of the color patches, and the pixel points in the color patches 19 to 22 are white dots. And selecting white point frames irradiated by different light sources by using a rectangle to obtain white point areas corresponding to the different light sources. For example, when the target light source is an HZ light source, the corresponding white point region is the region in the leftmost rectangular frame. When the target light source is a D75 light source, the corresponding white point area is the area in the rightmost rectangular frame.
Fig. 5 is a flow chart illustrating a white point detection method according to an embodiment of the present disclosure. As shown in fig. 5, the method includes the following steps S51 through S53.
In step S51, a target image is acquired, the target image corresponding to a target light source.
In step S52, the target image is transformed from the device color space to a target color space, i.e., XY domain, to generate a target color space image. The target color space has the following two properties: only the correlated color temperature is changed along the X direction (horizontal direction), from low color temperature to high color temperature from left to right; only the position on the isochromatic temperature line is changed in the Y direction (vertical direction), i.e., the color rendering index is different.
In this embodiment, the target color space may be obtained by performing distortion transformation on a CIE1931 chromaticity coordinate plane. Based on the two properties of the target color space, the white point areas obtained after the target image is converted from the device color space to the target color space are regular rectangles, so that the white point judgment logic can be greatly simplified. In addition, since the device color spaces corresponding to different photographing devices are different, the conversion parameters used in the color conversion process are related to the photographing devices.
In step S53, a pixel point in the white point region in the target color space image is determined as a white point in the target image.
Alternatively, step S52 in this embodiment may be implemented by transforming the parameter ω1、ω2、ω3And θ to realize color conversion, and ω1、ω2And ω3And preferably 1, the method for obtaining the transformation parameter includes the following steps.
Firstly, the same shooting equipment as the target image is adopted to shoot images of the standard color card under different reference light sources as a first reference image. Preferably, after the first reference images are acquired, the embodiment further includes a step of performing demosaicing processing on each of the first reference images.
Subsequently, a first transformation is performed on each of the first reference images to exclude the influence of luminance. Optionally, the first transformation is performed by equations (2) and (3) described above. R, G and B are color values of R channel, G channel and B channel of the pixel point before color transformation, namely color values of the pixel point in the device color space. The first transformation is used to transform the first reference image into an intermediate color space, and (x, y) is a first color value and a second color value of the first transformed pixel point in the intermediate color space, that is, an intermediate color value.
In order to meet the above two-point properties of the target color space, it is necessary to make white points irradiated by the reference light sources on or around the black body locus approximate to a straight line in the intermediate color space, so that the transformation parameter ω can be obtained as a target1、ω2、ω3
And then, performing second transformation on each first reference image after the first transformation to enable the white point under the irradiation of different reference light sources to rotate around a reference color value by a specific angle theta in the intermediate color space to obtain the target color space, and enabling an X axis in the target color space to represent the change of color temperature, a Y axis represents the change of chromaticity, namely the change of color rendering index, and theta is an included angle between the straight line and the X axis. Alternatively, the second transformation in this embodiment may be performed by the above-described formula (1).
Alternatively, in the present embodiment, the transformation parameter ω may be obtained by solving the optimal solution of the above-described equation (4)1、ω2、ω3And theta. varY represents the variance of the reference white point in each of the first reference images in the Y direction, and a smaller variance represents that the white points under illumination of different reference light sources are more approximately on a straight line.
After the color transformation is performed on the transformation parameters acquired by the method, the white points irradiated by all the reference light sources surround an approximately rectangular area in the target color space, and the discrimination between the white points and the non-white points is relatively large. Therefore, a white point interval of each reference light source can be circled by using a rectangle in the target color space, so that a white point area corresponding to each reference light source is obtained. Further, since the rectangular area surrounded by the white dots illuminated by all the reference light sources contains some non-white dots, the rectangular area can be further subdivided into a plurality of quadrilateral areas to eliminate the influence of the non-white dots.
The white point area corresponding to the target light source can be obtained by calibrating the target light source in the above manner, and based on this, after any target image is converted into the target color space, the point located in the white point area corresponding to the target light source is the white point in the target image, for example, when the target image is completely shot under the irradiation of the HZ light source, the target image is converted into the target color space, and all pixel points in the white point area corresponding to the HZ light source are obtained as the white point in the target image.
According to another aspect of the present disclosure, another white point detection method is also provided. Fig. 6 is a flow chart illustrating a white point detection method according to an embodiment of the present disclosure. As shown in fig. 6, the method includes the following steps S61 to S66.
In step S61, a target image is acquired, the target image corresponding to a target light source.
In step S62, a first color transform is performed on the target image to transform the target image into a target color space to generate a target color space image, the target color space image being represented by a first color value and a second color value, the first color value being used for representing a color temperature, and the second color value being used for representing a color rendering index.
In step S63, a pixel point in the target color space image in the first white point region is determined as a first candidate white point, where the first white point region can be obtained by labeling.
In step S64, a second color transform is performed on the target image to transform the target image to a UV color space to generate a UV color space image.
In step S65, a pixel point in the UV color space image in a second white point region is determined as a second candidate white point, where the second white point region can be obtained by labeling.
In step S66, a white point in the target image is determined from the first candidate white point and the second candidate white point. Preferably, this embodiment may select an intersection of the first candidate white point and the second candidate white point as the white point in the target image.
The process of obtaining the first candidate white point in steps S61 to S63 is similar to steps S11 to S13 in fig. 1, and will not be described herein again. In addition, the process of obtaining the second candidate white point shown in the above steps S64 to S65 may be implemented by using the prior art, and will not be described herein again. As can be seen from the above description, in this embodiment, an intersection of the first candidate white point and the second candidate white point may be selected as a white point in the target image, so as to exclude a non-white point falling in the middle area, which is beneficial to further improving the accuracy of white point detection.
According to another aspect of the present disclosure, there is also provided an automatic white balancing method including: acquiring a white point in a target image by adopting a white point detection method shown in FIG. 1, FIG. 5 or FIG. 6; and determining a white balance gain corresponding to the target light source according to the color value of the white point in the target image.
Optionally, the method for obtaining the white balance gain may be:
Figure BDA0003493663360000121
WBGainG=1,
Figure BDA0003493663360000122
WBGainR、WBGainGand WBgainBThe gains of the R, G and B channels, respectively, and SumR, SumG and SumB are weighted sums of the white point R, G and B channels, respectively, in the target image. When the white balance gain is obtained, the same weight value may be assigned to white points irradiated by different light sources, or different weight values may be assigned to white points irradiated by different light sources, in which case, sum ═ Σ Ri×Wi,SumG=∑Gi×Wi,SumB=∑BiX W, wherein Ri、GiAnd BiRespectively the accumulated value of white point, W, under the irradiation of the light source iiThe weighted value corresponding to the light source i.
According to another aspect of the disclosure, a transformation parameter calibration method is also provided. The transformation parameter calibration method comprises the following steps: determining the transformation parameters by a white point detection method; determining the transformation parameters as calibration parameters corresponding to a color space of the device.
The white point detection method comprises the following steps: acquiring a target image, wherein the target image corresponds to a target light source; determining transformation parameters associated with a device used to acquire the target image; performing color transformation on the target image to transform the target image into a target color space to generate a target color space image, the target color space image being represented by first color values and second color values, the first color values being indicative of a color temperature, the second color values being indicative of a color rendering index; and determining pixel points positioned in a white point area in the target color space image as white points in the target image. Wherein color transforming the target image comprises: and performing the color transformation according to the transformation parameters.
Optionally, determining the transformation parameters comprises: acquiring images of a standard color card shot by the equipment under the irradiation of a plurality of reference light sources as a plurality of first reference images; performing the color transformation according to a set of predetermined parameters to transform the plurality of first reference images into a plurality of target color space reference images; determining, for the set of predetermined parameters, a degree of dispersion of the second color values for pixels in the plurality of target color space reference images corresponding to reference white points in the plurality of first reference images; and determining the transformation parameter according to a target predetermined parameter corresponding to the degree of dispersion in the set of predetermined parameters.
Optionally, determining a degree of dispersion of the second color values of pixel points in the plurality of target color space reference images corresponding to the reference white points in the plurality of first reference images comprises: determining a degree of deviation of the second color value from a reference line in the target color space.
Optionally, determining the transformation parameter according to a target predetermined parameter of the set of predetermined parameters corresponding to the degree of dispersion comprises: determining whether the degree of dispersion is less than a predetermined value; and if the degree of dispersion is smaller than the predetermined value, determining the transformation parameter according to the target predetermined parameter corresponding to the degree of dispersion.
According to another aspect of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a computer program, which is executed by a processor to implement the above-mentioned white point detection method, automatic white balance method or transformation parameter calibration method.
According to another aspect of the present disclosure, an electronic device is also provided. Fig. 7 is a block diagram illustrating an electronic device 700 according to an embodiment of the disclosure. As shown in fig. 7, the electronic device 700 includes a memory 710 and a processor 720. The memory 710 is configured to store a computer program, and the processor 720 is communicatively coupled to the memory 710 and configured to invoke the computer program to perform the white point detection method or the automatic white balancing method of the present invention. In some embodiments, processor 720 may be a SoC, GPU, DSP, or a specific white point detection and white balance processing chip. In this embodiment, the electronic device 700 includes, but is not limited to, a mobile phone, a tablet computer, and other devices with a photographing function.
The protection scope of the white point detection method and the automatic white balance method of the present invention is not limited to the execution sequence of the steps listed in this embodiment, and all the schemes of adding, subtracting, and replacing the steps in the prior art according to the principle of the present invention are included in the protection scope of the present invention.
Compared with the prior art, the invention provides a different white point detection method, and the white point detection method is characterized in that a target image is converted into a target color space, and pixel points in a white point area in the target image after color conversion are obtained and used as white points in the target image. In this way, the white point detection result has higher accuracy, and the color temperature estimation accuracy is improved. In addition, the colors in the target color space are represented by a first color value and a second color value, and the first color value and the second color value are independent of brightness, so that the white point detection method can counteract the influence of brightness change on white point distribution, and is favorable for further improving the accuracy of white point detection.
In conclusion, the present invention effectively overcomes various disadvantages of the prior art and has high industrial utilization value. The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (15)

1. A white point detection method, comprising:
acquiring a target image, wherein the target image corresponds to a target light source;
performing color transformation on the target image to transform the target image into a target color space to generate a target color space image, the target color space image being represented by first color values and second color values, the first color values being indicative of a color temperature, the second color values being indicative of a color rendering index; and
and determining pixel points positioned in a white point area in the target color space image as white points in the target image.
2. The white point detection method of claim 1, further comprising: determining transformation parameters associated with a device used to acquire the target image;
wherein color transforming the target image comprises: and performing the color transformation according to the transformation parameters.
3. The white point detection method of claim 2, wherein determining transformation parameters comprises:
acquiring images of a standard color card shot by the equipment under the irradiation of a plurality of reference light sources as a plurality of first reference images;
performing the color transformation according to a set of predetermined parameters to transform the plurality of first reference images into a plurality of target color space reference images;
determining, for the set of predetermined parameters, a degree of dispersion of the second color values for pixels in the plurality of target color space reference images corresponding to reference white points in the plurality of first reference images; and
determining the transformation parameter according to a target predetermined parameter corresponding to the degree of dispersion in the set of predetermined parameters.
4. The white point detection method of claim 3, wherein determining a degree of dispersion of the second color values for pixel points in the plurality of target color space reference images corresponding to the reference white points in the plurality of first reference images comprises:
determining a degree of deviation of the second color value from a reference line in the target color space.
5. The white point detection method of claim 3 wherein determining the transformation parameters from the target predetermined parameters of the set of predetermined parameters corresponding to the degree of dispersion comprises:
determining whether the degree of dispersion is less than a predetermined value; and
if the degree of dispersion is less than the predetermined value, the transformation parameter is determined according to the target predetermined parameter corresponding to the degree of dispersion.
6. The white point detection method of claim 1 or 2, wherein color transforming the target image comprises:
performing a first transformation on the target image to transform the target image to an intermediate color space to generate an intermediate color space image, the intermediate color space image being independent of a brightness of the target image; and
second transforming the intermediate color space image to generate the target color space image such that the target color space is rotated by a set angle relative to the intermediate color space.
7. The white point detection method of claim 6, wherein the target color space comprises a first axis and a second axis intersecting the first axis, the first axis representing the color temperature and the second axis representing the color rendering index.
8. The white point detection method of claim 6,
performing a first transformation on the target image includes performing the first transformation by equations 1 and 2 below:
Figure FDA0003493663350000021
Figure FDA0003493663350000022
second transforming the intermediate color space image comprises performing the second transformation by equation 3 below:
Figure FDA0003493663350000023
wherein ω is1、ω2、ω3And theta is the transformation parameter, X and Y represent color values of pixels of the intermediate color space image in the intermediate color space, X and Y represent color values of pixels of the target color space image in the target color space, R, G and B represent color values of R, G and B channels of pixels of the target image, X0And y0Representing color values of a white point in the intermediate color space under illumination by a particular light source, and
x0and y0Represents the origin of the rotation, and θ represents the set angle.
9. The white point detection method of claim 1, further comprising: the white point region is determined and,
wherein determining the white point region comprises:
acquiring an image of a standard color chart shot under the irradiation of the target light source as a second reference image;
performing the color transformation on a reference white point in the second reference image to transform the reference white point to the target color space to generate a target color space reference white point; and
and determining the white point area according to the distribution of the target color space reference white points.
10. The white point detection method of claim 9 wherein determining the white point region from the distribution of the target color space reference white points comprises: determining the white point region having a rectangular shape in the target color space.
11. A white point detection method, comprising:
acquiring a target image, wherein the target image corresponds to a target light source;
performing a first color transformation on the target image to transform the target image into a target color space to generate a target color space image, the target color space image being represented by first color values and second color values, the first color values being indicative of a color temperature, the second color values being indicative of a color rendering index;
determining a pixel point in a first white point area in the target color space image as a first candidate white point;
performing a second color transformation on the target image to transform the target image to a UV color space to generate a UV color space image;
determining a pixel point in a second white point area in the UV color space image as a second candidate white point; and
and determining a white point in the target image according to the first candidate white point and the second candidate white point.
12. An automatic white balance method, characterized in that the automatic white balance method comprises:
acquiring a white point in a target image by adopting the white point detection method according to any one of claims 1 to 11; and
and determining a white balance gain corresponding to the target light source according to the color value of the white point in the target image.
13. A conversion parameter calibration method is characterized by comprising the following steps:
determining the transformation parameters by the white point detection method of any one of claims 2 to 5; and
determining the transformation parameters as calibration parameters corresponding to a color space of the device.
14. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program is executed by a processor to implement the white point detection method according to any one of claims 1 to 11, the automatic white balancing method according to claim 12 or the transformation parameter calibration method according to claim 13.
15. An electronic device, characterized in that the electronic device comprises:
a memory configured to store a computer program; and
a processor communicatively connected to the memory and configured to invoke the computer program to perform the white point detection method of any of claims 1 to 11, the automatic white balancing method of claim 12 or the transformation parameter calibration method of claim 13.
CN202210105006.XA 2022-01-28 2022-01-28 White point detection method, automatic white balance method, calibration method, medium, and apparatus Pending CN114494209A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210105006.XA CN114494209A (en) 2022-01-28 2022-01-28 White point detection method, automatic white balance method, calibration method, medium, and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210105006.XA CN114494209A (en) 2022-01-28 2022-01-28 White point detection method, automatic white balance method, calibration method, medium, and apparatus

Publications (1)

Publication Number Publication Date
CN114494209A true CN114494209A (en) 2022-05-13

Family

ID=81476608

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210105006.XA Pending CN114494209A (en) 2022-01-28 2022-01-28 White point detection method, automatic white balance method, calibration method, medium, and apparatus

Country Status (1)

Country Link
CN (1) CN114494209A (en)

Similar Documents

Publication Publication Date Title
CN106973278B (en) A kind of automatic white balance device and method with reference to face color character
CN105635593B (en) Multiple exposure imaging system and white balance method thereof
US8965120B2 (en) Image processing apparatus and method of controlling the same
US8902328B2 (en) Method of selecting a subset from an image set for generating high dynamic range image
US10559092B2 (en) Method and device for processing white balance of image and storage medium
US11277595B2 (en) White balance method for image and terminal device
JP6274931B2 (en) Multi-area white balance control device, multi-area white balance control method, multi-area white balance control program, computer recording multi-area white balance control program, multi-area white balance image processing device, multi-area white balance image processing method, multi-area White balance image processing program, computer recording multi-area white balance image processing program, and imaging apparatus provided with multi-area white balance image processing device
CN106851122A (en) The scaling method and device of the auto exposure parameter based on dual camera system
CN108234971B (en) White balance parameter determines method, white balance adjustment method and device, storage medium, terminal
CN107864342B (en) Image brightness adjusting method and device
CN103200409B (en) Color correction method of multi-projector display system
TWI820851B (en) Compensation lookup table configuration method, display panel compensation method, compensation lookup table configuration device
WO2021218603A1 (en) Image processing method and projection system
JP2016086246A (en) Image processing apparatus and method, and imaging device
CN106803920B (en) Image processing method and device and intelligent conference terminal
CN103905738B (en) High dynamic range images generate system and method
JP2015090562A (en) Image processing device, method, and program
CN113132700B (en) Method and system for adjusting contrast of projector
CN110290313B (en) Method for guiding automatic focusing equipment to be out of focus
CN117156289A (en) Color style correction method, system, electronic device, storage medium and chip
CN114494209A (en) White point detection method, automatic white balance method, calibration method, medium, and apparatus
JP4359662B2 (en) Color image exposure compensation method
CN106817542B (en) imaging method and imaging device of microlens array
TWI283532B (en) Image acquiring apparatus and image processing method thereof
JP2011176487A (en) Signal processing device, and imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination