CN114286000B - Image color processing method and device and electronic equipment - Google Patents

Image color processing method and device and electronic equipment Download PDF

Info

Publication number
CN114286000B
CN114286000B CN202111611798.XA CN202111611798A CN114286000B CN 114286000 B CN114286000 B CN 114286000B CN 202111611798 A CN202111611798 A CN 202111611798A CN 114286000 B CN114286000 B CN 114286000B
Authority
CN
China
Prior art keywords
algorithm
image
color control
color
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111611798.XA
Other languages
Chinese (zh)
Other versions
CN114286000A (en
Inventor
吴佩媛
熊佳
何佳伟
张威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Communications Shanghai Co Ltd
Original Assignee
Spreadtrum Communications Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Communications Shanghai Co Ltd filed Critical Spreadtrum Communications Shanghai Co Ltd
Priority to CN202111611798.XA priority Critical patent/CN114286000B/en
Priority to PCT/CN2022/074396 priority patent/WO2023123601A1/en
Publication of CN114286000A publication Critical patent/CN114286000A/en
Application granted granted Critical
Publication of CN114286000B publication Critical patent/CN114286000B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/77Circuits for processing the brightness signal and the chrominance signal relative to each other, e.g. adjusting the phase of the brightness signal relative to the colour signal, correcting differential gain or differential phase
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Color Image Communication Systems (AREA)

Abstract

The embodiment of the invention relates to the technical field of Internet, in particular to an image color processing method, an image color processing device and electronic equipment. The image color processing method comprises the following steps: inputting image data shot by a camera module into a scene recognition algorithm, wherein the scene recognition algorithm is used for outputting image scene information of the image data; determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information; and executing color processing on the image data according to the target color control algorithm. In the embodiment of the invention, the current scene is identified by combining the scene identification algorithm and the corresponding color control algorithm is executed under the current scene, thereby being beneficial to effectively calling the color control algorithm.

Description

Image color processing method and device and electronic equipment
[ field of technology ]
The embodiment of the invention relates to the technical field of image optimization, in particular to an image color processing method, an image color processing device and electronic equipment.
[ background Art ]
With the development of technology, the application of cameras is becoming more and more popular. Mobile devices, vehicle-mounted products, smart home and security devices are equipped with camera devices, and many electronic products are equipped with cameras. When a camera is used for shooting pictures, certain deviation exists in the colors of the images in different scenes, and the colors of the pictures can be adjusted to a normal level by performing color optimization. It may be due to the fact that the image colors that need to be adjusted are not exactly the same in different scenes.
How to determine to execute different image color processing schemes under different scenes becomes a problem to be solved.
[ invention ]
The embodiment of the invention provides an image color processing method, an image color processing device and electronic equipment, which are used for identifying a current scene by combining a scene identification algorithm and executing a corresponding color control algorithm under the current scene, thereby being beneficial to effectively calling the color control algorithm.
In a first aspect, an embodiment of the present invention provides an image color processing method, including:
inputting image data shot by a camera module into a scene recognition algorithm, wherein the scene recognition algorithm is used for outputting image scene information of the image data;
determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information;
and executing color processing on the image data according to the target color control algorithm.
In one possible implementation manner, before inputting the image data shot by the camera module into the scene recognition algorithm, the method further includes:
identifying a scene object from the image data by adopting an image feature identification algorithm;
and determining a target scene recognition algorithm from a plurality of scene recognition algorithms contained in a scene recognition algorithm library according to the scene objects, wherein the target scene recognition algorithm is used for outputting the image scene information.
In one possible implementation manner, the target scene recognition algorithm is configured to output the image scene information, and includes:
the target scene recognition algorithm is used for determining the image scene information according to the scene objects and the color information of the image data.
In one possible implementation manner, determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information includes:
and determining a target color control algorithm from the plurality of color control algorithms according to one or more of the ambient light level, the equipment parameters of the image pickup equipment and the use mode on the basis of the image scene information.
In one possible implementation manner, determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information includes:
determining a plurality of target color control algorithms from the plurality of color control algorithms according to the image scene information, and configuring effective conditions and effective proportions for the plurality of target color control algorithms;
wherein the color processing of the image data is performed in accordance with the validation conditions and the validation proportions configured for the plurality of target color control algorithms.
In one possible implementation, performing color processing on the image data according to the target color control algorithm includes:
acquiring input information of the target color control algorithm, wherein the input information comprises RGB values of the image data and environmental brightness statistical information;
and inputting the RGB values of the image data and the environmental brightness statistical information into the target color control algorithm to realize the color processing of the image data.
In one possible implementation, the color control algorithm includes one or more of the following: a lens shading correction LSC algorithm, an automatic white balance AWB algorithm, a color correction matrix CCM algorithm, a color correction proof algorithm and a post-processing color algorithm.
In a second aspect, an embodiment of the present invention provides an image color processing apparatus, including:
the input module is used for inputting the image data shot by the camera module into a scene recognition algorithm, and the scene recognition algorithm is used for outputting the image scene information of the image data;
the determining module is used for determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information;
and the execution module is used for executing color processing on the image data according to the target color control algorithm.
In one possible implementation manner, the method further includes: the identification determining module is used for identifying a scene object from the image data by adopting an image characteristic identification algorithm;
and determining a target scene recognition algorithm from a plurality of scene recognition algorithms contained in a scene recognition algorithm library according to the scene objects, wherein the target scene recognition algorithm is used for outputting the image scene information.
In one possible implementation manner, the input module is specifically configured to use the target scene recognition algorithm to determine the image scene information according to the scene object and the color information of the image data.
In one possible implementation manner, the determining module is specifically configured to determine, based on the image scene information, a target color control algorithm from the plurality of color control algorithms according to one or more of an ambient light level, an equipment parameter of the image capturing device, and a usage mode.
In one possible implementation manner, the determining module is further specifically configured to determine a plurality of target color control algorithms from the plurality of color control algorithms according to the image scene information, and configure validation conditions and validation proportions for the plurality of target color control algorithms;
wherein the color processing of the image data is performed in accordance with the validation conditions and the validation proportions configured for the plurality of target color control algorithms.
In one possible implementation manner, the execution module is specifically configured to obtain input information of the target color control algorithm, where the input information includes RGB values of the image data and environmental brightness statistical information;
and inputting the RGB values of the image data and the environmental brightness statistical information into the target color control algorithm to realize the color processing of the image data.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor invoking the program instructions capable of performing the method provided in the first aspect.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium storing computer instructions that cause a computer to perform the method provided in the first aspect.
It should be understood that, the second to fourth aspects of the present disclosure are consistent with the technical solutions of the first aspect of the present disclosure, and the beneficial effects obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flowchart of an image color processing method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an image color processing apparatus according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an embodiment of the electronic device of the present invention.
[ detailed description ] of the invention
For a better understanding of the technical solution of the present invention, the following detailed description of the embodiments of the present invention refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the present specification.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the description. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Fig. 1 is a flowchart of an image color processing method according to an embodiment of the present invention. As shown in fig. 1, the image color processing method includes:
step 101, inputting image data shot by a camera module into a scene recognition algorithm, wherein the scene recognition algorithm is used for outputting image scene information of the image data.
In some embodiments, multiple frames of image data captured by the camera module may be input into a scene recognition algorithm, and image scene information may be obtained by recognizing each frame of image data, and then by combining with the image scene information of each frame, it may be determined what scene the current image capturing apparatus is used in. For example, when the image scene information of each frame shows that the current scene contains blue sky and grassland, it may be determined that the current image capturing apparatus is operating in an outdoor scene.
Optionally, before inputting the image data captured by the camera module into the scene recognition algorithm, the method further includes: identifying a scene object from the image data by adopting an image feature identification algorithm; and determining a target scene recognition algorithm from a plurality of scene recognition algorithms contained in a scene recognition algorithm library according to the scene objects, wherein the target scene recognition algorithm is used for outputting the image scene information.
Specifically, the image data shot by the camera module can include various scene objects, and the various scene objects have corresponding scene recognition algorithms. Therefore, when using the scene recognition algorithm to process each scene object in the image data, the image feature recognition algorithm should be used to recognize each scene object first, and then different scene recognition algorithms should be selected according to different scene objects. For example, the image feature recognition algorithm is used to recognize the relevant features of the scene object in the image data, and if the scene object is determined to be a person according to the recognized relevant features, the face recognition algorithm can be called from a plurality of scene recognition algorithms.
Optionally, the target scene recognition algorithm is configured to output the image scene information, including:
the target scene recognition algorithm is used for determining the image scene information according to the scene objects and the color information of the image data.
Specifically, the scene recognition algorithm includes a neural network model, which includes an input layer, a hidden layer, and a full link layer. The scene object is input into a neural network model, data is output through a full link layer, classification recognition is carried out through a classification function softmax, and specific category characteristics of the scene object are determined. Subsequently, image scene information is acquired in combination with color information in the image data. If the scene object is a scene, determining the scene object as a sky through a scene recognition algorithm, and then combining the current color information to obtain the color of the current sky and the image scene information of the current scene as a cloudy or sunny day.
And 102, determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information.
In some embodiments, the image scene information primarily shows relevant scenes in the current image data, and analysis of the current scene may determine which color control algorithms to use for color optimization processing of the image data.
Optionally, on the basis of the image scene information, a target color control algorithm is determined from the several color control algorithms according to one or more of the ambient light level, the device parameters of the image capturing device, and the usage mode.
The ambient light level may be calculated by an Auto Exposure (AE) algorithm. The AE algorithms which are relatively common at present comprise an average brightness method, a weight average method, a brightness histogram and the like. The most common of these is the average luminance method. The average brightness method is to average the brightness of the pixels of the image, and finally reach the target ambient brightness by continuously adjusting the exposure parameters. The weighted average method is to set different weights for different areas of the image to calculate the brightness of the environment, for example, the selection of various photometric modes in the camera is to change the weights of the different areas. Luminance histogram method is to calculate the ambient light luminance by assigning different weights to the peaks in the histogram.
The image scene information may be used in combination with one or more of ambient light, parameters of the image capturing apparatus, and a usage mode of the image capturing apparatus to determine which color control algorithm should be used to perform color optimization processing on the image data, and may also be used in combination with related information of the color control algorithm. For example, when the image scene information obtained by the scene recognition algorithm is displayed as a blue sky scene, the brightness of the environment is required to reach a certain standard after the blue sky is recognized, and the mixture and the judgment are carried out with the information such as the color coordinate range.
When the target color control algorithm is determined, the corresponding differences among different camera modules can be solved through various module consistency (One Time Programmable, OTP) algorithms, the generalization of various algorithms among different modules is improved, and the individual differences among the modules are compensated.
Optionally, determining a plurality of target color control algorithms from the plurality of color control algorithms according to the image scene information, and configuring effective conditions and effective proportions for the plurality of target color control algorithms; wherein the color processing of the image data is performed in accordance with the validation conditions and the validation proportions configured for the plurality of target color control algorithms.
Specifically, according to one or more of the image scene information, the ambient light level, the parameters of the image capturing apparatus and the use mode of the image capturing apparatus, it is determined whether the current scene needs to be color optimized and a color control algorithm needs to be used. Whether the color control algorithm needs to perform color optimization on the image data or not can be determined through effective conditions configured by the color control algorithm and effective proportions, wherein the effective proportions can be set according to different scenes. For example, the determination is made in a blue sky scene and the current image capturing apparatus is determined to be photographed outdoors, and the effective condition of the automatic white balance algorithm configuration in the color control algorithm may be that the ambient light level is greater than 150cd/m 2 And the color temperature in the color coordinates is less than 5000k, and the effective ratio is 30% and 70%, respectively. When the validation condition and the validation ratio are satisfied, an automatic white balance algorithm is required.
And step 103, executing color processing on the image data according to the target color control algorithm.
In some embodiments, the color control algorithms include a lens shading correction (Lens Shading Correction, LSC) algorithm, an automatic white balance (Auto White Balance, AWB) algorithm, a color correction matrix (Color Correction Matrix, CCM) algorithm, a color correction proofing algorithm, and a post-processing color algorithm. The image data may be color optimized by one or more of the color control algorithms.
Optionally, performing color processing on the image data according to the target color control algorithm includes: acquiring input information of the target color control algorithm, wherein the input information comprises RGB values of the image data and environmental brightness statistical information;
and inputting the RGB values of the image data and the environmental brightness statistical information into the target color control algorithm to realize the color processing of the image data.
Specifically, some color control algorithms require information to be entered during processing, while some algorithms do not. The color control algorithms can obtain output results after calculation, and the image data can be optimized through the output results.
Among them, LSC algorithms generally include two methods: one is a concentric circle method and one is a grid method. The concentric circle method comprises the following steps: firstly, the center of the RGB three channels is found, the same point is generally selected, and the three channels of the center of the picture and the edge of the picture are multiplied by different gains in the shape of concentric circles. The curvature of the shadow gradation is considered to be gradually increased from the center to the edge, so that the equal gain curve is sparse in the center and dense in the edge. Generally, the gain of the lens shading is preferably not more than 2 times, because noise is introduced. The gain in the same square of the grid graph in the grid method is consistent, and the grid distribution is sparse in center and dense in four corners. The output result of the LSC algorithm is a gain table of RGB channels, which is mainly used for guaranteeing the brightness uniformity and the color uniformity of the center and the four corners of the camera module.
The AWB algorithm can be varied, including gray world algorithm, perfect reflection algorithm, dynamic threshold algorithm, color temperature estimation algorithm, etc. The AWB algorithm can output white balance gain compensation for correcting the overall color accuracy and preventing the overall color of the camera from deviating from expected.
The CCM algorithm mainly comprises the steps of passing through M by a sensor RRGB space 2 And M 1 Gamma correction is completed. Among them, the sensor rgb space is referred to as a "source color space", and the nonlinear sRGB space is referred to as a "target color space". At present, we can obtain 24 color blocks corresponding to the 'unsaturated map' of the source color space, and also have 24 color blocks corresponding to the 'saturated map' of the nonlinear sRGB space, and M 1 And the value of gamma is known, then only the picture in the nonlinear sRGB space is subjected to anti-gamma correction and then converted to the XYZ space, and then the matrix M can be obtained by combining the values of the sensor rGB 2 Then, a matrix M is obtained. Two typical algorithms of CCM are polynomial fitting and three-dimensional lookup table (3D-LUT) methods, and the algorithm output result is a color correction matrix which is used for solving the matrix of camera color rendering, and ensuring that various colors of the camera are close to human eye visual results in various environments.
Color correction proofing algorithms include, but are not limited to, various Gamma, HSV, and 3DLUT algorithms common in the industry. The Gamma algorithm assumes that there is a pixel in the image with a value of 200, and then the correction of this pixel performs the following steps: first, normalization converts pixel values into real numbers between 0 and 1. The formula is (i+0.5)/256, here containing 1 division and 1 addition operation. For pixel a, its corresponding normalized value is 0.783203. Then pre-compensating according to the formula
Figure BDA0003435206040000091
The corresponding value of the pixel normalized data with 1/gamma as an index is obtained, and the step comprises an index calculation. If the gamma value is 2.2, 1/gamma is 0.454545, and the pre-compensation result of the normalized A value is 0.783203≡0.454545= 0.894872. And finally, carrying out inverse normalization, and inversely transforming the precompensated real number value into an integer value between 0 and 255. The specific formula is f is 256-0.5, which isThe steps include a multiplication and a subtraction operation. In the previous example, the precompensation result 0.894872 of a is substituted into the above formula, so that the pixel value corresponding to the precompensated a is 228, and this 228 is the last output data.
The HSV algorithm describes the color change by using hue H, saturation S and brightness V, wherein the value range of H is 0-360 degrees, the value range is calculated from red in a counterclockwise direction, the value range of red is 0 degrees, the value range of green is 120 degrees, and the value range of blue is 240 degrees. The higher the saturation S, the darker and more colorful the color. The white light component of the spectral color is 0, and the saturation reaches the highest. The value range is usually 0% to 100%, and the larger the value is, the more saturated the color is. H represents the degree of brightness of the color, and for the light source color, the brightness value is related to the luminance of the illuminant; for object colors, this value is related to the transmittance or reflectance of the object, and typically ranges from 0% (black) to 100% (white).
The 3D-LUT algorithm is a three-dimensional color mapping algorithm that retunes the hue of an image by creating a color mapping table.
The output results of the three color correction proof algorithms are mapping table results, and the algorithm is used for mapping aiming at the color channels so as to finely control the specific color module performance.
Post-processing color algorithms include, but are not limited to, color biasing based on the YUV domain, rendering post-processing algorithms common in the industry. The algorithm aims at different devices, and post-processing algorithms with different color styles are realized in a scene.
The corresponding result can be output through one or more algorithms to optimize the color of the image data.
Fig. 2 is a schematic structural diagram of an image color processing apparatus according to an embodiment of the present invention. As shown in fig. 2, the image color processing apparatus 200 includes: an input module 201, a determination module 202, and an execution module 203. The input module 201 is configured to input image data captured by the camera module into a scene recognition algorithm, where the scene recognition algorithm is configured to output image scene information of the image data; a determining module 202, configured to determine a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information; an execution module 203 for executing color processing on the image data according to the target color control algorithm.
In the above embodiment of the present invention, optionally, the method further includes: the identification determining module is used for identifying a scene object from the image data by adopting an image characteristic identification algorithm; and determining a target scene recognition algorithm from a plurality of scene recognition algorithms contained in a scene recognition algorithm library according to the scene objects, wherein the target scene recognition algorithm is used for outputting the image scene information.
In the above embodiment of the present invention, optionally, the input module 201 is specifically configured to determine the image scene information according to the color information of the scene object and the image data by using the target scene recognition algorithm.
In the foregoing embodiment of the present invention, optionally, the determining module 202 is specifically configured to determine, according to one or more of the environmental light brightness, the device parameter of the image capturing device, and the usage mode, a target color control algorithm from the plurality of color control algorithms, in addition to the image scene information.
In the foregoing embodiment of the present invention, optionally, the determining module 202 is further specifically configured to determine a plurality of target color control algorithms from the plurality of color control algorithms according to the image scene information, and configure validation conditions and validation proportions for the plurality of target color control algorithms; wherein the plurality of target color control algorithms perform color processing on the image data according to the configured validation conditions and validation proportions.
In the above embodiment of the present invention, optionally, the method further includes: and the acquisition module is used for acquiring the input information of the target color control algorithm, wherein the input information comprises the RGB values of the image data and the environmental brightness statistical information.
Fig. 3 is a schematic structural diagram of an embodiment of the electronic device of the present invention.
As shown in fig. 3, the electronic device may include at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, which can be invoked by the processor to perform the image color processing method provided in the embodiment shown in fig. 1 of the present specification.
The electronic device may be a device capable of performing gesture recognition with a user, for example: the cloud server, the embodiment of the present disclosure does not limit the specific form of the electronic device. It is understood that the electronic device herein is the machine mentioned in the method embodiment.
Fig. 3 shows a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the invention. The electronic device shown in fig. 3 is only an example and should not be construed as limiting the functionality and scope of use of the embodiments of the present invention.
As shown in fig. 3, the electronic device is in the form of a general purpose computing device. Components of an electronic device may include, but are not limited to: one or more processors 410, a communication interface 420, a memory 430, and a communication bus 440 that connects the various system components (including the memory 430 and the processing unit 410).
The communication bus 440 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include industry Standard architecture (Industry Standard Architecture; hereinafter ISA) bus, micro channel architecture (Micro Channel Architecture; hereinafter MAC) bus, enhanced ISA bus, video electronics standards Association (Video Electronics Standards Association; hereinafter VESA) local bus, and peripheral component interconnect (Peripheral Component Interconnection; hereinafter PCI) bus.
Electronic devices typically include a variety of computer system readable media. Such media can be any available media that can be accessed by the electronic device and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 430 may include computer system readable media in the form of volatile memory, such as random access memory (Random Access Memory; hereinafter: RAM) and/or cache memory. The electronic device may further include other removable/non-removable, volatile/nonvolatile computer system storage media. Memory 430 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of the embodiments of the invention.
A program/utility having a set (at least one) of program modules may be stored in the memory 430, such program modules including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules typically carry out the functions and/or methods of the embodiments described herein.
The processor 410 executes various functional applications and data processing by running programs stored in the memory 430, for example, to implement the image color processing method provided in the embodiment of fig. 1 of the present invention.
An embodiment of the present invention provides a computer-readable storage medium storing computer instructions that cause a computer to execute an image color processing method provided in the embodiment shown in fig. 1 of the present specification.
Any combination of one or more computer readable media may be utilized as the above-described computer readable storage media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read-Only Memory (ROM), an erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory; EPROM) or flash Memory, an optical fiber, a portable compact disc Read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for the present specification may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (Local Area Network; hereinafter: LAN) or a wide area network (Wide Area Network; hereinafter: WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present specification. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In the description of the present specification, the meaning of "plurality" means at least two, for example, two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and additional implementations are included within the scope of the preferred embodiment of the present specification in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present specification.
Depending on the context, the word "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to detection". Similarly, the phrase "if determined" or "if detected (stated condition or event)" may be interpreted as "when determined" or "in response to determination" or "when detected (stated condition or event)" or "in response to detection (stated condition or event), depending on the context.
It should be noted that, the terminals in the embodiments of the present disclosure may include, but are not limited to, a personal Computer (Personal Computer; hereinafter referred to as a PC), a personal digital assistant (Personal Digital Assistant; hereinafter referred to as a PDA), a wireless handheld device, a Tablet Computer (Tablet Computer), a mobile phone, an MP3 player, an MP4 player, and the like.
In the several embodiments provided in this specification, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in each embodiment of the present specification may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods described in the embodiments of the present specification. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (hereinafter referred to as ROM), a random access Memory (Random Access Memory) and various media capable of storing program codes such as a magnetic disk or an optical disk.
The foregoing description of the preferred embodiments is provided for the purpose of illustration only, and is not intended to limit the scope of the disclosure, since any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the disclosure are intended to be included within the scope of the disclosure.

Claims (9)

1. A method of image color processing, the method comprising:
inputting image data shot by a camera module into a scene recognition algorithm, wherein the scene recognition algorithm is used for outputting image scene information of the image data;
determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information;
performing color processing on the image data according to the target color control algorithm;
determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information, wherein the target color control algorithm comprises the following steps:
determining a plurality of target color control algorithms from the plurality of color control algorithms according to the image scene information, and configuring effective conditions and effective proportions for the plurality of target color control algorithms;
wherein the color processing of the image data is performed in accordance with validation conditions and validation proportions configured for the plurality of target color control algorithms.
2. The method of claim 1, wherein before inputting the image data captured by the camera module into the scene recognition algorithm, the method further comprises:
identifying a scene object from the image data by adopting an image feature identification algorithm;
and determining a target scene recognition algorithm from a plurality of scene recognition algorithms contained in a scene recognition algorithm library according to the scene objects, wherein the target scene recognition algorithm is used for outputting the image scene information.
3. The method of claim 2, wherein the target scene recognition algorithm is configured to output the image scene information, comprising:
the target scene recognition algorithm is used for determining the image scene information according to the scene objects and the color information of the image data.
4. The method of claim 1, wherein determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library based on the image scene information comprises:
and determining a target color control algorithm from the plurality of color control algorithms according to one or more of the ambient light level, the equipment parameters of the image pickup equipment and the use mode on the basis of the image scene information.
5. The method of claim 1, wherein performing color processing on the image data according to the target color control algorithm comprises:
acquiring input information of the target color control algorithm, wherein the input information comprises RGB values of the image data and environmental brightness statistical information;
and inputting the RGB values of the image data and the environmental brightness statistical information into the target color control algorithm to realize the color processing of the image data.
6. The method of claim 1, wherein the number of color control algorithms includes one or more of: a lens shading correction LSC algorithm, an automatic white balance AWB algorithm, a color correction matrix CCM algorithm, a color correction proof algorithm and a post-processing color algorithm.
7. An image color processing apparatus, comprising:
the input module is used for inputting the image data shot by the camera module into a scene recognition algorithm, and the scene recognition algorithm is used for outputting the image scene information of the image data;
the determining module is used for determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library according to the image scene information;
an execution module for executing color processing of the image data according to the target color control algorithm;
according to the image scene information, determining a target color control algorithm from a plurality of color control algorithms contained in a color algorithm library, wherein the target color control algorithm comprises the following steps:
determining a plurality of target color control algorithms from the plurality of color control algorithms according to the image scene information, and configuring effective conditions and effective proportions for the plurality of target color control algorithms;
wherein the color processing of the image data is performed in accordance with validation conditions and validation proportions configured for the plurality of target color control algorithms.
8. An electronic device, comprising: at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the method of any of claims 1-6.
9. A computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 6.
CN202111611798.XA 2021-12-27 2021-12-27 Image color processing method and device and electronic equipment Active CN114286000B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111611798.XA CN114286000B (en) 2021-12-27 2021-12-27 Image color processing method and device and electronic equipment
PCT/CN2022/074396 WO2023123601A1 (en) 2021-12-27 2022-01-27 Image color processing method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111611798.XA CN114286000B (en) 2021-12-27 2021-12-27 Image color processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN114286000A CN114286000A (en) 2022-04-05
CN114286000B true CN114286000B (en) 2023-06-16

Family

ID=80876263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111611798.XA Active CN114286000B (en) 2021-12-27 2021-12-27 Image color processing method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN114286000B (en)
WO (1) WO2023123601A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116668866B (en) * 2022-11-21 2024-04-19 荣耀终端有限公司 Image processing method and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109525782A (en) * 2018-12-25 2019-03-26 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103942523B (en) * 2013-01-18 2017-11-03 华为终端有限公司 A kind of sunshine scene recognition method and device
CN106101547A (en) * 2016-07-06 2016-11-09 北京奇虎科技有限公司 The processing method of a kind of view data, device and mobile terminal
US10567721B2 (en) * 2017-08-23 2020-02-18 Motorola Mobility Llc Using a light color sensor to improve a representation of colors in captured image data
CN108600630A (en) * 2018-05-10 2018-09-28 Oppo广东移动通信有限公司 Photographic method, device and terminal device
CN110348291A (en) * 2019-05-28 2019-10-18 华为技术有限公司 A kind of scene recognition method, a kind of scene Recognition device and a kind of electronic equipment
CN112819703A (en) * 2019-11-18 2021-05-18 Oppo广东移动通信有限公司 Information processing method and apparatus, and storage medium
US11302095B2 (en) * 2020-01-09 2022-04-12 International Business Machines Corporation Cognitive motion picture analysis
CN112562019A (en) * 2020-12-24 2021-03-26 Oppo广东移动通信有限公司 Image color adjusting method and device, computer readable medium and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109525782A (en) * 2018-12-25 2019-03-26 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
WO2023123601A1 (en) 2023-07-06
CN114286000A (en) 2022-04-05

Similar Documents

Publication Publication Date Title
US10949958B2 (en) Fast fourier color constancy
CN112565636B (en) Image processing method, device, equipment and storage medium
US11457160B2 (en) Electronic device and method for adjusting color of image data by using infrared sensor
CN113170028A (en) Method for generating image data of imaging algorithm based on machine learning
CN115496668A (en) Image processing method, image processing device, electronic equipment and storage medium
CN114286000B (en) Image color processing method and device and electronic equipment
CN108629738B (en) Image processing method and device
CN116645527A (en) Image recognition method, system, electronic device and storage medium
CN111724447B (en) Image processing method, system, electronic equipment and storage medium
CN109348207B (en) Color temperature adjusting method, image processing method and device, medium and electronic equipment
CN113079362B (en) Video signal processing method and device and electronic equipment
CN113066020A (en) Image processing method and device, computer readable medium and electronic device
CN107454340B (en) Image synthesis method and device based on high dynamic range principle and mobile terminal
WO2022067761A1 (en) Image processing method and apparatus, capturing device, movable platform, and computer readable storage medium
CN115660997A (en) Image data processing method and device and electronic equipment
US11388348B2 (en) Systems and methods for dynamic range compression in multi-frame processing
CN113473101B (en) Color correction method, device, electronic equipment and storage medium
CN112243118B (en) White balance correction method, device, equipment and storage medium
CN112995634B (en) Image white balance processing method and device, electronic equipment and storage medium
CN116668838B (en) Image processing method and electronic equipment
CN112995633B (en) Image white balance processing method and device, electronic equipment and storage medium
CN111275725B (en) Method and device for determining color temperature and tone of image, storage medium and terminal
CN116055659B (en) Original image processing method and device, electronic equipment and storage medium
EP4089626A1 (en) Method and apparatus based on scene dependent lens shading correction
JP6403811B2 (en) Image processing apparatus and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant