CN108111831B - Photographing method, imaging apparatus, computer-readable storage medium, and computer device - Google Patents

Photographing method, imaging apparatus, computer-readable storage medium, and computer device Download PDF

Info

Publication number
CN108111831B
CN108111831B CN201711423801.9A CN201711423801A CN108111831B CN 108111831 B CN108111831 B CN 108111831B CN 201711423801 A CN201711423801 A CN 201711423801A CN 108111831 B CN108111831 B CN 108111831B
Authority
CN
China
Prior art keywords
light sources
light source
image
color temperature
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711423801.9A
Other languages
Chinese (zh)
Other versions
CN108111831A (en
Inventor
王会朝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201711423801.9A priority Critical patent/CN108111831B/en
Publication of CN108111831A publication Critical patent/CN108111831A/en
Application granted granted Critical
Publication of CN108111831B publication Critical patent/CN108111831B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Color Television Image Signal Generators (AREA)
  • Processing Of Color Television Signals (AREA)

Abstract

The application discloses a shooting method. The shooting method comprises the following steps: processing a preview image to determine the number of light sources in the preview image; judging whether the number of the light sources is larger than a preset number or not; sending out a prompt for adjusting the view field when the number of the light sources is larger than the preset number; and detecting the color temperature of the light sources when the number of the light sources is less than or equal to the preset number, and carrying out white balance according to the color temperature of the light sources. The application also discloses an imaging device, a computer device and a computer readable storage medium. According to the shooting method, the imaging device, the computer readable storage medium and the computer equipment, the number of the light sources of the preview image is detected, and the computer equipment is controlled to send the prompt of adjusting the view field when the number of the light sources is multiple, so that a user is assisted in adjusting the view field to reduce the number of the light sources in the preview image, and the image of the imaged image after white balance processing presents more accurate color.

Description

Photographing method, imaging apparatus, computer-readable storage medium, and computer device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a shooting method, an imaging apparatus, a computer-readable storage medium, and a computer device.
Background
The related art photographing method performs white balance processing by detecting color temperatures of light sources in a scene and according to the color temperatures of the light sources, however, when there are too many light sources with different color temperatures in the scene, the white balance processing becomes complicated and even the white balance processing effect is poor.
Disclosure of Invention
Embodiments of the present application provide a photographing method, an imaging apparatus, a computer device, and a computer-readable storage medium.
The shooting method of the embodiment of the application comprises the following steps:
processing a preview image to determine the number of light sources in the preview image;
judging whether the number of the light sources is larger than a preset number or not;
sending out a prompt for adjusting the view field when the number of the light sources is larger than the preset number; and
and detecting the color temperature of the light sources when the number of the light sources is less than or equal to the preset number, and carrying out white balance according to the color temperature of the light sources.
The imaging device of the embodiment of the application comprises:
the device comprises a first processing module, a second processing module and a control module, wherein the first processing module is used for processing a preview image to determine the number of light sources in the preview image;
the judging module is used for judging whether the number of the light sources is larger than a preset number or not;
the second processing module is used for sending out a prompt of adjusting the view field when the number of the light sources is larger than the preset number;
and the third processing module is used for detecting the color temperature of the light sources and carrying out white balance according to the color temperature of the light sources when the number of the light sources is less than or equal to the preset number.
One or more non-transitory computer-readable storage media embodying computer-executable instructions that, when executed by one or more processors, cause the processors to perform the photography method of embodiments of the present application.
The computer device comprises a memory and a processor, wherein the memory stores computer readable instructions, and the instructions, when executed by the processor, cause the processor to execute the shooting method.
According to the shooting method, the imaging device, the computer readable storage medium and the computer equipment, the number of the light sources of the preview image is detected, and the computer equipment is controlled to send the prompt of adjusting the view field when the number of the light sources is multiple, so that a user is assisted in adjusting the view field to reduce the number of the light sources in the preview image, and the image of the imaged image after white balance processing presents more accurate color.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a photographing method according to some embodiments of the present application.
FIG. 2 is a block schematic diagram of an imaging device according to certain embodiments of the present application.
FIG. 3 is a schematic plan view of a computer device according to some embodiments of the present application.
FIG. 4 is a schematic diagram of a scenario of a computer device according to some embodiments of the present application.
FIG. 5 is a schematic diagram of a scenario of a computer device according to some embodiments of the present application.
FIG. 6 is a schematic diagram of a scenario of a computer device according to some embodiments of the present application.
Fig. 7 is a flow chart illustrating a photographing method according to some embodiments of the present application.
FIG. 8 is a block diagram of a first processing module of some embodiments of the present application.
Fig. 9 is a scene diagram illustrating a photographing method according to some embodiments of the present application.
Fig. 10 is a scene diagram illustrating a photographing method according to some embodiments of the present application.
Fig. 11 is a histogram formed for each region of the photographing method according to some embodiments of the present application.
Fig. 12 is a schematic flow chart of a photographing method according to some embodiments of the present application.
FIG. 13 is a block schematic diagram of an imaging device according to certain embodiments of the present application.
Fig. 14 is a scene diagram illustrating a photographing method according to some embodiments of the present application.
Fig. 15 is a schematic flow chart of a photographing method according to some embodiments of the present application.
FIG. 16 is a block diagram of a third processing module in accordance with certain implementations of the present application.
Fig. 17 is a schematic flow chart of a photographing method according to some embodiments of the present application.
FIG. 18 is a block diagram of a third processing module in accordance with certain implementations of the present application.
Fig. 19 is a flow chart illustrating a photographing method according to some embodiments of the present application.
FIG. 20 is a block diagram representation of a processing unit in accordance with certain embodiments of the present application.
Fig. 21 is a scene diagram illustrating a photographing method according to some embodiments of the present application.
FIG. 22 is a graphical representation of color temperature curves for certain embodiments of the present application.
FIG. 23 is a block diagram of a computer device according to some embodiments of the present application.
FIG. 24 is a block schematic diagram of an image processing circuit according to some embodiments of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another.
Referring to fig. 1, a photographing method according to an embodiment of the present application includes the following steps:
s12: processing the preview image to determine the number of light sources in the preview image;
s14: judging whether the number of the light sources is larger than a preset number or not;
s16: when the number of the light sources is larger than the preset number, sending out a prompt for adjusting the view field; and
s18: and detecting the color temperature of the light sources when the number of the light sources is less than or equal to the preset number, and carrying out white balance according to the color temperature of the light sources.
Referring to fig. 2, an image forming apparatus 10 according to an embodiment of the present disclosure includes a first processing module 12, a determining module 14, a second processing module 16, and a third processing module 18. The first processing module 12 is configured to process the preview image to determine the number of light sources in the preview image. The judging module 14 is used for judging whether the number of the light sources is larger than a predetermined number. The second processing module 16 is configured to issue a prompt to adjust the field of view when the number of light sources is greater than a predetermined number. The third processing module 16 is configured to detect a color temperature of the light sources when the number of the light sources is less than or equal to a predetermined number and perform white balance according to the color temperature of the light sources.
The photographing method according to the embodiment of the present application may be implemented by the imaging apparatus 10 according to the embodiment of the present application, wherein the step S12 may be implemented by the first processing module 12, the step S14 may be implemented by the determination module 14, the step S16 may be implemented by the second processing module 16, and the step S18 may be implemented by the third processing module 18.
Referring to fig. 3, the imaging apparatus 10 according to the embodiment of the present application may be applied to the computer device 100 according to the embodiment of the present application, that is, the computer device 100 according to the embodiment of the present application may include the imaging apparatus 10 according to the embodiment of the present application.
In some embodiments, the computer device 100 includes a cell phone, a tablet, a laptop, a smart bracelet, a smart watch, a smart helmet, smart glasses, and the like.
According to the shooting method, the imaging device 10 and the computer equipment 100, the computer equipment 100 is controlled to send the prompt of adjusting the view field when the number of the light sources is multiple by detecting the number of the light sources of the preview image, so that a user is assisted in adjusting the view field to reduce the number of the light sources in the preview image, and the theme image of the imaged image after white balance processing presents more accurate color.
Referring to fig. 4, in some embodiments, the photographing method of the present application can be applied to a photographing mode of camera software. Such as a coaching mode.
Therefore, the requirements of users who pursue high shooting requirements can be met, and users with low shooting requirements can be taken into consideration.
For example, as shown in fig. 5, after the user turns on the guidance mode, when shooting a scene with multiple light sources, the computer device 100, such as a mobile phone, automatically determines the number of light sources, and when the number of light sources is greater than a predetermined number, a prompt is issued to adjust the field of view, for example, the distribution of the light sources a and B is as shown in fig. 5, and at this time, the computer device 100 prompts the user to move to the left. Of course, the prompt is processed, and the shooting of the user is not affected.
In some embodiments, the computer device 100 includes an electro-acoustic element (not shown), such as a speaker, through which the issuing of the prompt to adjust the field of view may be accomplished.
When the number of the light sources is less than the predetermined number, there is a prompt suitable for shooting, for example, a check-up pattern appears, as shown in fig. 6, indicating that the user can directly shoot.
In this way, a user who has a high-quality shooting demand can be guided to shoot an image that meets the psychological expectation of the user. And no threshold is provided, and almost any person can do the operation.
Referring to fig. 7, in some embodiments, step S12 includes the following steps:
s122: dividing the preview image into a plurality of regions;
s124: judging whether the region is a target region comprising a light source or not according to the histogram of each region;
s126: when the area is a target area comprising a light source, judging whether a plurality of adjacent target areas exist or not;
s128: when a plurality of adjacent target areas exist, splicing the plurality of adjacent target areas into a light source;
s121: when a plurality of adjacent target areas do not exist, determining the target areas as light sources; and
s123: and counting the number of the light sources.
Referring to fig. 8, in some embodiments, the first processing module 12 includes a dividing unit 122, a first determining unit 124, a second determining unit 126, a splicing unit 128, a first determining unit 121, and a counting unit 123. The dividing unit 122 is used to divide the image into a plurality of regions. The first judging unit 124 is configured to judge whether the region is a target region including the light source according to the histogram of each region. The second determination unit 126 is configured to determine whether there are multiple adjacent target areas. The stitching unit 128 is configured to stitch the adjacent target areas as the light source when the adjacent target areas exist. The first determination unit 121 is configured to determine the target area as the light source when there are no adjacent plurality of target areas. The counting unit 123 is used for counting the number of the light sources.
That is, step S122 may be implemented by the dividing unit 122, step S124 may be implemented by the first judging unit 124, step S126 may be implemented by the second judging unit 126, step S128 may be implemented by the splicing unit 128, step S121 may be implemented by the first determining unit 121, and step S123 may be implemented by the counting unit 123.
In this way, the location and number of light sources in the image can be determined.
Specifically, referring to fig. 9-11, in one embodiment, the capture method first divides the image into a plurality of regions, for example, 4 × 5 regions. Each region can draw 4 histograms according to the channel values of R, Gr, Gb and B, and then whether the region is a target region comprising the light source is judged according to the 4 histograms of each region. In fig. 9 and 10, the images each include a plurality of target regions. For example, the image in fig. 9 includes 3 target regions, and the image in fig. 10 includes 8 target regions. When the existing area is a target area including a light source, the shooting method judges whether a plurality of adjacent target areas exist, namely judges whether a situation that one light source covers a plurality of target areas simultaneously exists, wherein the covering can be partial covering or complete covering. When a plurality of adjacent target areas exist in the shooting method, splicing the plurality of adjacent target areas into a light source; when there are no adjacent target areas, each target area is determined as a light source. Referring to fig. 9, 3 target regions that are not adjacent to each other are respectively identified as a light source R, a light source G, and a light source B. Referring to fig. 10, 6 adjacent target regions are combined to form a complete light source R, and the other two non-adjacent target regions are respectively identified as a light source G and a light source B.
Note that the method of drawing the histogram of the area in fig. 11 is merely an example, and the horizontal axis of the histogram in fig. 11 indicates the pixel value and the vertical axis indicates the number of pixels. In other embodiments, the horizontal axis of the histogram may also be the number of pixels, and the vertical axis is the pixel value; or the horizontal axis of the histogram is the proportion of the number of pixels, and the vertical axis is the pixel value; or the horizontal axis of the histogram is the pixel value, and the vertical axis of the histogram is the ratio of the number of pixels.
In some embodiments, when determining whether a certain region is a target region including a light source according to a histogram of the region, it may be implemented by determining whether a pixel count ratio in which a pixel value exceeds a predetermined ratio. For example, the determination may be made as to whether the pixel count ratio of the pixel value exceeding 239 exceeds 5%, and when the pixel count ratio of the pixel value exceeding 239 exceeds 5%, the region is indicated as a target region including the light source; when the number of pixels having a pixel value exceeding 239 accounts for not more than 5%, it indicates that the region is not a target region including the light source.
In certain embodiments, the predetermined number is 1 or more.
In this way, the image can be guided for shooting with a single light source or a small number of light sources, so that the subject color of the resulting white-balance-processed image is more accurate.
Referring to fig. 12, in some embodiments, step S16 includes the following steps:
s162: and displaying the number of the light sources.
Referring to fig. 13, in some embodiments, the second processing module 16 includes a display unit 162, and the display unit 162 is used for displaying the number of the light sources.
That is, step S162 may be implemented by the display unit 162.
Thus, as shown in fig. 14, when the number of the light sources in the preview image is greater than the predetermined number, the user can be reminded of the number of the light sources in the scene shot in real time, so that the user can adjust the field of view better to reduce the number of the light sources to the predetermined number, the difficulty of white balance processing is reduced, and the white balance effect is more stable.
Referring to fig. 15, in some embodiments, step S18 includes the following steps:
s182: judging whether the number of the light sources is 1 or not;
s184: determining the light source as a main light source when the number of the light sources is 1;
s186: when the number of the light sources is not 1, determining a main light source according to at least one of scene parameters, corresponding areas and brightness parameters of the light sources, wherein the scene parameters comprise the time for shooting the image and the signal intensity of the GPS, and the brightness parameters comprise the corresponding brightness of the light sources and the average brightness of the image; and
s188: and carrying out white balance processing on the image according to the color temperature of the main light source.
Referring to fig. 16, in some embodiments, the third processing module 18 includes a third determining unit 182, a second determining unit 184, a third determining unit 186 and a processing unit 188. The third determination unit 182 is used for determining whether the number of the light sources is 1. The second determining unit 184 is configured to determine that the light source is the main light source when the number of light sources is 1. The third determining unit 186 is configured to determine the main light source according to at least one of a scene parameter, a corresponding area, and a brightness parameter of the plurality of light sources when the number of the light sources is not 1, where the scene parameter includes a time when the image is captured and a signal intensity of the GPS, and the brightness parameter includes corresponding brightness of the plurality of light sources and an average brightness of the image. The processing unit 188 is configured to perform white balance processing on the image according to the color temperature of the main light source.
That is, step S182 is implemented by the third judging unit 182, step S184 is implemented by the second determining unit 184, step S186 is implemented by the second determining unit 186, and step S188 is implemented by the processing unit 188.
Therefore, when the image only contains one light source, the light source is determined to be the main light source, and when the image contains a plurality of light sources, the main light source is determined according to at least one of the scene parameters, the corresponding areas and the brightness parameters of the plurality of light sources.
Referring to fig. 17, in some embodiments, step S18 further includes the following steps:
s181: and determining a main light source in an auxiliary mode according to the operation of a user on the image subjected to the white balance processing.
Referring to fig. 18, in some embodiments, the third processing module 18 further includes a fourth determining module 181. S181 may be implemented by the fourth determining module 181.
That is, the fourth determination module 181 may be configured to assist in determining the main light source according to a user operation on the image subjected to the white balance processing.
According to the shooting method of the embodiment of the application, the main light source is directly determined according to one light source or determined according to at least one of scene parameters, corresponding areas and brightness parameters of a plurality of light sources, and meanwhile the main light source can be determined in an auxiliary mode according to the operation of a user on an image subjected to white balance processing in normal times. The operation includes at least one of editing, saving, and deleting.
Specifically, the images subjected to the white balance processing may be some images stored in the local album after the white balance processing. It can be understood that the user generally selects the image which is considered to have a better white balance effect by the user and stores the image which has a poor white balance effect by the user and deletes the image. In addition, the user may also perform some editing of the image at their leisure, for example, adjusting the color temperature of the image. Therefore, through long-term machine learning and feedback, the process of determining the main light source by the shooting method is more and more accurate, and the white balance effect expected by a user can be more and more met when the image is subjected to white balance processing.
Referring to fig. 19 and 21, in some embodiments, step S188 includes the following steps:
s1882: determining a high brightness region H and a medium brightness region M according to the radially outward brightness distribution of the center O of the main light source;
s1884: subtracting the average value of the primary color channel pixels of the medium-brightness area M from the average value of the primary color channel pixels of the high-brightness area H to determine the color of the main light source; and
s1886: and determining the color temperature of the main light source according to the color of the main light source.
Referring to fig. 20 and 21, in some embodiments, the processing unit 188 includes a first determining subunit 1882, a calculating subunit 1884, and a second determining subunit 1886. The first determining subunit 182 is configured to determine the high luminance region H and the medium luminance region M according to the luminance distribution radially outward of the center O of the main light source. The calculation subunit 184 is configured to subtract the primary color channel pixel average value of the highlight region H from the primary color channel pixel average value of the mid-highlight region M to determine the primary light source color. The second determining subunit 186 is configured to determine the color temperature of the main light source according to the color of the light source.
That is, step S1882 may be implemented by the first determining subunit 1882, step S1884 may be implemented by the calculating unit 1884, and step S1886 may be implemented by the second determining subunit 1886.
In this way, the main light source color can be determined by the high-brightness region H and the medium-brightness region M and the main light source color temperature can be determined according to the main light source color.
Referring to fig. 21, after the light source position in the image is determined, it can be understood that the central O region of the light source in the image is an overexposed region, which is generally a large white spot and does not include information of the light source color. The light source color may be determined by the primary color channel pixel average of the highlight region H and the mid-highlight region M. The highlight region H may refer to a region constituted by pixels having luminance values radially outward of the center of the light source in a first luminance range L1, the first luminance range L1 being, for example, [200, 239 ]. The middle-bright region M may refer to a region constituted by pixels having brightness values radially outward of the center of the light source in a second brightness range L2, the second brightness range L2 being [150, 200 ], for example. It should be noted that specific values of the first luminance range L1 and the second luminance range L2 may be determined according to the luminance distribution of the center O of the light source radially outward, for example, the luminance of the light source decays faster, and the first luminance range L1 and the second luminance range L2 may be increased; for example, the luminance of the light source decays relatively slowly, the first luminance range L1 and the second luminance range L2 may be reduced.
The average value of the primary color channel pixels of the high-brightness area is the average value of the pixel values of all the pixels of the high-brightness area, and the average value of the primary color channel pixels of the medium-brightness area is the average value of the pixel values of all the pixels of the medium-brightness area. Assuming that the number of pixels in the highlight region is C1 and the number of pixels in the middle-bright region is C2, the number of pixels in the highlight region is C1
The average value of the primary color channel pixels of the highlight area is:
Figure BDA0001523585110000081
the average value of the primary color channel pixels of the middle bright area is:
averaging the primary color channel pixels of the highlight region
Figure BDA0001523585110000083
Subtracting the mean of the primary color channel pixels for the medium bright areaNamely, it is
Figure BDA0001523585110000085
To determine the color of the main light source, the color temperature of the main light source may be correspondingly determined according to the color of the main light source, in some embodiments, the color temperature of the main light source may be determined according to the color of the main light source, and specifically, the color temperature of the main light source may be determined by: and determining the color temperature of the light source according to the corresponding relation among the color of the main light source, the color of the main light source and the color temperature of the main light source. The corresponding relationship between the color of the main light source and the color temperature of the main light source may be a mapping table and/or a color temperature curve (as shown in fig. 22). Specifically, in one embodiment, images may be acquired under standard light boxes with color temperatures of 3000K, 4000K, 5000K, 6000K, and … … respectively, and the images may be calculated to obtain corresponding color temperatures
Figure BDA0001523585110000086
Thereby can form
Figure BDA0001523585110000087
A mapping table or a color temperature profile with the color temperature of the light source and may be stored in a local database. In the embodiment of the application, the calculation results
Figure BDA0001523585110000088
And then, the color temperature of the corresponding main light source can be inquired and obtained through the color temperature curve graph or the mapping table. Then, the corresponding white balance parameter can be searched and obtained according to the color temperature of the main light source and the corresponding relation between the color temperature of the main light source and the white balance parameter, so that the white balance processing can be carried out on the image according to the white balance parameter.
In some embodiments, the primary color channel refers to a color channel, for example, including at least one of an R (red) channel, a Gr (green red) channel, a Gb (green blue) channel, and a B (blue) channel, and in some embodiments, the pixel value of the G (green) channel may be obtained by the pixel value of the Gr channel and the pixel value of the Gb channel. The pixel average may refer to an arithmetic average of pixel values. In one example, the average value (Ravg, Gavg, Bavg) of each primary color channel pixel in the highlight area is (200,210,220), the average value (Ravg, Gavg, Bavg) of each primary color channel pixel in the middle highlight area is (160,180,190), and the channel (R, G, B) of the light source color is (200-.
Referring to fig. 23, a computer-readable storage medium 800 is further provided in an embodiment of the present application. One or more non-transitory computer-readable storage media 800 containing computer-executable instructions that, when executed by one or more processors 52, cause the processors 52 to perform the steps of:
s12: processing the preview image to determine the number of light sources in the preview image;
s14: judging whether the number of the light sources is larger than a preset number or not;
s16: when the number of the light sources is larger than the preset number, sending out a prompt for adjusting the view field; and
s18: and detecting the color temperature of the light sources when the number of the light sources is less than or equal to the preset number, and carrying out white balance according to the color temperature of the light sources.
FIG. 23 is a diagram showing an internal configuration of a computer device according to an embodiment. As shown in fig. 23, the computer apparatus 100 includes a processor 52, a memory 53 (e.g., a nonvolatile storage medium), an internal memory 54, a display 55, and an input device 56, which are connected via a system bus 51. The memory 53 of the computer device 100 has stored therein an operating system and computer readable instructions. The computer readable instructions can be executed by the processor 52 to implement the photographing method according to the embodiment of the present application. The processor 52 is used to provide computing and control capabilities that support the operation of the overall computing device 100. The internal memory 53 of the computer device 100 provides an environment for the execution of computer readable instructions in the memory 52. The display 55 of the computer device 100 may be a liquid crystal display or an electronic ink display, and the input device 56 may be a touch layer covered on the display 55, a button, a trackball or a touch pad arranged on a housing of the computer device 100, or an external keyboard, a touch pad or a mouse. The computer device 100 may be a mobile phone, a tablet computer, a notebook computer, a personal digital assistant, or a wearable device (e.g., a smart bracelet, a smart watch, a smart helmet, smart glasses), etc. It will be understood by those skilled in the art that the structure shown in fig. 23 is only a schematic diagram of a part of the structure related to the present application, and does not constitute a limitation to the computer device 100 to which the present application is applied, and a specific computer device 100 may include more or less components than those shown in the drawings, or combine some components, or have a different arrangement of components.
Referring to fig. 24, the computer device 100 according to the embodiment of the present disclosure includes an Image Processing circuit 80, and the Image Processing circuit 80 may be implemented by hardware and/or software components and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 24 is a diagram of an image processing circuit 800 in one embodiment. As shown in fig. 24, for convenience of explanation, only aspects of the image processing technique related to the embodiment of the present application are shown.
As shown in fig. 24, the image processing circuit 80 includes an ISP processor 81 (the ISP processor 81 may be the processor 52 or a part of the processor 52) and control logic 82. The image data captured by the camera 83 is first processed by the ISP processor 81, and the ISP processor 81 analyzes the image data to capture image statistics that may be used to determine one or more control parameters of the camera 83. The camera 83 may include one or more lenses 832 and an image sensor 834. Image sensor 834 may comprise an array of color filters (e.g., Bayer filters), and image sensor 834 may acquire light intensity and wavelength information captured by each imaging pixel and provide a raw set of image data that may be processed by ISP processor 81. The sensor 84 (e.g., a gyroscope) may provide parameters of the acquired image processing (e.g., anti-shake parameters) to the ISP processor 81 based on the type of sensor 84 interface. The sensor 84 interface may be a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interface, or a combination of the above.
In addition, the image sensor 834 may also send raw image data to the sensor 84, the sensor 84 may provide raw image data to the ISP processor 81 based on the sensor 84 interface type, or the sensor 84 may store raw image data in the image memory 85.
The ISP processor 81 processes the raw image data pixel by pixel in a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the ISP processor 81 may perform one or more image processing operations on the raw image data, gathering statistical information about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The ISP processor 81 may also receive image data from an image memory 85. For example, the sensor 84 interface sends raw image data to the image memory 85, and the raw image data in the image memory 85 is then provided to the ISP processor 81 for processing. The image Memory 85 may be the Memory 53, a portion of the Memory 53, a storage device, or a separate dedicated Memory within the electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving raw image data from image sensor 834 interface or from sensor 84 interface or from image memory 85, ISP processor 81 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to image memory 85 for additional processing before being displayed. The ISP processor 81 receives the processed data from the image memory 85 and performs image data processing on the processed data in the raw domain and in the RGB and YCbCr color spaces. The image data processed by ISP processor 81 may be output to display 87 (display 87 may include display screen 55) for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the ISP processor 81 may also be sent to the image memory 85, and the display 87 may read image data from the image memory 85. In one embodiment, image memory 85 may be configured to implement one or more frame buffers. In addition, the output of the ISP processor 81 may be sent to an encoder/decoder 86 for encoding/decoding the image data. The encoded image data may be saved and decompressed before being displayed on the display 87 device. The encoder/decoder 86 may be implemented by a CPU or GPU or coprocessor.
The statistical data determined by ISP processor 81 may be sent to control logic 82 unit. For example, the statistical data may include image sensor 834 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, lens 832 shading correction, and the like. Control logic 82 may include a processing element and/or microcontroller that executes one or more routines (e.g., firmware) that determine control parameters for camera 83 and ISP processor 81 based on the received statistical data. For example, the control parameters of camera 83 may include sensor 84 control parameters (e.g., gain, integration time for exposure control, anti-shake parameters, etc.), camera flash control parameters, lens 832 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as lens 832 shading correction parameters.
The following steps are taken to implement the photographing method using the image processing technique of fig. 24:
s12: processing the preview image to determine the number of light sources in the preview image;
s14: judging whether the number of the light sources is larger than a preset number or not;
s16: when the number of the light sources is larger than the preset number, sending out a prompt for adjusting the view field; and
s18: and detecting the color temperature of the light sources when the number of the light sources is less than or equal to the preset number, and carrying out white balance according to the color temperature of the light sources.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a non-volatile computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the program is executed. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), or the like.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (18)

1. A photographing method characterized by comprising the steps of:
processing a preview image to determine the number of light sources in the preview image;
judging whether the number of the light sources is larger than a preset number or not;
sending a prompt instructing a user to move to adjust the field of view when the number of the light sources is greater than the predetermined number, thereby assisting the user in adjusting the field of view to reduce the number of the light sources in the preview image; and
and detecting the color temperature of the light sources when the number of the light sources is less than or equal to the preset number, and carrying out white balance according to the color temperature of the light sources.
2. The photographing method of claim 1, wherein the step of processing the preview image to determine the number of light sources in the preview image comprises the steps of:
dividing the preview image into a plurality of regions;
judging whether the region is a target region comprising the light source or not according to the histogram of each region;
when the area is the target area comprising the light source, judging whether a plurality of adjacent target areas exist or not;
when a plurality of adjacent target areas exist, splicing the plurality of adjacent target areas into the light source;
determining the target area as the light source when there are no adjacent target areas; and
and counting the number of the light sources.
3. The photographing method according to claim 1, wherein the predetermined number is 1 or more.
4. The photographing method of claim 1, wherein the step of issuing a prompt to adjust a field of view when the number of light sources is greater than the predetermined number comprises:
and displaying the number of the light sources.
5. The photographing method of claim 1, wherein the detecting of the color temperature of the light sources and the white balancing according to the color temperature of the light sources when the number of the light sources is equal to or less than the predetermined number comprises:
judging whether the number of the light sources is 1 or not;
determining the light sources as main light sources when the number of the light sources is 1;
when the number of the light sources is not 1, determining the main light source according to at least one of scene parameters, corresponding areas and brightness parameters of the light sources, wherein the scene parameters comprise the time for shooting the image and the signal intensity of a GPS, and the brightness parameters comprise the corresponding brightness of the light sources and the average brightness of the image; and
and carrying out white balance processing on the image according to the color temperature of the main light source.
6. The photographing method of claim 5, wherein the detecting of the color temperature of the light sources when the number of the light sources is equal to or less than the predetermined number and the white balancing according to the color temperature of the light sources comprises:
and determining the main light source in an auxiliary mode according to the operation of a user on the image subjected to white balance processing.
7. The photographing method according to claim 6, wherein the operation includes at least one of editing, saving, and deleting.
8. The photographing method according to claim 5, wherein the white balancing the image according to the color temperature of the main light source comprises:
determining a highlight area and a mid-highlight area surrounding the central area of the main light source according to the brightness distribution of the center of the main light source outwards along the radial direction;
subtracting the average primary color channel pixel value of the medium bright area from the average primary color channel pixel value of the high bright area to determine the color of the main light source; and
and determining the color temperature of the main light source according to the color of the main light source.
9. An image forming apparatus, characterized in that the image forming apparatus comprises:
the first processing module is used for processing the preview image to determine the number of light sources in the preview image;
the judging module is used for judging whether the number of the light sources is larger than a preset number or not;
a second processing module, configured to issue a prompt instructing a user to move to adjust a field of view when the number of the light sources is greater than the predetermined number, so as to assist the user in adjusting the field of view to reduce the number of the light sources in the preview image; and
and the third processing module is used for detecting the color temperature of the light sources and carrying out white balance according to the color temperature of the light sources when the number of the light sources is less than or equal to the preset number.
10. The imaging apparatus of claim 9, wherein the first processing module comprises:
a dividing unit for dividing the preview image into a plurality of regions;
a first judging unit, configured to judge whether the region is a target region including the light source according to a histogram of each of the regions;
a second determination unit configured to determine whether there are a plurality of adjacent target regions when the region is a target region including the light source;
a splicing unit, configured to splice a plurality of adjacent target regions into the light source when the plurality of adjacent target regions exist;
a first determination unit configured to determine the target area as the light source when there are no adjacent plurality of the target areas; and
and the counting unit is used for counting the number of the light sources.
11. The imaging apparatus according to claim 9, wherein the predetermined number is 1 or more.
12. The imaging apparatus of claim 9, wherein the second processing module further comprises:
and the display unit is used for displaying the number of the light sources.
13. The imaging apparatus of claim 9, wherein the third processing module further comprises:
a third judging unit, configured to judge whether the number of the light sources is 1;
a second determining unit, configured to determine that the light source is a main light source when the number of the light sources is 1;
a third determining unit, configured to determine a main light source according to at least one of a scene parameter, a corresponding area, and a brightness parameter of the plurality of light sources when the number of the light sources is not 1, where the scene parameter includes a time when the image is captured and a signal intensity of a GPS, and the brightness parameter includes corresponding brightness of the plurality of light sources and an average brightness of the image; and
and the processing unit is used for carrying out white balance processing on the image according to the color temperature of the main light source.
14. The imaging apparatus of claim 13, wherein the third processing module comprises:
the fourth determination unit is used for determining the main light source in an auxiliary mode according to the operation of a user on the image after the white balance processing.
15. The imaging apparatus of claim 14, wherein the operations comprise at least one of editing, saving, deleting.
16. The imaging apparatus of claim 13, wherein the processing unit comprises:
a first determining subunit configured to determine a highlight region and a mid-highlight region surrounding the central region of the main light source according to a radially outward brightness distribution of the center of the main light source; and
a computing subunit to subtract the primary color channel pixel average of the highlight region from the primary color channel pixel average of the mid-highlight region to determine the color of the primary light source; and
a second determining subunit for determining a color temperature of the light source according to the color of the main light source.
17. A non-transitory computer-readable storage medium containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the photographing method of any one of claims 1 to 8.
18. A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions that, when executed by the processor, cause the processor to perform the photographing method according to any one of claims 1 to 8.
CN201711423801.9A 2017-12-25 2017-12-25 Photographing method, imaging apparatus, computer-readable storage medium, and computer device Active CN108111831B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711423801.9A CN108111831B (en) 2017-12-25 2017-12-25 Photographing method, imaging apparatus, computer-readable storage medium, and computer device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711423801.9A CN108111831B (en) 2017-12-25 2017-12-25 Photographing method, imaging apparatus, computer-readable storage medium, and computer device

Publications (2)

Publication Number Publication Date
CN108111831A CN108111831A (en) 2018-06-01
CN108111831B true CN108111831B (en) 2020-01-10

Family

ID=62212966

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711423801.9A Active CN108111831B (en) 2017-12-25 2017-12-25 Photographing method, imaging apparatus, computer-readable storage medium, and computer device

Country Status (1)

Country Link
CN (1) CN108111831B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109327691B (en) * 2018-10-23 2021-05-04 Oppo广东移动通信有限公司 Image shooting method and device, storage medium and mobile terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4707450B2 (en) * 2005-05-18 2011-06-22 イーストマン コダック カンパニー Image processing apparatus and white balance adjustment apparatus
CN100551081C (en) * 2007-04-23 2009-10-14 北京中星微电子有限公司 A kind of method and device of realizing white balance correction
KR100977055B1 (en) * 2009-02-20 2010-08-19 주식회사 코아로직 Device and method for adjusting auto white balance(awb) and image processing apparatus comprising the same device
CN102892010B (en) * 2012-10-22 2015-05-06 浙江宇视科技有限公司 White balance processing method and device under multiple light sources
CN105959661B (en) * 2016-05-06 2019-07-26 联想(北京)有限公司 A kind of color temperature estimation method and electronic equipment

Also Published As

Publication number Publication date
CN108111831A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
CN107977940B (en) Background blurring processing method, device and equipment
CN107948519B (en) Image processing method, device and equipment
CN108683862B (en) Imaging control method, imaging control device, electronic equipment and computer-readable storage medium
CN108055452B (en) Image processing method, device and equipment
CN108419028B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment
US10630920B2 (en) Image processing apparatus
CN107959851B (en) Colour temperature detection method and device, computer readable storage medium and computer equipment
EP3503545B1 (en) Image processing method and device and computer-readable storage medium
CN108024054B (en) Image processing method, device, equipment and storage medium
CN111028190A (en) Image processing method, image processing device, storage medium and electronic equipment
CN107481186B (en) Image processing method, image processing device, computer-readable storage medium and computer equipment
US10382671B2 (en) Image processing apparatus, image processing method, and recording medium
CN108024057B (en) Background blurring processing method, device and equipment
CN107395991B (en) Image synthesis method, image synthesis device, computer-readable storage medium and computer equipment
CN107704798B (en) Image blurring method and device, computer readable storage medium and computer device
CN108053438B (en) Depth of field acquisition method, device and equipment
CN107872631B (en) Image shooting method and device based on double cameras and mobile terminal
CN108063934B (en) Image processing method and device, computer readable storage medium and computer device
CN108259754B (en) Image processing method and device, computer readable storage medium and computer device
EP3836532A1 (en) Control method and apparatus, electronic device, and computer readable storage medium
CN108063933B (en) Image processing method and device, computer readable storage medium and computer device
CN107454318B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN110276730B (en) Image processing method and device and electronic equipment
CN108111831B (en) Photographing method, imaging apparatus, computer-readable storage medium, and computer device
CN109040598B (en) Image processing method, image processing device, computer-readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant after: OPPO Guangdong Mobile Communications Co., Ltd.

Address before: Changan town in Guangdong province Dongguan 523860 usha Beach Road No. 18

Applicant before: Guangdong Opel Mobile Communications Co., Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant