CN113516595A - Image processing method, image processing apparatus, electronic device, and storage medium - Google Patents

Image processing method, image processing apparatus, electronic device, and storage medium Download PDF

Info

Publication number
CN113516595A
CN113516595A CN202110378967.3A CN202110378967A CN113516595A CN 113516595 A CN113516595 A CN 113516595A CN 202110378967 A CN202110378967 A CN 202110378967A CN 113516595 A CN113516595 A CN 113516595A
Authority
CN
China
Prior art keywords
purple
region
area
green
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110378967.3A
Other languages
Chinese (zh)
Other versions
CN113516595B (en
Inventor
刘万程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110378967.3A priority Critical patent/CN113516595B/en
Priority claimed from CN202110378967.3A external-priority patent/CN113516595B/en
Publication of CN113516595A publication Critical patent/CN113516595A/en
Application granted granted Critical
Publication of CN113516595B publication Critical patent/CN113516595B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application discloses an image processing method, an image processing device, an electronic device and a storage medium. The image processing method comprises the following steps: acquiring one or more object edges in a color image; determining a green area and a purple area in the color image; and determining the areas close to the green area and the object edge in the purple area as purple edge areas according to the object edge, the green area and the purple area. According to the image processing method, the image processing device, the electronic equipment and the storage medium, the area close to the green area and the object edge in the purple area can be determined according to the object edge, the green area and the purple area to serve as the purple boundary area, so that the purple boundary area is convenient to correct, the purple boundary of the edge of the green area in the color image can be effectively eliminated, the image quality is improved, and the visual experience of a user is improved.

Description

Image processing method, image processing apparatus, electronic device, and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, an electronic device, and a storage medium.
Background
In the photographing process, due to the fact that the brightness contrast of the photographed scene is large, chromatic dispersion is prone to occur in the obtained image, the chromatic dispersion is generally expressed as purple, and the area where the chromatic dispersion is located is generally called purple fringing. The existence of purple fringing can seriously affect the image quality and reduce the visual experience of users, wherein, how to determine the purple fringed area in the image becomes an urgent problem to be solved in the field.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, an electronic device and a storage medium.
The image processing method of the embodiment of the application comprises the following steps: acquiring one or more object edges in a color image; determining a green region and a purple region in the color image; and determining an area close to the green area and the object edge in the purple area as a purple edge area according to the object edge, the green area and the purple area.
The image processing apparatus of the embodiment of the present application includes a first determination module, a second determination module, and a third determination module. The first determining module is used for acquiring one or more object edges in the color image. The second determination module is used for determining a green area and a purple area in the color image. And the third determining module is used for determining an area close to the green area and the object edge in the purple area as a purple edge area according to the object edge, the green area and the purple area.
The electronic device of the embodiment of the application comprises a processor. The processor is configured to: acquiring one or more object edges in a color image; determining a green region and a purple region in the color image; and determining an area close to the green area and the object edge in the purple area as a purple edge area according to the object edge, the green area and the purple area.
The computer-readable storage medium of the present embodiment stores thereon a computer program, which is characterized by realizing the steps of the image processing method described in the above embodiment when the program is executed by a processor.
According to the image processing method, the image processing device, the electronic equipment and the storage medium, the purple boundary area of the edge of the green area can be determined according to the edge of the object, the green area and the purple area, so that the purple boundary area can be corrected conveniently, the purple boundary of the edge of the green area in the color image can be effectively eliminated, the image quality is improved, and the visual experience of a user is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 3 is a schematic view of an electronic device of an embodiment of the present application;
FIG. 4 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 5 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
fig. 6 is a schematic diagram of sobel operators of the image processing method according to the embodiment of the present application;
FIG. 7 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 8 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 9 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 10 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 11 is a flowchart illustrating an image processing method according to an embodiment of the present application;
fig. 12 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
fig. 13 is a flowchart illustrating an image processing method according to an embodiment of the present application;
fig. 14 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
FIG. 15 is a schematic view of a scene of an image processing method according to an embodiment of the present application;
fig. 16 is a flowchart illustrating an image processing method according to an embodiment of the present application;
fig. 17 is a schematic diagram of an image processing apparatus according to an embodiment of the present application;
fig. 18 is a flowchart illustrating an image processing method according to an embodiment of the present application;
fig. 19 is a schematic diagram of an image processing apparatus according to an embodiment of the present application.
Description of the main element symbols:
image processing apparatus 100, first determination module 20, convolution unit 22, processing unit 24. A second determination module 30, a first conversion unit 32, a first determination unit 34, a second conversion unit 42, a second determination unit 44, a third determination unit 48, a dilation subunit 482, a determination subunit 484, a third determination module 50, a first calculation unit 52, a second calculation unit 54, a fourth determination unit 56, a correction module 60, a first correction unit 64, a second correction unit 66, a fourth determination module 70;
electronic device 200, processor 201, camera assembly 202, housing 203.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In the description of the embodiments of the present application, the terms "first", "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
Referring to fig. 1, an image processing method according to an embodiment of the present application includes:
013: acquiring one or more object edges in a color image;
015: determining a green area and a purple area in the color image;
019: and determining the areas close to the green area and the object edge in the purple area as purple edge areas according to the object edge, the green area and the purple area.
Referring to fig. 2, the image processing method according to the embodiment of the present application can be implemented by the image processing apparatus 100 according to the embodiment of the present application. Specifically, the image processing apparatus 100 includes a first determination module 20, a second determination module 30, and a third determination module 50. The first determination module 20 is used to acquire one or more object edges in a color image. The second determination module 30 is used to determine the green region and the purple region in the color image. The third determining module 50 is configured to determine, as a purple boundary region, a region near the green region and the object edge in the purple region according to the object edge, the green region, and the purple region.
Referring to fig. 3, the image processing method according to the embodiment of the present application can be implemented by the electronic device 200 according to the embodiment of the present application. In particular, the electronic device 200 comprises a processor 201. The processor 201 is configured to: acquiring one or more object edges in a color image; determining a green area and a purple area in the color image; and determining the areas close to the green area and the object edge in the purple area as purple edge areas according to the object edge, the green area and the purple area.
According to the image processing method, the image processing device 100 and the electronic device 200, the areas close to the green area and the object edge in the purple area can be determined as the purple boundary areas according to the object edge, the green area and the purple area, so that the purple boundary areas can be corrected conveniently, the purple boundary of the edge of the green area in the color image can be effectively eliminated, the image quality is improved, and the visual experience of a user is improved.
In the related technology, a first area corresponding to all purple objects is searched in a color image in a global color searching mode; then, converting the color image into a gray image, and searching a second area with a gray value larger than a preset value at a position corresponding to the first area in the gray image; then, searching the edge of the object at the position corresponding to the first area in the gray level image; and finally, taking intersection of the first area, the second area and the object edge, and determining the area where the intersection is located as a purple edge area. However, the prior art cannot distinguish the real purple object from the purple fringing, which may cause the false detection of the purple fringing region.
The image processing method according to the embodiment of the present application detects a green region in a color image as a region where green plants are located according to a phenomenon that purple fringing usually occurs at a green-plant edge, and can more accurately determine a purple fringing region of an edge of the green region, that is, a purple fringing region of the color image, by combining an object edge of the color image and a purple fringing region of a range closest to a purple fringing region range in the color image.
Specifically, a color image is acquired, different objects may be included in the color image, the object edges are outer edges of the different objects in the color image, and the object edges may include position information of the outer edges of the different objects. It will be understood that in the case where the object is a leaf, the leaf includes an outer edge, i.e. the outer contour of the leaf, and an inner edge, i.e. the inner texture (e.g. meridian) of the leaf, in this application the outer contour of the leaf is determined to obtain an edge of the object, the obtained edge of the object not including the inner texture of the leaf.
Further, the green region, i.e., the region where all green objects in the color image are located, may include position information of all green objects. The purple region is a region whose range is closest to that of the purple fringing region, and should not be understood as a region where all purple objects in the color image are located, and the purple region may include position information of a region whose range is closest to that of the purple fringing region. The purple-fringed area, that is, the purple region located at the edge of the green region, may include position information of the purple region of the edge of the green region.
It is noted that in some embodiments, step 015 may include: determining a green region in the color image; the purple region in the color image is determined. The green region may be determined in a different manner than the purple region. The green region may be determined first and then the purple region may be determined first, and the green region and the purple region may be determined simultaneously, which is not limited herein.
Referring to fig. 3, in some embodiments, the electronic device 200 further includes a camera assembly 202 and a housing 203, the camera assembly 202 is mounted on the housing 203, and the processor 201 is disposed in a receiving space formed by the housing 203. The camera assembly 202 may capture external light and generate corresponding electrical signals, and the processor 201 may receive the electrical signals from the camera assembly 202 and generate a color image according to the received electrical signals.
It should be noted that in the embodiment shown in fig. 3, the electronic device 200 is a smart phone, and in other embodiments, the electronic device 200 may be a tablet computer, a personal computer, a display, a digital camera, a teller machine, an intelligent wearable device, or other electronic devices with a photographing function.
Referring to fig. 4, in some embodiments, step 013 includes:
0131: performing convolution processing on the color image to obtain a gradient image;
0133: and carrying out binarization processing on the gradient image according to a preset gradient value to obtain the edge of the object.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 5, the first determination module 20 includes a convolution unit 22 and a processing unit 24. The convolution unit 22 is used to perform convolution processing on the color image to obtain a gradient image. The processing unit 24 is configured to perform binarization processing on the gradient image according to a preset gradient value to obtain an object edge.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to perform convolution processing on the color image to obtain a gradient image, and is configured to perform binarization processing on the gradient image according to a preset gradient value to obtain an object edge.
Thus, the object edge of the color image can be accurately obtained.
Specifically, in step 0131, in some embodiments, the color image is convolved using a 4-way Sobel operator (as shown in fig. 6) to obtain a gradient image. Further, carrying out convolution processing on the color image by adopting 1 Sobel operator in the horizontal direction, 1 in the vertical direction and 2 in the diagonal direction respectively so as to obtain 4 initial gradient images in different directions; then, 4 initial gradient images in different directions are fused pixel by pixel, that is, the average value of the gradient values of the pixels at the same position of the 4 initial gradient images in different directions is obtained, and the average value of the gradient values is used as the gradient value of the gradient image, so that a gradient image containing the gradient values in 4 directions is obtained. It can be appreciated that the use of the sobel operator in the 4-direction enables more accurate determination of the object edges than the use of the sobel operator in both the horizontal and vertical directions.
In step 0133, the gradient image is threshold-filtered and binarized according to the preset gradient value, in some embodiments, the gradient not smaller than the preset gradient value in the gradient image is retained and set to be 1, and the gradient smaller than the preset gradient value in the gradient image is set to be 0, so that the obtained image has an area with the gradient value of 1 as an object edge and an area with the gradient value of 0 as a non-object edge. In some embodiments, the predetermined gradient value is 90. It can be understood that when the preset gradient value is 90, the region where the outer contour of the leaf is located can be identified as the object edge, and the inner texture (e.g., meridians) of the leaf is not identified as the object edge, so that the object edge of the leaf can be accurately determined for subsequent processing.
In other embodiments, the color image may be processed by using an operator such as a scharr operator or a laplacian operator to obtain an object edge of the color image, which is not limited herein.
Referring to fig. 7, in some embodiments, determining a green region in a color image includes:
0151: converting the color image into HSV format;
0153: the green regions are determined in the HSV formatted color image.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 8, the second determination module 30 includes a first conversion unit 32 and a first determination unit 34. The first conversion unit 32 is used to convert the color image into HSV format. The first determination unit 34 is configured to determine a green region in the HSV-formatted color image.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to convert the color image into HSV format, and to determine a green region in the color image in HSV format.
In this way, the green region in the color image can be determined more intuitively by the conversion of the color space.
Specifically, the color image is converted from an RGB format to an HSV format, and a green region is determined in the color image in the HSV format according to a first threshold range. In the RGB format, R is the red component value, G is the green component value, and B is the blue component value. In the HSV format, H represents Hue (Hue), and the value range can be 0-180; s represents Saturation (Saturation), and the value range can be 0-255; v represents brightness (Value), and the Value range can be 0-255. In some embodiments, the first threshold range is {35< H <77& &43< S <255& &46< V <255}, so that a green region in a color image can be accurately detected.
Referring to fig. 9, in some embodiments, determining the purple region in the color image includes:
0171: converting the color image into HSV format;
0173: determining an initial purple region in the HSV-formatted color image;
0175: the purple region is determined from the initial purple region.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 10, the second determining module 30 includes a second converting unit 42, a second determining unit 44, and a third determining unit 48. The second conversion unit 42 is used to convert the color image into HSV format. The second determination unit 44 is configured to determine an initial purple region as a purple region in the HSV-format color image. The third determination unit 48 is configured to determine a purple region from the initial purple region.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to convert the color image into HSV format, and to determine an initial purple region as a purple region in the color image in HSV format, and to determine the purple region according to the initial purple region.
In this way, the purple region in the image can be effectively determined from the initial purple region. It can be understood that the initial purple region is a part of the region where all purple objects in the color image are located, the purple region is a region in a range closest to the purple boundary region, and the purple region can be determined relatively quickly by directly using a part of the region where all purple objects in the color image are located as the purple region.
Specifically, the color image is converted from the RGB format into the HSV format, and an initial purple region is determined as a purple region in the color image in the HSV format according to a second threshold range. In some embodiments, the second threshold range is {125< H <155& &30< S <255& &46< V <255}, so that by setting a more strict threshold, a more accurate initial purple region is detected and the initial purple region is used as a purple region, thereby effectively avoiding false detection of the purple region.
Referring to fig. 11, in some embodiments, step 0175 includes:
01751: performing morphological expansion on the initial purple area to obtain an expansion area;
01753: according to the expansion area, a purple area is determined in the color image in the HSV format.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 12, the third determining unit 48 includes a swelling subunit 482 and a determining subunit 484. The dilation subunit 482 is used to morphologically dilate the initial purple region, resulting in a dilated region. The determination subunit 484 is configured to determine a purple region in the HSV formatted color image according to the dilated region.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to perform morphological dilation on the initial purple region to obtain a dilated region, and to determine the purple region in the HSV-formatted color image according to the dilated region.
In this way, by performing dilation processing on the initial purple region, omission of the purple region can be avoided as much as possible.
Specifically, the initial purple region is morphologically dilated such that regions belonging to the same whole in the initial purple region are merged together to form a dilated region including position information of the regions belonging to the same whole in the initial purple region. And secondly, detecting the area with the same position as the expansion area in the color image for the second time according to a third threshold range so as to determine the purple area in the color image in the HSV format. The threshold range of the H component in the third threshold range is greater than the threshold range of the H component in the second threshold range, the threshold range of the S component in the third threshold range is less than the threshold range of the S component in the second threshold range, and the threshold range of the V component in the third threshold range is less than the threshold range of the V component in the second threshold range. In some embodiments, the third threshold range is {88< H <155& &60< S <255& &46< V <211}, so that the remaining purple regions except the initial purple region in the purple region are determined by further detecting the region of the color image at the same position as the dilated region by setting a more relaxed threshold, thereby avoiding missing detection of the purple region.
It should be noted that for the purple region, its purple color is generally not a single saturation, and generally, its purple darker regions also have lighter regions. If the second threshold range (strict threshold) is used to detect only once, then there will be lighter purple areas that are not detected; if the detection is performed by using the third threshold range (loose threshold), many areas other than purple fringing are mistakenly detected as purple areas, thereby increasing the operation cost. Therefore, the above embodiment adopts two detections, that is, the purple region with the highest possibility is detected first, and then the purple region is expanded and supplemented, so as to obtain a more accurate purple region finally.
Referring to FIG. 13, in some embodiments, step 019 includes:
0191: calculating a first distance between each pixel point in the purple region and the edge of the object;
0193: calculating a second distance between each pixel point in the purple region and the green region;
0195: and when the first distance of the pixel point is smaller than a first preset threshold value and the second distance of the pixel point is smaller than a second preset threshold value, determining the pixel point as the pixel point of the purple border area.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 14, the third determination module 50 includes a first calculation unit 52, a second calculation unit 54, and a fourth determination unit 56. The first calculation unit 52 is configured to calculate a first distance between each pixel point in the purple region and an edge of the object. The second calculating unit 54 is configured to calculate a second distance between each pixel point in the purple region and the green region. The fourth determining unit 56 is configured to determine that the pixel point is a pixel point in the purple-fringed region when the first distance of the pixel point is smaller than the first preset threshold and the second distance of the pixel point is smaller than the second preset threshold.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to calculate a first distance between each pixel point in the purple region and the edge of the object, calculate a second distance between each pixel point in the purple region and the green region, and determine that the pixel point is the pixel point of the purple region when the first distance of the pixel point is smaller than a first preset threshold and the second distance is smaller than a second preset threshold.
Therefore, the purple border area of the edge specific range of the green area can be screened out by judging the relation between the first distance and the first preset threshold value and judging the relation between the second distance and the second preset threshold value. It can be understood that the purple region includes a plurality of pixel points, and for a pixel point in the purple region, a first distance between the pixel point and an edge of an object and a second distance between the pixel point and a green region need to be calculated to determine whether the pixel point is a pixel point in the purple region.
Specifically, referring to fig. 15, in fig. 15, a region G represents a green region, a region P represents a purple region, a pixel point a is a point in the purple region, a pixel point B is a point in an object edge, and a pixel point C is a point in the green region. In some embodiments, the first distance d1 is the minimum euclidean distance between the pixel point a in the purple region and the edge of the object, and the second distance d2 is the minimum euclidean distance between the pixel point a in the purple region and the green region. When the first distance d1 is smaller than a first preset threshold and the second distance d2 is smaller than a second preset threshold, determining that the pixel point A is a pixel point of the purple border area; when the first distance d1 is not less than the first preset threshold or the second distance d2 is not less than the second preset threshold, it is determined that the pixel point a is not a pixel point of the purple fringed area, and it is then determined whether other pixel points of the purple fringed area are pixel points of the purple fringed area. And after all the pixel points in the purple region are judged, determining the region corresponding to the pixel point which is the purple edge region in the purple region as the purple edge region.
Further, for a pixel point B with coordinates (x1, y1) in the edge of the object, the euclidean distance d1 between the pixel point a and the pixel point B can be expressed by the following formula: d1 ═ sqrt ((X-X1)2+(Y-y1)2) And under the condition that the coordinates of the pixel point A are not changed, changing the coordinates of the pixel point B, and obtaining the minimum value of d1 as the first distance. For pixel C with coordinates (x2, y2) in the green region, the euclidean distance d2 between pixel a and pixel C can be expressed by the following formula: d2 ═ sqrt ((X-X2)2+(Y-y2)2) And under the condition that the coordinates of the pixel point A are not changed, changing the coordinates of the pixel point C, and obtaining the minimum value of d2 and taking the minimum value as a second distance.
In one example, the first preset threshold is 5, the second preset threshold is also 5, and when the first distance and the second distance of a certain pixel point in the purple region are both smaller than 5, the pixel point is determined to be a pixel point in the purple region; otherwise, determining that the pixel point is not the pixel point of the purple edge region, and then judging whether other pixel points of the purple edge region are the pixel points of the purple edge region.
Referring to fig. 16, in some embodiments, the color image includes a red channel, a blue channel and a green channel, and the image processing method further includes:
021: determining the nearest pixel point in the green area and the nearest pixel point in the purple area;
023: and correcting the pixel gray values of the red channel and the blue channel of each pixel point in the purple edge region according to the nearest pixel point.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 17, the image processing apparatus 100 further includes a fourth determining module 70 and a correcting module 60. The fourth determining module 70 is configured to determine a closest pixel point in the green region that is closest to each pixel point in the purple region. The correction module 60 is configured to correct the pixel gray values of the red channel and the blue channel of each pixel point in the purple region according to the nearest pixel point.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to determine a closest pixel point in the green region that is closest to each pixel point in the purple region, and correct the pixel grayscale values of the red channel and the blue channel of each pixel point in the purple region according to the closest pixel point.
Therefore, the purple border area is effectively corrected, the image quality is improved, and the visual experience of a user is improved. It can be understood that, in general, the pixel gray value of the green channel is accurate, so that the pixel gray values of the red channel and the blue channel of each pixel point in the purple edge region are corrected, and then the corrected red channel and blue channel are fused with the green channel before correction, so as to obtain the image without purple edges.
Specifically, in step 021, the closest pixel point in the green region that is closest to each pixel point in the purple region may be determined by calculating the minimum value of the euclidean distance. Since the nearest pixel points of each pixel point in the purple edge region in the green region may be different, the number of the nearest pixel points may include a plurality. After the nearest pixel point is determined, the pixel gray values of the red channel, the blue channel and the green channel of the nearest pixel point can be obtained.
It should be noted that the determination of the purple fringing field and the correction of the purple fringing field may be performed by the same apparatus, or may be performed by different apparatuses, and are not limited herein. For example, determining the purple fringed region and correcting the purple fringed region may both be performed by the electronic device; or determining that the purple fringing zone is executed by the electronic equipment, and correcting the purple fringing zone by the cloud server; alternatively, determining the purple fringing field can be performed by a cloud server, and correcting the purple fringing field is performed by the electronic device.
Referring to fig. 18, in some embodiments, step 023 includes:
0231: correcting the pixel gray value of the red channel of the corresponding pixel point in the purple edge region according to the pixel gray values of the red channel and the green channel of the nearest pixel point;
0233: and correcting the pixel gray value of the blue channel of the corresponding pixel point in the purple edge region according to the pixel gray values of the blue channel and the green channel of the nearest pixel point.
The image processing method according to the above embodiment can be realized by the image processing apparatus 100 according to the present embodiment. Specifically, referring to fig. 19, the calibration module 60 includes a first calibration unit 64 and a second calibration unit 66. The first correcting unit 64 is configured to correct the pixel gray value of the red channel of the corresponding pixel point in the purple boundary region according to the pixel gray values of the red channel and the green channel of the closest pixel point. The second correction unit 66 is configured to correct the pixel grayscale value of the blue channel of the corresponding pixel point in the purple-fringed region according to the pixel grayscale values of the blue channel and the green channel of the closest pixel point.
The image processing method of the above embodiment can be implemented by the electronic device 200 of the embodiment of the present application. Specifically, the processor 201 is configured to correct the pixel gray value of the red channel of the corresponding pixel point in the purple boundary region according to the pixel gray values of the red channel and the green channel of the closest pixel point, and correct the pixel gray value of the blue channel of the corresponding pixel point in the purple boundary region according to the pixel gray values of the blue channel and the green channel of the closest pixel point.
Therefore, according to the constant color difference criterion, the pixel gray values of the red channel and the blue channel of each pixel point in the purple edge area are corrected more accurately. The color difference constant criterion is that the color differences corresponding to two pixel points of adjacent points (i, j) and (i, j +1) are substantially equal, that is, the red and green color difference values of the adjacent points are substantially equal, for example, R (i, j) -G (i, j) ═ R (i, j +1) -G (i, j + 1); the difference between the blue and green colors of adjacent dots is basically equal, such as B (i, j) -G (i, j) ═ B (i, j +1) -G (i, j + 1).
Specifically, in step 0231, the pixel gray value Lr' of the red channel corresponding to the pixel point in the corrected purple-fringed area can be represented by the following formula: and Lr' ═ Lg + (Lrnear-Lgnear), wherein Lg is the pixel gray value of the green channel of the corresponding pixel point in the purple-edge region before correction, Lrnear is the pixel gray value of the red channel of the nearest pixel point, and Lgnear is the pixel gray value of the green channel of the nearest pixel point.
In step 0233, the corrected pixel gray value Lb' of the blue channel corresponding to the pixel point in the purple-fringed region can be represented by the following formula: lb' ═ Lg + (Lbnear-Lgnear), wherein Lg is the pixel gray value of the green channel of the corresponding pixel point in the purple fringing zone before correction, Lbnear is the pixel gray value of the blue channel of the nearest pixel point, and Lgnear is the pixel gray value of the green channel of the nearest pixel point.
Further, after determining the pixel gray values of the red channel and the blue channel of the corresponding pixel point in the corrected purple fringing region, fusing the corrected red channel and blue channel with the green channel before correction to obtain the image without purple fringing. Because the pixel gray value of each pixel point in the purple boundary region is corrected according to the pixel gray value of the nearest pixel point in the green region, the color presented by the purple boundary region after correction is more vivid and lifelike, rather than being presented as gray or purple one by one, and the visual experience of a user is improved.
It should be noted that the specific numerical values mentioned above are only for illustrating the implementation of the present application in detail and should not be construed as limiting the present application. In other examples or embodiments or examples, other values may be selected according to the application and are not specifically limited herein.
The computer-readable storage medium of the embodiments of the present application stores thereon a computer program that, when executed by the processor 201, implements the steps of the image processing method of any of the above-described embodiments.
For example, in the case where the program is executed by the processor 201, the steps of the following image processing method are implemented:
013: acquiring one or more object edges in a color image;
015: determining a green area and a purple area in the color image;
019: and determining the areas close to the green area and the object edge in the purple area as purple edge areas according to the object edge, the green area and the purple area.
It will be appreciated that the computer program comprises computer program code. The computer program code may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), software distribution medium, and the like. The Processor 201 may be a central processing unit, or may be other general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, or the like.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (11)

1. An image processing method, characterized in that the image processing method comprises:
acquiring one or more object edges in a color image;
determining a green region and a purple region in the color image;
and determining an area close to the green area and the object edge in the purple area as a purple edge area according to the object edge, the green area and the purple area.
2. The image processing method according to claim 1, wherein the determining, as a purple-edge region, a region close to the green region and the object edge in the purple region from the object edge, the green region, and the purple region includes:
calculating a first distance between each pixel point in the purple region and the edge of the object;
calculating a second distance between each pixel point in the purple region and the green region;
and when the first distance of the pixel point is smaller than a first preset threshold value and the second distance of the pixel point is smaller than a second preset threshold value, determining the pixel point as the pixel point of the purple border area.
3. The method of claim 1, wherein the obtaining one or more object edges in the color image comprises:
performing convolution processing on the color image to obtain a gradient image;
and carrying out binarization processing on the gradient image according to a preset gradient value to obtain the edge of the object.
4. The method according to claim 1, wherein the determining a green region in the color image comprises:
converting the color image into HSV format;
determining the green region in the color image in HSV format.
5. The method of claim 1, wherein determining the purple region in the color image comprises:
converting the color image into HSV format;
determining an initial purple region in the color image in HSV format;
and determining a purple region according to the initial purple region.
6. The method of claim 5, wherein the determining a purple region from the initial purple region comprises:
performing morphological expansion on the initial purple area to obtain an expanded area;
determining the purple region in the color image in HSV format according to the dilated region.
7. The image processing method according to claim 1, wherein the color image includes a red channel, a blue channel, and a green channel, the image processing method further comprising:
determining the nearest pixel point in the green area, which is closest to each pixel point in the purple area;
and correcting the pixel gray values of the red channel and the blue channel of each pixel point in the purple edge area according to the nearest pixel point.
8. The method according to claim 7, wherein said correcting the gray values of the red channel and the blue channel of each pixel point in the purple-fringed region according to the nearest pixel point comprises:
correcting the pixel gray value of the red channel of the corresponding pixel point in the purple boundary region according to the pixel gray values of the red channel and the green channel of the nearest pixel point;
and correcting the pixel gray value of the blue channel of the corresponding pixel point in the purple boundary region according to the pixel gray values of the blue channel and the green channel of the nearest pixel point.
9. An image processing apparatus characterized by comprising:
the first determining module is used for acquiring one or more object edges in the color image;
a second determining module for determining a green region and a purple region in the color image;
and the third determining module is used for determining an area close to the green area and the object edge in the purple area as a purple edge area according to the object edge, the green area and the purple area.
10. An electronic device, comprising a processor configured to:
acquiring one or more object edges in a color image;
determining a green region and a purple region in the color image;
and determining an area close to the green area and the object edge in the purple area as a purple edge area according to the object edge, the green area and the purple area.
11. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 8.
CN202110378967.3A 2021-04-08 Image processing method, image processing apparatus, electronic device, and storage medium Active CN113516595B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110378967.3A CN113516595B (en) 2021-04-08 Image processing method, image processing apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110378967.3A CN113516595B (en) 2021-04-08 Image processing method, image processing apparatus, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN113516595A true CN113516595A (en) 2021-10-19
CN113516595B CN113516595B (en) 2024-06-11

Family

ID=

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113132705A (en) * 2021-04-20 2021-07-16 Oppo广东移动通信有限公司 Image color edge correction method, correction device, electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325678A (en) * 2007-06-11 2008-12-17 富士胶片株式会社 Image recording apparatus and image recording method
US20090189997A1 (en) * 2008-01-28 2009-07-30 Fotonation Ireland Limited Methods and Apparatuses for Addressing Chromatic Abberations and Purple Fringing
CN101771882A (en) * 2008-12-26 2010-07-07 佳能株式会社 Image processing apparatus and image processing method
CN102158730A (en) * 2011-05-26 2011-08-17 威盛电子股份有限公司 Image processing system and method
JP2015211325A (en) * 2014-04-25 2015-11-24 株式会社朋栄 Purple fringe cancellation processing method and purple fringe cancellation processing device for implementing the processing
CN105389786A (en) * 2015-10-28 2016-03-09 努比亚技术有限公司 Image processing method and device
CN107864365A (en) * 2017-10-31 2018-03-30 上海集成电路研发中心有限公司 A kind of method for eliminating image purple boundary

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325678A (en) * 2007-06-11 2008-12-17 富士胶片株式会社 Image recording apparatus and image recording method
US20090189997A1 (en) * 2008-01-28 2009-07-30 Fotonation Ireland Limited Methods and Apparatuses for Addressing Chromatic Abberations and Purple Fringing
CN101771882A (en) * 2008-12-26 2010-07-07 佳能株式会社 Image processing apparatus and image processing method
CN102158730A (en) * 2011-05-26 2011-08-17 威盛电子股份有限公司 Image processing system and method
JP2015211325A (en) * 2014-04-25 2015-11-24 株式会社朋栄 Purple fringe cancellation processing method and purple fringe cancellation processing device for implementing the processing
CN105389786A (en) * 2015-10-28 2016-03-09 努比亚技术有限公司 Image processing method and device
CN107864365A (en) * 2017-10-31 2018-03-30 上海集成电路研发中心有限公司 A kind of method for eliminating image purple boundary

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张林等: ""基于物体颜色信息的图像紫边矫正方法"", 《光学学报》, vol. 36, no. 12, pages 1 - 9 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113132705A (en) * 2021-04-20 2021-07-16 Oppo广东移动通信有限公司 Image color edge correction method, correction device, electronic device and storage medium

Similar Documents

Publication Publication Date Title
US10298864B2 (en) Mismatched foreign light detection and mitigation in the image fusion of a two-camera system
CN111862195A (en) Light spot detection method and device, terminal and storage medium
US20030053692A1 (en) Method of and apparatus for segmenting a pixellated image
US9256928B2 (en) Image processing apparatus, image processing method, and storage medium capable of determining a region corresponding to local light from an image
US10382712B1 (en) Automatic removal of lens flares from images
EP2372655A1 (en) Image processing device and method, and program therefor
JP6553624B2 (en) Measurement equipment and system
US20100195902A1 (en) System and method for calibration of image colors
US9361669B2 (en) Image processing apparatus, image processing method, and program for performing a blurring process on an image
US8948452B2 (en) Image processing apparatus and control method thereof
JP2011188496A (en) Backlight detection device and backlight detection method
JP5779089B2 (en) Edge detection apparatus, edge detection program, and edge detection method
CN101983507A (en) Automatic redeye detection
US11503262B2 (en) Image processing method and device for auto white balance
CN108615030B (en) Title consistency detection method and device and electronic equipment
JP5381565B2 (en) Image processing apparatus, image processing program, and image processing method
US10438376B2 (en) Image processing apparatus replacing color of portion in image into single color, image processing method, and storage medium
US8885971B2 (en) Image processing apparatus, image processing method, and storage medium
JP2017229061A (en) Image processing apparatus, control method for the same, and imaging apparatus
JP6922399B2 (en) Image processing device, image processing method and image processing program
CN108965646A (en) Image processing apparatus, image processing method and storage medium
US9094617B2 (en) Methods and systems for real-time image-capture feedback
KR20150059302A (en) Method for recognizing character by fitting image shot, and information processing device for executing it
CN111669492A (en) Method for processing shot digital image by terminal and terminal
JP5441669B2 (en) Image processing apparatus and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant