CN107886481B - Image processing method and device and mobile terminal - Google Patents

Image processing method and device and mobile terminal Download PDF

Info

Publication number
CN107886481B
CN107886481B CN201711078087.4A CN201711078087A CN107886481B CN 107886481 B CN107886481 B CN 107886481B CN 201711078087 A CN201711078087 A CN 201711078087A CN 107886481 B CN107886481 B CN 107886481B
Authority
CN
China
Prior art keywords
map
pixel value
sampling
pixel
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711078087.4A
Other languages
Chinese (zh)
Other versions
CN107886481A (en
Inventor
孙向华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201711078087.4A priority Critical patent/CN107886481B/en
Publication of CN107886481A publication Critical patent/CN107886481A/en
Application granted granted Critical
Publication of CN107886481B publication Critical patent/CN107886481B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention provides an image processing method, an image processing device and a mobile terminal, wherein the method comprises the following steps: performing downsampling processing on the input graph and the guide graph in parallel to obtain a downsampled input graph and a downsampled guide graph; processing the down-sampling input map and the down-sampling guide map to obtain a filter coefficient map and a filter offset map; performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph; and filtering the first up-sampling image and the second up-sampling image to obtain an output image corresponding to the input image. The invention processes the input image and the guide image in parallel in the process of guiding filtering processing to the image, can accelerate the calculation speed in the image processing process, and improves the image processing efficiency, thereby improving the real-time property of the image processing.

Description

Image processing method and device and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an image processing method and apparatus, and a mobile terminal.
Background
With the rapid development of communication technology, mobile terminals such as smart phones and tablet computers are increasingly popularized and become indispensable tools in daily life of people, and the requirements of people on the mobile terminals are also increasingly high, especially the mobile terminals have a function of processing images such as shot photos. The guided filtering technology is an image processing technology applied to a mobile terminal, and can be used for filtering an input image through a guide image, so that an output image after filtering is approximately similar to the input image, but texture parts are similar to the guide image, thereby keeping smooth and fine matting of the image and the like and improving the image quality.
However, in the process of image processing by the mobile terminal through the conventional guided filtering technology, because the image resolution of the input image is usually large and the calculation amount in the guided filtering processing process is large, the processing process of the input image usually takes a long time, so that the image filtering processing of the mobile terminal cannot meet the requirement of real-time operation, and the user experience effect is reduced. Therefore, the problem of poor processing real-time performance caused by long time consumption exists in the process of performing the guiding filtering processing on the input image by the mobile terminal at present.
Disclosure of Invention
The embodiment of the invention provides an image processing method, an image processing device and a mobile terminal, and aims to solve the problem of poor processing real-time performance caused by long time consumption in the process of performing guided filtering processing on an input image by the mobile terminal at present.
In order to solve the technical problem, the invention is realized as follows: an image processing method applied to a mobile terminal including a digital signal processor, comprising:
performing downsampling processing on the input graph and the guide graph in parallel to obtain a downsampled input graph and a downsampled guide graph;
processing the down-sampling input map and the down-sampling guide map to obtain a filter coefficient map and a filter offset map;
performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph;
and filtering the first up-sampling image and the second up-sampling image to obtain an output image corresponding to the input image.
In a first aspect, an embodiment of the present invention provides an image processing method applied to a mobile terminal including a digital signal processor, including:
performing downsampling processing on the input graph and the guide graph in parallel to obtain a downsampled input graph and a downsampled guide graph;
processing the down-sampling input map and the down-sampling guide map to obtain a filter coefficient map and a filter offset map;
performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph;
and filtering the first up-sampling image and the second up-sampling image to obtain an output image corresponding to the input image.
In a second aspect, an embodiment of the present invention further provides an image processing apparatus, including a digital signal processor, the apparatus further including:
the down-sampling module is used for carrying out down-sampling processing on the input image and the guide image in parallel to obtain a down-sampling input image and a down-sampling guide image;
the filter parameter calculation module is used for processing the downsampling input map and the downsampling guide map to obtain a filter coefficient map and a filter offset map;
the up-sampling module is used for carrying out up-sampling processing on the filtering coefficient graph and the filtering offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph;
and the filtering processing module is used for filtering the first upper sampling image and the second upper sampling image to obtain an output image corresponding to the input image.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the image processing method.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the steps of the image processing method.
According to the embodiment of the invention, in the process of conducting the guided filtering processing on the image, the input image and the guide image are processed in parallel, so that the computing speed in the image processing process can be increased, the image processing efficiency is improved, and the real-time performance of the image processing is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a flow chart of another image processing method according to an embodiment of the present invention;
FIG. 3 is a schematic processing procedure diagram of an application example of an image processing method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a table lookup for horizontal interpolation by a bilinear interpolation technique according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a zoom provided by an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of a filter parameter calculation module in an image processing apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an electronic submodule in an image processing apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a filter parameter map generation sub-module in an image processing apparatus according to an embodiment of the present invention;
FIG. 10 is a schematic structural diagram of a computing unit in an image processing apparatus according to an embodiment of the present invention;
fig. 11 is a schematic structural diagram of a first lookup subunit in the image processing apparatus according to the embodiment of the present invention;
fig. 12 is a schematic structural diagram of a down-sampling module in an image processing apparatus according to an embodiment of the present invention;
fig. 13 is a hardware structure diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present invention, applied to a mobile terminal including a Digital Signal Processor (DSP), as shown in fig. 1, including the following steps:
and a step 101 of performing downsampling processing on the input map and the guide map in parallel to obtain a downsampled input map and a downsampled guide map.
In the embodiment of the invention, in the process of filtering the input map by the mobile terminal, the down-sampling processing can be performed on the input map and the guide map in parallel to obtain the down-sampling input map and the down-sampling guide map. The input map is an image to be processed, and the guide map may be an image having the same size as the input image and having a specific texture portion.
For example: the size of the input map and the size of the guide map are M multiplied by N, the preset sampling multiple is s, the DSP of the mobile terminal can simultaneously execute at least two threads, wherein one thread of the at least two threads performs down-sampling on the input map, and the size of the down-sampled input map obtained after the down-sampling is (M/s) multiplied by (N/s); similarly, another thread of the DSP may down-sample the guide map, and the size of the down-sampled guide map obtained after down-sampling is also (M/s) × (N/s).
And 102, processing the downsampled input map and the downsampled guide map to obtain a filter coefficient map and a filter offset map.
In the embodiment of the present invention, when the downsampled input map and the downsampled guide map are obtained in step 101, the mobile terminal may process the downsampled input map and the downsampled guide map to obtain the filter coefficient map and the filter offset map.
For example: the DSP of the mobile terminal can calculate the sum of pixel values of all pixel points in a square window with a preset width by taking each pixel point as a center in a down-sampling input image through a direct summation method; similarly, the DSP can also calculate the sum of pixel values of all pixels in a square window with a preset width by taking each pixel as a center in a first image obtained by pixel-by-pixel multiplication between the down-sampling guide map, the two down-sampling input maps, and a second image obtained by pixel-by-pixel thread between the down-sampling input map and the down-sampling guide map, and calculate the pixel value proportion coefficient and the pixel value offset corresponding to each pixel in the down-sampling input map by taking the sum of pixel values corresponding to each pixel in the down-sampling input map, the down-sampling guide map, the first image, and the second image, generate a pixel value proportion coefficient map according to the proportion coefficient corresponding to each pixel in the down-sampling input map, and process the proportion coefficient map to generate a filter coefficient map; and generating a pixel value offset map according to the offset corresponding to each pixel point in the downsampled input map, and processing the pixel value offset map to generate a filtering offset map.
And 103, performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph.
In this embodiment of the present invention, if the filter coefficient map and the filter offset map are obtained in step 102, the DSP of the mobile terminal may perform upsampling processing on the filter coefficient map and the filter offset map in parallel, so as to obtain a first upsampled map and a second upsampled map which are consistent with the image size of the input map.
For example: taking the guide map with the size of M × N as an example, and the sizes of the filter coefficient map and the filter offset map are (M/s) × (N/s), the DSP of the mobile terminal may perform upsampling on the filter coefficient map according to the sampling multiple s in one thread to obtain a first upsampled map with the image size of M × N; similarly, the DSP of the mobile terminal may perform upsampling on the filter offset map in another thread to obtain a second upsampled map with an image size of M × N.
And 104, filtering the first up-sampling graph and the second up-sampling graph to obtain an output graph corresponding to the input graph.
In this embodiment of the present invention, if the first upsampled map and the second upsampled map are obtained in step 103, the DPS of the mobile terminal may perform filtering processing on the first upsampled map and the second upsampled map to obtain an output map corresponding to the input map, where the obtained output map is substantially similar to the input map, but a texture portion of the input map is similar to a specific texture portion in the guide map.
The filtering processing on the first up-sampling graph and the second up-sampling graph may be performed by multiplying a value of a target pixel point in the guide graph by a pixel value corresponding to the target pixel point in the first up-sampling graph, that is, a pixel value proportional coefficient corresponding to the target pixel point, by a DSP of the mobile terminal, adding a result obtained by the multiplication to a pixel value corresponding to the target pixel point, that is, a pixel value offset corresponding to the target pixel point in the second up-sampling graph, and taking the result obtained by the addition as a pixel value of a pixel point corresponding to the target pixel point in the output graph, thereby generating the output graph.
In the embodiment of the present invention, the mobile terminal may be any mobile terminal including a digital signal processor, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
According to the image processing method provided by the embodiment of the invention, the input image and the guide image are processed in parallel in the process of guiding filtering processing on the image, so that the calculation speed in the image processing process can be increased, the image processing efficiency is improved, and the real-time performance of image processing is improved.
Referring to fig. 2, fig. 2 is a schematic flowchart of an image processing method according to an embodiment of the present invention, and as shown in fig. 2, the method includes the following steps:
step 201, a downsampling process is performed on the input map in at least one input map processing thread to obtain a downsampled input map, and the downsampling process is performed on the guide map in at least one guide map processing thread to obtain a downsampled guide map.
In the embodiment of the invention, in the process of filtering the input image by the mobile terminal, the DSP can perform downsampling processing on the input image through one or more input image processing threads to obtain a downsampled input image, and perform downsampling processing on the guide image through one or more guide image processing threads to obtain a downsampled guide image. The at least one input graph processing thread and the at least one guide graph processing thread are threads processed in parallel in the DSP respectively.
For example: the DSP can perform downsampling processing on an input image with the size of M multiplied by N through two input image processing threads, in the processing process, one input image processing thread of the DSP can perform downsampling processing on an image area of the input image, wherein the image area comprises M/2 lines of pixel points, while the other input image processing thread can perform downsampling processing on an image area of the input image, wherein the image area comprises the other M/2 lines of pixel points, and therefore the image processing efficiency of the image mobile terminal is improved.
Optionally, step 201 may include: in the at least one input graph processing thread, zooming the input graph by using a preset sampling multiple to obtain a zoom graph corresponding to the input graph; searching pixel values of interpolation points of the zoom image in the horizontal direction and the vertical direction by utilizing a bilinear interpolation technology; taking the pixel value of each interpolation point as the pixel value of the pixel point corresponding to the interpolation point in the zoom image to generate the down-sampling input image; and performing downsampling processing on the guide map in at least one guide map processing thread to obtain a downsampled guide map.
In this embodiment, the DSP may successively perform horizontal interpolation calculation and vertical interpolation calculation by using a bilinear interpolation technique in a downsampling process of the input image, and since the interpolation point pixel values may be obtained in batch by using a table lookup method according to the index values on the DSP by using the bilinear interpolation technique, the interpolation may be calculated simultaneously by a plurality of pixel points, thereby further increasing the operation speed of the DSP in the image processing process and increasing the efficiency of the image filtering processing.
It should be noted that, the guide map may be processed by a bilinear interpolation technique to obtain a down-sampled guide map, and the implementation process is similar to the down-sampling process of the input map, and is not described again here.
Step 202, processing the downsampled input map and the downsampled guide map to obtain a filter coefficient map and a filter offset map.
In the embodiment of the present invention, if the downsampled input map and the downsampled guide map are obtained in step 201, the mobile terminal may process the downsampled input map and the downsampled guide map to obtain the filter coefficient map and the filter offset map.
Optionally, step 202 may include: calculating the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image; and generating a filtering coefficient graph and a filtering offset graph according to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input graph.
In the embodiment of the invention, the mobile terminal generates the filter coefficient graph and the filter offset graph by calculating the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image and according to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image, so that the generated filter coefficient graph and the filter offset graph are more accurate, the texture part of the output image after the mobile terminal performs the guiding filtering treatment is more similar to the texture part in the guiding image, and the image quality is improved.
Further optionally, the step of calculating the pixel value variance and the pixel value covariance of each pixel point in the downsampled input image may include: calculating to obtain the pixel value variance of a target pixel point by using a pixel value variance calculation formula, wherein the target pixel point is any pixel point in the down-sampling input image, and the pixel value variance calculation formula is as follows:
var=t′*sumI′I′-sumI′·sumI′
the var represents the variance of the pixel value of the target pixel point;
the t ' r ', r ' r/s, the r represents a preset filtering window width, the s represents the sampling multiple, and t ' represents the total number of pixel points in the filtering window with the width r ';
the sumI′A pixel sum value indicating a pixel point corresponding to the target pixel point in the integral map of the down-sampling guide map;
the sumI′I′Expressing the pixel sum value of the pixel point corresponding to the target pixel point in an integral image of an image obtained by multiplying two down-sampling guide pixel-by-pixel points;
calculating to obtain the pixel value covariance of the target pixel point by using a pixel value covariance calculation formula, wherein the pixel value covariance calculation formula is as follows:
cov=t′*sumI′P′-sumI′·sumP′
cov represents the covariance of the pixel value of the target pixel point;
the sumP′A pixel sum value representing a pixel point corresponding to a target pixel point in an integrogram of the downsampled input graph;
the sumI′P′And expressing the pixel sum value of the pixel point corresponding to the pixel point in the integral image of the image obtained by multiplying the down-sampling guide image and the down-sampling input image pixel by pixel point.
In the embodiment of the invention, if the mobile terminal can calculate the pixel value variance of the target pixel point by using the pixel value variance calculation formula and calculate the pixel value covariance of the target pixel point by using the pixel value covariance calculation formula, the calculated pixel value variance and pixel value covariance of each pixel point can be more accurate, and the image quality of the output image is further improved.
Further optionally, the step of generating a filter coefficient map and a filter offset map according to the pixel value variance and the pixel value covariance of each pixel point in the downsampled input map includes: calculating a pixel value proportional coefficient and a pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image; generating a pixel value proportion coefficient map associated with the pixel value proportion coefficient of each pixel point in the down-sampling input map and a pixel value offset map associated with the pixel value offset of each pixel point in the down-sampling input map; and processing the pixel value scale coefficient map to obtain the filter coefficient map, and processing the pixel value offset map to obtain the filter offset map.
In the embodiment of the invention, the mobile terminal can calculate the pixel value variance of each pixel point in the downsampled input image and the pixel value proportional coefficient and the pixel value offset corresponding to the pixel value covariance, generate the pixel value proportional coefficient image associated with each pixel value proportional coefficient and the pixel value offset image associated with each pixel value offset, so that the filtering coefficient image is obtained by processing the pixel value proportional coefficient image, the filtering offset image is obtained by processing the pixel value offset image, the filtering offset image is further obtained, and the filtering coefficient image and the filtering offset image are further more accurate, so that the image quality of the obtained output image is higher.
Further optionally, the step of calculating a pixel value proportionality coefficient and a pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the downsampled input image includes: acquiring at least two multiplication factors corresponding to the pixel value variance of the target pixel point, and searching a corresponding value of each multiplication factor in the at least two multiplication factors in a preset lookup table in the digital signal processor; and calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point based on the corresponding value of each multiplication factor in the at least two multiplication factors.
In this embodiment, the mobile terminal searches for at least two multiplication factors corresponding to the pixel value variance of a pixel point in the down-sampling input image and a corresponding value of every two multiplication factors in a preset lookup table in the digital signal processor, and calculates the pixel value proportionality coefficient and the pixel value offset of the pixel point according to the at least two multiplication factors corresponding to the pixel value variance of the pixel point and the corresponding value of every multiplication factor, so that the operation speed and the calculation accuracy of the DSP of the mobile terminal can be improved.
Further optionally, the step of calculating to obtain the pixel value proportionality coefficient and the pixel value offset of the target pixel point may include: calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point according to a proportional coefficient calculation formula and an offset calculation formula, wherein the proportional coefficient calculation formula is as follows:
Figure GDA0002731378720000101
a' is a pixel value proportion coefficient of the target pixel point;
n is a preset shift amount and is greater than 7;
the s is the number of multiplication factors corresponding to the pixel value variance of the target pixel point, and is more than or equal to 2;
said wiThe corresponding value of the ith multiplication factor in the second preset lookup table is obtained;
the offset calculation formula is as follows:
Figure GDA0002731378720000111
and b' is the pixel value offset of the target pixel point.
In this embodiment, the mobile terminal may calculate the pixel value scaling factor and the pixel value offset of each pixel point in the down-sampled input image through the scaling factor calculation formula and the offset calculation formula, so that floating point operation in the image processing process may be converted into multiplication and shift operation, and the operation speed and the calculation accuracy of the DSP of the mobile terminal are further improved.
Further optionally, the step of searching for the target shift amount corresponding to the pixel value variance of the target pixel point in the first preset lookup table in the digital signal processor includes: judging whether the sum of the pixel value variance and the preset deviation amount of the target pixel point is within a preset table look-up value range or not, and acquiring a judgment result; if the judgment result is negative, factorizing the sum of the pixel value variance and the preset deviation amount of the target pixel point into at least two multiplication factors, and searching a value corresponding to each multiplication factor in the at least two multiplication factors in the preset lookup table, wherein each multiplication factor in the at least two multiplication factors is a value within the range of the preset lookup table value.
In this embodiment, when the sum of the pixel value variance and the preset deviation of a pixel point in the down-sampling input image exceeds the range of the preset lookup table value, the mobile terminal performs factorization on the sum of the pixel value variance and the preset deviation of the pixel point, obtains at least two multiplication factors through factorization, and searches for a value corresponding to each of the at least two multiplication factors in the preset lookup table, so that the calculation mode of the mobile terminal is more flexible, the calculation amount of the mobile terminal is reduced, and the image processing efficiency is improved.
Step 203, performing upsampling processing on the filter coefficient map in at least one filter coefficient map processing thread to obtain a first upsampled map, and performing upsampling processing on the filter offset map in at least one filter offset map processing thread to obtain a second upsampled map.
In this embodiment of the present invention, when obtaining the filter coefficient map and the filter offset map in step 202, the mobile terminal may perform upsampling processing on the filter coefficient map in at least one filter coefficient map processing thread to obtain a first upsampled map, and perform upsampling processing on the filter offset map in at least one filter offset map processing thread to obtain a second upsampled map. The at least one filtering coefficient map processing thread and the at least one filtering offset map processing thread are threads processed in parallel in the digital signal processor respectively.
For example: taking the up-sampling of the filter coefficient map as an example, the DSP may perform the up-sampling processing on the filter coefficient map with the size of (M/s) × (N/s) through two filter coefficient map processing threads, in the up-sampling processing process, one filter coefficient map processing thread of the DSP may perform the up-sampling processing on an image area including pixel points of M/(2s) lines in the filter coefficient map, and the other filter coefficient map processing thread may perform the up-sampling processing on an image area including pixel points of M/(2s) lines in the filter coefficient map, thereby improving the image processing efficiency of the image mobile terminal.
It should be noted that, the downsampling processing on the filter coefficient map and the filter offset map may also be performed on the filter coefficient map and the filter offset map respectively by using a bilinear interpolation technique to obtain a first upsampling map and a second upsampling map, and the implementation process is similar to the downsampling process on the input map, so as to further improve the image processing efficiency of the mobile terminal, which is not described herein again.
And 204, filtering the first up-sampling graph and the second up-sampling graph to obtain an output graph corresponding to the input graph.
In this embodiment of the present invention, if the first upsampled map and the second upsampled map are obtained in step 203, the DPS of the mobile terminal may perform filtering processing on the first upsampled map and the second upsampled map to obtain an output map corresponding to the input map, where the obtained output map is substantially similar to the input map, but a texture portion of the input map is similar to a specific texture portion in the guide map.
As shown in fig. 3, fig. 3 is a schematic processing procedure diagram of an application example of an image processing method provided in an embodiment of the present invention, where in the application example, a single-channel image is taken as an example, an input graph is P, a guide graph is I, a preset filter window width is r, a scaling factor is s, an output graph is Q, and a preset parameter ∈ is a small constant, and a processing procedure of the application example of the image processing method is as follows:
step 301, a first thread in the DSP downsamples an input graph P to generate a downsampled input graph P';
the DSP calculates the zoom map of the input map P according to the zoom factor, namely the sampling factor, and samples the bilinear interpolation technology to obtain the interpolation points in the horizontal direction and the vertical direction in the zoom map, because the index value and the weight of each interpolation point are sequentially stored in the array, the mobile terminal can obtain the pixel value of each interpolation point in batch according to the index by using a table lookup method on the DSP, and the specific table lookup process takes the example of performing table lookup on the horizontal interpolation as shown in FIG. 4, wherein the numerical value in the table 4A represents the index value of the interpolation point, the table 4B represents the pixel value, and the table 4C represents the table lookup result.
The principle of bilinear interpolation is as follows:
if the input image has a size of a × b and the scaled image is m × n, the original image is obtainedThe scaling ratios of the image and the scaled image in the horizontal direction and the vertical direction are respectively
Figure GDA0002731378720000131
And
Figure GDA0002731378720000132
as shown in FIG. 5, the scaled pixel point (x, y) of the image should correspond to the input image
Figure GDA0002731378720000133
The pixel value at the location. But due to
Figure GDA0002731378720000134
And
Figure GDA0002731378720000135
are all floating point numbers, and therefore
Figure GDA0002731378720000136
Floating point numbers are also possible. Since the positions of the pixels are all integers, it is necessary to find the position in the input map
Figure GDA0002731378720000137
The nearby 4 pixel point positions, such as G in FIG. 5, are equivalent to
Figure GDA0002731378720000138
Since the pixel value of point G does not exist in the input image, to find the pixel value corresponding to the scaled image at the (x, y) position, we need to determine the 4 pixels (Q) adjacent to the position12、Q22、Q11、Q21) The bilinear interpolation is carried out on the pixel value of the point to obtain the pixel value of the point. First in the x-axis direction, pair (Q)12,Q22) And (Q)11,Q21) Respectively carrying out horizontal interpolation to obtain R2,R1. During horizontal interpolation, because the distance between the G point and the left and right adjacent pixel points is different, thenThe weights are different, and the weight corresponding to the pixel point close to the G point is large. In the same way, pair (R)2,R1) And carrying out vertical interpolation to obtain a G point pixel value. And traversing each pixel point in the zoom map to obtain a down-sampling input map.
Step 302, a second thread in the DSP performs down-sampling on the guide map I to generate a down-sampling guide map I';
similarly to the above step 301, the DSP can also generate the down-sampled guide map P' by using bilinear interpolation technique. Wherein the first thread and the second thread are threads processed in parallel.
Step 303, calculating the pixel sum value sum of each pixel point in the first integral graph in the first threadP′And the pixel sum value sum of each pixel point in the second integral imageI′P′
Wherein, the first integral image is an integral image of the down-sampling input image P', and the pixel sum value sum of a pixel point in the first integral imageP′The sum of all pixel values in a filtering window with the length and the width r 'taking the pixel point as the center, wherein r' is r/s; the second integral image is an image obtained by multiplying the down-sampling guide image I 'and the down-sampling input image P' pixel by pixel, and the pixel sum value sum of one pixel in the second integral imageI′P′The sum of all pixel values in a filtering window with the length and the width r' taking the pixel point as the center.
Step 304, calculating the pixel sum value sum of each pixel point in the third integral graph in the second threadI′And the sum of the pixel values sum of each pixel in the fourth integral imageI′I′
Wherein, the third integral image is an integral image of the down-sampling guide image I', and the pixel sum value sum of a pixel point in the third integral imageI′The sum of all pixel values in a filtering window with the length and the width r' taking the pixel point as the center; the fourth integral image is an image obtained by multiplying two down-sampling guide images I' pixel by pixel, and the pixel sum value sum of one pixel in the fourth integral imageI′I′Is the sum of all pixel values within a filter window of length and width r' centered on the pixel pointAnd
305, generating a filter coefficient graph and a filter offset graph according to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input graph;
if the pixel sum value of each pixel point in the first, second, third and fourth integrograms is obtained by calculation in the above steps 303 and 304, the mobile terminal can obtain the pixel value variance and the pixel value covariance of each pixel point in the down-sampled input image by calculation, that is, the pixel value variance and the pixel value covariance are obtained by calculation
var=t′*sumI′I′-sumI′·sumI′
cov=t′*sumI′P′-sumI′·sumP′
T ' ═ r ' × r ' above;
the mobile terminal may generate a filter coefficient map and a filter offset map according to the pixel value variance and the pixel value covariance of each pixel point in the downsampled input map. Setting the pixel value variance var of each pixel point in the sampling input image as y, and y being a divisor, and setting the pixel value covariance cov of each pixel point in the sampling input image as x, and x being a dividend, the above-mentioned implementation process of generating the filter coefficient image and the filter offset image is as follows:
if y<127 by y look-up table
Figure GDA0002731378720000151
Then
Figure GDA0002731378720000152
The division can be converted into a multiplication and shift operation, N representing the shift amount, where the symbol ">>"represents a shift right operation of a binary number, and when s is 1;
if y ≧ 127, then depending on the order of magnitude of y, it can be determined to divide y into
Figure GDA0002731378720000153
Multiplication by a multiplication factor, denoted y respectively0,y1……ys-1Then, then
Figure GDA0002731378720000154
Thus, a divisor with a large value range can be decomposed into 2 or more numbers which are less than or equal to 127, and division can be realized by looking up the table, multiplying and shifting for many times.
The mobile terminal can obtain y by looking up the division table0,y1……ys-1Corresponding value of w0,w1……ws-1Then a pixel value scaling factor and a pixel value offset may be obtained, the pixel value scaling factor being
Figure GDA0002731378720000155
Namely, it is
Figure GDA0002731378720000161
The pixel value is offset by
Figure GDA0002731378720000162
Since a may be less than 1, a is expanded by 2NAnd multiplying, and calculating by taking an integer value.
The mobile terminal can calculate a' obtained by pixel value variance and pixel value covariance of all pixel points in the downsampled input image as pixel values of the pixel points in the pixel value proportion coefficient image; and b' obtained by calculating the pixel value variance and the pixel value covariance of all the pixel points is used as the pixel value of the pixel point in the pixel value offset map, the pixel value proportional coefficient map is processed to obtain a filter coefficient map, and the pixel value offset map is processed to obtain a filter offset map.
The processing of the pixel value proportionality coefficient map to obtain a filter coefficient map may be that the mobile terminal obtains an integral map of the proportionality coefficient map, calculates an average value of all pixels in a filter window with a length and a width r' in the proportionality coefficient integral map by centering on each pixel, updates the average value of each pixel to a corresponding calculated average value, and uses the integral map of the updated proportionality coefficient map as the filter coefficient map; similarly, the process of processing the pixel value offset map to obtain the filtering offset map is similar to the process of processing the pixel value scale coefficient map to obtain the filtering coefficient map, and is not repeated here.
Step 306, performing upsampling processing on the filter coefficient map in a third thread in the DSP to obtain a first upsampled map;
step 307, performing upsampling processing on the filtering offset map in a fourth thread in the DSP to obtain a second upsampling map;
and the third thread and the fourth thread are threads processed by the DSP in parallel.
And 308, filtering the first up-sampling image and the second up-sampling image to obtain an output image Q corresponding to the input image P.
Wherein, carry out filtering process to first upsampling picture and second upsampling picture, obtain output picture Q corresponding with input picture P, can be through the pixel value of the same pixel of corresponding pixel in first upsampling picture, second upsampling picture and the guide picture promptly coordinate, calculate the pixel value that obtains the pixel that corresponds in the output picture Q, the computational formula is:
q=(A′·I+B′)/2N
the Q represents the pixel value of a target pixel point in the output image Q;
the I represents the pixel value of a pixel point corresponding to the target pixel point in the guide image I;
the above a' represents the pixel value of the pixel point corresponding to the target pixel point in the filter coefficient map;
b' represents a pixel value of a pixel point corresponding to the target pixel point in the filter offset map.
According to the image processing method, the input map is subjected to down-sampling processing in at least one input map processing thread to obtain a down-sampling input map, and the guide map is subjected to down-sampling processing in at least one guide map processing thread to obtain a down-sampling guide map; the method comprises the steps of carrying out up-sampling processing on a filter coefficient map in at least one filter coefficient map processing thread to obtain a first up-sampling map, and carrying out up-sampling processing on the filter offset map in at least one filter offset map processing thread to obtain a second up-sampling map. Therefore, the calculation speed of the mobile terminal in the image guide filtering processing process can be further increased, and the real-time performance of the image processing of the mobile terminal is higher.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention, as shown in fig. 6, the image processing apparatus 600 includes a digital signal processor, and the image processing apparatus 600 further includes a down-sampling module 601, a filter parameter calculating module 602, an up-sampling module 603, and a filter processing module 604, which are connected in sequence:
a down-sampling module 601, configured to perform down-sampling processing on the input map and the guide map in parallel to obtain a down-sampled input map and a down-sampled guide map;
a filter parameter calculation module 602, configured to process the downsampled input map and the downsampled guide map to obtain a filter coefficient map and a filter offset map;
an upsampling module 603, configured to perform upsampling processing on the filter coefficient map and the filter offset map in parallel to obtain a first upsampling map and a second upsampling map;
a filtering processing module 604, configured to perform filtering processing on the first upsampled map and the second upsampled map to obtain an output map corresponding to the input map.
Optionally, the downsampling module 601 is further configured to perform downsampling on the input map in at least one input map processing thread to obtain a downsampled input map, and perform downsampling on the guide map in at least one guide map processing thread to obtain a downsampled guide map, where the at least one input map processing thread and the at least one guide map processing thread are threads that are processed in parallel in the digital signal processor, respectively;
the upsampling module 603 is further configured to perform upsampling on the filter coefficient map in at least one filter coefficient map processing thread to obtain a first upsampled map, and perform upsampling on the filter offset map in at least one filter offset map processing thread to obtain a second upsampled map, where the at least one filter coefficient map processing thread and the at least one filter offset map processing thread are threads of parallel processing in the digital signal processor, respectively.
Optionally, as shown in fig. 7, the filtering parameter calculating module 602 includes:
the calculation submodule 6021 is configured to calculate a pixel value variance and a pixel value covariance of each pixel point in the downsampled input image;
the filtering parameter map generating sub-module 6022 is configured to generate a filtering coefficient map and a filtering offset map according to the pixel value variance and the pixel value covariance of each pixel point in the downsampled input map.
Optionally, as shown in fig. 8, the calculation sub-module 6021 includes:
a variance calculating unit 60211, configured to calculate, by using a pixel value variance calculating formula, a pixel value variance of a target pixel point, where the target pixel point is any one pixel point in the down-sampling input image, and the pixel value variance calculating formula is:
var=t′*sumI′I′-sumI′·sumI′
the var represents the variance of the pixel value of the target pixel point;
the t ' r ', r ' r/s, the r representing a preset filtering window width, and the s representing the sampling multiple;
the sumI′A pixel sum value indicating a pixel point corresponding to the target pixel point in the integral map of the down-sampling guide map;
the sumI′I′Expressing the pixel sum value of a pixel point corresponding to the target pixel point in an integral image of an image obtained by multiplying two down-sampling guide images pixel by pixel point;
a covariance calculation unit 60212, configured to calculate a covariance of the pixel value of the target pixel point by using a pixel value covariance calculation formula, where the pixel value covariance calculation formula is:
cov=t′*sumI′P′-sumI′·sumP′
cov represents the covariance of the pixel value of the target pixel point;
the sumP′A pixel sum value representing a pixel point corresponding to a target pixel point in an integrogram of the downsampled input graph;
the sumI′P′And expressing the pixel sum value of the pixel point corresponding to the pixel point in the integral image of the image obtained by multiplying the down-sampling guide image and the down-sampling input image pixel by pixel point.
Optionally, as shown in fig. 9, the filtering parameter map generation sub-module 6022 includes:
a calculating unit 60221 configured to calculate a pixel value proportionality coefficient and a pixel value offset corresponding to a pixel value variance and a pixel value covariance of each pixel point in the downsampled input image;
an image generating unit 60222, configured to generate a pixel value scale coefficient map associated with a pixel value scale coefficient of each pixel in the downsampled input map, and a pixel value offset map associated with a pixel value offset of each pixel in the downsampled input map;
a filtering parameter processing unit 60223, configured to process the pixel value scaling coefficient map to obtain the filtering coefficient map, and process the pixel value offset map to obtain the filtering offset map.
Optionally, as shown in fig. 10, the calculating unit 60221 includes:
a searching subunit 602211, configured to obtain at least two multiplication factors corresponding to the variance of the pixel value of the target pixel point, and search, in a preset lookup table in the digital signal processor, a corresponding value of each of the at least two multiplication factors;
and the calculating subunit 602212 is configured to calculate, based on a corresponding value of each of the at least two multiplication factors, a pixel value scaling coefficient and a pixel value offset of the target pixel point.
Optionally, the calculating subunit 602212 is further configured to calculate, according to a scaling factor calculation formula and an offset calculation formula, a pixel value scaling factor and a pixel value offset of the target pixel point, where the scaling factor calculation formula is
Figure GDA0002731378720000201
A' is a pixel value proportion coefficient of the target pixel point;
n is a preset shift amount and is greater than 7;
the s is the number of multiplication factors corresponding to the pixel value variance of the target pixel point, and is more than or equal to 2;
said wiThe corresponding value of the ith multiplication factor in the second preset lookup table is obtained;
the offset calculation formula is as follows:
Figure GDA0002731378720000202
and b' is the pixel value offset of the target pixel point.
Optionally, as shown in fig. 11, the first finding subunit 602211 further includes:
a judgment subunit 6022111, configured to judge whether the sum of the pixel value variance of the target pixel point and a preset deviation amount is within a preset table lookup value range, and obtain a judgment result;
a look-up sub-unit 6022112, configured to factorize the sum of the variance of the pixel value of the target pixel and a preset deviation into at least two multiplication factors if the determination result is negative, and look up a value corresponding to each of the at least two multiplication factors in the preset look-up table, where each of the at least two multiplication factors is a value within a range of a value in the preset look-up table.
Optionally, as shown in fig. 12, the downsampling module 601 includes:
a zoom map obtaining submodule 6011, configured to, in the at least one input map processing thread, zoom the input map by using a preset sampling multiple, and obtain a zoom map corresponding to the input map;
a bilinear interpolation submodule 6012, configured to search, by using a bilinear interpolation technique, a pixel value of each interpolation point of the scaled map in the horizontal direction and the vertical direction;
a first generating submodule 6013, configured to generate the downsampled input map by using a pixel value of each interpolation point as a pixel value of a pixel point corresponding to the interpolation point in the scaled map;
a second generating sub-module 6014 is configured to perform downsampling on the guide map in at least one guide map processing thread to obtain a downsampled guide map.
The image processing apparatus 600 can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 5, and is not described herein again to avoid repetition.
The image processing apparatus 600 of the embodiment of the present invention performs parallel processing on the input map and the guide map during the guided filtering processing of the image, so as to accelerate the calculation speed during the image processing, improve the image processing efficiency, and thus improve the real-time performance of the image processing.
Fig. 13 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 1300 includes, but is not limited to: a radio frequency unit 1301, a network module 1302, an audio output unit 1303, an input unit 1304, a sensor 1305, a display unit 1306, a user input unit 1307, an interface unit 1308, a memory 1309, a processor 1310, a power supply 1311, and the like, wherein the processor 1310 is a digital signal processor. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 13 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 1310 is configured to perform downsampling processing on the input map and the guide map in parallel to obtain a downsampled input map and a downsampled guide map; processing the down-sampling input map and the down-sampling guide map to obtain a filter coefficient map and a filter offset map; performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph; and filtering the first up-sampling image and the second up-sampling image to obtain an output image corresponding to the input image.
Optionally, the processor 1310 is further configured to: the method comprises the steps of performing downsampling processing on an input map in at least one input map processing thread to obtain a downsampled input map, and performing downsampling processing on a guide map in at least one guide map processing thread to obtain a downsampled guide map, wherein the at least one input map processing thread and the at least one guide map processing thread are threads which are processed in parallel in the digital signal processor respectively; the method comprises the steps of carrying out up-sampling processing on a filter coefficient map in at least one filter coefficient map processing thread to obtain a first up-sampling map, and carrying out up-sampling processing on the filter offset map in at least one filter offset map processing thread to obtain a second up-sampling map, wherein the at least one filter coefficient map processing thread and the at least one filter offset map processing thread are threads which are processed in parallel in a digital signal processor respectively.
Optionally, the processor 1310 is further configured to: calculating the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image; and generating a filtering coefficient graph and a filtering offset graph according to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input graph.
Optionally, the processor 1310 is further configured to: calculating to obtain the pixel value variance of the target pixel point by using a pixel value variance calculation formulaThe target pixel point is any pixel point in the down-sampling input image, and the pixel value variance calculation formula is as follows: var ═ t'. sumI′I′-sumI′·sumI′The var represents the variance of the pixel value of the target pixel point; the t ' r ', r ' r/s, the r representing a preset filtering window width, and the s representing the sampling multiple; the sumI′A pixel sum value indicating a pixel point corresponding to the target pixel point in the integral map of the down-sampling guide map; the sumI′I′Expressing the pixel sum value of a pixel point corresponding to the target pixel point in an integral image of an image obtained by multiplying two down-sampling guide images pixel by pixel point; calculating to obtain the pixel value covariance of the target pixel point by using a pixel value covariance calculation formula, wherein the pixel value covariance calculation formula is as follows: cov ═ t'. sumI′P′-sumI′·sumP′Cov, the covariance of the pixel value of the target pixel point is represented; the sumP′A pixel sum value representing a pixel point corresponding to a target pixel point in an integrogram of the downsampled input graph; the sumI′P′And expressing the pixel sum value of the pixel point corresponding to the pixel point in the integral image of the image obtained by multiplying the down-sampling guide image and the down-sampling input image pixel by pixel point.
Optionally, the processor 1310 is further configured to: calculating a pixel value proportional coefficient and a pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image; generating a pixel value proportion coefficient map associated with the pixel value proportion coefficient of each pixel point in the down-sampling input map and a pixel value offset map associated with the pixel value offset of each pixel point in the down-sampling input map; and processing the pixel value scale coefficient map to obtain the filter coefficient map, and processing the pixel value offset map to obtain the filter offset map.
Optionally, the processor 1310 is further configured to: acquiring at least two multiplication factors corresponding to the pixel value variance of the target pixel point, and searching a corresponding value of each multiplication factor in the at least two multiplication factors in a preset lookup table in the digital signal processor; and calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point based on the corresponding value of each multiplication factor in the at least two multiplication factors.
Optionally, the processor 1310 is further configured to: calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point according to a proportional coefficient calculation formula and an offset calculation formula, wherein the proportional coefficient calculation formula is as follows:
Figure GDA0002731378720000231
a' is a pixel value proportion coefficient of the target pixel point; a' is a pixel value proportion coefficient of the target pixel point, N is a preset shift amount, and N is larger than 7; the s is the number of multiplication factors corresponding to the pixel value variance of the target pixel point, and is more than or equal to 2; said wiThe corresponding value of the ith multiplication factor in the second preset lookup table is obtained; the offset calculation formula is as follows:
Figure GDA0002731378720000241
and b' is the pixel value offset of the target pixel point.
Optionally, the processor 1310 is further configured to: judging whether the sum of the pixel value variance and the preset deviation amount of the target pixel point is within a preset table look-up value range or not, and acquiring a judgment result; if the judgment result is negative, factorizing the sum of the pixel value variance and the preset deviation amount of the target pixel point into at least two multiplication factors, and searching a value corresponding to each multiplication factor in the at least two multiplication factors in the preset lookup table, wherein each multiplication factor in the at least two multiplication factors is a value within the range of the preset lookup table value.
Optionally, the processor 1310 is further configured to: in the at least one input graph processing thread, zooming the input graph by using a preset sampling multiple to obtain a zoom graph corresponding to the input graph; searching pixel values of interpolation points of the zoom image in the horizontal direction and the vertical direction by utilizing a bilinear interpolation technology; taking the pixel value of each interpolation point as the pixel value of the pixel point corresponding to the interpolation point in the zoom image to generate the down-sampling input image; and performing downsampling processing on the guide map in at least one guide map processing thread to obtain a downsampled guide map.
The mobile terminal 1300 can implement each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
The mobile terminal 1300 according to the embodiment of the present invention performs downsampling processing on the input map and the guide map in parallel to obtain a downsampled input map and a downsampled guide map; processing the down-sampling input map and the down-sampling guide map to obtain a filter coefficient map and a filter offset map; performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph; and filtering the first up-sampling image and the second up-sampling image to obtain an output image corresponding to the input image. Therefore, the mobile terminal can process the input graph and the guide graph in parallel in the process of conducting guide filtering processing on the image, the computing speed in the image processing process is increased, the image processing efficiency is improved, and the real-time performance of image processing of the mobile terminal is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 1301 may be configured to receive and transmit signals during a message transmission or call process, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 1310; in addition, the uplink data is transmitted to the base station. In general, radio unit 1301 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 1301 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 1302, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 1303 can convert audio data received by the radio frequency unit 1301 or the network module 1302 or stored in the memory 1309 into an audio signal and output as sound. Also, the audio output unit 1303 may also provide audio output related to a specific function performed by the mobile terminal 1300 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 1303 includes a speaker, a buzzer, a receiver, and the like.
The input unit 1304 is used to receive audio or video signals. The input Unit 1304 may include a Graphics Processing Unit (GPU) 13041 and a microphone 13042, and the Graphics processor 13041 processes image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 1306. The image frames processed by the graphic processor 13041 may be stored in the memory 1309 (or other storage medium) or transmitted via the radio frequency unit 1301 or the network module 1302. The microphone 13042 can receive sounds and can process such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 1301 in case of a phone call mode.
The mobile terminal 1300 also includes at least one sensor 1305, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 13061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 13061 and/or backlight when the mobile terminal 1300 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 1305 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which will not be described in detail herein.
The display unit 1306 is used to display information input by a user or information provided to the user. The Display unit 1306 may include a Display panel 13061, and the Display panel 13061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 1307 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 1307 includes a touch panel 13071 and other input devices 13072. Touch panel 13071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on touch panel 13071 or near touch panel 13071 using a finger, stylus, or any other suitable object or attachment). The touch panel 13071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1310, and receives and executes commands sent from the processor 1310. In addition, the touch panel 13071 can be implemented by various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 1307 may include other input devices 13072 in addition to the touch panel 13071. In particular, the other input devices 13072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 13071 can be overlaid on the display panel 13061, and when the touch panel 13071 detects a touch operation on or near the touch panel, the touch operation can be transmitted to the processor 1310 to determine the type of the touch event, and then the processor 1310 can provide a corresponding visual output on the display panel 13061 according to the type of the touch event. Although the touch panel 13071 and the display panel 13061 are shown in fig. 13 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 13071 and the display panel 13061 may be integrated to implement the input and output functions of the mobile terminal, and are not limited herein.
The interface unit 1308 is an interface through which an external device is connected to the mobile terminal 1300. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. Interface unit 1308 may be used to receive input from an external device (e.g., data information, power, etc.) and transmit the received input to one or more elements within mobile terminal 1300 or may be used to transmit data between mobile terminal 1300 and an external device.
The memory 1309 may be used to store software programs as well as various data. The memory 1309 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 1309 can include high-speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 1310 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 1309 and calling data stored in the memory 1309, thereby performing overall monitoring of the mobile terminal. Processor 1310 may include one or more processing units; preferably, the processor 1310 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 1310.
The mobile terminal 1300 may also include a power supply 1311 (e.g., a battery) for powering the various components, and preferably, the power supply 1311 may be logically coupled to the processor 1310 via a power management system that provides functionality for managing charging, discharging, and power consumption via the power management system.
In addition, the mobile terminal 1300 includes some functional modules that are not shown, and are not described herein again.
Preferably, an embodiment of the present invention further provides a mobile terminal, including a processor 1310, a memory 1309, and a computer program stored in the memory 1309 and capable of running on the processor 1310, where the computer program, when executed by the processor 1310, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and details are not described here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. An image processing method applied to a mobile terminal including a digital signal processor, comprising:
performing downsampling processing on the input graph and the guide graph in parallel to obtain a downsampled input graph and a downsampled guide graph;
processing the down-sampling input map and the down-sampling guide map to obtain a filter coefficient map and a filter offset map;
performing up-sampling processing on the filter coefficient graph and the filter offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph;
filtering the first up-sampling graph and the second up-sampling graph to obtain an output graph corresponding to the input graph;
the step of processing the downsampled input map and the downsampled guide map to obtain a filter coefficient map and a filter offset map includes:
calculating the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image;
generating a filtering coefficient graph and a filtering offset graph according to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input graph;
generating a filter coefficient graph and a filter offset graph according to the pixel value variance and the pixel value covariance of each pixel point in the downsampled input graph, comprising: calculating a pixel value proportional coefficient and a pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image; generating a pixel value proportion coefficient map associated with the pixel value proportion coefficient of each pixel point in the down-sampling input map and a pixel value offset map associated with the pixel value offset of each pixel point in the down-sampling input map; processing the pixel value scale coefficient map to obtain the filter coefficient map, and processing the pixel value offset map to obtain the filter offset map;
wherein, the step of calculating the pixel value proportionality coefficient and the pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image comprises: acquiring at least two multiplication factors corresponding to the pixel value variance of a target pixel point, and searching a corresponding value of each multiplication factor in the at least two multiplication factors in a preset lookup table in the digital signal processor; calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point based on a corresponding value of each multiplication factor in the at least two multiplication factors;
the step of calculating to obtain the pixel value proportionality coefficient and the pixel value offset of the target pixel point comprises: calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point according to a proportional coefficient calculation formula and an offset calculation formula, wherein the proportional coefficient calculation formula is as follows:
Figure FDA0002731378710000021
a' is a pixel value proportion coefficient of the target pixel point; cov represents the covariance of the pixel value of the target pixel point; n is a preset displacement and is greater than 7; the s is the number of multiplication factors corresponding to the pixel value variance of the target pixel point, and is more than or equal to 2; the W isiThe corresponding value of the ith multiplication factor in the preset lookup table is obtained;
the offset calculation formula is as follows:
Figure FDA0002731378710000022
b' is the pixel value offset of the target pixel point;
the sumI′A pixel sum value indicating a pixel point corresponding to the target pixel point in the integral map of the down-sampling guide map;
the sumP′A pixel sum value representing a pixel point corresponding to a target pixel point in an integrogram of the downsampled input graph;
the t' r/s represents a preset filter window width, and the s represents the sampling multiple.
2. The method according to claim 1, wherein the step of performing downsampling processing on the input map and the guide map in parallel to obtain a downsampled input map and a downsampled guide map comprises:
the method comprises the steps of performing downsampling processing on an input map in at least one input map processing thread to obtain a downsampled input map, and performing downsampling processing on a guide map in at least one guide map processing thread to obtain a downsampled guide map, wherein the at least one input map processing thread and the at least one guide map processing thread are threads which are processed in parallel in the digital signal processor respectively;
the step of performing parallel upsampling processing on the filter coefficient map and the filter offset map to obtain a first upsampling map and a second upsampling map comprises the following steps:
the method comprises the steps of carrying out up-sampling processing on a filter coefficient map in at least one filter coefficient map processing thread to obtain a first up-sampling map, and carrying out up-sampling processing on the filter offset map in at least one filter offset map processing thread to obtain a second up-sampling map, wherein the at least one filter coefficient map processing thread and the at least one filter offset map processing thread are threads which are processed in parallel in a digital signal processor respectively.
3. The method of claim 1, wherein the step of calculating the pixel value variance and the pixel value covariance for each pixel point in the downsampled input map comprises:
calculating to obtain the pixel value variance of a target pixel point by using a pixel value variance calculation formula, wherein the target pixel point is any pixel point in the down-sampling input image, and the pixel value variance calculation formula is as follows:
var=t′*sumI′I′-sumI′·sumI′
the var represents the pixel value variance of the target pixel point;
the sumI′I′Expressing the pixel sum value of a pixel point corresponding to the target pixel point in an integral image of an image obtained by multiplying two down-sampling guide images pixel by pixel point;
calculating to obtain the pixel value covariance of the target pixel point by using a pixel value covariance calculation formula, wherein the pixel value covariance calculation formula is as follows:
cov=t′*sumI′P′-sumI′·sumP′
the sumI′P′And expressing the pixel sum value of the pixel point corresponding to the pixel point in the integral image of the image obtained by multiplying the down-sampling guide image and the down-sampling input image pixel by pixel point.
4. An image processing apparatus comprising a digital signal processor, characterized in that the apparatus further comprises:
the down-sampling module is used for carrying out down-sampling processing on the input image and the guide image in parallel to obtain a down-sampling input image and a down-sampling guide image;
the filter parameter calculation module is used for processing the downsampling input map and the downsampling guide map to obtain a filter coefficient map and a filter offset map;
the up-sampling module is used for carrying out up-sampling processing on the filtering coefficient graph and the filtering offset graph in parallel to obtain a first up-sampling graph and a second up-sampling graph;
the filtering processing module is used for carrying out filtering processing on the first upper sampling image and the second upper sampling image to obtain an output image corresponding to the input image;
the filtering parameter calculation module comprises:
the calculation submodule is used for calculating the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image;
the filtering parameter image generating submodule is used for generating a filtering coefficient image and a filtering offset image according to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image;
the filtering parameter map generation submodule is further configured to: calculating a pixel value proportional coefficient and a pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image; generating a pixel value proportion coefficient map associated with the pixel value proportion coefficient of each pixel point in the down-sampling input map and a pixel value offset map associated with the pixel value offset of each pixel point in the down-sampling input map; processing the pixel value scale coefficient map to obtain the filter coefficient map, and processing the pixel value offset map to obtain the filter offset map;
wherein, the step of calculating the pixel value proportionality coefficient and the pixel value offset corresponding to the pixel value variance and the pixel value covariance of each pixel point in the down-sampling input image comprises: acquiring at least two multiplication factors corresponding to the pixel value variance of a target pixel point, and searching a corresponding value of each multiplication factor in the at least two multiplication factors in a preset lookup table in the digital signal processor; calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point based on a corresponding value of each multiplication factor in the at least two multiplication factors;
the step of calculating to obtain the pixel value proportionality coefficient and the pixel value offset of the target pixel point comprises: calculating to obtain a pixel value proportional coefficient and a pixel value offset of the target pixel point according to a proportional coefficient calculation formula and an offset calculation formula, wherein the proportional coefficient calculation formula is as follows:
Figure FDA0002731378710000051
cov represents the covariance of the pixel value of the target pixel point;
a' is a pixel value proportion coefficient of the target pixel point; n is a preset displacement and is greater than 7; the s is the number of multiplication factors corresponding to the pixel value variance of the target pixel point, and is more than or equal to 2; the W isiThe corresponding value of the ith multiplication factor in the preset lookup table is obtained;
the offset calculation formula is as follows:
Figure FDA0002731378710000052
b' is the pixel value offset of the target pixel point;
the sumP′A pixel sum value representing a pixel point corresponding to a target pixel point in an integrogram of the downsampled input graph;
the sumI′Is shown inIn the integral image of the down-sampling guide image, the pixel sum value of the pixel point corresponding to the target pixel point;
the t' r/s represents a preset filter window width, and the s represents the sampling multiple.
5. The apparatus of claim 4,
the down-sampling module is further configured to perform down-sampling processing on the input map in at least one input map processing thread to obtain a down-sampled input map, and perform down-sampling processing on the guide map in at least one guide map processing thread to obtain a down-sampled guide map, where the at least one input map processing thread and the at least one guide map processing thread are threads that are processed in parallel in the digital signal processor, respectively;
the up-sampling module is further configured to perform up-sampling processing on the filter coefficient map in at least one filter coefficient map processing thread to obtain a first up-sampling map, and perform up-sampling processing on the filter offset map in at least one filter offset map processing thread to obtain a second up-sampling map, where the at least one filter coefficient map processing thread and the at least one filter offset map processing thread are threads of parallel processing in the digital signal processor, respectively.
6. The apparatus of claim 4, wherein the computation submodule comprises:
the variance calculation unit is used for calculating the pixel value variance of a target pixel point by using a pixel value variance calculation formula, wherein the target pixel point is any one pixel point in the down-sampling input image, and the pixel value variance calculation formula is as follows:
var=t′*sumI′I′-sumI′·sumI′
the var represents the pixel value variance of the target pixel point;
the sumI′I′Representing the step by step of the guide map under two said downsamplesIn an integral image of the image obtained by multiplying the pixel points, the pixel sum value of the pixel point corresponding to the target pixel point;
the covariance calculation unit is configured to calculate a pixel value covariance of the target pixel point by using a pixel value covariance calculation formula, where the pixel value covariance calculation formula is:
cov=t′*sumI′P′-sumI′·sumP′
the sumI′P′And expressing the pixel sum value of the pixel point corresponding to the pixel point in the integral image of the image obtained by multiplying the down-sampling guide image and the down-sampling input image pixel by pixel point.
7. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the image processing method according to any one of claims 1 to 3.
8. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 3.
CN201711078087.4A 2017-11-06 2017-11-06 Image processing method and device and mobile terminal Active CN107886481B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711078087.4A CN107886481B (en) 2017-11-06 2017-11-06 Image processing method and device and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711078087.4A CN107886481B (en) 2017-11-06 2017-11-06 Image processing method and device and mobile terminal

Publications (2)

Publication Number Publication Date
CN107886481A CN107886481A (en) 2018-04-06
CN107886481B true CN107886481B (en) 2021-01-08

Family

ID=61778834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711078087.4A Active CN107886481B (en) 2017-11-06 2017-11-06 Image processing method and device and mobile terminal

Country Status (1)

Country Link
CN (1) CN107886481B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109064414B (en) * 2018-07-06 2020-11-10 维沃移动通信有限公司 Image denoising method and device
CN109474784B (en) * 2018-11-21 2020-07-17 维沃移动通信有限公司 Preview image processing method and terminal equipment
CN110263730B (en) 2019-06-24 2022-01-21 北京达佳互联信息技术有限公司 Image recognition method and device, electronic equipment and storage medium
CN110458766B (en) * 2019-07-11 2023-08-25 天津大学 Snapshot hyperspectral image demosaicing method
CN111199523B (en) * 2019-12-24 2023-08-25 深圳供电局有限公司 Power equipment identification method, device, computer equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571401A (en) * 2013-10-18 2015-04-29 中国航天科工集团第三研究院第八三五八研究所 Implementing device of high-speed guiding filter on FPGA (field programmable gate array) platform
CN103745446B (en) * 2014-01-27 2017-08-29 广东威创视讯科技股份有限公司 Image guiding filtering method and system
CN104063847A (en) * 2014-06-18 2014-09-24 长春理工大学 FPGA based guide filter and achieving method thereof
CN106933579A (en) * 2017-03-01 2017-07-07 西安电子科技大学 Image rapid defogging method based on CPU+FPGA

Also Published As

Publication number Publication date
CN107886481A (en) 2018-04-06

Similar Documents

Publication Publication Date Title
CN107886481B (en) Image processing method and device and mobile terminal
CN108513070B (en) Image processing method, mobile terminal and computer readable storage medium
CN107909583B (en) Image processing method and device and terminal
CN108495029B (en) Photographing method and mobile terminal
CN108989672B (en) Shooting method and mobile terminal
CN107749046B (en) Image processing method and mobile terminal
CN112308806A (en) Image processing method, image processing device, electronic equipment and readable storage medium
CN111147752B (en) Zoom factor adjusting method, electronic device, and medium
CN111031234B (en) Image processing method and electronic equipment
CN111145087B (en) Image processing method and electronic equipment
CN111352892B (en) Operation processing method and electronic equipment
CN110007821B (en) Operation method and terminal equipment
CN116033269A (en) Linkage auxiliary anti-shake shooting method, equipment and computer readable storage medium
CN111008929A (en) Image correction method and electronic equipment
CN109104573B (en) Method for determining focusing point and terminal equipment
CN111062261A (en) Image processing method and device
CN107798662B (en) Image processing method and mobile terminal
CN110769162B (en) Electronic equipment and focusing method
CN110490953B (en) Text-based image generation method, terminal device and medium
CN108965701B (en) Jitter correction method and terminal equipment
CN111031265B (en) FSR (frequency selective response) determining method and electronic equipment
CN110807411B (en) Moon identification method and electronic equipment
CN111145083B (en) Image processing method, electronic equipment and computer readable storage medium
CN108845753B (en) Picture processing method and terminal
CN115221888A (en) Entity mention identification method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant