CN109903224B - Image scaling method and device, computer equipment and storage medium - Google Patents

Image scaling method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN109903224B
CN109903224B CN201910071588.2A CN201910071588A CN109903224B CN 109903224 B CN109903224 B CN 109903224B CN 201910071588 A CN201910071588 A CN 201910071588A CN 109903224 B CN109903224 B CN 109903224B
Authority
CN
China
Prior art keywords
value
original pixel
neighborhood
pixel point
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910071588.2A
Other languages
Chinese (zh)
Other versions
CN109903224A (en
Inventor
谭伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Jieli Technology Co Ltd
Original Assignee
Zhuhai Jieli Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Jieli Technology Co Ltd filed Critical Zhuhai Jieli Technology Co Ltd
Priority to CN201910071588.2A priority Critical patent/CN109903224B/en
Publication of CN109903224A publication Critical patent/CN109903224A/en
Application granted granted Critical
Publication of CN109903224B publication Critical patent/CN109903224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The application relates to an image scaling method, an image scaling device, a computer device and a storage medium. The method comprises the following steps: acquiring pixel values of original pixel points in a preset neighborhood around a mapping coordinate point of a pixel point to be interpolated in a target image; determining edge gradient direction information and edge strength information of the mapping coordinate points according to pixel values of the original pixel points; determining the weight of each original pixel point in the neighborhood; determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the weight; weighting the weight of each original pixel point by using the edge enhancement attenuation correction coefficient to obtain a convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood. By adopting the method, the edge information of the original image can be kept in the zooming process, and the quality of the zoomed target image is improved.

Description

Image scaling method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image scaling method and apparatus, a computer device, and a storage medium.
Background
With the development of display technologies, the demand for image scaling technologies between different resolutions is also increasing, for example, in high definition or ultra high definition display technologies, it is necessary to scale the existing video resources to a matching size by the image scaling technology to perform scaling display on a small screen or an ultra high definition screen, so as to meet the requirement of display resolution.
The traditional image scaling methods mainly include an interpolation method based on low-pass filtering and an interpolation method based on edges. The low-pass filtering interpolation method includes bilinear interpolation, bicubic interpolation, etc. In the traditional image interpolation algorithm, the adjacent interpolation is simple and easy to realize, and is generally applied in the early stage, but the method can generate obvious sawtooth edges and mosaic phenomena in a new image. The bilinear interpolation method has a smoothing function, can effectively overcome the defects of the adjacent method, but can degrade the high-frequency part of the image to make the details of the image blurred. When the magnification factor is higher, high-order interpolation, such as bicubic and cubic spline interpolation, has good effect compared with low-order interpolation.
In the traditional image scaling method, the pixel gray value generated by interpolation can continue the continuity of the gray change of the original image through interpolation operation, so that the gray change of the image obtained after scaling is natural and smooth. However, in an image, there is a sudden change in gray value between some pixels and adjacent pixels, that is, there is a gray discontinuity, and these pixels with a sudden change in gray value are edge pixels of an image that describes the contour or texture of an object in the image.
Disclosure of Invention
In view of the foregoing, it is desirable to provide an image scaling method, an image scaling apparatus, a computer device, and a storage medium, which can retain image edge information and improve the quality of a target image obtained after scaling.
A method of image scaling, the method comprising:
acquiring a pixel point to be interpolated in a target image; determining a mapping coordinate point of a pixel point to be interpolated in an original image, and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image; determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel value of each original pixel point in the neighborhood; determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood; determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance; weighting the weight of each original pixel point by utilizing the edge enhancement attenuation correction coefficient to obtain the convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
An image scaling apparatus, the apparatus comprising:
the interpolation-waiting pixel point acquisition module is used for acquiring interpolation-waiting pixel points in the target image;
the neighborhood pixel acquisition module is used for determining the mapping coordinate point of the pixel point to be interpolated in the original image and acquiring the pixel value of each original pixel point in the preset neighborhood around the mapping coordinate point in the original image;
the edge information determining module is used for determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel values of all original pixel points in the neighborhood;
the original weight determining module is used for determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood;
the enhancement attenuation correction coefficient determining module is used for determining the edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance;
the pixel value determining module of the pixel point to be interpolated is used for weighting the weight of each original pixel point by utilizing the edge enhancement attenuation correction coefficient to obtain the convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring a pixel point to be interpolated in a target image; determining a mapping coordinate point of a pixel point to be interpolated in an original image, and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image; determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel value of each original pixel point in the neighborhood; determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood; determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance; weighting the weight of each original pixel point by using the edge enhancement attenuation correction coefficient to obtain a convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a pixel point to be interpolated in a target image; determining a mapping coordinate point of a pixel point to be interpolated in an original image, and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image; determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel value of each original pixel point in the neighborhood; determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood; determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance; weighting the weight of each original pixel point by using the edge enhancement attenuation correction coefficient to obtain a convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
According to the image zooming method, the image zooming device, the computer equipment and the storage medium, the pixel value of each original pixel point in the neighborhood is preset according to the mapping coordinate point in the original image corresponding to the pixel point to be interpolated, the edge gradient direction information and the edge strength information of the mapping coordinate point are determined, the weight of each original pixel point in the neighborhood is enhanced or attenuated to obtain the weight of each enhanced or attenuated original pixel point, and the pixel value of the pixel point to be interpolated is obtained according to the weight of each enhanced or attenuated original pixel point and the pixel value of each original pixel point. The method can keep the correlation of the pixels before and after the interpolation process, and can carry out interpolation processing on the edge in any direction so as to keep the edge information of the original image, so that the image can be kept clear after being amplified or reduced, the phenomena of sawtooth distortion and the like are avoided, the integrity and the correlation of the edge information of the image are kept, the outline and the texture of the zoomed image are clearer, and the quality of the target image obtained after the zooming is improved.
Drawings
FIG. 1 is a flow diagram illustrating an embodiment of an image scaling method;
FIG. 2 is a flowchart illustrating an image scaling method according to another embodiment;
FIG. 3 is a schematic diagram in one embodiment;
FIG. 4 is a schematic diagram in one embodiment;
FIG. 5 is a schematic diagram in one embodiment;
FIG. 6 is a schematic diagram in one embodiment;
FIG. 7 is a schematic view in one embodiment;
FIG. 8 is a block diagram showing the configuration of an image scaling apparatus according to an embodiment;
FIG. 9 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clearly understood, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of and not restrictive on the broad application.
The image zooming method can be applied to the terminal. The terminal can be, but is not limited to, various televisions, personal computers, notebook computers, smart phones, tablet computers, portable wearable devices and the like.
In one embodiment, as shown in fig. 1, there is provided an image scaling method, including the steps S110-S180 of:
s110, acquiring pixel points to be interpolated in a target image;
the target image is obtained by zooming an original image, the image difference is a process of calculating unknown pixel points in the target image by using known pixel points in the original image, and the pixel points to be interpolated are the unknown pixel points which need to be determined through calculation in the zoomed target image.
S120, determining a mapping coordinate point of a pixel point to be interpolated in the original image;
by way of example, as shown in FIG. 3, assume thatThe input original image is amplified into a high-resolution target image in low resolution, and the width and the height of the original image are respectively W src 、h src The width and height of the target image are W tar 、h tar . The coordinate value (I) of the pixel point to be interpolated in the row I and the column J in the target image tar ,J tar ) The coordinate value of the mapping coordinate point in the original image is (I) src ,J src ):
Figure GDA0004019109800000051
S130, acquiring a pixel value of each original pixel point in a preset neighborhood around a mapping coordinate point in the original image;
each original pixel point in a preset neighborhood around the mapping coordinate point can determine the number of the original pixel points in the neighborhood according to actual needs, for example, the number of the original pixel points around the original pixel point can be 2 × 2,3 × 4,3 × 3 or 6 × 6 and the like. The more the original pixel points in the neighborhood are adopted, the better the target image effect generated after zooming is, the more complete the information is, but the corresponding calculation amount can be increased, so that the proper number of the original pixel points in the neighborhood can be selected according to the zooming condition.
As an example, as shown in FIG. 4, a pixel point (I) to be interpolated may be selected tar ,J tar ) Mapping coordinate points (I) in an original image src ,J src ) And taking original pixel points in the surrounding 4 multiplied by 4 neighborhood as neighborhood pixels.
S140, determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel values of all original pixel points in the neighborhood;
the pixel value is a quantity representing a value of a pixel, and may be, for example, a channel value of the pixel in a certain color space, and taking an RGB color space as an example, the pixel value of the pixel may be represented as (R, G, B), where R, G, and B are channel values of three channels of the RGB color space, respectively.
In one embodiment, as shown in fig. 2, the step S140 of determining the edge gradient direction information and the edge strength information of the mapping coordinate point according to the pixel values of the original pixel points in the neighborhood includes steps S141 to S142:
s141, determining a horizontal edge gradient value of a mapping coordinate point and a corresponding horizontal edge strength value according to the pixel value of each original pixel point in the neighborhood;
in one embodiment, the step S141 of determining the horizontal edge gradient value and the corresponding horizontal edge strength value of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood includes: calculating the mean value of the pixel values of all original pixel points in the neighborhood; acquiring original pixel points of a preset row in original pixel points in a neighborhood; determining a horizontal edge gradient value of a mapping coordinate point according to a difference value of original pixel points of a preset row in the horizontal direction; and determining the horizontal edge strength value of the mapping coordinate point according to the difference value between the original pixel point of the preset line and the average value.
Further, in an embodiment, obtaining original pixel points in a preset row of original pixel points in a neighborhood includes: and acquiring original pixel points of the rows except the first row and the last row in the original pixel points in the neighborhood as original pixel points of a preset row.
As an example, to map coordinate points (I) src ,J src ) Taking original pixel points in a neighborhood of 4 × 4 as an example, the mean value of pixel values of each original pixel point in the neighborhood is calculated as follows:
Figure GDA0004019109800000071
in the above formula, avg is the average value of the pixel values of each original pixel point in the neighborhood, i is the index of the row of the original pixel point in the neighborhood, i takes the value of 0-3,j as the index of the column of the original pixel point in the neighborhood, and j takes the value of 0-3,p i,j Is the pixel value of the original pixel point (i, j) in the neighborhood.
Mapping coordinate points (I) src ,J src ) The horizontal edge gradient value of (a) is calculated as follows:
h_diff0=-1*p 10 -2*p 11 +2*p 12 +1*p 13
h_diff1=-1*p 20 -2*p 21 +2*p 22 +1*p 23
h_diff=h_diff0+h_diff1
in the above formula, h _ diff0 is a difference value of a first row of original pixel points in the neighborhood in the horizontal direction, h _ diff1 is a difference value of a second row of original pixel points in the neighborhood in the horizontal direction, and h _ diff is a horizontal edge gradient value of a mapping coordinate point.
Mapping coordinate points (I) src ,J src ) The corresponding horizontal edge strength value is calculated as follows:
Figure GDA0004019109800000072
in the above formula, h _ var is the horizontal edge strength value, and abs is the absolute value.
In the above embodiment, to simply calculate the horizontal edge gradient value and the corresponding horizontal edge intensity value of the original pixel point of the mapping coordinate point in the 4 × 4 neighborhood, the horizontal edge gradient value and the horizontal edge intensity value are calculated by using the pixel values of the first row and the second row with higher correlation among the original pixel points in the neighborhood (the original pixel points in the neighborhood are 0,1,2,3 rows and 0,1,2,3 columns, respectively). Because the relevance of the first column and the second column of each row in the neighborhood to pixel points to be interpolated is higher than that of the zeroth column and the third column, the method adopts [ -1, -2,2,1] T The matrix calculates the difference value of the original pixel points of the first row and the second row in the horizontal direction, and judges the edge horizontal gradient direction according to the positive and negative of the sum of the difference values in the row direction. And obtaining corresponding difference values according to the calculated pixel average value in the neighborhood and the pixel values of all columns in the first row and the second row, and summing absolute values to obtain a horizontal edge strength value serving as strength information in the horizontal direction of the edge.
And S142, determining the vertical edge gradient value of the mapping coordinate point and the corresponding vertical edge strength value according to the pixel value of each original pixel point in the neighborhood.
In one embodiment, the S142 determines, according to the pixel value of each original pixel point in the neighborhood, a vertical edge gradient value and a corresponding vertical edge strength value of the mapping coordinate point, including: calculating the mean value of the pixel values of all original pixel points in the neighborhood; acquiring original pixel points of a preset column in original pixel points in a neighborhood; determining a vertical edge gradient value of a mapping coordinate point according to a difference value of original pixel points of a preset column in the vertical direction; and determining the vertical edge strength value of the mapping coordinate point according to the difference value between the original pixel point and the mean value of the preset column.
In one embodiment, obtaining a preset column of original pixels in the neighborhood includes: and acquiring original pixel points of the rows except the first row and the last row in the original pixel points in the neighborhood as original pixel points of a preset row.
By way of example, coordinate points (I) are also mapped src ,J src ) Taking original pixel points in a neighborhood of 4 × 4 as an example, the mean value of pixel values of each original pixel point in the neighborhood is calculated as follows:
Figure GDA0004019109800000081
in the above formula, avg is the average value of the pixel values of each original pixel point in the neighborhood, i is the index of the row of the original pixel point in the neighborhood, i takes the value of 0-3,j as the index of the row of the original pixel point in the neighborhood, and j takes the value of 0-3,p i,j Is the pixel value of the original pixel point (i, j) in the neighborhood.
Mapping coordinate points (I) src ,J src ) The vertical edge gradient value of (a) is calculated as follows:
v_diff0=-1*p 01 -2*p 11 +2*p 21 +1*p 31
v_diff1=-1*p 02 -2*p 12 +2*p 22 +1*p 32
v_diff=v_diff0+v_diff1
in the above formula, v _ diff0 is a difference value of a first row of original pixel points in a neighborhood in the vertical direction, v _ diff1 is a difference value of a second row of original pixel points in the neighborhood in the vertical direction, and v _ diff is a vertical edge gradient value of a mapping coordinate point.
Mapping coordinate points (I) src ,J src ) The corresponding vertical edge strength value is calculated as follows:
Figure GDA0004019109800000091
in the above formula, v _ var is a vertical edge strength value, and abs is an absolute value.
In the above embodiment, to simply calculate the vertical edge gradient value and the corresponding vertical edge intensity value of the original pixel points of the mapping coordinate point in the 4 × 4 neighborhood, the vertical edge gradient value and the vertical edge intensity value are calculated by using the pixel values of the first column and the second column with higher correlation among the original pixel points in the neighborhood (the original pixel points in the neighborhood are 0,1,2,3 rows and 0,1,2,3 columns, respectively). Because the correlation between the first row and the second row of each column in the neighborhood and the pixel points to be interpolated is higher than that between the zeroth row and the third row, the matrix of [ -1-2 1] is adopted to calculate the difference value of the original pixel points of the first column and the second column in the vertical direction, and the vertical gradient direction of the edge is judged according to the positive and negative of the sum of the difference values in the column direction. And obtaining corresponding difference values according to the calculated pixel average value in the neighborhood and the pixel values of each row in the first column and the second column, and summing absolute values to obtain a vertical edge strength value serving as strength information for judging the vertical direction of the edge.
S150, determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood;
illustratively, the gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood is calculated as follows:
Figure GDA0004019109800000092
wherein, (x, y) is the coordinate value of any original pixel point in the neighborhood of 4*4 with the pixel point to be interpolated as the center, r sd (x, y) is the Gaussian distance between the mapping coordinate point and the original pixel point with the coordinate (x, y) in the neighborhood;
taking original pixel points in 4x4 neighborhoods as neighborsFor example, in this step, 16 original pixel points have 16 coordinate values (x, y), and 16 gaussian distances r can be calculated respectively sd (x,y)。
According to r sd (x, y) determining the weight of the original pixel point in the corresponding neighborhood as follows:
Figure GDA0004019109800000101
wherein, (x, y) is the coordinate value of the corresponding original pixel point in the neighborhood, w (x,y) Is r sd Weight corresponding to (x, y), n x To normalize the coefficient, r sd (x, y) is the distance between the original pixel points in the neighborhood and the mapping coordinate points, and sigma is the standard deviation of corresponding Gaussian distribution;
taking 4 × 4 original pixels in the neighborhood as the neighborhood pixels, in this step, 16 gaussian distances r are used sd (x, y) 16 weights for 16 original pixel points can be calculated.
The above is the basic principle of the weight calculation in this step, and in practical application, to simplify the operation, the weights corresponding to different gaussian distances can be obtained in this step directly by querying the stored weight table. Specifically, the corresponding weight value may be obtained by looking up a table according to the gaussian curve of the corresponding smooth low-pass filtering and the gaussian distance of each point in the neighborhood, and the coefficient n may be normalized x The corresponding value is the reciprocal of the sum of the weights of all points in the neighborhood;
in the embodiment, the contribution weight of each original pixel point in the neighborhood to the current pixel point to be interpolated can be effectively obtained through the Gaussian distance, so that the correlation between the pixel point to be interpolated and the original pixel point in the neighborhood of the original image is reserved to a certain extent, and the image edge characteristics in the original image are reserved to a certain extent.
S160, determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance;
in one embodiment, as shown in fig. 2, the step S160 of determining the edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the gaussian distance includes steps S161-S163:
s161, determining a horizontal edge correction coefficient of each original pixel point according to the horizontal edge gradient value and the horizontal edge intensity value of the mapping coordinate point and the Gaussian distance of each original pixel point relative to the mapping coordinate point;
in one embodiment, the S161 determines the horizontal edge correction coefficient of each original pixel point according to the horizontal edge gradient value of the mapping coordinate point, the horizontal edge strength value, and the gaussian distance of each original pixel point from the mapping coordinate point, including: determining the positive and negative of a horizontal edge enhancement attenuation index according to the horizontal edge gradient value of the mapping coordinate point; determining the magnitude of the enhancement attenuation base number of the horizontal edge according to the horizontal edge strength value of the mapping coordinate point; aiming at each original pixel point, taking the horizontal edge enhancement attenuation base number as the base number, taking the product of the horizontal edge enhancement attenuation index and the Gaussian distance from the original pixel point to the mapping coordinate point as the index, and performing power operation to obtain the horizontal edge correction coefficient of the original pixel point.
As an example, from the horizontal edge gradient values of the mapping coordinate points, the positive or negative of the horizontal edge enhancement decay index is determined as follows:
Figure GDA0004019109800000111
in the above equation, h _ signed is the horizontal edge enhancement attenuation exponent.
Determining the magnitude of a horizontal edge enhancement attenuation base number alpha according to the horizontal edge intensity value of the mapping coordinate point;
aiming at each original pixel point, taking the horizontal edge enhancement attenuation base number as the base number, taking the product of the horizontal edge enhancement attenuation index and the Gaussian distance from the original pixel point to the mapping coordinate point as the index, and performing power operation to obtain the horizontal edge correction coefficient of the original pixel point as shown in the following formula:
Figure GDA0004019109800000112
in the above formula, α' (x, y) is a horizontal edge correction coefficient of the original pixel point having coordinates (x, y).
In the above embodiment, for the purpose of simplifying the operation, in one embodiment, the step S161 determines the horizontal edge correction coefficient of each original pixel point according to the horizontal edge gradient value of the mapping coordinate point, the horizontal edge intensity value, and the gaussian distance between each original pixel point and the mapping coordinate point, where the step S161 includes: determining a corresponding curve set according to the positive and negative of the gradient value of the horizontal edge of the mapping coordinate point; the curve set comprises an enhancement curve set with the curve slope larger than zero or an attenuation curve set with the curve slope smaller than zero; selecting a corresponding curve from the determined curve set according to the horizontal edge strength value, wherein the horizontal edge strength value is in direct proportion to the absolute value of the slope of the selected corresponding curve; and inquiring a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa from the selected curve, and taking the vertical coordinate as the horizontal edge correction coefficient of each original pixel point.
As an example, when the horizontal edge gradient value h _ diff of the mapping coordinate point is greater than or equal to zero, determining as an enhancement curve set; the enhancement curve set comprises a preset number of enhancement curves, and each enhancement curve corresponds to a horizontal edge gradient value h _ diff of a value interval;
as shown in fig. 5, which is a schematic diagram of 7 enhancement curves in an enhancement curve set in an embodiment, according to the horizontal edge strength value h _ var, a corresponding enhancement curve is selected from the enhancement curve set as follows:
when h _ var is less than strengthh _ level _ th0, selecting a level0 curve;
when the strengthh _ level _ th0 is not less than h _ var < strengthh _ level _ th1, selecting a level1 curve;
when the strengthh _ level _ th1 is not less than h _ var < strengthh _ level _ th2, selecting a level2 curve;
when the strengthh _ level _ th2 is not less than h _ var < strengthh _ level _ th3, selecting a level3 curve;
when the strengthh _ level _ th3 is not less than h _ var < strengthh _ level _ th4, selecting a level4 curve;
when the strengthh _ level _ th4 is not less than h _ var < strengthh _ level _ th5, selecting a level5 curve;
when the strengthh _ level _ th5 is not less than h _ var < strengthh _ level _ th6, selecting a level6 curve;
when the strengthh _ level _ th6 is not less than h _ var < strengthh _ level _ th7, selecting a level7 curve;
when h _ var is larger than or equal to strengthh _ level _ th7, selecting a level8 curve;
wherein, the strength _ level _ th 0-7 represents a threshold value of a preset value interval, which can be set according to requirements, and the slopes of the corresponding curve levels 1-8 are sequentially increased; the slope of the level0 curve is 0, that is, no enhancement is performed, and the value of α' (x, y) queried in the curve by each corresponding original pixel (x, y) is 1.
After the corresponding enhancement curve is selected, a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa is inquired from the selected enhancement curve and is used as the horizontal edge correction coefficient of each original pixel point.
When the horizontal edge gradient value h _ diff of the mapping coordinate point is less than zero, determining the mapping coordinate point as an attenuation curve set; the attenuation curve set comprises a preset number of attenuation curves, and each attenuation curve corresponds to a horizontal edge gradient value h _ diff of a value interval;
as shown in fig. 6, which is a schematic diagram of 7 attenuation curves in an attenuation curve set in one embodiment, a corresponding attenuation curve is selected from the attenuation curve set according to the horizontal edge intensity value h _ var as follows:
when h _ var is less than strengthh _ level _ th0, selecting a level0 curve;
when the strengthh _ level _ th0 is not less than h _ var and less than strengthh _ level _ th1, selecting a level1 curve;
when the strengthh _ level _ th1 is not less than h _ var < strengthh _ level _ th2, selecting a level2 curve;
when the strengthh _ level _ th2 is not less than h _ var < strengthh _ level _ th3, selecting a level3 curve;
when the strengthh _ level _ th3 is not less than h _ var < strengthh _ level _ th4, selecting a level4 curve;
when the strengthh _ level _ th4 is not less than h _ var < strengthh _ level _ th5, selecting a level5 curve;
when the strengthh _ level _ th5 is not less than h _ var < strengthh _ level _ th6, selecting a level6 curve;
when the strengthh _ level _ th6 is not less than h _ var < strengthh _ level _ th7, selecting a level7 curve;
when h _ var is larger than or equal to strengthh _ level _ th7, selecting a level8 curve;
the strength _ level _ th 0-7 represents a threshold value of a preset value interval, which can be set according to requirements, the slopes of corresponding curves level 1-8 are sequentially decreased (the absolute values of the slopes are sequentially increased), wherein the slope of the level0 curve is 0, that is, attenuation is not performed, and the values of alpha' (x, y) queried by corresponding original pixel points (x, y) in the curve are all 1.
After the corresponding attenuation curve is selected, a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa is inquired from the selected attenuation curve and is used as the horizontal edge correction coefficient of each original pixel point.
In other embodiments, there are various alternatives to the above scheme for obtaining the corresponding horizontal edge correction coefficient through curve query, for example, the curve may be converted into a corresponding table, and the horizontal edge correction coefficient of each original pixel point may be queried and determined through table lookup according to the horizontal edge gradient value of the mapping coordinate point, the horizontal edge intensity value, and the gaussian distance of each original pixel point relative to the mapping coordinate point.
S162, determining a vertical edge correction coefficient of each original pixel according to the vertical edge gradient value and the vertical edge strength value of the mapping coordinate and the Gaussian distance of each original pixel relative to the mapping coordinate;
in one embodiment, the S162 determines the vertical edge rectification coefficient of each original pixel according to the vertical edge gradient value of the mapping coordinate point, the vertical edge strength value, and the gaussian distance of each original pixel from the mapping coordinate point, including: determining a corresponding curve set according to the positive and negative of the gradient value of the vertical edge of the mapping coordinate point; the curve set comprises an enhancement curve set with the curve slope larger than zero or an attenuation curve set with the curve slope smaller than zero; selecting a corresponding curve from the determined curve set according to the vertical edge strength value, wherein the vertical edge strength value is in direct proportion to the absolute value of the slope of the selected corresponding curve; and inquiring a vertical coordinate corresponding to an abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the vertical edge correction coefficient of each original pixel point from the selected curve.
By way of example, from the vertical edge gradient values of the mapped coordinate points, the positive and negative of the vertical edge enhancement attenuation index are determined as follows:
Figure GDA0004019109800000141
in the above equation, v _ signed is a vertical edge emphasis attenuation exponent.
Determining the magnitude of the enhancement attenuation base number alpha of the vertical edge according to the vertical edge strength value of the mapping coordinate point;
aiming at each original pixel point, taking the enhancement attenuation base number of the vertical edge as a base number, taking the product of the enhancement attenuation index of the vertical edge and the Gaussian distance from the original pixel point to the mapping coordinate point as an index, and performing power operation to obtain the correction coefficient of the vertical edge of the original pixel point as shown in the following formula:
Figure GDA0004019109800000142
/>
in the above equation, β' (x, y) is the vertical edge correction coefficient of the original pixel point with coordinates (x, y).
In the above embodiment, for the purpose of simplifying the operation, in one embodiment, the step S162 determines the vertical edge correction coefficient of each original pixel point according to the vertical edge gradient value and the vertical edge intensity value of the mapping coordinate point and the gaussian distance between each original pixel point and the mapping coordinate point, where the vertical edge correction coefficient is calculated according to the principle in the step S162, and the method includes: determining a corresponding curve set according to the positive and negative of the gradient value of the vertical edge of the mapping coordinate point; the curve set comprises an enhancement curve set with the curve slope larger than zero or an attenuation curve set with the curve slope smaller than zero; selecting a corresponding curve from the determined curve set according to the vertical edge strength value, wherein the vertical edge strength value is in direct proportion to the absolute value of the slope of the selected corresponding curve; and inquiring a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa from the selected curve, and taking the vertical coordinate as the vertical edge correction coefficient of each original pixel point.
As an example, when the vertical edge gradient value v _ diff of the mapping coordinate point is greater than or equal to zero, determining as an enhancement curve set; the enhancement curve set comprises a preset number of enhancement curves, and each enhancement curve corresponds to a vertical edge gradient value v _ diff of a value interval;
as shown in fig. 5, which is a schematic diagram of 7 enhancement curves in an enhancement curve set in one embodiment, a corresponding enhancement curve is selected from the enhancement curve set according to a vertical edge strength value v _ var as follows:
when v _ var is less than strengthh _ level _ th0, selecting a level0 curve;
when the value v _ var is less than or equal to v _ var and less than the value strengthh _ level _ th1, selecting a level1 curve;
when the strengthh _ level _ th1 is not less than v _ var < strengthh _ level _ th2, selecting a level2 curve;
when the strengthh _ level _ th2 is not less than v _ var < strengthh _ level _ th3, selecting a level3 curve;
when the strengthh _ level _ th3 is not less than v _ var < strengthh _ level _ th4, selecting a level4 curve;
when the strengthh _ level _ th4 is not less than v _ var < strengthh _ level _ th5, selecting a level5 curve;
when the strengthh _ level _ th5 is not less than v _ var < strengthh _ level _ th6, selecting a level6 curve;
when the strengthh _ level _ th6 is not less than v _ var < strengthh _ level _ th7, selecting a level7 curve;
when v _ var is larger than or equal to strength _ level _ th7, selecting a level8 curve;
the length _ level _ th 0-7 represents a threshold value of a preset value interval, which can be set according to requirements, and the slopes of corresponding curves level 1-8 are sequentially increased, wherein the slope of the level0 curve is 0, i.e., no enhancement is performed, and the values of beta' (x, y) queried by corresponding original pixel points (x, y) in the curve are all 1.
After the corresponding enhancement curve is selected, a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa is inquired from the selected enhancement curve and is used as the vertical edge correction coefficient of each original pixel point.
When the vertical edge gradient value v _ diff of the mapping coordinate point is less than zero, determining the vertical edge gradient value v _ diff as an attenuation curve set; the attenuation curve set comprises a preset number of attenuation curves, and each attenuation curve corresponds to a vertical edge gradient value v _ diff of a value interval;
as shown in fig. 6, which is a schematic diagram of 7 attenuation curves in the attenuation curve set in one embodiment, according to the vertical edge strength value v _ var, the corresponding attenuation curve is selected from the attenuation curve set as follows:
when v _ var is less than strengthh _ level _ th0, selecting a level0 curve;
when the value v _ var is less than or equal to v _ var and less than the value strengthh _ level _ th1, selecting a level1 curve;
when the strengthh _ level _ th1 is not less than v _ var < strengthh _ level _ th2, selecting a level2 curve;
when the strengthh _ level _ th2 is not less than v _ var < strengthh _ level _ th3, selecting a level3 curve;
when the strengthh _ level _ th3 is not less than v _ var < strengthh _ level _ th4, selecting a level4 curve;
when the strengthh _ level _ th4 is not less than v _ var < strengthh _ level _ th5, selecting a level5 curve;
when the strengthh _ level _ th5 is not less than v _ var < strengthh _ level _ th6, selecting a level6 curve;
when the strengthh _ level _ th6 is not less than v _ var < strengthh _ level _ th7, selecting a level7 curve;
when v _ var is larger than or equal to strength _ level _ th7, selecting a level8 curve;
the length _ level _ th 0-7 represents a threshold value of a preset value interval, which can be set according to requirements, the slopes of the corresponding curves level 1-8 are sequentially decreased (the absolute values of the slopes are sequentially increased), wherein the slope of the level0 curve is 0, that is, attenuation is not performed, and the values of β' (x, y) queried by the corresponding original pixel points (x, y) in the curve are all 1.
After the corresponding attenuation curve is selected, a vertical coordinate corresponding to an abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa is inquired from the selected attenuation curve and is used as a vertical edge correction coefficient of each original pixel point.
In other embodiments, there are various alternatives to the above scheme of obtaining the corresponding vertical edge correction coefficient through curve query, for example, the curve may be converted into a corresponding table, and the vertical edge correction coefficient of each original pixel point may be queried and determined through a table lookup manner according to the vertical edge gradient value of the mapped coordinate point, the vertical edge strength value, and the gaussian distance of each original pixel point relative to the mapped coordinate point, and so on.
And S163, determining the edge enhancement attenuation correction coefficient of each original pixel point according to the horizontal edge correction coefficient and the vertical edge correction coefficient of each original pixel point.
By way of example, the edge enhancement attenuation correction coefficient coe of each original pixel point may be determined according to the horizontal edge correction coefficient α '(x, y) and the vertical edge correction coefficient β' (x, y) of each original pixel point ij As shown in the following formula:
coe ij =α′(x,y)×β′(x,y)
wherein i is the index of original pixel point row in the neighborhood, j is the index of original pixel point column in the neighborhood, coe ij And (x, y) is the coordinate value of the original pixel point (i, j) in the neighborhood. Taking the example that the neighborhood includes 4 × 4 original pixel points, i can take the value of 0 to 3,j can take the value of 0 to 3.
S170, weighting the weight of each original pixel point by utilizing the edge enhancement attenuation correction coefficient to obtain a convolution filtering weight;
as an example, taking 4 × 4 original pixels included in a neighborhood as an example, weighting the weight of each original pixel by using an edge enhancement attenuation correction coefficient, and obtaining a convolution filtering weight as shown in the following formula:
Figure GDA0004019109800000181
wherein, the subscript ij represents the ith row and jth column original pixel point in the corresponding neighborhood, w 00 ~w 33 Represents the corresponding attenuation-enhanced weight, w, of each original pixel in the neighborhood sad00 ~w sad33 The weight before attenuation enhancement of each original pixel point in the neighborhood is calculated by representing the Gaussian distance of each original pixel point in the neighborhood,
Figure GDA0004019109800000182
coe 00 ~coe 33 representing the edge-enhanced attenuation correction coefficients of each original pixel point in the neighborhood,
Figure GDA0004019109800000183
/>
and S180, determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
As an example, taking 4 × 4 original pixel points included in a neighborhood as an example, determining the pixel value of a pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood as follows:
Figure GDA0004019109800000184
in which pixel i,j Representing the pixel value, p, of the ith row and jth column pixel point to be interpolated in the neighborhood 00 ~p 33 Representing 4x4 neighbourhood of pixel point to be interpolatedThe pixel value of each original pixel point.
As shown in fig. 7, which is a schematic diagram of a scaling process of a convolution filtering window, h _ step and v _ step are step offsets of a pixel to be interpolated in a horizontal direction and a vertical direction, respectively; and searching original pixel points in 4x4 neighborhoods around the mapping coordinates in the original image by taking the pixel points to be interpolated as the center position of the convolution window, respectively weighting, summing and normalizing to obtain the pixel values of the pixel points to be interpolated, namely, obtaining the pixel values of all the pixel points to be interpolated in the target image by the method for obtaining the pixel values of the pixel points to be interpolated in the target image by convolution, namely, completing the process of obtaining the target image by zooming the original image.
According to the image scaling method, the edge gradient direction information and the edge strength information of the mapping coordinate point are determined according to the pixel value of each original pixel point in the neighborhood preset around the mapping coordinate point in the original image corresponding to the pixel point to be interpolated, the weight of each original pixel point in the neighborhood is enhanced or attenuated to obtain the weight of each enhanced or attenuated original pixel point, and the pixel value of the pixel point to be interpolated is obtained according to the weight of each enhanced or attenuated original pixel point and the pixel value of each original pixel point. The method can keep the correlation of the pixels before and after the interpolation process, and can carry out interpolation processing on the edge in any direction so as to keep the edge information of the original image, so that the image can be kept clear after being amplified or reduced, the phenomena of sawtooth distortion and the like are avoided, the integrity and the correlation of the edge information of the image are kept, the outline and the texture of the zoomed image are clearer, and the quality of the target image obtained after the zooming is improved.
It should be understood that although the various steps in the flow charts of fig. 1-3 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-3 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 8, there is provided an image scaling apparatus 800 comprising: a pixel point to be interpolated obtaining module 810, a neighborhood pixel obtaining module 820, an edge information determining module 830, an original weight determining module 840, an enhanced attenuation correction coefficient determining module 850, and a pixel point to be interpolated pixel value determining module 860, wherein:
a to-be-interpolated pixel point obtaining module 810, configured to obtain a to-be-interpolated pixel point in a target image;
a neighborhood pixel obtaining module 820, configured to determine a mapping coordinate point of a pixel to be interpolated in an original image, and obtain a pixel value of each original pixel in a preset neighborhood around the mapping coordinate point in the original image;
an edge information determining module 830, configured to determine edge gradient direction information and edge strength information of the mapping coordinate point according to a pixel value of each original pixel point in a neighborhood;
the original weight determining module 840 is used for determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood;
an enhanced attenuation correction coefficient determining module 850, configured to determine an edge enhanced attenuation correction coefficient of each original pixel point in a neighborhood according to the edge gradient direction information, the edge strength information, and the gaussian distance;
a pixel value determining module 860 for weighting the weight of each original pixel by using the edge-enhanced attenuation correction coefficient to obtain a convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
In one embodiment, the edge information determination module 830 includes:
the horizontal edge information determining module is used for determining the horizontal edge gradient value of the mapping coordinate point and the corresponding horizontal edge strength value according to the pixel value of each original pixel point in the neighborhood;
and the vertical edge information determining module is used for determining the vertical edge gradient value of the mapping coordinate point and the corresponding vertical edge strength value according to the pixel value of each original pixel point in the neighborhood.
In one embodiment, the horizontal edge information determination module is to: calculating the mean value of the pixel values of all original pixel points in the neighborhood; acquiring original pixel points of a preset row in original pixel points in a neighborhood; determining a horizontal edge gradient value of a mapping coordinate point according to a difference value of original pixel points of a preset row in the horizontal direction; and determining the horizontal edge strength value of the mapping coordinate point according to the difference value between the original pixel point of the preset line and the average value.
In one embodiment, the vertical edge information determination module is to: calculating the mean value of the pixel values of all original pixel points in the neighborhood; acquiring original pixel points of a preset column in original pixel points in a neighborhood; determining a vertical edge gradient value of a mapping coordinate point according to a difference value of original pixel points of a preset column in the vertical direction; and determining the vertical edge strength value of the mapping coordinate point according to the difference value between the original pixel point and the mean value of the preset column.
In one embodiment, the enhanced attenuation correction factor determination module 850 includes:
the horizontal edge correction coefficient determining module is used for determining the horizontal edge correction coefficient of each original pixel point according to the horizontal edge gradient value and the horizontal edge intensity value of the mapping coordinate point and the Gaussian distance of each original pixel point relative to the mapping coordinate point;
the vertical edge correction coefficient determining module is used for determining the vertical edge correction coefficient of each original pixel according to the vertical edge gradient value and the vertical edge strength value of the mapping coordinate and the Gaussian distance of each original pixel relative to the mapping coordinate;
and the edge enhancement attenuation correction coefficient determining module is used for determining the edge enhancement attenuation correction coefficient of each original pixel point according to the horizontal edge correction coefficient and the vertical edge correction coefficient of each original pixel point.
In one embodiment, the horizontal edge correction factor determination module is to: determining the positive and negative of a horizontal edge enhancement attenuation index according to the horizontal edge gradient value of the mapping coordinate point; determining the magnitude of the horizontal edge enhancement attenuation base number according to the horizontal edge strength value of the mapping coordinate point; aiming at each original pixel point, taking the horizontal edge enhancement attenuation base number as the base number, taking the product of the horizontal edge enhancement attenuation index and the Gaussian distance from the original pixel point to the mapping coordinate point as the index, and performing power operation to obtain the horizontal edge correction coefficient of the original pixel point.
In one embodiment, the horizontal edge correction factor determination module is to: determining a corresponding curve set according to the positive and negative of the gradient value of the horizontal edge of the mapping coordinate point; the curve set comprises an enhancement curve set with the curve slope larger than zero or an attenuation curve set with the curve slope smaller than zero; selecting a corresponding curve from the determined curve set according to the horizontal edge strength value, wherein the horizontal edge strength value is in direct proportion to the absolute value of the slope of the selected corresponding curve; and inquiring a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the horizontal edge correction coefficient of each original pixel point from the selected curve.
In one embodiment, the vertical edge correction factor determination module is to: determining the positive and negative of a vertical edge enhanced attenuation index according to the vertical edge gradient value of the mapping coordinate point; determining the enhancement attenuation base number of the vertical edge according to the vertical edge strength value of the mapping coordinate point; and aiming at each original pixel point, performing power operation by taking the vertical edge enhanced attenuation base number as the base number and taking the product of the vertical edge enhanced attenuation index and the Gaussian distance from the original pixel point to the mapping coordinate point as the index to obtain the vertical edge correction coefficient of the original pixel point.
In one embodiment, the vertical edge correction factor determination module is to: determining a corresponding curve set according to the positive and negative of the gradient value of the vertical edge of the mapping coordinate point; the curve set comprises an enhancement curve set with the curve slope larger than zero or an attenuation curve set with the curve slope smaller than zero; selecting a corresponding curve from the determined curve set according to the vertical edge strength value, wherein the vertical edge strength value is in direct proportion to the absolute value of the slope of the selected corresponding curve; and inquiring a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa from the selected curve, and taking the vertical coordinate as the vertical edge correction coefficient of each original pixel point.
In one embodiment, the original weight determining module is configured to determine, according to a gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood, a weight of each original pixel point in the neighborhood as follows:
Figure GDA0004019109800000221
wherein, (x, y) is the coordinate value of the original pixel point in the neighborhood, r sd (x, y) is the Gaussian distance between the original pixel point in the neighborhood and the mapping coordinate point, w (x,y) Is the weight of the original pixel point (x, y), n x To normalize the coefficients, σ is the standard deviation of the corresponding gaussian distribution.
In one embodiment, the enhancement attenuation correction factor determining module 850 is configured to determine the edge enhancement attenuation correction factor of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information, and the gaussian distance as follows:
Figure GDA0004019109800000222
in the above formula, coe ij The edge enhancement attenuation correction coefficient of the ith row and jth column original pixel point in the neighborhood is defined, (x, y) is the coordinate value of the ith row and jth column original pixel point in the neighborhood, r sd (x, y) is the Gaussian distance between the ith row and the jth column original pixel point in the neighborhood and the mapping coordinate point, the positive and negative of h _ signed is determined according to the positive and negative of the gradient value of the horizontal edge, the positive and negative of v _ signed is determined according to the positive and negative of the gradient value of the vertical edge, the value size of alpha is positively correlated with the strength value of the horizontal edge, the value size of beta is positively correlated with the strength value of the vertical edgeThe values are positively correlated.
For the specific definition of the image scaling device, reference may be made to the above definition of the image scaling method, which is not described herein again. The modules in the image scaling apparatus may be wholly or partially implemented by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 9. The computer device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an image scaling method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 9 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring a pixel point to be interpolated in a target image; determining a mapping coordinate point of a pixel point to be interpolated in an original image, and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image; determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel value of each original pixel point in the neighborhood; determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood; determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance; weighting the weight of each original pixel point by utilizing the edge enhancement attenuation correction coefficient to obtain the convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
In other embodiments, the processor, when executing the computer program, further performs the steps of the image scaling method according to any of the above embodiments.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring a pixel point to be interpolated in a target image; determining a mapping coordinate point of a pixel point to be interpolated in an original image, and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image; determining edge gradient direction information and edge strength information of the mapping coordinate points according to the pixel value of each original pixel point in the neighborhood; determining the weight of each original pixel point in the neighborhood according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood; determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance; weighting the weight of each original pixel point by using the edge enhancement attenuation correction coefficient to obtain a convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
In other embodiments, the computer program, when executed by the processor, further implements the steps of the image scaling method of any of the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), rambus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (15)

1. A method of image scaling, the method comprising:
acquiring a pixel point to be interpolated in a target image;
determining a mapping coordinate point of the pixel point to be interpolated in the original image, and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image;
determining edge gradient direction information and edge strength information of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood, wherein the edge gradient direction information comprises a horizontal edge gradient value and a vertical edge gradient value, the edge strength information comprises a horizontal edge strength value and a vertical edge strength value,
selecting original pixel points in a neighborhood of 4x4 around the mapping coordinate point, wherein the mean value of pixel values of all the original pixel points in the neighborhood is calculated as follows:
Figure FDA0004053546010000011
avg is the average value of the pixel values of all original pixel points in the neighborhood, i is the subscript of the row of the original pixel points in the neighborhood, i takes the value of 0-3,j as the subscript of the row of the original pixel points in the neighborhood, and j takes the value of 0-3,p i,j The pixel value of the original pixel point (i, j) in the neighborhood,
the horizontal edge strength value is calculated as follows:
Figure FDA0004053546010000012
wherein h _ var is the horizontal edge strength value, abs is the absolute value,
the vertical edge strength value is calculated as follows:
Figure FDA0004053546010000013
in the formula, v _ var is a vertical edge strength value, and abs is an absolute value;
determining the weight of each original pixel point in the neighborhood by adopting the following formula according to the Gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood:
Figure FDA0004053546010000014
wherein, (x, y) is the coordinate value of the original pixel point in the neighborhood, r sd (x, y) is the Gaussian distance between the original pixel point with the coordinate value (x, y) in the neighborhood and the mapping coordinate point, w (x,y) Is the weight of the original pixel point with coordinate value of (x, y), n x Is a normalized coefficient, and σ is a standard deviation of the corresponding gaussian distribution;
determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance, wherein the edge enhancement attenuation correction coefficient is calculated by adopting the following formula:
Figure FDA0004053546010000021
of these, coe ij The method comprises the steps that an edge enhancement attenuation correction coefficient of the ith row and the jth column of original pixel points in a neighborhood is calculated, h _ signed is a horizontal edge enhancement attenuation index, the positive and negative of h _ signed is determined according to the positive and negative of a horizontal edge gradient value, v _ signed is a vertical edge enhancement attenuation index, the positive and negative of v _ signed is determined according to the positive and negative of a vertical edge gradient value, alpha is a horizontal edge enhancement attenuation base number, the value size of alpha is in positive correlation with a horizontal edge strength value, beta is a vertical edge enhancement attenuation base number, and the value size of beta is in positive correlation with a vertical edge strength value;
weighting the weight of each original pixel point by using the edge enhancement attenuation correction coefficient to obtain a convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
2. The method according to claim 1, wherein the determining edge gradient direction information and edge strength information of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood comprises:
determining a horizontal edge gradient value and a corresponding horizontal edge intensity value of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood;
and determining the vertical edge gradient value and the corresponding vertical edge strength value of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood.
3. The method of claim 2, wherein determining the horizontal edge gradient value and the corresponding horizontal edge strength value for the mapped coordinate point according to the pixel values of the original pixels in the neighborhood comprises:
calculating the mean value of the pixel values of all the original pixel points in the neighborhood;
acquiring original pixel points of a preset row in the original pixel points in the neighborhood;
determining a horizontal edge gradient value of the mapping coordinate point according to the difference value of the original pixel points of the preset row in the horizontal direction;
and determining the horizontal edge intensity value of the mapping coordinate point according to the difference value between the original pixel point of the preset row and the mean value.
4. The method as claimed in claim 2, wherein determining the vertical edge gradient value and the corresponding vertical edge strength value of the mapping coordinate point according to the pixel values of the original pixels in the neighborhood includes:
calculating the mean value of the pixel values of all the original pixel points in the neighborhood;
acquiring original pixel points of a preset column in the original pixel points in the neighborhood;
determining a vertical edge gradient value of the mapping coordinate point according to the difference value of the original pixel points of the preset column in the vertical direction;
and determining the vertical edge strength value of the mapping coordinate point according to the difference value between the original pixel point of the preset column and the average value.
5. The method according to any one of claims 1 to 4, wherein the determining an edge enhancement attenuation correction coefficient of each original pixel point in the neighborhood according to the edge gradient direction information, the edge strength information and the Gaussian distance comprises:
determining a horizontal edge correction coefficient of each original pixel point according to the horizontal edge gradient value and the horizontal edge intensity value of the mapping coordinate point and the Gaussian distance of each original pixel point relative to the mapping coordinate point;
determining a vertical edge correction coefficient of each original pixel point according to the vertical edge gradient value and the vertical edge strength value of the mapping coordinate point and the Gaussian distance of each original pixel point relative to the mapping coordinate point;
and determining the edge enhancement attenuation correction coefficient of each original pixel point according to the horizontal edge correction coefficient and the vertical edge correction coefficient of each original pixel point.
6. The method of claim 5, wherein determining the horizontal edge rectification coefficients for each raw pixel point based on the horizontal edge gradient values, the horizontal edge intensity values, and the Gaussian distance of each raw pixel point from the mapped coordinate point comprises:
determining the positive and negative of a horizontal edge enhancement attenuation index according to the horizontal edge gradient value of the mapping coordinate point;
determining the magnitude of the horizontal edge enhancement attenuation base number according to the horizontal edge strength value of the mapping coordinate point;
aiming at each original pixel point, taking the horizontal edge enhancement attenuation base number as the base number, taking the product of the horizontal edge enhancement attenuation index and the Gaussian distance from the original pixel point to the mapping coordinate point as the index, and performing power operation to obtain the horizontal edge correction coefficient of the original pixel point.
7. The method of claim 5, wherein determining the horizontal edge rectification coefficient for each raw pixel point according to the horizontal edge gradient value, the horizontal edge intensity value and the Gaussian distance of each raw pixel point from the mapped coordinate point comprises:
determining a corresponding curve set according to the positive and negative of the horizontal edge gradient value of the mapping coordinate point; the curve set comprises an enhancement curve set with a curve slope larger than zero or an attenuation curve set with a curve slope smaller than zero;
selecting a corresponding curve from the determined curve set according to the horizontal edge intensity value, wherein the horizontal edge intensity value is in direct proportion to the absolute value of the slope of the selected corresponding curve;
and inquiring a vertical coordinate corresponding to the abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa from the selected curve, and taking the vertical coordinate as the horizontal edge correction coefficient of each original pixel point.
8. The method of claim 5, wherein determining the vertical edge rectification coefficient for each original pixel point according to the vertical edge gradient value, the vertical edge strength value and the Gaussian distance of each original pixel point to the mapping coordinate point comprises:
determining the positive and negative of a vertical edge enhanced attenuation index according to the vertical edge gradient value of the mapping coordinate point;
determining the enhancement attenuation base number of the vertical edge according to the vertical edge strength value of the mapping coordinate point;
and aiming at each original pixel point, performing power operation by taking the vertical edge enhanced attenuation base number as the base number and taking the product of the vertical edge enhanced attenuation index and the Gaussian distance from the original pixel point to the mapping coordinate point as the index to obtain the vertical edge correction coefficient of the original pixel point.
9. The method of claim 5, wherein determining the vertical edge rectification coefficient for each original pixel point according to the vertical edge gradient value and the vertical edge intensity value of the mapped coordinate point and the Gaussian distance of each original pixel point relative to the mapped coordinate point comprises:
determining a corresponding curve set according to the positive and negative of the vertical edge gradient value of the mapping coordinate point; the curve set comprises an enhancement curve set with a curve slope larger than zero or an attenuation curve set with a curve slope smaller than zero;
selecting a corresponding curve from the determined curve set according to the vertical edge strength value, wherein the vertical edge strength value is in direct proportion to the absolute value of the slope of the selected corresponding curve;
and inquiring a vertical coordinate corresponding to an abscissa by taking the Gaussian distance between each original pixel point and the mapping coordinate point as the abscissa from the selected curve, and taking the vertical coordinate as a vertical edge correction coefficient of each original pixel point.
10. An image scaling apparatus, characterized in that the apparatus comprises:
the to-be-interpolated pixel point acquisition module is used for acquiring a to-be-interpolated pixel point in the target image;
the neighborhood pixel acquisition module is used for determining a mapping coordinate point of the pixel point to be interpolated in the original image and acquiring a pixel value of each original pixel point in a preset neighborhood around the mapping coordinate point in the original image;
an edge information determining module, configured to determine, according to a pixel value of each original pixel point in the neighborhood, edge gradient direction information and edge strength information of the mapping coordinate point, where the edge gradient direction information includes a horizontal edge gradient value and a vertical edge gradient value, and the edge strength information includes a horizontal edge strength value and a vertical edge strength value; selecting a 4x4 neighborhood around the mapped coordinate pointThe average value of the pixel values of all the original pixel points in the neighborhood is calculated as follows:
Figure FDA0004053546010000061
avg is the average value of the pixel values of all original pixel points in the neighborhood, i is the subscript of the row of the original pixel points in the neighborhood, i takes the value of 0-3,j as the subscript of the row of the original pixel points in the neighborhood, and j takes the value of 0-3,p i,j The pixel value of the original pixel point (i, j) in the neighborhood; the horizontal edge strength value is calculated as follows:
Figure FDA0004053546010000062
in the formula, h _ var is a horizontal edge strength value, and abs is an absolute value; the vertical edge strength value is calculated as follows:
Figure FDA0004053546010000063
in the formula, v _ var is a vertical edge strength value, and abs is an absolute value;
an original weight determining module, configured to determine, according to a gaussian distance between the mapping coordinate point and each original pixel point in the neighborhood, a weight of each original pixel point in the neighborhood by using the following formula:
Figure FDA0004053546010000064
wherein, (x, y) is the coordinate value of the original pixel point in the neighborhood, r sd (x, y) is the Gaussian distance between the original pixel point with the coordinate value (x, y) in the neighborhood and the mapping coordinate point, w (x,y) Is the weight of the original pixel point with coordinate value of (x, y), n x Is a normalized coefficient, and σ is a standard deviation of the corresponding gaussian distribution;
an enhanced attenuation correction coefficient determining module, configured to determine, according to the edge gradient direction information, the edge strength information, and the gaussian distance, an edge enhanced attenuation correction coefficient of each original pixel point in the neighborhood, where the edge enhanced attenuation correction coefficient is calculated by using the following formula:
Figure FDA0004053546010000065
wherein coe ij The method comprises the steps that an edge enhancement attenuation correction coefficient of the ith row and the jth column of original pixel points in a neighborhood is calculated, h _ signed is a horizontal edge enhancement attenuation index, the positive and negative of h _ signed is determined according to the positive and negative of a horizontal edge gradient value, v _ signed is a vertical edge enhancement attenuation index, the positive and negative of v _ signed is determined according to the positive and negative of a vertical edge gradient value, alpha is a horizontal edge enhancement attenuation base number, the value size of alpha is in positive correlation with a horizontal edge strength value, beta is a vertical edge enhancement attenuation base number, and the value size of beta is in positive correlation with a vertical edge strength value;
the pixel value determining module of the pixel point to be interpolated is used for weighting the weight of each original pixel point by utilizing the edge enhancement attenuation correction coefficient to obtain the convolution filtering weight; and determining the pixel value of the pixel point to be interpolated in the target image according to the product of the convolution filtering weight and the pixel value of each original pixel point in the neighborhood.
11. The apparatus of claim 10, wherein the edge information determining module comprises:
the horizontal edge information determining module is used for determining a horizontal edge gradient value and a corresponding horizontal edge strength value of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood;
and the vertical edge information determining module is used for determining the vertical edge gradient value and the corresponding vertical edge strength value of the mapping coordinate point according to the pixel value of each original pixel point in the neighborhood.
12. The apparatus according to claim 11, wherein the horizontal edge information determining module is configured to calculate a mean value of pixel values of original pixels in the neighborhood; acquiring original pixel points of a preset row in the original pixel points in the neighborhood; determining a horizontal edge gradient value of the mapping coordinate point according to the difference value of the original pixel points of the preset row in the horizontal direction; and determining the horizontal edge intensity value of the mapping coordinate point according to the difference value between the original pixel point of the preset row and the mean value.
13. The apparatus of claim 11, wherein the vertical edge information determining module is configured to calculate a mean value of pixel values of original pixels in the neighborhood; acquiring original pixel points of a preset column in the original pixel points in the neighborhood; determining a vertical edge gradient value of the mapping coordinate point according to a difference value of the original pixel points of the preset column in the vertical direction; and determining the vertical edge strength value of the mapping coordinate point according to the difference value between the original pixel point of the preset column and the average value.
14. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the image scaling method according to any of claims 1 to 9 when executing the computer program.
15. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image scaling method according to any one of claims 1 to 9.
CN201910071588.2A 2019-01-25 2019-01-25 Image scaling method and device, computer equipment and storage medium Active CN109903224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910071588.2A CN109903224B (en) 2019-01-25 2019-01-25 Image scaling method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910071588.2A CN109903224B (en) 2019-01-25 2019-01-25 Image scaling method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109903224A CN109903224A (en) 2019-06-18
CN109903224B true CN109903224B (en) 2023-03-31

Family

ID=66944245

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910071588.2A Active CN109903224B (en) 2019-01-25 2019-01-25 Image scaling method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109903224B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110738625B (en) * 2019-10-21 2022-03-11 Oppo广东移动通信有限公司 Image resampling method, device, terminal and computer readable storage medium
CN112862673B (en) * 2019-11-12 2024-06-14 上海途擎微电子有限公司 Adaptive image scaling method, adaptive image scaling device, and storage device
CN111126254A (en) * 2019-12-23 2020-05-08 Oppo广东移动通信有限公司 Image recognition method, device, equipment and storage medium
CN111275730A (en) * 2020-01-13 2020-06-12 平安国际智慧城市科技股份有限公司 Method, device and equipment for determining map area and storage medium
KR20210112042A (en) * 2020-03-04 2021-09-14 에스케이하이닉스 주식회사 Image sensing device and operating method of the same
CN111724304B (en) * 2020-06-12 2024-04-19 深圳市爱协生科技股份有限公司 Image scaling method and device, terminal equipment and storage medium
CN113808012A (en) * 2020-06-17 2021-12-17 京东方科技集团股份有限公司 Image processing method, computer device, and computer-readable storage medium
CN112037273B (en) * 2020-09-09 2023-05-19 南昌虚拟现实研究院股份有限公司 Depth information acquisition method and device, readable storage medium and computer equipment
CN112508790B (en) * 2020-12-16 2023-11-14 上海联影医疗科技股份有限公司 Image interpolation method, device, equipment and medium
CN112435171B (en) * 2021-01-28 2021-04-20 杭州西瞳智能科技有限公司 Reconstruction method of image resolution
CN113850732B (en) * 2021-08-10 2024-06-21 深圳曦华科技有限公司 Local image processing method and device, electronic equipment and storage medium
CN113781370A (en) * 2021-08-19 2021-12-10 北京旷视科技有限公司 Image enhancement method and device and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216935A (en) * 2008-01-17 2008-07-09 四川虹微技术有限公司 Image amplification method based on spline function interpolation algorithm
CN101615288A (en) * 2008-06-27 2009-12-30 富士通株式会社 The equipment, method and the computer readable recording medium storing program for performing that are used for pixel interpolating
CN106251339A (en) * 2016-07-21 2016-12-21 深圳市大疆创新科技有限公司 Image processing method and device
CN107644398A (en) * 2017-09-25 2018-01-30 上海兆芯集成电路有限公司 Image interpolation method and its associated picture interpolating device
CN108805806A (en) * 2017-04-28 2018-11-13 华为技术有限公司 Image processing method and device
CN108961167A (en) * 2018-07-12 2018-12-07 安徽理工大学 A kind of Bayer-CFA interpolation method based on finite difference and gradient

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9076232B2 (en) * 2011-11-29 2015-07-07 Industry-Academic Cooperation Foundation, Yonsei University Apparatus and method for interpolating image, and apparatus for processing image using the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101216935A (en) * 2008-01-17 2008-07-09 四川虹微技术有限公司 Image amplification method based on spline function interpolation algorithm
CN101615288A (en) * 2008-06-27 2009-12-30 富士通株式会社 The equipment, method and the computer readable recording medium storing program for performing that are used for pixel interpolating
CN106251339A (en) * 2016-07-21 2016-12-21 深圳市大疆创新科技有限公司 Image processing method and device
CN108805806A (en) * 2017-04-28 2018-11-13 华为技术有限公司 Image processing method and device
CN107644398A (en) * 2017-09-25 2018-01-30 上海兆芯集成电路有限公司 Image interpolation method and its associated picture interpolating device
CN108961167A (en) * 2018-07-12 2018-12-07 安徽理工大学 A kind of Bayer-CFA interpolation method based on finite difference and gradient

Also Published As

Publication number Publication date
CN109903224A (en) 2019-06-18

Similar Documents

Publication Publication Date Title
CN109903224B (en) Image scaling method and device, computer equipment and storage medium
WO2022199583A1 (en) Image processing method and apparatus, computer device, and storage medium
WO2018040463A1 (en) Data compression and decompression methods for demura table, and mura compensation method
WO2018032763A1 (en) Method and device for generating thermodynamic diagram
CN104794685B (en) A kind of method and device for realizing image denoising
CN106485720A (en) Image processing method and device
CN104376542A (en) Image enhancement method
CN112967207B (en) Image processing method and device, electronic equipment and storage medium
CN108961260B (en) Image binarization method and device and computer storage medium
CN110580693A (en) Image processing method, image processing device, computer equipment and storage medium
CN109934783B (en) Image processing method, image processing device, computer equipment and storage medium
CN111754429A (en) Motion vector post-processing method and device, electronic device and storage medium
CN111209908A (en) Method and device for updating label box, storage medium and computer equipment
CN112837254B (en) Image fusion method and device, terminal equipment and storage medium
CN111489318B (en) Medical image enhancement method and computer-readable storage medium
CN116843566A (en) Tone mapping method, tone mapping device, display device and storage medium
US9594955B2 (en) Modified wallis filter for improving the local contrast of GIS related images
US20230186442A1 (en) Image processing method, image processing system, and non-transitory computer readable storage medium
CN115147296A (en) Hyperspectral image correction method, device, computer equipment and storage medium
CN112967208B (en) Image processing method and device, electronic equipment and storage medium
CN114820674A (en) Arc contour extraction method, device, computer equipment and storage medium
CN111784733B (en) Image processing method, device, terminal and computer readable storage medium
CN112465931B (en) Image text erasing method, related equipment and readable storage medium
WO2020241337A1 (en) Image processing device
CN111543045B (en) Real-time brightness optimization and contrast optimization of images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 519000 No. 333, Kexing Road, Xiangzhou District, Zhuhai City, Guangdong Province

Applicant after: ZHUHAI JIELI TECHNOLOGY Co.,Ltd.

Address before: Floor 1-107, building 904, ShiJiHua Road, Zhuhai City, Guangdong Province

Applicant before: ZHUHAI JIELI TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant