CN104933736B - A kind of Vision Entropy acquisition methods and device - Google Patents

A kind of Vision Entropy acquisition methods and device Download PDF

Info

Publication number
CN104933736B
CN104933736B CN201410105824.5A CN201410105824A CN104933736B CN 104933736 B CN104933736 B CN 104933736B CN 201410105824 A CN201410105824 A CN 201410105824A CN 104933736 B CN104933736 B CN 104933736B
Authority
CN
China
Prior art keywords
pixel points
target pixel
parameter
gradient
entropy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410105824.5A
Other languages
Chinese (zh)
Other versions
CN104933736A (en
Inventor
赵寅
杨海涛
周炯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201410105824.5A priority Critical patent/CN104933736B/en
Publication of CN104933736A publication Critical patent/CN104933736A/en
Application granted granted Critical
Publication of CN104933736B publication Critical patent/CN104933736B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a kind of Vision Entropy acquisition methods and device, including:Obtain at least two target pixel points specified in image-region;Determine grain direction and texture strength corresponding to each described target pixel points;The entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and the Vision Entropy of the specified image-region is obtained according to the entropy of whole.Using the present invention, the grain direction of each target pixel points is not only allowed for, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, can truly reflect the complexity of specified image-region.

Description

A kind of Vision Entropy acquisition methods and device
Technical field
The present invention relates to field of computer technology, more particularly to a kind of Vision Entropy acquisition methods and device.
Background technology
Image is to be combined to form by multiple pixels, and the pixel in image is typically that photographed scene is adopted What sample obtained.In the picture in different specified image-regions, specify the pixel of image-region different due to forming, so not The complexity of same specified image-region is different, when same physical intensity(Such as identical mean square error)Distorted signal occur Different visual intensities will be showed when on the specified image-region of different complexities.Visual experiment is found, when specified When image-region is more complicated, then appears in this and specify the distortion on image-region to be more difficult to be found, this phenomenon is referred to as entropy masking Effect.In actual applications, in order to obtain the entropy masking effect of specified image-region, then usually require to obtain to specify image-region Complexity, and specify image-region complexity can be represented with Vision Entropy.In order to represent to specify image-region Complexity, the texture in industry generally according to specified image-region carry out computation vision entropy, wherein, texture have two it is important Attribute, direction and intensity, the direction of texture is by specifying the arrangement mode of pixel in image-region to determine that the intensity of texture is By specifying the pixel value of pixel in image-region is rich to determine.
Common, the Vision Entropy computational methods in industry are:Choose and specify M pixel in image-region, with this M picture The entropy of the pixel value of vegetarian refreshments, the i.e. entropy of the texture strength of this M pixel specify the Vision Entropy of image-region as this.It is existing Vision Entropy computational methods only account for specify image-region in each pixel pixel value it is rich, that is, weighed each picture The probability distribution of the texture strength of vegetarian refreshments, but without the probability for weighing the grain direction of each pixel in specified image-region Distribution, therefore, the Vision Entropy computational methods have certain limitation, and result of calculation is inaccurate, it is impossible to which truly reflection refers to Determine the complexity of image-region.
The content of the invention
The embodiment of the present invention provides a kind of Vision Entropy acquisition methods and device, can truly reflect answering for specified image-region Miscellaneous degree.
First aspect of the embodiment of the present invention provides a kind of Vision Entropy acquisition methods, it may include:
Obtain at least two target pixel points specified in image-region;
Determine grain direction and texture strength corresponding to each described target pixel points;
The entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and according to whole Entropy obtains the Vision Entropy of the specified image-region.
Based in a first aspect, in the first feasible embodiment, described each described target pixel points pair of determination The grain direction and the entropy of texture strength answered, and the Vision Entropy of the specified image-region is obtained according to the entropy of whole, wrap Include:
Calculate the joint of grain direction and texture strength corresponding to all target pixel points in the specified image-region Entropy, and the Vision Entropy using the combination entropy as the specified image-region;
Or the entropy of grain direction and the finger corresponding to all target pixel points in the calculating specified image-region Determine the weighted average of the entropy of texture strength corresponding to all target pixel points in image-region, and by the weighted average Vision Entropy as the specified image-region.
Based in a first aspect, in second of feasible embodiment, described each described object pixel of determination Grain direction and texture strength corresponding to point, including:
The parameter sets of each target pixel points are obtained, the parameter sets include the target pixel points at least The gradient parameter of one preset direction, wherein, the corresponding gradient parameter of a preset direction;
Grain direction according to corresponding to the parameter sets of each target pixel points determine the target pixel points;
The gradient parameter included in parameter sets to target pixel points each described quantifies, and obtains the target Texture strength corresponding to pixel.
Based on second of feasible embodiment of first aspect, in the third feasible embodiment, the basis is every The parameter sets of one target pixel points determine the grain direction corresponding to the target pixel points, including:
By the gradient parameter included in the parameter sets of each target pixel points compared with predetermined threshold value;
If the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value, by flat orientation Or any one of preset direction is defined as the grain direction corresponding to the target pixel points;
If at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points, by the target picture The preset direction corresponding to greatest gradient parameter in the parameter sets of vegetarian refreshments is defined as the texture corresponding to the target pixel points Direction, or by the perpendicular side of the preset direction corresponding to the greatest gradient parameter in the parameter sets with the target pixel points To the grain direction being defined as corresponding to the target pixel points.
Based on second of feasible embodiment of first aspect, in the 4th kind of feasible embodiment, the basis is every The parameter sets of one target pixel points determine the grain direction corresponding to the target pixel points, including:
By the gradient parameter included in the parameter sets of each target pixel points compared with predetermined threshold value;
If the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value, will preset Default direction be defined as grain direction corresponding to the target pixel points, wherein, the default direction set in advance is flat Any one preset direction in smooth direction or at least one preset direction;
If at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points, by the target picture The preset direction corresponding to minimal gradient parameter in the parameter sets of vegetarian refreshments is defined as the texture corresponding to the target pixel points Direction, or by the perpendicular side of the preset direction corresponding to the minimal gradient parameter in the parameter sets with the target pixel points To the grain direction being defined as corresponding to the target pixel points.
Based on second of feasible embodiment of first aspect, in the 5th kind of feasible embodiment, the basis is every The parameter sets of one target pixel points determine the grain direction corresponding to the target pixel points, including:
Obtain the neighborhood territory pixel point of each target pixel points, the neighborhood territory pixel point be using the target pixel points as Pixel in the range of the predeterminable area at center;
The neighborhood territory pixel of each target pixel points and target pixel points point is formed into target area, and obtains each The pixel value of all pixels point in the target area;
Determine corresponding to pixel and the minimum pixel value in each described target area corresponding to max pixel value Pixel;The pixel corresponding to max pixel value and the pixel corresponding to minimum pixel value in each described target area It is in the angle formed between line and each described preset direction between point, the minimum preset direction of the angle is true It is set to the grain direction corresponding to the target pixel points.
Based on second of feasible embodiment of first aspect, in the 6th kind of feasible embodiment, the basis is every The parameter sets of one target pixel points determine the grain direction corresponding to the target pixel points, including:
Preset direction corresponding to greatest gradient parameter in the parameter sets of each target pixel points is determined For the grain direction corresponding to the target pixel points, or by with the maximum in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to gradient parameter is defined as the grain direction corresponding to the target pixel points.
It is described according to each in the 7th kind of feasible implementation based on second of feasible embodiment of first aspect The parameter sets of the target pixel points determine the grain direction corresponding to the target pixel points, including:
Preset direction corresponding to minimal gradient parameter in the parameter sets of each target pixel points is determined For the grain direction corresponding to the target pixel points, or by with the minimum in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to gradient parameter is defined as the grain direction corresponding to the target pixel points.
It is described to each in the 8th kind of feasible embodiment based on second of feasible embodiment of first aspect Gradient parameter included in the parameter sets of the individual target pixel points is quantified, and is obtained corresponding to the target pixel points Texture strength, including:
Greatest gradient parameter in the parameter sets of each target pixel points is quantified, and quantized value is true It is set to texture strength corresponding to the target pixel points.
It is described to each in the 9th kind of feasible embodiment based on second of feasible embodiment of first aspect Gradient parameter included in the parameter sets of the individual target pixel points is quantified, and is obtained corresponding to the target pixel points Texture strength, including:
, respectively will be every when the parameter sets of each target pixel points include at least two gradient parameters Greatest gradient parameter and the weighted average of second largest gradient parameter in the parameter sets of one target pixel points Quantified, and quantized result is defined as texture strength corresponding to the target pixel points.
Based in a first aspect, in the tenth kind of feasible embodiment, described each described target pixel points pair of determination The grain direction and texture strength answered, including:
Obtain the first gradient parameter and the second gradient parameter of each target pixel points, the first gradient parameter For the target pixel points the first preset direction gradient parameter, second gradient parameter be the target pixel points it is pre- second The gradient parameter of set direction;First preset direction is different from second preset direction;
According to the first gradient parameter of each target pixel points and second gradient parameter, the mesh is calculated The reference angle of pixel is marked, the reference angle is the actual grain direction of the target pixel points relative to first preset direction Or the angle between second preset direction;
By the reference angle of each target pixel points compared with least one preselected reference angle, and it will compare Preselected direction compared with difference corresponding to minimum preselected reference angle is defined as grain direction corresponding to the target pixel points;Wherein, The preselected reference angle is preselected direction corresponding to the preselected reference angle relative to first preset direction or described the Angle between two preset directions;
The first gradient parameter to target pixel points each described and the maximum in second gradient parameter Quantification treatment is carried out, obtains the texture strength corresponding to the target pixel points, or the institute to target pixel points each described State first gradient parameter and the weighted average of second gradient parameter carries out quantification treatment, it is right to obtain target pixel points institute The texture strength answered.
Second aspect of the present invention provides a kind of Vision Entropy acquisition device, it may include:
Acquisition module, for obtaining at least two target pixel points in specified image-region;
First determining module, for determining grain direction and texture strength corresponding to each described target pixel points;
Second determining module, for determining grain direction corresponding to each described target pixel points and texture strength Entropy, and the Vision Entropy of the specified image-region is obtained according to the entropy of whole.
Based on second aspect, in the first feasible embodiment, second determining module is used for, and calculates the finger Determine the combination entropy of grain direction and texture strength corresponding to all target pixel points in image-region, and the combination entropy is made For the Vision Entropy of the specified image-region;
Or second determining module is used to calculate corresponding to all target pixel points in the specified image-region The weighted average of the entropy of texture strength corresponding to all target pixel points in the entropy of grain direction and the specified image-region Value, and the Vision Entropy using the weighted average as the specified image-region.
Based on second aspect, in second of feasible embodiment, first determining module includes:
First acquisition unit, for obtaining the parameter sets of each target pixel points, the parameter sets include The target pixel points at least one preset direction gradient parameter, wherein, the corresponding gradient parameter of preset direction;
Determining unit, corresponding to determining the target pixel points according to the parameter sets of each target pixel points Grain direction;
First quantifying unit, enter for the gradient parameter included in the parameter sets to target pixel points each described Row quantifies, and obtains the texture strength corresponding to the target pixel points.
It is described to determine list in the third feasible embodiment based on second of feasible embodiment of second aspect Member includes:
First comparing subunit, for by the gradient parameter included in the parameter sets of each target pixel points Compared with predetermined threshold value;
First determination subelement, if the gradient parameter included in parameter sets for the target pixel points is respectively less than pre- If threshold value, then flat orientation or any one of preset direction are defined as to the grain direction corresponding to the target pixel points;
Second determination subelement, if being more than at least one gradient parameter in the parameter sets of the target pixel points pre- If threshold value, then the preset direction corresponding to the greatest gradient parameter in the parameter sets of the target pixel points is defined as the target Grain direction corresponding to pixel, or by corresponding to the greatest gradient parameter in the parameter sets with the target pixel points The perpendicular direction of preset direction is defined as the grain direction corresponding to the target pixel points.
Based on second of feasible embodiment of second aspect, in the 4th kind of feasible embodiment, described first is true Order member includes:
Second comparing subunit, for by the gradient parameter included in the parameter sets of each target pixel points Compared with predetermined threshold value;
3rd determination subelement, if the gradient parameter included in parameter sets for the target pixel points is respectively less than pre- If threshold value, then default direction set in advance is defined as the grain direction corresponding to the target pixel points, wherein, it is described advance The default direction set is any one preset direction in flat orientation or at least one preset direction;
4th determination subelement, if being more than at least one gradient parameter in the parameter sets of the target pixel points pre- If threshold value, then the preset direction corresponding to the minimal gradient parameter in the parameter sets of the target pixel points is defined as the target Grain direction corresponding to pixel, or by corresponding to the minimal gradient parameter in the parameter sets with the target pixel points The perpendicular direction of preset direction is defined as the grain direction corresponding to the target pixel points.
It is described to determine list in the 5th kind of feasible embodiment based on second of feasible embodiment of second aspect Member includes:
Subelement is obtained, for obtaining the neighborhood territory pixel point of each target pixel points, the neighborhood territory pixel point is The pixel in the range of predeterminable area centered on the target pixel points;
Subelement is formed, for the neighborhood territory pixel point of each target pixel points and the target pixel points to be formed into target area Domain, and obtain the pixel value of all pixels point in each described target area;
5th determination subelement, for determine pixel in each described target area corresponding to max pixel value and Pixel corresponding to minimum pixel value;Pixel and minimum in each described target area corresponding to max pixel value In the angle formed between the line and each described preset direction between pixel corresponding to pixel value, by the angle The minimum preset direction of degree is defined as the grain direction corresponding to the target pixel points.
It is described to determine list in the 6th kind of feasible embodiment based on second of feasible embodiment of second aspect Member is used for, and the preset direction corresponding to the greatest gradient parameter in the parameter sets of each target pixel points is defined as Grain direction corresponding to the target pixel points, or will be terraced with the maximum in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to degree parameter is defined as the grain direction corresponding to the target pixel points.
It is described to determine list in the 7th kind of feasible embodiment based on second of feasible embodiment of second aspect Member is used for, and the preset direction corresponding to the minimal gradient parameter in the parameter sets of each target pixel points is defined as Grain direction corresponding to the target pixel points, or will be terraced with the minimum in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to degree parameter is defined as the grain direction corresponding to the target pixel points.
Based on second of feasible embodiment of second aspect, in the 8th kind of feasible embodiment, first amount Change unit to be used for, the greatest gradient parameter in the parameter sets of each target pixel points is quantified, and will quantify Value is defined as texture strength corresponding to the target pixel points.
Based on second of feasible embodiment of second aspect, in the 9th kind of feasible embodiment, first amount Change unit to be used for, when the parameter sets of each target pixel points include at least two gradient parameters, respectively Greatest gradient parameter in the parameter sets of each target pixel points and the weighting of second largest gradient parameter are put down Average is quantified, and quantized result is defined as into texture strength corresponding to the target pixel points.
Based on second aspect, in the tenth kind of feasible embodiment, first determining module includes:
Second acquisition unit, the first gradient parameter for obtaining each target pixel points are joined with the second gradient Number, the first gradient parameter are gradient parameter of the target pixel points in the first preset direction, and second gradient parameter is Gradient parameter of the target pixel points in the second preset direction;First preset direction is different from second preset direction;
Computing unit, for the first gradient parameter according to each target pixel points and second gradient Parameter, calculates the reference angle of the target pixel points, the reference angle for the target pixel points actual grain direction relative to institute State the angle between the first preset direction or second preset direction;
Compare determining unit, for by the reference angle of each target pixel points and at least one preselected reference Angle is compared, and the preselected direction corresponding to the preselected reference angle minimum by difference is compared is defined as the target pixel points and corresponded to Grain direction;Wherein, the preselected reference angle is that the preselected direction corresponding to the preselected reference angle is pre- relative to described first Angle between set direction or second preset direction;
Second quantifying unit, for the first gradient parameter to target pixel points each described and the described second ladder The maximum spent in parameter carries out quantification treatment, obtains the texture strength corresponding to the target pixel points, or to each institute State the first gradient parameter of target pixel points and carry out quantification treatment with the weighted average of second gradient parameter, obtain Texture strength corresponding to the target pixel points.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, embodiment will be described below In the required accompanying drawing used be briefly described, it should be apparent that, drawings in the following description be the present invention some implementation Example, for those of ordinary skill in the art, on the premise of not paying creative work, can also be obtained according to these accompanying drawings Obtain other accompanying drawings.
Fig. 1 is a kind of schematic flow sheet of Vision Entropy acquisition methods provided in an embodiment of the present invention;
Fig. 2 is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;
Fig. 3 is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;
Fig. 4 is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;
Fig. 5 a are the schematic flow sheets that a kind of grain direction provided in an embodiment of the present invention determines method;
Fig. 5 b are the schematic flow sheets that another grain direction provided in an embodiment of the present invention determines method;
Fig. 5 c are the schematic flow sheets that another grain direction provided in an embodiment of the present invention determines method;
Fig. 6 is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;
Fig. 7 is a kind of preselected direction schematic diagram provided in an embodiment of the present invention;
Fig. 8 is a kind of structural representation of Vision Entropy acquisition device provided in an embodiment of the present invention;
Fig. 9 is the structural representation of another Vision Entropy acquisition device provided in an embodiment of the present invention;
Figure 10 a are the structural representations of another Vision Entropy acquisition device provided in an embodiment of the present invention;
Figure 10 b are the structural representations of another Vision Entropy acquisition device provided in an embodiment of the present invention;
Figure 10 c are the structural representations of another Vision Entropy acquisition device provided in an embodiment of the present invention;
Figure 11 is the structural representation of another Vision Entropy acquisition device provided in an embodiment of the present invention.
Figure 12 is the structural representation of another Vision Entropy acquisition device provided in an embodiment of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation describes, it is clear that described embodiment is part of the embodiment of the present invention, rather than whole embodiments.Based on this hair Embodiment in bright, the every other implementation that those of ordinary skill in the art are obtained under the premise of creative work is not made Example, belongs to the scope of protection of the invention.
Vision Entropy acquisition methods described in the embodiment of the present invention can be applied in image quality evaluating method or Image Coding In processing method.
When applying in image quality evaluating method, the detailed process of image quality evaluation is:When image IX is at Reason(Such as lossy compression method, fuzzy etc.)When obtaining image IY, U specified image-region Yu are chosen from image IY(U=1 ..., U, U is positive integer), specified image-region Xu corresponding with Yu is chosen from image IX.Calculate the Vision Entropy for specifying image-region Xu Eu;The strength of distortion of image-region is specified to be weighted processing several in image-region Yu using the Vision Entropy Eu, and Strength of distortion after handling weighting is averaging, and obtains specifying image-region Yu average distortion intensity ERu;All ERu are made Convergence processing obtains average value ERA, and as image Y quality score, the ERA is bigger, then image IY quality is poorer.
When applying in Image Coding processing method, a kind of volume of Least-cost can be selected from a variety of coding modes Pattern is encoded, and specific processing method is:To image-region X, precoding is carried out respectively using M kinds coding mode, is obtained To reconstruction image region Ym and number of coded bits BITm, m=1 ..., M;The M kinds coding mode can be in frame(Intra)Compile Pattern, interframe(Inter)Coding mode, ADPCM(DPCM)Coding mode etc..Calculate regarding for image-region X Feel entropy Eu, and processing is weighted to the strength of distortion for rebuilding image-region Ym using the Vision Entropy Eu, described distortion is strong Spend ERRu,rIt can be weighed using a variety of existing distortion evaluation methods, such as mean square error, absolute difference etc. can be used.Pair plus Strength of distortion summation after power, obtains Ym total distortion strength S Em.Coded image area X m kind coding mode costs COSTm =f(SEm), wherein f (SEm) can there are a variety of calculations, such as f (SEm)=SEmFor or f (SEm)=SEm+λBITm(λ is positive real Number);Finally the coding mode of Least-cost is selected to encode image-region X.
Fig. 1 is refer to, is a kind of schematic flow sheet of Vision Entropy acquisition methods provided in an embodiment of the present invention;Such as Fig. 1 institutes State, a kind of Vision Entropy acquisition methods described in the present embodiment include step:
S100, obtain at least two target pixel points specified in image-region;
In one embodiment, specified image-region can be the whole image region or one in piece image A region in width image, the method for obtaining at least two target pixel points in specified image-region can specify figure As selecting N number of position An in region and its neighborhood(n=1,…,N), specific system of selection can regularly specify figure As being selected in region or arbitrarily being selected from specified image-region.Then the position that acquisition is selected Pixel, and as the target pixel points of the position correspondence, each target pixel points include pixel value, pixel value it is rich And arrangement mode determines the texture of specified image-region, the intensity of texture characterizes the rich of pixel value, the side of texture To the arrangement mode for characterizing pixel value.
S101, determine grain direction and texture strength corresponding to each described target pixel points;
As an alternative embodiment, determine grain direction corresponding to each target pixel points and texture strength Method can be the gradient parameter for calculating the presumptive area centered on target pixel points first at least one preset direction, And grain direction and texture strength according to corresponding to the gradient parameter of at least one preset direction determines the target pixel points.Need Illustrate, gradient parameter can be with the conventional gradient operator such as Sobel operators, Prewitt operators or Gabor operators with it is pre- Determine Grad or its absolute value that region is calculated as process of convolution, can also be according to the pixel value of pixel in presumptive area It is calculated.
S102, the entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and according to whole The entropy obtain the Vision Entropy of the specified image-region.
As an alternative embodiment, calculating the method for the Vision Entropy of the specified image-region can be, by institute Grain direction corresponding to each obtained target pixel points and texture strength obtain each object pixel as calculating parameter The entropy of grain direction and texture strength corresponding to point, wherein, entropy here can be the combination entropy of grain direction and texture strength, It can also be the entropy of grain direction and texture strength respectively, be the entropy of grain direction and the entropy of texture strength, and according to whole The entropy that pixel is calculated, calculate the Vision Entropy of specified image-region.This Vision Entropy computational methods not only allow for The arrangement mode of grain direction corresponding to target pixel points, i.e. pixel value, it is also considered that texture strength corresponding to target pixel points, I.e. pixel value is rich.So it can truly reflect the complexity of specified image-region.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Fig. 2 is refer to, is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;Such as Fig. 2 Described, a kind of Vision Entropy acquisition methods described in the present embodiment include step:
S200, obtain at least two target pixel points specified in image-region;
Step S200 of the embodiment of the present invention, the embodiment step S100 shown in Fig. 1 is refer to, herein without repeating.
S201, determine grain direction and texture strength corresponding to each described target pixel points;
Step S201 of the embodiment of the present invention, the embodiment step S101 shown in Fig. 1 is refer to, herein without repeating.
S202, calculate grain direction corresponding to all target pixel points in the specified image-region and texture strength Combination entropy, and the Vision Entropy using the combination entropy as the specified image-region;
As an alternative embodiment, Vision Entropy can be grain direction and line corresponding to each target pixel points The combination entropy of intensity is managed, specific computational methods can be, it is assumed that the quantity of target pixel points is N, then all target pixel points Corresponding grain direction and texture strength are respectively Dn and Sn(N=1 ..., N, N>1, N is positive integer), to the N groups texture side To and texture strength, calculate the combination entropy of grain direction and texture strength, as the Vision Entropy E of specified image-region, i.e.,:
Wherein i and j is respectively the positive integer less than or equal to I and J;I represents the species number of grain direction, each texture Direction can use diRepresent;J represents the number of levels of texture strength, and s can be used per one-level texture strengthjRepresent;p(di,sj) it is N groups Grain direction is d in grain direction and texture strengthiAnd texture strength is sjThe probability that occurs of combination, that is, N groups texture side (d is combined as to its in texture strengthi,sj) probability;Log be logarithm operation symbol, its truth of a matter can be 2,10, e(Naturally often Number)Deng.Specified image-region can be a region in entire image, or image, such as a rectangular area or one Individual border circular areas or a delta-shaped region.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Fig. 3 is refer to, is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;Such as Fig. 3 Described, a kind of Vision Entropy acquisition methods described in the present embodiment include step:
S300, obtain at least two target pixel points specified in image-region;
Step S300 of the embodiment of the present invention, the embodiment step S100 shown in Fig. 1 is refer to, herein without repeating.
S301, determine grain direction and texture strength corresponding to each described target pixel points;
Step S301 of the embodiment of the present invention, the embodiment step S101 shown in Fig. 1 is refer to, herein without repeating.
S302, calculate the entropy of grain direction and the finger corresponding to all target pixel points in the specified image-region Determine the weighted average of the entropy of texture strength corresponding to all target pixel points in image-region, and by the weighted average Vision Entropy as the specified image-region.
As an alternative embodiment, the acquisition methods of Vision Entropy can obtain each target pixel points first The entropy of corresponding grain direction, the entropy of texture strength corresponding to each target pixel points is then obtained, finally to acquired The entropy of grain direction and the entropy of texture strength are weighted averagely, so as to obtain the Vision Entropy of specified image-region.Further, Here it is further detailed by specifying in image-region exemplified by N number of target pixel points, then corresponding to all target pixel points Grain direction and texture strength are respectively Dn and Sn(N=1 ..., N, N>1, N is positive integer), calculate the entropy Ed and line of grain direction The entropy Es of intensity weighted average is managed, as the Vision Entropy E of specified image-region, i.e.,:
E=(w1Edγ+w2Esγ)1/γ
Wherein i and j is respectively the positive integer less than or equal to I and J;I represents the species number of grain direction, each texture Direction can use diRepresent;J represents the number of levels of texture strength, and s can be used per one-level texture strengthjRepresent;p(di) it is N number of texture Grain direction is d in directioniProbability, p (si) it is that texture strength is s in N number of texture strengthjProbability;Log is logarithm operation Symbol;w1And w2For weighted factor, and there is w1,w2>0, w1+w2=1, such as w1=0.7, w2=0.3, or w1=w2=0.5;γ is positive real Number, such as γ=0.5 or γ=1 or γ=3.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Fig. 4 is refer to, is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;Such as Fig. 4 Described, a kind of Vision Entropy acquisition methods described in the present embodiment include step:
S400, obtain at least two target pixel points specified in image-region;
Step S400 of the embodiment of the present invention, the embodiment step S100 shown in Fig. 1 is refer to, herein without repeating.
S401, obtains the parameter sets of each target pixel points, and the parameter sets include the target pixel points In the gradient parameter of at least one preset direction, wherein, the corresponding gradient parameter of a preset direction;
In one embodiment, parameter sets include gradient parameter of the target pixel points at least one preset direction, its In, the corresponding gradient parameter of a preset direction, the corresponding parameter sets of a target pixel points a, so target picture Vegetarian refreshments corresponds to multiple gradient parameters.Further, the method for obtaining the parameter sets of each target pixel points is specifically as follows, According to the region Bn centered on the position An of target pixel points, K is calculated(K >=2, K are positive integer)The gradient of individual preset direction Parameter Vn, k(k=1,…,K), Vn, k represent position An k-th of gradient parameter.K described preset direction has diversified forms, Such as horizontal direction, the vertical direction of image-region this 2 preset directions of image-region can be selected, or from image district The horizontal direction in domain, 45 degree of the bottom right of image-region direction, the vertical direction of image-region, 45 degree of the lower-left direction of image-region This 4 preset directions or other preset directions.The gradient parameter of each preset direction can be with Sobel operators, The conventional gradient operator such as Prewitt operators or Gabor operators and the region Bn Grad being calculated as process of convolution or its It absolute value, can also be calculated by method presented below, be to be calculated according to the pixel value of pixel in the Bn of region, Following methods are the computational methods that gradient parameter is introduced by taking four preset directions as an example:
G=max(|P(x-1,y)-P(x+1,y)|,|P(x-2,y)-P(x+2,y)|)
G45°=max(|P(x-1,y-1)-P(x+1,y+1)|,|P(x-2,y-2)-P(x+2,y+2)|)
G90°=max(|P(x,y-1)-P(x,y+1)|,|P(x,y-2)-P(x,y+2)|)
G135°=max(|P(x+1,y+1)-P(x-1,y-1)|,|P(x+2,y+2)-P(x-2,y-2)|)
Wherein G、G45°、G90°、G135°Represent respectively the horizontal direction of image-region, image-region 45 degree of bottom right direction, The gradient parameter of 4 preset directions such as the vertical direction of image-region, 45 degree of the lower-left of image-region direction;(x, y) is target The position An of pixel coordinate, the coordinate can be to specify the tangent straight line of horizontal edge of image-region as transverse axis, with The tangent straight line in vertical edge for specifying image-region is the longitudinal axis, the points of two straight line intersections tangent with edge for origin institute The coordinate system of composition is reference frame.P (x, y) expression horizontal coordinates are x, the pixel value for the pixel that vertical coordinate is y; Max (C1, C2) represents the higher value in C1 and C2;| C3 | represent C3 absolute value.
S402, the texture side according to corresponding to the parameter sets of each target pixel points determine the target pixel points To;
In one embodiment, parameter sets include gradient parameter of the target pixel points at least one preset direction, So parameter sets include multiple gradient parameters, can be determined according to the parameter sets of each target pixel points obtained Grain direction corresponding to the target pixel points, specific determination mode can be according to each gradient parameter value in parameter sets Size and each gradient parameter corresponding to preset direction be determined.
Specifically, the determination mode of the grain direction corresponding to target pixel points there can be five kinds of embodiments, divide below Do not introduced respectively with this five kinds of embodiments:
In the first embodiment, Fig. 5 a are refer to, are a kind of grain direction determination side provided in an embodiment of the present invention The schematic flow sheet of method;As described in Fig. 5 a, a kind of grain direction described in present embodiment determines that method includes step:
S500, the gradient parameter included in the parameter sets of each target pixel points and predetermined threshold value are carried out Compare;
As an alternative embodiment, the gradient parameter included in parameter sets by each target pixel points Compared with predetermined threshold value, predetermined threshold value is arithmetic number, for example, predetermined threshold value can be 10 or 14.14 or specified image districts The 5% of the difference of max pixel value and minimum pixel value in domain.
S501, will be flat if the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value Smooth direction or any one of preset direction are defined as the grain direction corresponding to the target pixel points;
As an alternative embodiment, if all gradient parameters in the parameter sets of target pixel points are both less than pre- If threshold value, then a default direction set in advance is defined as the grain direction corresponding to the target pixel points.Need to illustrate , default direction set in advance can be set by user oneself, for example, flat orientation can be arranged into set in advance Default direction, it is when all gradient parameters in the parameter sets of target pixel points are respectively less than predetermined threshold value, then it is assumed that mesh Mark pixel is in flat site, and flat orientation is the grain direction corresponding to the target pixel points.Can also be by the target Any one preset direction at least one preset direction in the parameter sets of pixel corresponding to each gradient parameter is set Default direction set in advance is set to, is when all gradient parameters in the first gradient parameter of target pixel points are respectively less than pre- If during threshold value, then it is assumed that the grain direction corresponding to target pixel points is any one default side at least one preset direction To.
S502, should if at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points The preset direction corresponding to greatest gradient parameter in the parameter sets of target pixel points is defined as corresponding to the target pixel points Grain direction, or the preset direction corresponding to the greatest gradient parameter in the parameter sets with the target pixel points is mutually hung down Straight direction is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, when in the parameter sets of target pixel points at least one gradient parameter be more than Predetermined threshold value, then the preset direction corresponding to the greatest gradient parameter in the parameter sets of the target pixel points is defined as the mesh Mark the grain direction corresponding to pixel.Either, by the greatest gradient parameter institute in the parameter sets with the target pixel points The perpendicular direction of corresponding preset direction is defined as the grain direction corresponding to the target pixel points.
In second of embodiment, Fig. 5 b are refer to, are determined for another grain direction provided in an embodiment of the present invention The schematic flow sheet of method;As described in Fig. 5 b, a kind of grain direction described in present embodiment determines that method includes step:
S503, the gradient parameter included in the parameter sets of each target pixel points and predetermined threshold value are carried out Compare;
Step S503 of the embodiment of the present invention, the embodiment step S500 shown in Fig. 5 a is refer to, herein without repeating.
S504, will be pre- if the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value The default direction first set is defined as the grain direction corresponding to the target pixel points, wherein, the acquiescence side set in advance Any one preset direction in for flat orientation or at least one preset direction;
Step S504 of the embodiment of the present invention, the embodiment step S501 shown in Fig. 5 a is refer to, herein without repeating.
S505, should if at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points The preset direction corresponding to minimal gradient parameter in the parameter sets of target pixel points is defined as corresponding to the target pixel points Grain direction, or the preset direction corresponding to the minimal gradient parameter in the parameter sets with the target pixel points is mutually hung down Straight direction is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, when in the parameter sets of target pixel points at least one gradient parameter be more than Predetermined threshold value, then the preset direction corresponding to the minimal gradient parameter in the parameter sets of the target pixel points is defined as the mesh Mark the grain direction corresponding to pixel.Either, by the minimal gradient parameter institute in the parameter sets with the target pixel points The perpendicular direction of corresponding preset direction is defined as the grain direction corresponding to the target pixel points.
In the third embodiment, Fig. 5 c are refer to, are determined for another grain direction provided in an embodiment of the present invention The schematic flow sheet of method;As described in Fig. 5 c, a kind of grain direction described in present embodiment determines that method includes step:
S506, obtains the neighborhood territory pixel point of each target pixel points, and the neighborhood territory pixel point is with the target picture The pixel in the range of predeterminable area centered on vegetarian refreshments;
As an alternative embodiment, it is determined that during grain direction corresponding to target pixel points, each is obtained The neighborhood territory pixel points of target pixel points is, it is necessary to explanation, and neighborhood territory pixel point is the neighbor pixel of target pixel points, and the phase Adjacent pixel needs the pixel in the range of the predeterminable area centered on target pixel points, predeterminable area method of determining range Can be that, for example, centered on target pixel points, preset length is in the range of the border circular areas of radius.It is it should be noted that pre- Can be arbitrary shape if the shape of regional extent does not limit, as long as the predeterminable area scope is including this target pixel points Can.For example, it may be the rectangular region centered on target pixel points.
S507, the neighborhood territory pixel of each target pixel points and target pixel points point is formed into target area, and obtained The pixel value of all pixels point in each described target area;
Each target pixel points and object pixel in image-region are specified as an alternative embodiment, obtaining After neighborhood of a point pixel, the neighborhood territory pixel of target pixel points and target pixel points point is formed into target area, is one The corresponding target area of individual target pixel points, target pixel points and the target pixel points are also contains in the target area Neighborhood territory pixel point, a specified image-region include multiple target areas.And obtain the institute included in each target area There is the pixel value of pixel.
S508, determine that pixel in each described target area corresponding to max pixel value and minimum pixel value institute are right The pixel answered;Corresponding to pixel and minimum pixel value in each described target area corresponding to max pixel value In the angle formed between line and each described preset direction between pixel, by the default side that the angle is minimum To the grain direction being defined as corresponding to the target pixel points.
As an alternative embodiment, comparing the pixel value of all pixels point in each target area, and determine Pixel corresponding to max pixel value and the pixel corresponding to minimum pixel value in each target area, each target Region has to pixel that should be corresponding to the max pixel value of target area and the pixel corresponding to minimum pixel value.Will The pixel corresponding to max pixel value and the pixel corresponding to minimum pixel value connect in each target area, obtain The line of pixel is obtained, angle can be formed between the line and all preset directions, determines that the line is preset with all Preset direction in the angle formed between direction corresponding to minimum angle is defined as the line corresponding to the target pixel points Manage direction.Here it is illustrated, if pixel and minimum pixel value in a target area corresponding to max pixel value The angle of line and the first preset direction is 20 degree between corresponding pixel, and the angle with the second preset direction is 15 degree, Angle with the 3rd preset direction is 40 degree, then the pixel and minimum pixel value corresponding to the max pixel value of the target area Angle between corresponding pixel between line and the second preset direction is minimum, so the second preset direction is defined as into this The grain direction of the target pixel points of target area.
In the 4th kind of embodiment, determine that grain direction determines that method includes step S509;
S509, by the preset direction corresponding to the greatest gradient parameter in the parameter sets of each target pixel points Be defined as the grain direction corresponding to the target pixel points, or by with the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to greatest gradient parameter is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, determining the method for the grain direction of target pixel points can be, according to each Greatest gradient parameter in the parameter sets of individual target pixel points determines the grain direction corresponding to target pixel points, for example, can To be that the preset direction corresponding to the greatest gradient parameter in the parameter sets of target pixel points directly is defined as into the target picture The grain direction of vegetarian refreshments, or the preset direction corresponding to by the greatest gradient parameter in the parameter sets with target pixel points Perpendicular direction is defined as the grain direction corresponding to target pixel points.
In the 5th kind of embodiment, determine that grain direction determines that method includes step S510;
S510, by the preset direction corresponding to the minimal gradient parameter in the parameter sets of each target pixel points Be defined as the grain direction corresponding to the target pixel points, or by with the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to minimal gradient parameter is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, determining the method for the grain direction of target pixel points can be, according to each Minimal gradient parameter in the parameter sets of individual target pixel points determines the grain direction corresponding to target pixel points, for example, can To be that the preset direction corresponding to the minimal gradient parameter in the parameter sets of target pixel points directly is defined as into the target picture The grain direction of vegetarian refreshments, or the preset direction corresponding to by the minimal gradient parameter in the parameter sets with target pixel points Perpendicular direction is defined as the grain direction corresponding to target pixel points.
S403, the gradient parameter included in parameter sets to target pixel points each described quantify, and obtain Texture strength corresponding to the target pixel points.
In one embodiment, gradient parameter that can be included in the parameter sets to each target pixel points is carried out Quantify, and obtain the texture strength corresponding to the target pixel points, because a target pixel points have a parameter sets, When quantifying to the parameter sets of each target pixel points, the texture that can obtain quantity identical with target pixel points is strong Degree, a target pixel points have a texture strength.Wherein, target pixel points, parameter sets and texture strength will be one by one Corresponding relation, it is that texture according to corresponding to the parameter sets of the target pixel points can obtain the target pixel points is strong Degree.
Further, the quantization that the gradient parameter included in the parameter sets to each target pixel points is quantified Method can have following two quantization methods, and step S40 is a kind of quantization method, and step S41 is another quantization method:
S40, the greatest gradient parameter in the parameter sets of each target pixel points is quantified, and will be quantified Value is defined as texture strength corresponding to the target pixel points.
As an alternative embodiment, parameter sets include gradient of the target pixel points at least one preset direction Parameter, the greatest gradient ginseng in the parameter sets of the target pixel points is chosen from the parameter sets of each target pixel points Number, each parameter sets have a greatest gradient parameter.Then the greatest gradient parameter is quantified, and by quantized value It is defined as texture strength corresponding to the target pixel points.It should be noted that when parameter sets only include a gradient parameter, Then greatest gradient parameter is only one gradient parameter included in parameter sets.
Further, the quantizing process quantified to greatest gradient parameter can be that to preset J incremented by successively Threshold value H1,…,HJ, it is assumed that greatest gradient parameter is Vmax, works as Hj≤Vmax<Hj+1When, j=1 .., J-1, the line of target pixel points Reason intensity is rank j;As Vmax >=HJWhen, the texture strength of target pixel points is rank J.Threshold value H1,…,HJMethod to set up Can be a variety of, such as:Hj=β+C·(j-1)α, 1≤j≤J, wherein C are arithmetic number(Such as C=20 or 30.5), α is arithmetic number (Such as α=1 or 2), β is arithmetic number(Such as β=T1 or β=20), J can be 2 or 3 or 4 etc..Threshold value H1…,HJMethod to set up It can also be formed so that the ordered series of numbers of certain rule is presented.Such as:J=4, threshold value H are set1、H2、H3、H4Respectively 0,10,30,60 4 Individual preset constant.
S41, when the parameter sets of each target pixel points include at least two gradient parameters, respectively Greatest gradient parameter in the parameter sets of each target pixel points and the weighting of second largest gradient parameter are put down Average is quantified, and quantized result is defined as into texture strength corresponding to the target pixel points.
As an alternative embodiment, when the parameter sets of each target pixel points include at least two gradients During parameter, for example, parameter sets include N number of gradient parameter, wherein N >=2, it is to be wrapped in the parameter sets of the target pixel points Include gradient parameter corresponding to N number of preset direction, then maximum gradient parameter and second largest gradient parameter in the set that gets parms, The gradient parameter of maximum and the weighted average of second largest gradient parameter are asked for, and weighted average is quantified, will be quantified As a result it is defined as texture strength corresponding to the target pixel points.
Specifically, the maximum gradient parameter and second largest gradient parameter in hypothesis parameter sets are respectively Q1And Q2, then Q1 And Q2Weighted average S be:
S=(w3Q1 κ+w4Q2 κ)1/κ
Wherein w3And w4For weighted factor, κ is arithmetic number, such as κ=0.5 or κ=1 or κ=3.
Then weighted average S is quantified, to greatest gradient parameter Vmax in specific quantizing process and step S40 Quantizing process it is identical, will not be repeated here.
S404, the entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and according to whole The entropy obtain the Vision Entropy of the specified image-region.
Step S404 of the embodiment of the present invention, the embodiment step S102 shown in Fig. 1 is refer to, herein without repeating.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Fig. 6 is refer to, is the schematic flow sheet of another Vision Entropy acquisition methods provided in an embodiment of the present invention;Such as Fig. 6 Described, a kind of Vision Entropy acquisition methods described in the present embodiment include step:
S600, obtain at least two target pixel points specified in image-region;
Step S600 of the embodiment of the present invention, the embodiment step S100 shown in Fig. 1 is refer to, herein without repeating.
S601, obtain the first gradient parameter and the second gradient parameter of each target pixel points, first ladder It is gradient parameter of the target pixel points in the first preset direction to spend parameter, and second gradient parameter is that the target pixel points exist The gradient parameter of second preset direction;First preset direction is different from second preset direction;
As an alternative embodiment, the first preset direction can be horizontally oriented or vertical direction, or The other directions of person, the second preset direction can be horizontally oriented or vertical direction, or other directions.First is default Angle between direction and the second preset direction can be arbitrary, but be the first preset direction and second not equal to 0 degree Preset direction is different.For example, the first preset direction can be vertical direction, the second preset direction can be horizontal direction;First Preset direction can be 60 degree of directions of upper right, and the second preset direction can be 10 degree of bottom right direction.First gradient parameter is target Pixel the first preset direction gradient parameter, the second gradient parameter be target pixel points the second preset direction gradient join Number, the computational methods of gradient parameter can be calculated according to gradient operator.
S602, according to the first gradient parameter of each target pixel points and second gradient parameter, meter The reference angle of the target pixel points is calculated, the reference angle is pre- relative to described first for the actual grain direction of the target pixel points Angle between set direction or second preset direction;
As an alternative embodiment, can be with according to the first gradient parameter of target pixel points and the second gradient parameter Calculate the reference angle θ of target pixel points.Reference angle θ is the actual grain direction of the target pixel points relative to the first preset direction Or the second angle between preset direction, specific computational methods can be, it is assumed that the first gradient parameter of target pixel points For V1, the second gradient parameter is V2, then the reference angle θ of target pixel points value can be calculated with below equation.When first When the angle of preset direction and the second preset direction is 90 degree, θ=arctan (V1/V2) or θ=arctan (V2/V1);When θ= During arctan (V1/V2), be the actual grain direction for representing target pixel points relative to the angle between the second preset direction, As θ=arctan (V2/V1), be represent target pixel points actual grain direction relative between the first preset direction Angle.When the angle between the first preset direction and the second preset direction is ψ(ψ<90°)When, then reference angle θ=arctan (V1 Sin ψ/(V2+V1cos ψ)) or θ=ψ-arctan (V1sin ψ/(V2+V1cos ψ)).When θ=arctan (V1sin ψ/ (V2+V1cos ψ)) when, be represent target pixel points actual grain direction relative to the angle between the second preset direction Degree, it is the actual grain direction phase for representing target pixel points as θ=ψ-arctan (V1sin ψ/(V2+V1cos ψ)) For the angle between the first preset direction.
S603, by the reference angle of each target pixel points compared with least one preselected reference angle, And the preselected direction corresponding to the preselected reference angle minimum by difference is compared is defined as grain direction corresponding to the target pixel points; Wherein, the preselected reference angle is the preselected direction corresponding to the preselected reference angle relative to first preset direction or institute State the angle between the second preset direction;
As an alternative embodiment, at least one preselected reference angle can be user according to oneself need carry out Setting, preselected reference angle is that the preselected direction corresponding to the preselected reference angle is default relative to the first preset direction or second Angle between direction is, it is necessary to which explanation, needs relative to the angle between the first preset direction or the second preset direction It is consistent with the reference direction of the actual grain direction of target pixel points, if the reference angle of target pixel points is the reality of target pixel points Border grain direction is relative to the angle between the first preset direction, then preselected reference angle is also pre- corresponding to the preselected reference angle Direction is selected relative to the angle between the first preset direction.
Here illustrated by taking Fig. 7 as an example, it is assumed that 0 degree of direction in figure is the first preset direction, all preselected references Angle is that then at least one preselected direction in figure is respectively 0 degree of direction relative to the angle between the first preset direction(It is i.e. horizontal Direction, preselected reference angle are 0 degree),+30 degree directions(Preselected reference angle is 30 degree),+60 degree directions(Preselected reference angle is 60 Degree),+90 degree directions(That is vertical direction, preselected reference angle are 90 degree), -30 degree directions(Preselected reference angle is -30 degree), -60 degree Direction(Preselected reference angle is -60 degree)With -90 degree directions(Preselected reference angle is -90 degree).
By the reference angle of each target pixel points compared with least one preselected reference angle, and difference will be compared most Preselected direction corresponding to small preselected reference angle is defined as grain direction corresponding to the target pixel points, i other words and target picture The immediate preselected direction of actual grain direction of vegetarian refreshments is the grain direction of target pixel points.Here with some object pixel It is illustrated exemplified by the reference angle of point, it is assumed that during reference angle θ=20 degree of target pixel points, then the degree side of preselected direction+30 To difference of 30 degree of the preselected reference angle between the reference angle θ of target pixel points it is minimum, then+30 degree directions are defined as the mesh Mark the grain direction of pixel.Assuming that when reference angle θ=- 20 of target pixel points are spent, then preselected direction -30 spends the pre-selection in direction The difference that reference angle -30 is spent between the reference angle θ of target pixel points is minimum, then -30 degree directions is defined as into the object pixel The grain direction of point.
S603, to the first gradient parameters of target pixel points each described with second gradient parameter most Big value carries out quantification treatment, obtains the texture strength corresponding to the target pixel points, or to target pixel points each described The weighted average of the first gradient parameter and second gradient parameter carry out quantification treatment, obtain the target pixel points Corresponding texture strength.
As an alternative embodiment, first gradient parameter and the second gradient parameter to each target pixel points Quantification treatment is carried out, specific quantification treatment process can be to ask for the first gradient parameter and the of each target pixel points Maximum in two gradient parameters, or the second gradient parameter of each target pixel points and the weighting of 3rd gradient parameter are put down Average, maximum ask for mode and are, for example, make V=max (| V1 |, | V2 |), the mode of asking for of weighted average is, for example, make V= (|V1φ|+|V2φ|)1/φ, φ is arithmetic number, such as φ=2 or 1.5.Then the first gradient of each target pixel points is joined Number and the second gradient parameter of the maximum in the second gradient parameter or each target pixel points and 3rd gradient parameter Weighted average is quantified, such as V is carried out into quantification treatment, is quantified as rank s, strong as texture corresponding to target pixel points Degree, specific quantizing process can use equal interval quantizing, such as makeWherein Q and λ is constant, optional Q= 20, λ=0.5,For lower floor operation;Specific quantizing process can also be used with the equal interval quantizing for clamping down on operation, for example, OrderWherein TH is positive integer, such as TH=4 or TH=5.
S604, the entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and according to whole The entropy obtain the Vision Entropy of the specified image-region.
Step S604 of the embodiment of the present invention, the embodiment step S102 shown in Fig. 1 is refer to, herein without repeating.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
A kind of specific implementation of Vision Entropy acquisition device provided in an embodiment of the present invention is described below.
Fig. 8 is refer to, is a kind of structural representation of Vision Entropy acquisition device provided in an embodiment of the present invention.Such as Fig. 8 institutes Show, a kind of Vision Entropy acquisition device described in the present embodiment includes:Acquisition module 100, the first determining module 101 and second determine Module 102.
Acquisition module 100, for obtaining at least two target pixel points in specified image-region;
In one embodiment, specified image-region can be the whole image region or one in piece image A region in width image, the method that acquisition module 100 obtains the target pixel points in specified image-region can referred to Determine to select N number of position An in image-region and its neighborhood(n=1,…,N), specific system of selection can regularly referred to Determine to be selected in image-region or arbitrarily selected from specified image-region.Then the position selected is obtained The pixel put, and as the target pixel points of the position correspondence, each target pixel points include pixel value, pixel value it is rich Richness and arrangement mode determine the texture of specified image-region, and the intensity of texture characterizes the rich of pixel value, texture Direction characterize the arrangement mode of pixel value.
First determining module 101, for determining grain direction and texture strength corresponding to each described target pixel points;
As an alternative embodiment, determining module 101 determines grain direction corresponding to each target pixel points Method with texture strength can calculate presumptive area centered on target pixel points first at least one default side To gradient parameter, and according to corresponding to the gradient parameter of at least one preset direction determines the target pixel points grain direction and Texture strength.It should be noted that gradient parameter can be conventional with Sobel operators, Prewitt operators or Gabor operators etc. Gradient operator and the presumptive area Grad or its absolute value that are calculated as process of convolution, can also be according in presumptive area The calculated for pixel values of pixel obtains.
Second determining module 102, for determining grain direction and texture strength corresponding to each described target pixel points Entropy, and according to whole the entropy obtain the Vision Entropy of the specified image-region.
As an alternative embodiment, calculating the method for the Vision Entropy of the specified image-region can be, by institute Grain direction corresponding to each obtained target pixel points and texture strength obtain as calculating parameter, the second determining module 102 The grain direction of each target pixel points and the entropy of texture strength are taken, wherein, entropy here can be grain direction and texture The entropy of the combination entropy or grain direction and texture strength of intensity respectively, it is the entropy and texture strength of grain direction Entropy, the second determining module 102 and the entropy calculated according to whole pixels, calculate the Vision Entropy of specified image-region. This Vision Entropy computational methods not only allow for the arrangement mode of grain direction corresponding to target pixel points, i.e. pixel value, also examine Texture strength corresponding to target pixel points is considered, i.e., pixel value is rich.So it can truly reflect specified image-region Complexity.
Further, the second determining module 102, which obtains Vision Entropy, can following two embodiments:
In the first embodiment, the second determining module 102 is used for, and calculates all mesh in the specified image-region The combination entropy of grain direction and texture strength corresponding to pixel is marked, and using the combination entropy as the specified image-region Vision Entropy;
Vision Entropy can be the combination entropy of grain direction and texture strength corresponding to each target pixel points, specific meter Calculation method can be, it is assumed that the quantity of target pixel points is N, then grain direction and texture strength corresponding to all target pixel points Respectively Dn and Sn(N=1 ..., N, N>1, N is positive integer), to the N groups grain direction and texture strength, the second determining module 102 calculate the combination entropy of grain directions and texture strength, as the Vision Entropy E of specified image-region, i.e.,:
Wherein i and j is respectively the positive integer less than or equal to I and J;I represents the species number of grain direction, each texture Direction can use diRepresent;J represents the number of levels of texture strength, and s can be used per one-level texture strengthjRepresent;p(di,sj) it is N groups Grain direction is d in grain direction and texture strengthiAnd texture strength is sjThe probability that occurs of combination, that is, N groups texture side (d is combined as to its in texture strengthi,sj) probability;Log be logarithm operation symbol, its truth of a matter can be 2,10, e(Naturally often Number)Deng.Specified image-region can be a region in entire image, or image, such as a rectangular area or one Individual border circular areas or a delta-shaped region.
In second of embodiment, the second determining module 102 is used to calculate all targets in the specified image-region Texture strength corresponding to all target pixel points in the entropy of grain direction corresponding to pixel and the specified image-region The weighted average of entropy, and the Vision Entropy using the weighted average as the specified image-region.
The acquisition methods of Vision Entropy can be that the second determining module 102 obtains line corresponding to each target pixel points first The entropy in direction is managed, the entropy of texture strength corresponding to each target pixel points is then obtained, finally to acquired grain direction Entropy and texture strength entropy be weighted it is average, so as to obtain the Vision Entropy of specified image-region.Further, here with finger Determine to be further detailed exemplified by N number of target pixel points in image-region, then grain direction corresponding to all target pixel points It is respectively Dn and Sn with texture strength(N=1 ..., N, N>1, N is positive integer), the second determining module 102 calculating grain direction Entropy Ed and the entropy Es of texture strength weighted average, as the Vision Entropy E of specified image-region, i.e.,:
E=(w1Edγ+w2Esγ)1/γ
Wherein i and j is respectively the positive integer less than or equal to I and J;I represents the species number of grain direction, each texture Direction can use diRepresent;J represents the number of levels of texture strength, and s can be used per one-level texture strengthjRepresent;p(di) it is N number of texture Grain direction is d in directioniProbability, p (si) it is that texture strength is s in N number of texture strengthjProbability;Log is logarithm operation Symbol;w1And w2For weighted factor, and there is w1,w2>0, w1+w2=1, such as w1=0.7, w2=0.3, or w1=w2=0.5;γ is positive real Number, such as γ=0.5 or γ=1 or γ=3.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Also referring to Fig. 9, Fig. 9 is another Vision Entropy acquisition device provided in an embodiment of the present invention.Wherein, Fig. 9 institutes The Vision Entropy acquisition device shown is that Vision Entropy acquisition device as shown in Figure 8 optimizes what is obtained.Obtained in the Vision Entropy shown in Fig. 9 In device, including all modules in Vision Entropy acquisition device described in Fig. 8, acquisition module 100, the and of the first determining module 101 Second determining module 102 refers to Fig. 8 description, will not be repeated here.Wherein, the first determining module 102 can include first Acquiring unit 1020, the quantifying unit 1022 of determining unit 1021 and first.
First acquisition unit 1020, for obtaining the parameter sets of each target pixel points, the parameter sets Including the target pixel points at least one preset direction gradient parameter, wherein, the corresponding gradient ginseng of preset direction Number;
In one embodiment, parameter sets include gradient parameter of the target pixel points at least one preset direction, its In, the corresponding gradient parameter of a preset direction, the corresponding parameter sets of a target pixel points a, so target picture Vegetarian refreshments corresponds to multiple gradient parameters.Further, first acquisition unit 1020 obtains the parameter sets of each target pixel points Method be specifically as follows, according to the region Bn centered on the position An of target pixel points, calculate K(K >=2, K are positive integer) The gradient parameter Vn, k of individual preset direction(k=1,…,K), Vn, k represent position An k-th of gradient parameter.Described K are pre- Set direction has diversified forms, such as can select the horizontal direction of image-region, the default side of vertical direction this 2 of image-region To, or the horizontal direction from image-region, 45 degree of the bottom right direction of image-region, the vertical direction of image-region, image This 4 preset directions of 45 degree of the lower-left direction in region or other preset directions.The gradient parameter of each preset direction Can make process of convolution meter with the conventional gradient operator such as Sobel operators, Prewitt operators or Gabor operators and region Bn It obtained Grad or its absolute value, can also be calculated by method presented below, be according to pixel in the Bn of region The pixel value of point is calculated, and following methods are the computational methods that gradient parameter is introduced by taking four preset directions as an example:
G=max(|P(x-1,y)-P(x+1,y)|,|P(x-2,y)-P(x+2,y)|)
G45°=max(|P(x-1,y-1)-P(x+1,y+1)|,|P(x-2,y-2)-P(x+2,y+2)|)
G90°=max(|P(x,y-1)-P(x,y+1)|,|P(x,y-2)-P(x,y+2)|)
G135°=max(|P(x+1,y+1)-P(x-1,y-1)|,|P(x+2,y+2)-P(x-2,y-2)|)
Wherein G、G45°、G90°、G135°Represent respectively the horizontal direction of image-region, image-region 45 degree of bottom right direction, The gradient parameter of 4 preset directions such as the vertical direction of image-region, 45 degree of the lower-left of image-region direction;(x, y) is target The position An of pixel coordinate, the coordinate can be to specify the tangent straight line of horizontal edge of image-region as transverse axis, with The tangent straight line in vertical edge for specifying image-region is the longitudinal axis, the points of two straight line intersections tangent with edge for origin institute The coordinate system of composition is reference frame.P (x, y) expression horizontal coordinates are x, the pixel value for the pixel that vertical coordinate is y; Max (C1, C2) represents the higher value in C1 and C2;| C3 | represent C3 absolute value.
Determining unit 1021, for determining the target pixel points institute according to the parameter sets of each target pixel points Corresponding grain direction;
In one embodiment, parameter sets include gradient parameter of the target pixel points at least one preset direction, So parameter sets include multiple gradient parameters, determining unit 1021 is according to the parameters of each target pixel points obtained Set can determine the grain direction corresponding to the target pixel points, and specific determination mode can be according to each in parameter sets Preset direction corresponding to the size of individual gradient parameter value and each gradient parameter is determined.
Specifically, as an alternative embodiment, determining unit 1021 can be by the ginseng of each target pixel points The preset direction corresponding to greatest gradient parameter in manifold conjunction is defined as the grain direction corresponding to the target pixel points, or The perpendicular direction of preset direction corresponding to greatest gradient parameter in parameter sets with each target pixel points is true It is set to the grain direction corresponding to the target pixel points.
As another optional embodiment, determining unit 1021 can be by the parameter sets of each target pixel points In minimal gradient parameter corresponding to preset direction be defined as grain direction corresponding to the target pixel points, or will with it is every The perpendicular direction of the preset direction corresponding to minimal gradient parameter in the parameter sets of one target pixel points is defined as this Grain direction corresponding to target pixel points.
First quantifying unit 1022, for the gradient ginseng included in the parameter sets to target pixel points each described Number is quantified, and obtains the texture strength corresponding to the target pixel points.
In one embodiment, the first quantifying unit 1022 can be to being wrapped in the parameter sets of each target pixel points The gradient parameter contained is quantified, and obtains the texture strength corresponding to the target pixel points, because a target pixel points have One parameter sets, so when the parameter sets to each target pixel points quantify, it can obtain and target pixel points The texture strength of identical quantity, a target pixel points have a texture strength.Wherein, target pixel points, parameter sets and Texture strength will be one-to-one relation, be that can obtain the target pixel points according to the parameter sets of the target pixel points Corresponding texture strength.
Further, the gradient ginseng included in the first 1022 parameter sets to each target pixel points of quantifying unit The quantizing process that number is quantified can have following two embodiments:
In the first embodiment, the first quantifying unit 1022 is used for, by the parameter of each target pixel points Greatest gradient parameter in set is quantified, and quantized value is defined as into texture strength corresponding to the target pixel points.
As an alternative embodiment, parameter sets include gradient of the target pixel points at least one preset direction Parameter, the first quantifying unit 1022 choose the parameter sets of the target pixel points from the parameter sets of each target pixel points In greatest gradient parameter, each parameter sets has a greatest gradient parameter.First quantifying unit 1022 then should Greatest gradient parameter is quantified, and quantized value is defined as into texture strength corresponding to the target pixel points.It should be noted that When parameter sets only include a gradient parameter, then greatest gradient parameter is only one ladder included in parameter sets Spend parameter.
Further, the quantizing process that the first quantifying unit 1022 is quantified to greatest gradient parameter can be, in advance Set J threshold value H incremented by successively1,…,HJ, it is assumed that greatest gradient parameter is Vmax, works as Hj≤Vmax<Hj+1When, j=1 .., J-1, the texture strength of target pixel points is rank j;As Vmax >=HJWhen, the texture strength of target pixel points is rank J.Threshold value H1,…,HJMethod to set up can be it is a variety of, such as:Hj=β+C·(j-1)α, 1≤j≤J, wherein C are arithmetic number(Such as C=20 Or 30.5), α is arithmetic number(Such as α=1 or 2), β is arithmetic number(Such as β=T1 or β=20), J can be 2 or 3 or 4 etc..Threshold Value H1…,HJMethod to set up can also be formed so that the ordered series of numbers of certain rule is presented.Such as:J=4, threshold value H are set1、H2、H3、H4 Respectively 0,10,30,60 4 preset constant.
In second of embodiment, the first quantifying unit 1022 is used for, when described in each described target pixel points When parameter sets include at least two gradient parameters, respectively by the parameter sets of each target pixel points Greatest gradient parameter and the weighted average of second largest gradient parameter are quantified, and quantized result is defined as into the object pixel Texture strength corresponding to point.
As an alternative embodiment, when the parameter sets of each target pixel points include at least two gradients During parameter, for example, parameter sets include N number of gradient parameter, wherein N >=2, it is to be wrapped in the parameter sets of the target pixel points Include gradient parameter corresponding to N number of preset direction, then the first quantifying unit 1022 get parms set in maximum gradient parameter With second largest gradient parameter, the gradient parameter of maximum and the weighted average of second largest gradient parameter are asked for, and to weighted average Value is quantified, and quantized result is defined as into texture strength corresponding to the target pixel points.
Specifically, the maximum gradient parameter and second largest gradient parameter in hypothesis parameter sets are respectively Q1And Q2, then Q1And Q2, weighted average S be:
S=(w3Q1 κ+w4Q2 κ)1/κ
Wherein w3And w4For weighted factor, κ is arithmetic number, such as κ=0.5 or κ=1 or κ=3.
Then first quantifying unit 1022 quantifies to weighted average S, specific quantizing process and the first implementation It is identical to greatest gradient parameter Vmax quantizing process in mode, it will not be repeated here.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Also referring to Figure 10 a, Figure 10 a are another Vision Entropy acquisition device provided in an embodiment of the present invention.Wherein, scheme Vision Entropy acquisition device shown in 10a is that Vision Entropy acquisition device as shown in Figure 9 optimizes what is obtained.In regarding shown in Figure 10 a Feel in entropy acquisition device, including all modules and unit in the Vision Entropy acquisition device described in Fig. 9, acquisition module 100, first Determining module 101, the second determining module 102 and first acquisition unit 1010, determining unit in the first determining module 101 1011 and first quantifying unit 1012 refer to Fig. 9 description, will not be repeated here.Wherein it is determined that unit 1011 can include First comparing subunit 10110, the first determination subelement 10111 and the second determination subelement 10112.
First comparing subunit 10110, for by the ladder included in the parameter sets of each target pixel points Parameter is spent compared with predetermined threshold value;
As an alternative embodiment, the first comparing subunit 10110 is by the parameter set of each target pixel points For gradient parameter included in conjunction compared with predetermined threshold value, predetermined threshold value is arithmetic number, for example, predetermined threshold value can be 10 Or 14.14 or specified the difference of max pixel value and minimum pixel value in image-region 5%.
First determination subelement 10111, if the gradient parameter included in parameter sets for the target pixel points is equal Less than predetermined threshold value, then flat orientation or any one of preset direction are defined as to the texture corresponding to the target pixel points Direction;
As an alternative embodiment, if all gradient parameters in the parameter sets of target pixel points are both less than pre- If threshold value, then the first determination subelement 10111 is defined as a default direction set in advance corresponding to the target pixel points Grain direction.It should be noted that default direction set in advance can be set by user oneself, for example, can will be flat Direction is arranged to default direction set in advance, is when all gradient parameters in the parameter sets of target pixel points are respectively less than During predetermined threshold value, then it is assumed that target pixel points are in flat site, and flat orientation is the texture corresponding to the target pixel points Direction.Can also be by least one preset direction corresponding to each gradient parameter in the parameter sets of the target pixel points Any one preset direction is arranged to default direction set in advance, is when the institute in the first gradient parameter of target pixel points When thering is the gradient parameter to be respectively less than predetermined threshold value, then it is assumed that the grain direction corresponding to target pixel points is at least one preset direction In any one preset direction.
Second determination subelement 10112, if at least one gradient parameter in the parameter sets of the target pixel points More than predetermined threshold value, then the preset direction corresponding to the greatest gradient parameter in the parameter sets of the target pixel points is defined as Grain direction corresponding to the target pixel points, or by the greatest gradient parameter institute in the parameter sets with the target pixel points The perpendicular direction of corresponding preset direction is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, when in the parameter sets of target pixel points at least one gradient parameter be more than Predetermined threshold value, then the second determination subelement 10112 is by corresponding to the greatest gradient parameter in the parameter sets of the target pixel points Preset direction be defined as grain direction corresponding to the target pixel points.Either, by the parameter set with the target pixel points The perpendicular direction of the preset direction corresponding to greatest gradient parameter in conjunction is defined as the texture corresponding to the target pixel points Direction.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Also referring to Figure 10 b, Figure 10 b are another Vision Entropy acquisition device provided in an embodiment of the present invention.Wherein, scheme Vision Entropy acquisition device shown in 10b is that Vision Entropy acquisition device as shown in Figure 9 optimizes what is obtained.In regarding shown in Figure 10 b Feel in entropy acquisition device, including all modules and unit in the Vision Entropy acquisition device described in Fig. 9, acquisition module 100, first Determining module 101, the second determining module 102 and first acquisition unit 1010, determining unit in the first determining module 101 1011 and first quantifying unit 1012 refer to Fig. 9 description, will not be repeated here.Wherein it is determined that unit 1011 can include Second comparing subunit 10113, the 3rd determination subelement 10114 and the 4th determination subelement 10115.
Second comparing subunit 10113, for by the ladder included in the parameter sets of each target pixel points Parameter is spent compared with predetermined threshold value;
As an alternative embodiment, the second comparing subunit 10113 is by the parameter set of each target pixel points For gradient parameter included in conjunction compared with predetermined threshold value, predetermined threshold value is arithmetic number, for example, predetermined threshold value can be 10 Or 14.14 or specified the difference of max pixel value and minimum pixel value in image-region 5%.
3rd determination subelement 10114, if the gradient parameter included in parameter sets for the target pixel points is equal Less than predetermined threshold value, then default direction set in advance is defined as the grain direction corresponding to the target pixel points, wherein, institute Default direction set in advance is stated as any one preset direction in flat orientation or at least one preset direction;
As an alternative embodiment, if all gradient parameters in the parameter sets of target pixel points are both less than pre- If threshold value, then the 3rd determination subelement 10114 is defined as a default direction set in advance corresponding to the target pixel points Grain direction.It should be noted that default direction set in advance can be set by user oneself, for example, can will be flat Direction is arranged to default direction set in advance, is when all gradient parameters in the parameter sets of target pixel points are respectively less than During predetermined threshold value, then it is assumed that target pixel points are in flat site, and flat orientation is the texture corresponding to the target pixel points Direction.Can also be by least one preset direction corresponding to each gradient parameter in the parameter sets of the target pixel points Any one preset direction is arranged to default direction set in advance, is when the institute in the first gradient parameter of target pixel points When thering is the gradient parameter to be respectively less than predetermined threshold value, then it is assumed that the grain direction corresponding to target pixel points is at least one preset direction In any one preset direction.
4th determination subelement 10115, if at least one gradient parameter in the parameter sets of the target pixel points More than predetermined threshold value, then the preset direction corresponding to the minimal gradient parameter in the parameter sets of the target pixel points is defined as Grain direction corresponding to the target pixel points, or by the minimal gradient parameter institute in the parameter sets with the target pixel points The perpendicular direction of corresponding preset direction is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, when in the parameter sets of target pixel points at least one gradient parameter be more than Predetermined threshold value, then the 4th determination subelement 10115 is by corresponding to the minimal gradient parameter in the parameter sets of the target pixel points Preset direction be defined as grain direction corresponding to the target pixel points.Either, the 4th determination subelement 10115 will be with this The perpendicular direction of the preset direction corresponding to minimal gradient parameter in the parameter sets of target pixel points is defined as the target Grain direction corresponding to pixel.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
Also referring to Figure 10 c, Figure 10 c are another Vision Entropy acquisition device provided in an embodiment of the present invention.Wherein, scheme Vision Entropy acquisition device shown in 10c is that Vision Entropy acquisition device as shown in Figure 9 optimizes what is obtained.In regarding shown in Figure 10 c Feel in entropy acquisition device, including all modules and unit in the Vision Entropy acquisition device described in Fig. 9, acquisition module 100, first Determining module 101, the second determining module 102 and first acquisition unit 1010, determining unit in the first determining module 101 1011 and first quantifying unit 1012 refer to Fig. 9 description, will not be repeated here.Wherein it is determined that unit 1011 can include Obtain subelement 10116, the composition determination subelement 10118 of subelement 10117 and the 5th.
Subelement 10116 is obtained, for obtaining the neighborhood territory pixel point of each target pixel points, the neighborhood territory pixel Point is the pixel in the range of the predeterminable area centered on the target pixel points;
As an alternative embodiment, it is determined that during grain direction corresponding to target pixel points, subelement is obtained 10116 obtain the neighborhood territory pixel point of each target pixel points, it is necessary to which explanation, neighborhood territory pixel point is the phase of target pixel points Adjacent pixel, and the neighbor pixel needs the pixel in the range of the predeterminable area centered on target pixel points, preset areas Domain method of determining range can be that, for example, centered on target pixel points, preset length is the border circular areas scope of radius It is interior.It should be noted that the shape of predeterminable area scope does not limit, can be arbitrary shape, as long as the predeterminable area scope bag Include this target pixel points.For example, it may be the rectangular region centered on target pixel points.
Subelement 10117 is formed, for the neighborhood territory pixel point of each target pixel points and the target pixel points to be formed Target area, and obtain the pixel value of all pixels point in each described target area;
Each target pixel points and object pixel in image-region are specified as an alternative embodiment, obtaining After neighborhood of a point pixel, composition subelement 10117 forms the neighborhood territory pixel of target pixel points and target pixel points point Target area, be the corresponding target area of target pixel points, also contains in the target area target pixel points with And the neighborhood territory pixel point of the target pixel points, a specified image-region include multiple target areas.Form subelement 10117 simultaneously Obtain the pixel value of all pixels point included in each target area.
5th determination subelement 10118, for determining the picture in each described target area corresponding to max pixel value Pixel corresponding to vegetarian refreshments and minimum pixel value;Pixel in each described target area corresponding to max pixel value , will in the angle formed between line and each described preset direction between the pixel corresponding to minimum pixel value The minimum preset direction of the angle is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, the 5th determination subelement 10118 compares all pictures in each target area The pixel value of vegetarian refreshments, and determine corresponding to pixel and minimum pixel value in each target area corresponding to max pixel value Pixel, each target area have to should be corresponding to the max pixel value of target area pixel and minimum pixel The corresponding pixel of value.Pixel of 5th determination subelement 10118 corresponding to by max pixel value in each target area Pixel corresponding to point and minimum pixel value connects, and obtains the line of pixel, the line and all preset directions Between can form angle, determine corresponding to angle minimum in the angle that is formed between the line and all preset directions Preset direction be defined as grain direction corresponding to the target pixel points.Here it is illustrated, if a target area Line and the first preset direction between the pixel corresponding to pixel and minimum pixel value corresponding to middle max pixel value Angle is 20 degree, and the angle with the second preset direction is 15 degree, and the angle with the 3rd preset direction is 40 degree, then the target area Max pixel value corresponding to pixel and pixel corresponding to minimum pixel value between line and the second preset direction it Between angle it is minimum, so the second preset direction to be defined as to the grain direction of the target pixel points of the target area.
Figure 11 is the structural representation of another Vision Entropy acquisition device provided in an embodiment of the present invention, regarding shown in Figure 11 It is that Vision Entropy acquisition device as shown in Figure 8 optimizes what is obtained to feel entropy acquisition device.In the Vision Entropy acquisition device shown in Figure 11 In, including all modules in Vision Entropy acquisition device described in Fig. 8, acquisition module 100, the first determining module 101 and second Determining module 102 refer to Fig. 8 description, will not be repeated here.Further, the first determining module 101 can include second Acquiring unit 1013, computing unit 1014, compare the quantifying unit 1016 of determining unit 1015 and second.
Second acquisition unit 1013, for obtaining the first gradient parameter and the second gradient of each target pixel points Parameter, the first gradient parameter are the target pixel points in the gradient parameter of the first preset direction, second gradient parameter For the target pixel points the second preset direction gradient parameter;First preset direction and second preset direction are not Together;
As an alternative embodiment, the first preset direction can be horizontally oriented or vertical direction, or The other directions of person, the second preset direction can be horizontally oriented or vertical direction, or other directions.First is default Angle between direction and the second preset direction can be arbitrary, but be the first preset direction and second not equal to 0 degree Preset direction is different.For example, the first preset direction can be vertical direction, the second preset direction can be horizontal direction;First Preset direction can be 60 degree of directions of upper right, and the second preset direction can be 10 degree of bottom right direction.First gradient parameter is target Pixel the first preset direction gradient parameter, the second gradient parameter be target pixel points the second preset direction gradient join Number, the computational methods second acquisition unit 1013 of gradient parameter can be calculated according to gradient operator.
Computing unit 1014, for the first gradient parameter and described second according to each target pixel points Gradient parameter, calculates the reference angle of the target pixel points, and the reference angle is relative for the actual grain direction of the target pixel points Angle between first preset direction or second preset direction;
As an alternative embodiment, computing unit 1014 is according to the first gradient parameter of target pixel points and second Gradient parameter can calculate the reference angle θ of target pixel points.Reference angle θ be the target pixel points actual grain direction relative to Angle between first preset direction or the second preset direction, specific computational methods can be, it is assumed that target pixel points First gradient parameter is V1, and the second gradient parameter is V2, then the reference angle θ of target pixel points value can be carried out with below equation Calculate.When the angle of the first preset direction and the second preset direction is 90 degree, θ=arctan (V1/V2) or θ=arctan (V2/ V1);As θ=arctan (V1/V2), be represent target pixel points actual grain direction relative to the second preset direction it Between angle, be the actual grain direction for representing target pixel points relative to the first default side as θ=arctan (V2/V1) Angle between.When the angle between the first preset direction and the second preset direction is ψ(ψ<90°)When, then reference angle θ= Arctan (V1sin ψ/(V2+V1cos ψ)) or θ=ψ-arctan (V1sin ψ/(V2+V1cos ψ)).As θ=arctan When (V1sin ψ/(V2+V1cos ψ)), be represent target pixel points actual grain direction relative to the second preset direction Between angle, as θ=ψ-arctan (V1sin ψ/(V2+V1cos ψ)), be represent target pixel points actual line Direction is managed relative to the angle between the first preset direction.
Compare determining unit 1015, for by the reference angle of each target pixel points and at least one pre-selection Reference angle is compared, and the preselected direction corresponding to the preselected reference angle minimum by difference is compared is defined as the target pixel points Corresponding grain direction;Wherein, the preselected reference angle is preselected direction corresponding to the preselected reference angle relative to described the Angle between one preset direction or second preset direction;
As an alternative embodiment, at least one preselected reference angle can be user according to oneself need carry out Setting, preselected reference angle is that the preselected direction corresponding to the preselected reference angle is default relative to the first preset direction or second Angle between direction is, it is necessary to which explanation, needs relative to the angle between the first preset direction or the second preset direction It is consistent with the reference direction of the actual grain direction of target pixel points, if the reference angle of target pixel points is the reality of target pixel points Border grain direction is relative to the angle between the first preset direction, then preselected reference angle is also pre- corresponding to the preselected reference angle Direction is selected relative to the angle between the first preset direction.
Here illustrated by taking Fig. 7 as an example, it is assumed that 0 degree of direction in figure is the first preset direction, all preselected references Angle is that then at least one preselected direction in figure is respectively 0 degree of direction relative to the angle between the first preset direction(It is i.e. horizontal Direction, preselected reference angle are 0 degree),+30 degree directions(Preselected reference angle is 30 degree),+60 degree directions(Preselected reference angle is 60 Degree),+90 degree directions(That is vertical direction, preselected reference angle are 90 degree), -30 degree directions(Preselected reference angle is -30 degree), -60 degree Direction(Preselected reference angle is -60 degree)With -90 degree directions(Preselected reference angle is -90 degree).
Compare determining unit 1015 to be compared the reference angle of each target pixel points and at least one preselected reference angle Compared with, and the preselected direction corresponding to the preselected reference angle minimum by difference is compared is defined as texture side corresponding to the target pixel points To, i other words with grain direction that the immediate preselected directions of actual grain direction of target pixel points is target pixel points.This In be illustrated by taking the reference angle of some target pixel points as an example, it is assumed that during reference angle θ=20 degree of target pixel points, Then preselected direction+30 spends difference minimum of 30 degree of the preselected reference angle in direction between the reference angle θ of target pixel points, then incite somebody to action+ 30 degree of directions are defined as the grain direction of the target pixel points.Assuming that when reference angle θ=- 20 of target pixel points are spent, then preselect The difference that direction -30 is spent between the degree of preselected reference angle -30 in direction and the reference angle θ of target pixel points is minimum, then by -30 degree sides To the grain direction for being defined as the target pixel points.
Second quantifying unit 1016, for the first gradient parameter to target pixel points each described and described the Maximum in two gradient parameters carries out quantification treatment, obtains the texture strength corresponding to the target pixel points, or to each The first gradient parameter of the individual target pixel points carries out quantification treatment with the weighted average of second gradient parameter, Obtain the texture strength corresponding to the target pixel points.
As an alternative embodiment, the second quantifying unit 1016 is joined to the first gradient of each target pixel points Number carries out quantification treatment with the second gradient parameter, and specific quantification treatment process can be to ask for each target pixel points Maximum in first gradient parameter and the second gradient parameter, or the second gradient parameter of each target pixel points and the 3rd The weighted average of gradient parameter, maximum ask for mode and are, for example, make V=max (| V1 |, | V2 |), weighted average is asked for Mode is, for example, make V=(| V1φ|+|V2φ|)1/φ, φ is arithmetic number, such as φ=2 or 1.5.Second quantifying unit 1016 is then By the maximum or each target pixel points in the first gradient parameter and the second gradient parameter of each target pixel points The second gradient parameter and the weighted average of 3rd gradient parameter quantified, such as V is subjected to quantification treatment, is quantified as level Other s, as texture strength corresponding to target pixel points, specific quantizing process can use equal interval quantizing, such as makeWherein Q and λ is constant, optional Q=20, λ=0.5,For lower floor operation;Specific quantizing process It can also use with the equal interval quantizing for clamping down on operation, for example, orderWherein TH is positive integer, Such as TH=4 or TH=5.
Figure 12 is the structural representation of another Vision Entropy acquisition device provided in an embodiment of the present invention, as shown in figure 12, Memory 200, and the processor 201 being connected with memory 200, memory 200 are used to store batch processing code, processor 201 programs for being used to call memory 200 to store perform following operation:
Obtain at least two target pixel points specified in image-region;
Determine grain direction and texture strength corresponding to each described target pixel points;
The entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and according to whole Entropy obtains the Vision Entropy of the specified image-region.
Optionally, specified image-region can be in whole image region or the piece image in piece image A region
Optionally, the method for grain direction and texture strength corresponding to each target pixel points in target pixel points is determined Can be the gradient parameter for calculating the presumptive area centered on target pixel points first at least one preset direction, and Grain direction and texture strength according to corresponding to the gradient parameter of at least one preset direction determines the target pixel points.
Optionally, calculating the method for the Vision Entropy of the specified image-region can be, by resulting target pixel points Corresponding grain direction and texture strength are counted as calculating parameter, and according to calculating parameter and specific Vision Entropy calculation formula Calculate the Vision Entropy of specified image-region.
As an alternative embodiment, corresponding to each described target pixel points of the determination of the execution of processor 201 The entropy of grain direction and texture strength, and the Vision Entropy of the specified image-region is obtained according to the entropy of whole, including:
Calculate the joint of grain direction and texture strength corresponding to all target pixel points in the specified image-region Entropy, and the Vision Entropy using the combination entropy as the specified image-region;
Or the entropy of grain direction and the finger corresponding to all target pixel points in the calculating specified image-region Determine the weighted average of the entropy of texture strength corresponding to all target pixel points in image-region, and by the weighted average Vision Entropy as the specified image-region.
Optionally, Vision Entropy can be the combination entropy of grain direction and texture strength corresponding to each target pixel points, Specific computational methods can be, it is assumed that the quantity of target pixel points be N, then grain direction corresponding to all target pixel points with Texture strength is respectively Dn and Sn(N=1 ..., N, N>1, N is positive integer), to the N groups grain direction and texture strength, calculate The combination entropy of grain direction and texture strength, as the Vision Entropy E of specified image-region, i.e.,:
Wherein i and j is respectively the positive integer less than or equal to I and J;I represents the species number of grain direction, each texture Direction can use diRepresent;J represents the number of levels of texture strength, and s can be used per one-level texture strengthjRepresent;p(di,sj) it is N groups Grain direction is d in grain direction and texture strengthiAnd texture strength is sjThe probability that occurs of combination, that is, N groups texture side (d is combined as to its in texture strengthi,sj) probability;Log be logarithm operation symbol, its truth of a matter can be 2,10, e(Naturally often Number)Deng.Specified image-region can be a region in entire image, or image, such as a rectangular area or one Individual border circular areas or a delta-shaped region.
Optionally, the acquisition methods of Vision Entropy can obtain grain direction corresponding to each target pixel points first Entropy, the entropy of texture strength corresponding to each target pixel points is then obtained, finally to the entropy and line of acquired grain direction The entropy of reason intensity is weighted averagely, so as to obtain the Vision Entropy of specified image-region.Further, here with specified image district It is further detailed in domain exemplified by N number of target pixel points, then grain direction and texture are strong corresponding to all target pixel points Degree is respectively Dn and Sn(N=1 ..., N, N>1, N is positive integer), calculate adding for the entropy Ed of the grain direction and entropy Es of texture strength Weight average value, as the Vision Entropy E of specified image-region, i.e.,:
E=(w1Edγ+w2Esγ)1/γ
Wherein i and j is respectively the positive integer less than or equal to I and J;I represents the species number of grain direction, each texture Direction can use diRepresent;J represents the number of levels of texture strength, and s can be used per one-level texture strengthjRepresent;p(di) it is N number of texture Grain direction is d in directioniProbability, p (si) it is that texture strength is s in N number of texture strengthjProbability;Log is logarithm operation Symbol;w1And w2For weighted factor, and there is w1,w2>0, w1+w2=1, such as w1=0.7, w2=0.3, or w1=w2=0.5;γ is positive real Number, such as γ=0.5 or γ=1 or γ=3.
As an alternative embodiment, corresponding to each described target pixel points of the determination of the execution of processor 201 Grain direction and texture strength, including:
The parameter sets of each target pixel points are obtained, the parameter sets include the target pixel points at least The gradient parameter of one preset direction, wherein, the corresponding gradient parameter of a preset direction;
Grain direction according to corresponding to the parameter sets of each target pixel points determine the target pixel points;
The gradient parameter included in parameter sets to target pixel points each described quantifies, and obtains the target Texture strength corresponding to pixel.
Optionally, parameter sets include gradient parameter of the target pixel points at least one preset direction, wherein, one is pre- The corresponding gradient parameter of set direction, the corresponding parameter sets of a target pixel points a, so target pixel points are corresponding Multiple gradient parameters.Further, the method for obtaining the parameter sets of each target pixel points is specifically as follows, according to mesh The region Bn centered on the position An of pixel is marked, calculates K(K >=2, K are positive integer)The gradient parameter Vn, k of individual preset direction (k=1,…,K), Vn, k represent position An k-th of gradient parameter.K described preset direction has a diversified forms, such as can be with This 2 preset directions of the vertical direction of horizontal direction, image-region from image-region, or the level from image-region Direction, 45 degree of the bottom right direction of image-region, the vertical direction of image-region, this 4, the direction in 45 degree of lower-left of image-region are pre- Set direction or other preset directions.The gradient parameter of each preset direction can be with Sobel operators, Prewitt The Grad or its absolute value that the conventional gradient operator such as operator or Gabor operators is calculated with region Bn as process of convolution, It can also be calculated by method presented below, be to be calculated according to the pixel value of pixel in the Bn of region, with lower section Method is that the computational methods of gradient parameter are introduced by taking four preset directions as an example:
G=max(|P(x-1,y)-P(x+1,y)|,|P(x-2,y)-P(x+2,y)|)
G45°=max(|P(x-1,y-1)-P(x+1,y+1)|,|P(x-2,y-2)-P(x+2,y+2)|)
G90°=max(|P(x,y-1)-P(x,y+1)|,|P(x,y-2)-P(x,y+2)|)
G135°=max(|P(x+1,y+1)-P(x-1,y-1)|,|P(x+2,y+2)-P(x-2,y-2)|)
Wherein G、G45°、G90°、G135°Represent respectively the horizontal direction of image-region, image-region 45 degree of bottom right direction, The gradient parameter of 4 preset directions such as the vertical direction of image-region, 45 degree of the lower-left of image-region direction;(x, y) is target The position An of pixel coordinate, the coordinate can be to specify the tangent straight line of horizontal edge of image-region as transverse axis, with The tangent straight line in vertical edge for specifying image-region is the longitudinal axis, the points of two straight line intersections tangent with edge for origin institute The coordinate system of composition is reference frame.P (x, y) expression horizontal coordinates are x, the pixel value for the pixel that vertical coordinate is y; Max (C1, C2) represents the higher value in C1 and C2;| C3 | represent C3 absolute value.
Optionally, parameter sets include gradient parameter of the target pixel points at least one preset direction, so parameter Set includes multiple gradient parameters, and the target picture can be determined according to the parameter sets of each target pixel points obtained Grain direction corresponding to vegetarian refreshments, specific determination mode can be according to the size of each gradient parameter value in parameter sets with And the preset direction corresponding to each gradient parameter is determined.
Optionally, gradient parameter that can be included in the parameter sets to each target pixel points quantifies, and The texture strength corresponding to the target pixel points is obtained, because a target pixel points there are a parameter sets, to each When the parameter sets of individual target pixel points are quantified, the texture strength of quantity identical with target pixel points can be obtained, one Target pixel points have a texture strength.Wherein, target pixel points, parameter sets and texture strength will be closed correspondingly System, is the texture strength according to corresponding to the parameter sets of the target pixel points can obtain the target pixel points.
As an alternative embodiment, the parameter according to each target pixel points that processor 201 performs Set determines the grain direction corresponding to the target pixel points, including:
By the gradient parameter included in the parameter sets of each target pixel points compared with predetermined threshold value;
If the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value, by flat orientation Or any one of preset direction is defined as the grain direction corresponding to the target pixel points;
If at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points, by the target picture The preset direction corresponding to greatest gradient parameter in the parameter sets of vegetarian refreshments is defined as the texture corresponding to the target pixel points Direction, or by the perpendicular side of the preset direction corresponding to the greatest gradient parameter in the parameter sets with the target pixel points To the grain direction being defined as corresponding to the target pixel points.
Optionally, predetermined threshold value can be max pixel value and minimum pixel value in 10 or 14.14 or specified image-regions Difference 5%.
Optionally, if all gradient parameters in the parameter sets of target pixel points are both less than predetermined threshold value, one Default direction set in advance is defined as the grain direction corresponding to the target pixel points.It is it should be noted that set in advance Default direction can be set by user oneself, be to work as example, flat orientation can be arranged to default direction set in advance When all gradient parameters in the parameter sets of target pixel points are respectively less than predetermined threshold value, then it is assumed that target pixel points are in flat Region, flat orientation are the grain direction corresponding to the target pixel points.Can also be by the parameter sets of the target pixel points In any one preset direction at least one preset direction corresponding to each gradient parameter be arranged to set in advance silent Recognize direction, be when all gradient parameters in the first gradient parameter of target pixel points are respectively less than predetermined threshold value, then it is assumed that Grain direction corresponding to target pixel points is any one preset direction at least one preset direction.
Optionally, when in the parameter sets of target pixel points at least one gradient parameter be more than predetermined threshold value, then should The preset direction corresponding to greatest gradient parameter in the parameter sets of target pixel points is defined as corresponding to the target pixel points Grain direction.Either, by the greatest gradient parameter in the parameter sets with the target pixel points corresponding to preset direction Perpendicular direction is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, the parameter according to each target pixel points that processor 201 performs Set determines the grain direction corresponding to the target pixel points, including:
By the gradient parameter included in the parameter sets of each target pixel points compared with predetermined threshold value;
If the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value, will preset Default direction be defined as grain direction corresponding to the target pixel points, wherein, the default direction set in advance is flat Any one preset direction in smooth direction or at least one preset direction;
If at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points, by the target picture The preset direction corresponding to minimal gradient parameter in the parameter sets of vegetarian refreshments is defined as the texture corresponding to the target pixel points Direction, or by the perpendicular side of the preset direction corresponding to the minimal gradient parameter in the parameter sets with the target pixel points To the grain direction being defined as corresponding to the target pixel points.
Optionally, when in the parameter sets of target pixel points at least one gradient parameter be more than predetermined threshold value, then should The preset direction corresponding to minimal gradient parameter in the parameter sets of target pixel points is defined as corresponding to the target pixel points Grain direction.Either, by the minimal gradient parameter in the parameter sets with the target pixel points corresponding to preset direction Perpendicular direction is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, processor 201 perform the ginseng according to each target pixel points Manifold closes the grain direction determined corresponding to the target pixel points, including:
Obtain the neighborhood territory pixel point of each target pixel points, the neighborhood territory pixel point be using the target pixel points as Pixel in the range of the predeterminable area at center;
The neighborhood territory pixel of each target pixel points and target pixel points point is formed into target area, and obtains each The pixel value of all pixels point in the target area;
Determine corresponding to pixel and the minimum pixel value in each described target area corresponding to max pixel value Pixel;The pixel corresponding to max pixel value and the pixel corresponding to minimum pixel value in each described target area It is in the angle formed between line and each described preset direction between point, the minimum preset direction of the angle is true It is set to the grain direction corresponding to the target pixel points.
Optionally, the shape of predeterminable area scope does not limit, and can be arbitrary shape, as long as the predeterminable area scope includes This target pixel points.For example, it may be the rectangular region centered on target pixel points.
Optionally, obtain specify image-region in the neighborhood territory pixel of each target pixel points and target pixel points point it Afterwards, the neighborhood territory pixel of target pixel points and target pixel points point is formed into target area, is that a target pixel points are corresponding One target area, also contains target pixel points and the neighborhood territory pixel point of the target pixel points in the target area, one Image-region is specified to include multiple target areas.And obtain the pixel of all pixels point included in each target area Value.
Optionally, compare the pixel value of all pixels point in each target area, and determine in each target area The pixel corresponding to pixel and minimum pixel value corresponding to max pixel value, each target area have to should mesh Mark the pixel corresponding to the max pixel value in region and the pixel corresponding to minimum pixel value.By in each target area The pixel corresponding to pixel and minimum pixel value corresponding to max pixel value connects, and obtains the line of pixel, Angle can be formed between the line and all preset directions, determines what is formed between the line and all preset directions Preset direction in angle corresponding to minimum angle is defined as the grain direction corresponding to the target pixel points.Here lifted Example explanation, if between the pixel in a target area corresponding to max pixel value and the pixel corresponding to minimum pixel value The angle of line and the first preset direction is 20 degree, and the angle with the second preset direction is 15 degree, the folder with the 3rd preset direction Angle is 40 degree, then between the pixel corresponding to the max pixel value of the target area and the pixel corresponding to minimum pixel value Angle between line and the second preset direction is minimum, so the second preset direction to be defined as to the object pixel of the target area The grain direction of point.
As an alternative embodiment, the parameter according to each target pixel points that processor 201 performs Set determines the grain direction corresponding to the target pixel points, including:
Preset direction corresponding to greatest gradient parameter in the parameter sets of each target pixel points is determined For the grain direction corresponding to the target pixel points, or by with the maximum in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to gradient parameter is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, the parameter according to each target pixel points that processor 201 performs Set determines the grain direction corresponding to the target pixel points, including:
Preset direction corresponding to minimal gradient parameter in the parameter sets of each target pixel points is determined For the grain direction corresponding to the target pixel points, or by with the minimum in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to gradient parameter is defined as the grain direction corresponding to the target pixel points.
As an alternative embodiment, the parameter set to target pixel points each described that processor 201 performs Gradient parameter included in conjunction is quantified, and obtains the texture strength corresponding to the target pixel points, including:
Greatest gradient parameter in the parameter sets of each target pixel points is quantified, and quantized value is true It is set to texture strength corresponding to the target pixel points.
Optionally, parameter sets include gradient parameter of the target pixel points at least one preset direction, from each mesh Mark the greatest gradient parameter chosen in the parameter sets of pixel in the parameter sets of the target pixel points, each parameter sets There is a greatest gradient parameter.Then the greatest gradient parameter is quantified, and quantized value is defined as the object pixel Texture strength corresponding to point.It should be noted that when parameter sets only include a gradient parameter, then greatest gradient parameter is It is only one gradient parameter included in parameter sets.
Further, the quantizing process quantified to greatest gradient parameter can be that to preset J incremented by successively Threshold value H1,…,HJ, it is assumed that greatest gradient parameter is Vmax, works as Hj≤Vmax<Hj+1When, j=1 .., J-1, the line of target pixel points Reason intensity is rank j;As Vmax >=HJWhen, the texture strength of target pixel points is rank J.Threshold value H1,…,HJMethod to set up Can be a variety of, such as:Hj=β+C·(j-1)α, 1≤j≤J, wherein C are arithmetic number(Such as C=20 or 30.5), α is arithmetic number (Such as α=1 or 2), β is arithmetic number(Such as β=T1 or β=20), J can be 2 or 3 or 4 etc..Threshold value H1…,HJMethod to set up It can also be formed so that the ordered series of numbers of certain rule is presented.Such as:J=4, threshold value H are set1、H2、H3、H4Respectively 0,10,30,60 4 Individual preset constant.
As an alternative embodiment, the parameter set to target pixel points each described that processor 201 performs Gradient parameter included in conjunction is quantified, and obtains the texture strength corresponding to the target pixel points, including:
, respectively will be every when the parameter sets of each target pixel points include at least two gradient parameters Greatest gradient parameter and the weighted average of second largest gradient parameter in the parameter sets of one target pixel points Quantified, and quantized result is defined as texture strength corresponding to the target pixel points.
Optionally, when the parameter sets of each target pixel points include at least two gradient parameters, for example, parameter Set includes N number of gradient parameter, wherein N >=2, is, and the parameter sets of the target pixel points include N number of preset direction and corresponded to Gradient parameter, then get parms set in maximum gradient parameter and second largest gradient parameter, ask for maximum gradient ginseng The weighted average of number and second largest gradient parameter, and weighted average is quantified, quantized result is defined as the target Texture strength corresponding to pixel.
Specifically, the maximum gradient parameter and second largest gradient parameter in hypothesis parameter sets are respectively Q1And Q2, then Q1And Q2, weighted average S be:
S=(w3Q1 κ+w4Q2 κ)1/κ
Wherein w3And w4For weighted factor, κ is arithmetic number, such as κ=0.5 or κ=1 or κ=3.
Then weighted average S is quantified.
As an alternative embodiment, corresponding to each described target pixel points of the determination of the execution of processor 201 Grain direction and texture strength, including:
Obtain the first gradient parameter and the second gradient parameter of each target pixel points, the first gradient parameter For the target pixel points the first preset direction gradient parameter, second gradient parameter be the target pixel points it is pre- second The gradient parameter of set direction;First preset direction is different from second preset direction;
According to the first gradient parameter of each target pixel points and second gradient parameter, the mesh is calculated The reference angle of pixel is marked, the reference angle is the actual grain direction of the target pixel points relative to first preset direction Or the angle between second preset direction;
By the reference angle of each target pixel points compared with least one preselected reference angle, and it will compare Preselected direction compared with difference corresponding to minimum preselected reference angle is defined as grain direction corresponding to the target pixel points;Wherein, The preselected reference angle is preselected direction corresponding to the preselected reference angle relative to first preset direction or described the Angle between two preset directions;
The first gradient parameter to target pixel points each described and the maximum in second gradient parameter Quantification treatment is carried out, obtains the texture strength corresponding to the target pixel points, or the institute to target pixel points each described State first gradient parameter and the weighted average of second gradient parameter carries out quantification treatment, it is right to obtain target pixel points institute The texture strength answered.
Optionally, the first preset direction can be horizontally oriented or vertical direction, or other directions, and second Preset direction can be horizontally oriented or vertical direction, or other directions.First preset direction and the second default side Angle between can be arbitrary, but be that the first preset direction is different from the second preset direction not equal to 0 degree.
Optionally, target pixel points can be calculated according to the first gradient parameter of target pixel points and the second gradient parameter Reference angle θ.Reference angle θ is the actual grain direction of the target pixel points relative to the first preset direction or the second preset direction Between angle, specific computational methods can be, it is assumed that the first gradient parameter of target pixel points is V1, the second gradient parameter For V2, then the reference angle θ of target pixel points value can be calculated with below equation.When the first preset direction and second are preset When the angle in direction is 90 degree, θ=arctan (V1/V2) or θ=arctan (V2/V1);As θ=arctan (V1/V2), it is The actual grain direction of target pixel points is represented relative to the angle between the second preset direction, as θ=arctan (V2/V1), Be represent target pixel points actual grain direction relative to the angle between the first preset direction.When the first preset direction and Angle between second preset direction is ψ(ψ<90°)When, then reference angle θ=arctan (V1sin ψ/(V2+V1cos ψ)) or θ=ψ-arctan(V1·sinψ/(V2+V1·cosψ)).As θ=arctan (V1sin ψ/(V2+V1cos ψ)), it is The actual grain direction of target pixel points is represented relative to the angle between the second preset direction, as θ=ψ-arctan (V1sin ψ/(V2+V1cos ψ)) when, be represent target pixel points actual grain direction relative to the angle between the first preset direction Degree.
Optionally, at least one preselected reference angle can be that user is set according to the needs of oneself, preselected reference Angle be preselected direction corresponding to the preselected reference angle relative to the angle between the first preset direction or the second preset direction, It should be noted that need the reality with target pixel points relative to the angle between the first preset direction or the second preset direction The reference direction of border grain direction is consistent, if the reference angle of target pixel points for target pixel points actual grain direction relative to Angle between first preset direction, then preselected reference angle is also the preselected direction corresponding to the preselected reference angle relative to first Angle between preset direction.
Here illustrated by taking Fig. 7 as an example, it is assumed that 0 degree of direction in figure is the first preset direction, all preselected references Angle is that then at least one preselected direction in figure is respectively 0 degree of direction relative to the angle between the first preset direction(It is i.e. horizontal Direction, preselected reference angle are 0 degree),+30 degree directions(Preselected reference angle is 30 degree),+60 degree directions(Preselected reference angle is 60 Degree),+90 degree directions(That is vertical direction, preselected reference angle are 90 degree), -30 degree directions(Preselected reference angle is -30 degree), -60 degree Direction(Preselected reference angle is -60 degree)With -90 degree directions(Preselected reference angle is -90 degree).
By the reference angle of each target pixel points compared with least one preselected reference angle, and difference will be compared most Preselected direction corresponding to small preselected reference angle is defined as grain direction corresponding to the target pixel points, i other words and target picture The immediate preselected direction of actual grain direction of vegetarian refreshments is the grain direction of target pixel points.Here with some object pixel It is illustrated exemplified by the reference angle of point, it is assumed that during reference angle θ=20 degree of target pixel points, then the degree side of preselected direction+30 To difference of 30 degree of the preselected reference angle between the reference angle θ of target pixel points it is minimum, then+30 degree directions are defined as the mesh Mark the grain direction of pixel.Assuming that when reference angle θ=- 20 of target pixel points are spent, then preselected direction -30 spends the pre-selection in direction The difference that reference angle -30 is spent between the reference angle θ of target pixel points is minimum, then -30 degree directions is defined as into the object pixel The grain direction of point.
Optionally, quantification treatment, tool are carried out to the first gradient parameter and the second gradient parameter of each target pixel points The quantification treatment process of body can be ask for each target pixel points first gradient parameter and the second gradient parameter in most Big value, or the second gradient parameter of each target pixel points and the weighted average of 3rd gradient parameter, maximum are asked for Mode is, for example, make V=max (| V1 |, | V2 |), the mode of asking for of weighted average is, for example, make V=(| V1φ|+|V2φ|)1/φ, φ is arithmetic number, such as φ=2 or 1.5.Then by the first gradient parameter and the second gradient parameter of each target pixel points In maximum or each target pixel points the second gradient parameter and 3rd gradient parameter the weighted average amount of progress Change, such as V is subjected to quantification treatment, be quantified as rank s, as texture strength corresponding to target pixel points, specifically quantified Journey can use equal interval quantizing, such as makeWherein Q and λ is constant, optional Q=20, λ=0.5,For Lower floor operation;Specific quantizing process can also be used with the equal interval quantizing for clamping down on operation, for example, orderWherein TH is positive integer, such as TH=4 or TH=5.
In the embodiment of the present invention, obtain and specify grain direction and texture corresponding to each target pixel points in image-region After intensity, the entropy of grain direction and texture strength corresponding to each target pixel points is determined, and according to target complete pixel The entropy of point obtains specifying the Vision Entropy of image-region.The acquisition process of Vision Entropy not only allows for each mesh in the embodiment of the present invention Mark the grain direction of pixel, it is also considered that the texture strength of each target pixel points, accessed Vision Entropy accuracy is high, The complexity of specified image-region can truly be reflected.
One of ordinary skill in the art will appreciate that realize all or part of flow in above-described embodiment method, being can be with The hardware of correlation is instructed to complete by computer program, described program can be stored in a computer read/write memory medium In, the program is upon execution, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, described storage medium can be magnetic Dish, CD, read-only memory(Read-Only Memory, ROM)Or random access memory(Random Access Memory, RAM)Deng.
Step in present invention method can be sequentially adjusted, merged and deleted according to actual needs.
Module or unit in terminal of the embodiment of the present invention can be combined, divided and deleted according to actual needs.
The parts such as the microcontroller of the embodiment of the present invention, can with universal integrated circuit (such as central processor CPU), or with Application specific integrated circuit (ASIC) is realized.
Above disclosure is only preferred embodiment of present invention, can not limit the right model of the present invention with this certainly Enclose, therefore the equivalent variations made according to the claims in the present invention, still belong to the scope that the present invention is covered.

Claims (20)

1. a kind of Vision Entropy acquisition methods, it is characterised in that methods described includes:
Obtain at least two target pixel points specified in image-region;
Determine grain direction and texture strength corresponding to each described target pixel points;
The entropy of grain direction and texture strength corresponding to each described target pixel points is determined, and is obtained according to the entropy of whole To the Vision Entropy of the specified image-region;
Wherein, the entropy of grain direction and texture strength corresponding to described each described target pixel points of determination, and according to whole The entropy obtain the Vision Entropy of the specified image-region, including:
The combination entropy of grain direction and texture strength corresponding to all target pixel points in the specified image-region is calculated, and Vision Entropy using the combination entropy as the specified image-region;
Or the entropy of grain direction and the specified figure corresponding to all target pixel points in the calculating specified image-region The weighted average of the entropy of texture strength as corresponding to all target pixel points in region, and using the weighted average as The Vision Entropy of the specified image-region.
2. the method as described in claim 1, it is characterised in that texture corresponding to described each described target pixel points of determination Direction and texture strength, including:
The parameter sets of each target pixel points are obtained, the parameter sets include the target pixel points at least one The gradient parameter of preset direction, wherein, the corresponding gradient parameter of a preset direction;
Grain direction according to corresponding to the parameter sets of each target pixel points determine the target pixel points;
The gradient parameter included in parameter sets to target pixel points each described quantifies, and obtains the object pixel The corresponding texture strength of point.
3. method as claimed in claim 2, it is characterised in that the parameter sets according to each target pixel points The grain direction corresponding to the target pixel points is determined, including:
By the gradient parameter included in the parameter sets of each target pixel points compared with predetermined threshold value;
If the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value, by flat orientation or appoint One preset direction of meaning is defined as the grain direction corresponding to the target pixel points;
If at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points, by the target pixel points Parameter sets in greatest gradient parameter corresponding to preset direction be defined as grain direction corresponding to the target pixel points, It is or the perpendicular direction of the preset direction corresponding to the greatest gradient parameter in the parameter sets with the target pixel points is true It is set to the grain direction corresponding to the target pixel points.
4. method as claimed in claim 2, it is characterised in that the parameter sets according to each target pixel points The grain direction corresponding to the target pixel points is determined, including:
By the gradient parameter included in the parameter sets of each target pixel points compared with predetermined threshold value;
, will be set in advance silent if the gradient parameter included in the parameter sets of the target pixel points is respectively less than predetermined threshold value Recognize direction and be defined as grain direction corresponding to the target pixel points, wherein, the default direction set in advance is flat side To or at least one preset direction in any one preset direction;
If at least one gradient parameter is more than predetermined threshold value in the parameter sets of the target pixel points, by the target pixel points Parameter sets in minimal gradient parameter corresponding to preset direction be defined as grain direction corresponding to the target pixel points, It is or the perpendicular direction of the preset direction corresponding to the minimal gradient parameter in the parameter sets with the target pixel points is true It is set to the grain direction corresponding to the target pixel points.
5. method as claimed in claim 2, it is characterised in that the parameter sets according to each target pixel points The grain direction corresponding to the target pixel points is determined, including:
The neighborhood territory pixel point of each target pixel points is obtained, the neighborhood territory pixel point is centered on the target pixel points Predeterminable area in the range of pixel;
The neighborhood territory pixel of each target pixel points and target pixel points point is formed into target area, and obtained described in each The pixel value of all pixels point in target area;
Determine the pixel corresponding to max pixel value and the pixel corresponding to minimum pixel value in each described target area Point;Pixel in each described target area corresponding to max pixel value and the pixel corresponding to minimum pixel value it Between line and each described preset direction between in the angle that is formed, the minimum preset direction of the angle is defined as Grain direction corresponding to the target pixel points.
6. method as claimed in claim 2, it is characterised in that the parameter sets according to each target pixel points The grain direction corresponding to the target pixel points is determined, including:
Preset direction corresponding to greatest gradient parameter in the parameter sets of each target pixel points is defined as this Grain direction corresponding to target pixel points, or by with the greatest gradient in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to parameter is defined as the grain direction corresponding to the target pixel points.
7. method as claimed in claim 2, it is characterised in that the parameter sets according to each target pixel points The grain direction corresponding to the target pixel points is determined, including:
Preset direction corresponding to minimal gradient parameter in the parameter sets of each target pixel points is defined as this Grain direction corresponding to target pixel points, or by with the minimal gradient in the parameter sets of target pixel points each described The perpendicular direction of preset direction corresponding to parameter is defined as the grain direction corresponding to the target pixel points.
8. method as claimed in claim 2, it is characterised in that in the parameter sets to target pixel points each described Comprising gradient parameter quantified, obtain the texture strength corresponding to the target pixel points, including:
Greatest gradient parameter in the parameter sets of each target pixel points is quantified, and quantized value is defined as Texture strength corresponding to the target pixel points.
9. method as claimed in claim 2, it is characterised in that in the parameter sets to target pixel points each described Comprising gradient parameter quantified, obtain the texture strength corresponding to the target pixel points, including:
When the parameter sets of each target pixel points include at least two gradient parameters, respectively by each Greatest gradient parameter and the weighted average of second largest gradient parameter in the parameter sets of the target pixel points are carried out Quantify, and quantized result is defined as texture strength corresponding to the target pixel points.
10. the method as described in claim 1, it is characterised in that line corresponding to described each described target pixel points of determination Direction and texture strength are managed, including:
The first gradient parameter and the second gradient parameter of each target pixel points are obtained, the first gradient parameter is should For target pixel points in the gradient parameter of the first preset direction, second gradient parameter is the target pixel points in the second default side To gradient parameter;First preset direction is different from second preset direction;
According to the first gradient parameter of each target pixel points and second gradient parameter, the target picture is calculated The reference angle of vegetarian refreshments, the reference angle for the target pixel points actual grain direction relative to first preset direction or Angle between second preset direction;
By the reference angle of each target pixel points compared with least one preselected reference angle, and will be poor Preselected direction corresponding to the minimum preselected reference angle of value is defined as grain direction corresponding to the target pixel points;Wherein, it is described Preselected reference angle is preselected direction corresponding to the preselected reference angle relative to first preset direction or described second pre- Angle between set direction;
The first gradient parameter and the maximum in second gradient parameter of target pixel points each described are carried out Quantification treatment, obtain the texture strength corresponding to the target pixel points, or described to target pixel points each described The weighted average of one gradient parameter and second gradient parameter carries out quantification treatment, obtains corresponding to the target pixel points Texture strength.
11. a kind of Vision Entropy acquisition device, it is characterised in that described device includes:
Acquisition module, for obtaining at least two target pixel points in specified image-region;
First determining module, for determining grain direction and texture strength corresponding to each described target pixel points;
Second determining module, for calculating grain direction and line corresponding to all target pixel points in the specified image-region Manage the combination entropy of intensity, and the Vision Entropy using the combination entropy as the specified image-region;
Or second determining module is used to calculate texture corresponding to all target pixel points in the specified image-region The weighted average of the entropy of texture strength corresponding to all target pixel points in the entropy in direction and the specified image-region, and Vision Entropy using the weighted average as the specified image-region.
12. device as claimed in claim 11, it is characterised in that first determining module includes:
First acquisition unit, for obtaining the parameter sets of each target pixel points, the parameter sets include the mesh Gradient parameter of the pixel at least one preset direction is marked, wherein, the corresponding gradient parameter of a preset direction;
Determining unit, for the line corresponding to determining the target pixel points according to the parameter sets of each target pixel points Manage direction;
First quantifying unit, for the gradient parameter amount of progress included in the parameter sets to target pixel points each described Change, obtain the texture strength corresponding to the target pixel points.
13. device as claimed in claim 12, it is characterised in that the determining unit includes:
First comparing subunit, for by the gradient parameter included in the parameter sets of each target pixel points and in advance If threshold value is compared;
First determination subelement, if the gradient parameter included in parameter sets for the target pixel points is respectively less than default threshold Flat orientation or any one of preset direction, then be defined as the grain direction corresponding to the target pixel points by value;
Second determination subelement, if being more than default threshold at least one gradient parameter in the parameter sets of the target pixel points Value, then be defined as the object pixel by the preset direction corresponding to the greatest gradient parameter in the parameter sets of the target pixel points The corresponding grain direction of point, or will be default corresponding to the greatest gradient parameter in the parameter sets with the target pixel points The perpendicular direction in direction is defined as the grain direction corresponding to the target pixel points.
14. device as claimed in claim 12, it is characterised in that the determining unit includes:
Second comparing subunit, for by the gradient parameter included in the parameter sets of each target pixel points and in advance If threshold value is compared;
3rd determination subelement, if the gradient parameter included in parameter sets for the target pixel points is respectively less than default threshold Value, then be defined as the grain direction corresponding to the target pixel points by default direction set in advance, wherein, it is described to preset Default direction be any one preset direction in flat orientation or at least one preset direction;
4th determination subelement, if being more than default threshold at least one gradient parameter in the parameter sets of the target pixel points Value, then be defined as the object pixel by the preset direction corresponding to the minimal gradient parameter in the parameter sets of the target pixel points The corresponding grain direction of point, or will be default corresponding to the minimal gradient parameter in the parameter sets with the target pixel points The perpendicular direction in direction is defined as the grain direction corresponding to the target pixel points.
15. device as claimed in claim 12, it is characterised in that the determining unit includes:
Subelement is obtained, for obtaining the neighborhood territory pixel point of each target pixel points, the neighborhood territory pixel point is with this The pixel in the range of predeterminable area centered on target pixel points;
Subelement is formed, for the neighborhood territory pixel point of each target pixel points and the target pixel points to be formed into target area, And obtain the pixel value of all pixels point in each described target area;
5th determination subelement, for determining pixel and minimum in each described target area corresponding to max pixel value Pixel corresponding to pixel value;Pixel and minimum pixel in each described target area corresponding to max pixel value In the angle formed between line and each described preset direction between the corresponding pixel of value, by the angle most Small preset direction is defined as the grain direction corresponding to the target pixel points.
16. device as claimed in claim 12, it is characterised in that the determining unit is used for, by each target picture The preset direction corresponding to greatest gradient parameter in the parameter sets of vegetarian refreshments is defined as the texture corresponding to the target pixel points Direction, or by the preset direction phase corresponding to the greatest gradient parameter in the parameter sets of target pixel points each described Vertical direction is defined as the grain direction corresponding to the target pixel points.
17. device as claimed in claim 12, it is characterised in that the determining unit is used for, by each target picture The preset direction corresponding to minimal gradient parameter in the parameter sets of vegetarian refreshments is defined as the texture corresponding to the target pixel points Direction, or by the preset direction phase corresponding to the minimal gradient parameter in the parameter sets of target pixel points each described Vertical direction is defined as the grain direction corresponding to the target pixel points.
18. device as claimed in claim 12, it is characterised in that first quantifying unit is used for, by each mesh The greatest gradient parameter marked in the parameter sets of pixel is quantified, and quantized value is defined as corresponding to the target pixel points Texture strength.
19. device as claimed in claim 12, it is characterised in that first quantifying unit is used for, when each mesh When the parameter sets of mark pixel include at least two gradient parameters, respectively by the institute of each target pixel points State the greatest gradient parameter in parameter sets and the weighted average of second largest gradient parameter is quantified, and quantized result is true It is set to texture strength corresponding to the target pixel points.
20. device as claimed in claim 11, it is characterised in that first determining module includes:
Second acquisition unit, for obtaining the first gradient parameter and the second gradient parameter of each target pixel points, institute It is the target pixel points in the gradient parameter of the first preset direction to state first gradient parameter, and second gradient parameter is the target Gradient parameter of the pixel in the second preset direction;First preset direction is different from second preset direction;
Computing unit, join for the first gradient parameter according to each target pixel points and second gradient Number, calculates the reference angle of the target pixel points, and the reference angle is the actual grain direction of the target pixel points relative to described Angle between first preset direction or second preset direction;
Compare determining unit, for the reference angle of each target pixel points and at least one preselected reference angle to be entered Row compares, and the preselected direction corresponding to the preselected reference angle minimum by difference is compared is defined as line corresponding to the target pixel points Manage direction;Wherein, the preselected reference angle is that the preselected direction corresponding to the preselected reference angle is default square relative to described first To or second preset direction between angle;
Second quantifying unit, join for the first gradient parameter to target pixel points each described and second gradient Maximum in number carries out quantification treatment, obtains the texture strength corresponding to the target pixel points, or to mesh each described Mark the first gradient parameter of pixel and carry out quantification treatment with the weighted average of second gradient parameter, obtain the mesh Mark the texture strength corresponding to pixel.
CN201410105824.5A 2014-03-20 2014-03-20 A kind of Vision Entropy acquisition methods and device Active CN104933736B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410105824.5A CN104933736B (en) 2014-03-20 2014-03-20 A kind of Vision Entropy acquisition methods and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410105824.5A CN104933736B (en) 2014-03-20 2014-03-20 A kind of Vision Entropy acquisition methods and device

Publications (2)

Publication Number Publication Date
CN104933736A CN104933736A (en) 2015-09-23
CN104933736B true CN104933736B (en) 2018-01-19

Family

ID=54120889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410105824.5A Active CN104933736B (en) 2014-03-20 2014-03-20 A kind of Vision Entropy acquisition methods and device

Country Status (1)

Country Link
CN (1) CN104933736B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325707A (en) * 2007-06-12 2008-12-17 浙江大学 System for encoding and decoding texture self-adaption video

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102542303A (en) * 2010-12-24 2012-07-04 富士通株式会社 Device and method for generating classifier of specified object in detection image

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325707A (en) * 2007-06-12 2008-12-17 浙江大学 System for encoding and decoding texture self-adaption video

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种基于视觉熵的图像分割压缩方法;单志广 等;《北京科技大学学报》;20000430;第22卷(第2期);第185-189页 *
利用纹理信息的图像分块自适应压缩感知;王蓉芳 等;《电子学报》;20130831;第41卷(第8期);第1506-1514页 *
基于视觉熵的视觉注意计算模型;窦燕 等;《光学学报》;20090930;第29卷(第9期);第2511-2515页 *

Also Published As

Publication number Publication date
CN104933736A (en) 2015-09-23

Similar Documents

Publication Publication Date Title
Koenderink et al. Relief: Pictorial and otherwise
US6868191B2 (en) System and method for median fusion of depth maps
CN102682446B (en) Adaptive combined two-sided filter is used to generate equipment and the method for dense depth map
CN102663747B (en) Stereo image objectivity quality evaluation method based on visual perception
KR101001086B1 (en) Method and apparatus for modeling film grain patterns in the frequency domain
CN109118470B (en) Image quality evaluation method and device, terminal and server
CN106296669B (en) A kind of image quality evaluating method and device
CN108109147A (en) A kind of reference-free quality evaluation method of blurred picture
CN110062234A (en) A kind of perception method for video coding based on the just discernable distortion in region
CN113822982A (en) Human body three-dimensional model construction method and device, electronic equipment and storage medium
WO2021248966A1 (en) Point cloud quality assessment method, encoder, decoder, and storage medium
CN102595185A (en) Stereo image quality objective evaluation method
CN112784874B (en) Binocular vision stereo matching method and device, electronic equipment and storage medium
CN101976444A (en) Pixel type based objective assessment method of image quality by utilizing structural similarity
CN102740114A (en) Non-parameter evaluation method for subjective quality of video
WO2010102913A1 (en) Blur measurement in a block-based compressed image
CN110674925B (en) No-reference VR video quality evaluation method based on 3D convolutional neural network
CN114792345B (en) Calibration method based on monocular structured light system
CN103281537A (en) Compression method and device for dynamic range of image
WO2021102948A1 (en) Image processing method and device
CN111709977A (en) Binocular depth learning method based on adaptive unimodal stereo matching cost filtering
CN104933736B (en) A kind of Vision Entropy acquisition methods and device
CN102982532B (en) Stereo image objective quality evaluation method base on matrix decomposition
Hall Subjective evaluation of a perceptual quality metric
CN105430397B (en) A kind of 3D rendering Quality of experience Forecasting Methodology and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant