CN102903107B - Three-dimensional picture quality objective evaluation method based on feature fusion - Google Patents

Three-dimensional picture quality objective evaluation method based on feature fusion Download PDF

Info

Publication number
CN102903107B
CN102903107B CN201210357956.8A CN201210357956A CN102903107B CN 102903107 B CN102903107 B CN 102903107B CN 201210357956 A CN201210357956 A CN 201210357956A CN 102903107 B CN102903107 B CN 102903107B
Authority
CN
China
Prior art keywords
org
pixel
dis
picture
coordinate position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210357956.8A
Other languages
Chinese (zh)
Other versions
CN102903107A (en
Inventor
邵枫
段芬芳
蒋刚毅
郁梅
李福翠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Qizhen Information Technology Service Co.,Ltd.
Original Assignee
Ningbo University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo University filed Critical Ningbo University
Priority to CN201210357956.8A priority Critical patent/CN102903107B/en
Publication of CN102903107A publication Critical patent/CN102903107A/en
Application granted granted Critical
Publication of CN102903107B publication Critical patent/CN102903107B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a three-dimensional picture quality objective evaluation method based on feature fusion. The method includes that at first a single-eye image of an original non-distortion three-dimensional image and a single-eye image of a distortion three-dimensional image to be evaluated are calculated respectively, by calculating mean value and a standard difference of pixel value in each pixel point in the two single-eye images, objective evaluation metric of each pixel point in the single-eye image of the distortion three-dimensional image to be evaluated is obtained, remarkable images of the two single-eye images and a distortion image between the two single-eye images are calculated respectively, the objective evaluation metric of each pixel point in the single-eye image of the distortion three-dimensional image to be evaluated is fused, and an image quality objective evaluation predicating value of the distortion three-dimensional image to be evaluated is obtained. The three-dimensional picture quality objective evaluation method has the advantages that the obtained single-eye images can simulate a double-eye three-dimensional fusion process well, and by fusing the remarkable images and the distortion images, relevance of objective evaluation results and subjective perception can be effectively improved.

Description

The objective evaluation method for quality of stereo images that a kind of feature based merges
Technical field
The present invention relates to a kind of image quality evaluating method, especially relate to the objective evaluation method for quality of stereo images that a kind of feature based merges.
Background technology
Along with developing rapidly of image coding technique and stereo display technique, stereo-picture technology receives to be paid close attention to and application more and more widely, has become a current study hotspot.Stereo-picture technology utilizes the binocular parallax principle of human eye, and binocular receives the left and right visual point image from Same Scene independently of one another, is merged and forms binocular parallax, thus enjoy the stereo-picture with depth perception and realism by brain.Due to the impact of acquisition system, store compressed and transmission equipment, stereo-picture can inevitably introduce a series of distortion, and compared with single channel image, stereo-picture needs the picture quality ensureing two passages simultaneously, quality assessment is carried out to it and has very important significance.But current stereoscopic image quality lacks effective method for objectively evaluating and evaluates.Therefore, set up effective stereo image quality objective evaluation model tool to be of great significance.
Current objective evaluation method for quality of stereo images directly plane picture quality evaluating method is directly applied to evaluation stereo image quality, but, the left and right visual point image of stereoscopic image carries out merging the process that the relief process of generation is not the visual point image superposition of simple left and right, also be difficult to represent by simple mathematical method, therefore, how effectively binocular solid to be merged in stereo image quality evaluation procedure and simulate, how to extract effective characteristic information to merge evaluation result, objective evaluation result is made more to feel to meet human visual system, all carry out in stereoscopic image the problem that needs in evaluating objective quality process to research and solve.
Summary of the invention
Technical matters to be solved by this invention is to provide the objective evaluation method for quality of stereo images that a kind of feature based that effectively can improve the correlativity of objective evaluation result and subjective perception merges.
The present invention solves the problems of the technologies described above adopted technical scheme: the objective evaluation method for quality of stereo images that a kind of feature based merges, it is characterized in that its processing procedure is: first, according to the even symmetry frequency response of each pixel in the left visual point image of original undistorted stereo-picture and right visual point image in different scale and direction and odd symmetry frequency response, and the anaglyph between the left visual point image of original undistorted stereo-picture and right visual point image, obtain the one-eyed figure of original undistorted stereo-picture; According to the even symmetry frequency response of each pixel in the left visual point image of the stereo-picture of distortion to be evaluated and right visual point image in different scale and direction and odd symmetry frequency response, and the anaglyph between the left visual point image of original undistorted stereo-picture and right visual point image, obtain the one-eyed figure of the stereo-picture of distortion to be evaluated; Secondly, according to average and the standard deviation of the pixel value of each pixel in two one-eyed figure, the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is obtained; Again, according to amplitude and the phase place of the one-eyed figure of original undistorted stereo-picture, obtain corresponding remarkable figure; According to amplitude and the phase place of the one-eyed figure of the stereo-picture of distortion to be evaluated, obtain corresponding remarkable figure; Then, according to the distortion map between two remarkable figure and two one-eyed figure, the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is merged, obtains the picture quality objective evaluation predicted value of the stereo-picture of distortion to be evaluated; Finally, the picture quality objective evaluation predicted value of the stereo-picture of the distortion of the different distortion level of several different type of distortion is obtained according to above-mentioned processing procedure.
The concrete steps of the objective evaluation method for quality of stereo images that a kind of feature based of the present invention merges are:
1. S is made orgfor original undistorted stereo-picture, make S disfor the stereo-picture of distortion to be evaluated, by S orgleft visual point image be designated as { L org(x, y) }, by S orgright visual point image be designated as { R org(x, y) }, by S disleft visual point image be designated as { L dis(x, y) }, by S disright visual point image be designated as { R dis(x, y) }, wherein, (x herein, y) coordinate position of the pixel in left visual point image and right visual point image is represented, 1≤x≤W, 1≤y≤H, W represents the width of left visual point image and right visual point image, and H represents the height of left visual point image and right visual point image, L org(x, y) represents { L org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), R org(x, y) represents { R org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), L dis(x, y) represents { L dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y), R dis(x, y) represents { R dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
2. according to { L org(x, y) }, { R org(x, y) }, { L dis(x, y) }, { R dis(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, correspondingly obtain { L org(x, y) }, { R org(x, y) }, { L dis(x, y) }, { R dis(x, y) } in the amplitude of each pixel, then according to { L org(x, y) } and { R org(x, y) } in the amplitude of each pixel and { L org(x, y) } and { R org(x, y) } between anaglyph in the pixel value of each pixel, calculate S orgone-eyed figure, be designated as { CM org(x, y) }, and according to { L dis(x, y) } and { R dis(x, y) } in the amplitude of each pixel and { L org(x, y) } and { R org(x, y) } between anaglyph in the pixel value of each pixel, calculate S disone-eyed figure, be designated as { CM dis(x, y) }, wherein, CM org(x, y) represents { CM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), CM dis(x, y) represents { CM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
3. according to { CM org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, calculate { CM dis(x, y) } in the objective evaluation metric of each pixel, by { CM dis(x, y) } in coordinate position be that the objective evaluation metric of the pixel of (x, y) is designated as Q image(x, y);
4. according to { CM org(x, y) } amplitude and phase place, calculate { CM org(x, y) } remarkable figure, be designated as { SM org(x, y) }, and according to { CM dis(x, y) } amplitude and phase place, calculate { CM dis(x, y) } remarkable figure, be designated as { SM dis(x, y) }, wherein, SM org(x, y) represents { SM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), SM dis(x, y) represents { SM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
5. { CM is calculated org(x, y) } and { CM dis(x, y) } between distortion map, be designated as { DM (x, y) }, the pixel value being the pixel of (x, y) by coordinate position in { DM (x, y) } is designated as DM (x, y), DM (x, y)=(CM org(x, y)-CM dis(x, y)) 2;
6. according to { SM org(x, y) } and { SM dis(x, y) } and { DM (x, y) }, to { CM dis(x, y) } in the objective evaluation metric of each pixel merge, obtain S dispicture quality objective evaluation predicted value, be designated as Q, Q = [ Σ ( x , y ) ∈ Ω Q image ( x , y ) × SM ( x , y ) Σ ( x , y ) ∈ Ω SM ( x , y ) ] γ × [ Σ ( x , y ) ∈ Ω Q image ( x , y ) × DM ( x , y ) Σ ( x , y ) ∈ Ω DM ( x , y ) ] β , Wherein, Ω represents pixel domain scope, SM (x, y)=max (SM org(x, y), SM dis(x, y)), max () is for getting max function, γ and β is weight coefficient;
7. the undistorted stereo-picture that n original is adopted, set up its distortion stereo-picture set under the different distortion level of different type of distortion, this distortion stereo-picture set comprises the stereo-picture of several distortions, utilizes subjective quality assessment method to obtain the mean subjective scoring difference of the stereo-picture of every width distortion in the set of distortion stereo-picture respectively, is designated as DMOS, DMOS=100-MOS, wherein, MOS represents subjective scoring average, DMOS ∈ [0,100], n >=1;
8. 1. 6. S is calculated to step according to step disthe operation of picture quality objective evaluation predicted value, the picture quality objective evaluation predicted value of the stereo-picture of every width distortion in the set of calculated distortion stereo-picture respectively.
Described step detailed process is 2.:
2.-1, to { L org(x, y) } carry out filtering process, obtain { L org(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, by { L org(x, y) } in coordinate position be that the pixel of (x, y) is designated as e in the even symmetry frequency response in different scale and direction α, θ(x, y), by { L org(x, y) } in coordinate position be that the pixel of (x, y) is designated as o in the odd symmetry frequency response in different scale and direction α, θ(x, y), wherein, α represents the scale factor of the wave filter that filtering adopts, 1≤α≤4, and θ represents the direction factor of the wave filter that filtering adopts, 1≤θ≤4;
2.-2, according to { L org(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, calculate { L org(x, y) } in the amplitude of each pixel, by { L org(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as GE org L ( x , y ) = Σ θ = 1 4 Σ α = 1 4 e α , θ ( x , y ) 2 + o α , θ ( x , y ) 2 ;
2.-3,2.-1 2.-2 { L are obtained to step according to step org(x, y) } in the operation of amplitude of each pixel, obtain { R in an identical manner org(x, y) }, { L dis(x, y) } and { R dis(x, y) } in the amplitude of each pixel, by { R org(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as by { L dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as by { R dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as
2.-4, Block Matching Algorithm is adopted to calculate { L org(x, y) } and { R org(x, y) } between anaglyph, be designated as wherein, represent middle coordinate position is the pixel value of the pixel of (x, y);
2.-5, according to { L org(x, y) } and { R org(x, y) } in each pixel amplitude and in the pixel value of each pixel, calculate S orgone-eyed figure, be designated as { CM org(x, y) }, by { CM org(x, y) } in coordinate position be that the pixel value of the pixel of (x, y) is designated as CM org(x, y), CM org ( x , y ) = GE org L ( x , y ) × L org ( x , y ) + GE org R ( x - d org L ( x , y ) , y ) × R org ( x - d org L ( x , y ) , y ) GE org L ( x , y ) + GE org R ( x - d org L ( x , y ) , y ) , Wherein, represent { R org(x, y) } in coordinate position be the amplitude of pixel, represent { R org(x, y) } in coordinate position be the pixel value of pixel;
2.-6, according to { L dis(x, y) } and { R dis(x, y) } in each pixel amplitude and in the pixel value of each pixel, calculate S disone-eyed figure, be designated as { CM dis(x, y) }, by { CM dis(x, y) } in coordinate position be that the pixel value of the pixel of (x, y) is designated as CM dis(x, y), CM dis ( x , y ) = GE dis L ( x , y ) × L dis ( x , y ) + GE dis R ( x - d org L ( x , y ) , y ) × R dis ( x - d org L ( x , y ) , y ) GE dis L ( x , y ) + GE dis R ( x - d org L ( x , y ) , y ) , Wherein, represent { R dis(x, y) } in coordinate position be the amplitude of pixel, represent { R dis(x, y) } in coordinate position be the pixel value of pixel.
Described step 2. in-1 to { L org(x, y) } wave filter that carries out filtering process employing is log-Garbor wave filter.
Described step detailed process is 3.:
3.-1, { CM is calculated org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, by { CM org(x, y) } in coordinate position be (x 1, y 1) the average of pixel value of pixel and standard deviation be designated as μ respectively org(x 1, y 1) and σ org(x 1, y 1), by { CM dis(x, y) } in coordinate position be (x 1, y 1) the average of pixel value of pixel and standard deviation be designated as μ respectively dis(x 1, y 1) and σ dis(x 1, y 1),
σ org ( x 1 , y 1 ) = Σ ( x 1 , y 1 ) ∈ N ( x 1 , y 1 ) ( CM org ( x 1 , y 1 ) - μ org ( x 1 , y 1 ) ) 2 M ,
μ dis ( x 1 , y 1 ) = Σ ( x 1 , y 1 ) ∈ ( x 1 , y 1 ) CM dis ( x 1 , y 1 ) M ,
σ dis ( x 1 , y 1 ) = Σ ( x 1 , y 1 ) ∈ N ( x 1 , y 1 ) ( CM dis ( x 1 , y 1 ) - μ dis ( x 1 , y 1 ) ) 2 M , Wherein, 1≤x 1≤ W, 1≤y 1≤ H, N (x 1, y 1) represent with coordinate position for (x 1, y 1) pixel centered by 8 × 8 neighborhood windows, M represents N (x 1, y 1) in the number of pixel, CM org(x 1, y 1) represent { CM org(x, y) } in coordinate position be (x 1, y 1) the pixel value of pixel, CM dis(x 1, y 1) represent { CM dis(x, y) } in coordinate position be (x 1, y 1) the pixel value of pixel;
3.-2, according to { CM org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, calculate { CM dis(x, y) } in the objective evaluation metric of each pixel, by { CM dis(x, y) } in coordinate position be (x 1, y 1) the objective evaluation metric of pixel be designated as Q image(x 1, y 1), Q image ( x 1 , y 1 ) = 4 × ( μ org ( x 1 , y 1 ) × μ dis ( x 1 , y 1 ) ) × ( σ org ( x 1 , y 1 ) × σ dis ( x 1 , y 1 ) ) + C ( μ org ( x 1 , y 1 ) 2 + μ dis ( x 1 , y 1 ) 2 ) × ( σ org ( x 1 , y 1 ) 2 + σ dis ( x 1 , y 1 ) 2 ) + C , Wherein, C is controling parameters.
Described step detailed process is 4.:
4.-1, to { CM org(x, y) } carry out discrete Fourier transform (DFT), obtain { CM org(x, y) } amplitude and phase place, be designated as { M respectively org(u, v) } and { A org(u, v) }, wherein, u represents the amplitude of transform domain or the width of phase place, and v represents the amplitude of transform domain or the height of phase place, 1≤u≤W, 1≤v≤H, M org(u, v) represents { M org(u, v) } in coordinate position be the amplitude of the pixel of (u, v), A org(u, v) represents { A org(u, v) } in coordinate position be the phase value of the pixel of (u, v);
4.-2, { M is calculated org(u, v) } the amplitude of high fdrequency component, be designated as { R org(u, v) }, by { R org(u, v) } in coordinate position be that the amplitude of the high fdrequency component of the pixel of (u, v) is designated as R org(u, v), R org(u, v)=log (M org(u, v))-h m(u, v) * log (M org(u, v)), wherein, log () is take e as the logarithmic function at the end, and e=2.718281828, " * " is convolution operation symbol, h m(u, v) represents the mean filter of m × m;
4.-3, according to { R org(u, v) } and { A org(u, v) } carry out inverse discrete fourier transform, using the inverse transformation image of acquisition as { CM org(x, y) } remarkable figure, be designated as { SM org(x, y) }, wherein, SM org(x, y) represents { SM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
4.-4,4.-1 4.-3 { CM are obtained to step according to step org(x, y) } the operation of remarkable figure, obtain { CM in an identical manner dis(x, y) } remarkable figure, be designated as { SM dis(x, y) }, wherein, SM dis(x, y) represents { SM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
Compared with prior art, the invention has the advantages that:
1) the inventive method is by calculating the one-eyed figure of the one-eyed figure of original undistorted stereo-picture and the stereo-picture of distortion to be evaluated respectively, and directly the one-eyed figure of the stereo-picture of distortion is evaluated, effectively can simulate binocular solid fusion process like this, avoid the process of the objective evaluation metric of left visual point image and right visual point image being carried out to linear weighted function.
2) the inventive method passes through the distortion map between remarkable figure and two the one-eyed figure of the one-eyed figure calculating the original one-eyed figure of undistorted stereo-picture and the stereo-picture of distortion to be evaluated, and the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is merged, evaluation result can be made more to feel to meet human visual system, thus effectively improve the correlativity of objective evaluation result and subjective perception.
Accompanying drawing explanation
Fig. 1 be the inventive method totally realize block diagram;
Fig. 2 a is the left visual point image of Akko (being of a size of 640 × 480) stereo-picture;
Fig. 2 b is the right visual point image of Akko (being of a size of 640 × 480) stereo-picture;
Fig. 3 a is the left visual point image of Altmoabit (being of a size of 1024 × 768) stereo-picture;
Fig. 3 b is the right visual point image of Altmoabit (being of a size of 1024 × 768) stereo-picture;
Fig. 4 a is the left visual point image of Balloons (being of a size of 1024 × 768) stereo-picture;
Fig. 4 b is the right visual point image of Balloons (being of a size of 1024 × 768) stereo-picture;
Fig. 5 a is the left visual point image of Doorflower (being of a size of 1024 × 768) stereo-picture;
Fig. 5 b is the right visual point image of Doorflower (being of a size of 1024 × 768) stereo-picture;
Fig. 6 a is the left visual point image of Kendo (being of a size of 1024 × 768) stereo-picture;
Fig. 6 b is the right visual point image of Kendo (being of a size of 1024 × 768) stereo-picture;
Fig. 7 a is the left visual point image of LeaveLaptop (being of a size of 1024 × 768) stereo-picture;
Fig. 7 b is the right visual point image of LeaveLaptop (being of a size of 1024 × 768) stereo-picture;
Fig. 8 a is the left visual point image of Lovebierd1 (being of a size of 1024 × 768) stereo-picture;
Fig. 8 b is the right visual point image of Lovebierd1 (being of a size of 1024 × 768) stereo-picture;
Fig. 9 a is the left visual point image of Newspaper (being of a size of 1024 × 768) stereo-picture;
Fig. 9 b is the right visual point image of Newspaper (being of a size of 1024 × 768) stereo-picture;
Figure 10 a is the left visual point image of Puppy (being of a size of 720 × 480) stereo-picture;
Figure 10 b is the right visual point image of Puppy (being of a size of 720 × 480) stereo-picture;
Figure 11 a is the left visual point image of Soccer2 (being of a size of 720 × 480) stereo-picture;
Figure 11 b is the right visual point image of Soccer2 (being of a size of 720 × 480) stereo-picture;
Figure 12 a is the left visual point image of Horse (being of a size of 720 × 480) stereo-picture;
Figure 12 b is the right visual point image of Horse (being of a size of 720 × 480) stereo-picture;
Figure 13 a is the left visual point image of Xmas (being of a size of 640 × 480) stereo-picture;
Figure 13 b is the right visual point image of Xmas (being of a size of 640 × 480) stereo-picture;
Figure 14 is that the picture quality objective evaluation predicted value of the stereo-picture of each width distortion in the set of distortion stereo-picture and mean subjective are marked the scatter diagram of difference.
Embodiment
Below in conjunction with accompanying drawing embodiment, the present invention is described in further detail.
The objective evaluation method for quality of stereo images that a kind of feature based that the present invention proposes merges, it totally realizes block diagram as shown in Figure 1, its processing procedure is: first, according to the even symmetry frequency response of each pixel in the left visual point image of original undistorted stereo-picture and right visual point image in different scale and direction and odd symmetry frequency response, and the anaglyph between the left visual point image of original undistorted stereo-picture and right visual point image, obtain the one-eyed figure of original undistorted stereo-picture; According to the even symmetry frequency response of each pixel in the left visual point image of the stereo-picture of distortion to be evaluated and right visual point image in different scale and direction and odd symmetry frequency response, and the anaglyph between the left visual point image of original undistorted stereo-picture and right visual point image, obtain the one-eyed figure of the stereo-picture of distortion to be evaluated; Secondly, according to average and the standard deviation of the pixel value of each pixel in two one-eyed figure, the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is obtained; Again, according to amplitude and the phase place of the one-eyed figure of original undistorted stereo-picture, obtain corresponding remarkable figure; According to amplitude and the phase place of the one-eyed figure of the stereo-picture of distortion to be evaluated, obtain corresponding remarkable figure; Then, according to the distortion map between two remarkable figure and two one-eyed figure, the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is merged, obtains the picture quality objective evaluation predicted value of the stereo-picture of distortion to be evaluated; Finally, the picture quality objective evaluation predicted value of the stereo-picture of the distortion of the different distortion level of several different type of distortion is obtained according to above-mentioned processing procedure.
The inventive method specifically comprises the following steps:
1. S is made orgfor original undistorted stereo-picture, make S disfor the stereo-picture of distortion to be evaluated, by S orgleft visual point image be designated as { L org(x, y) }, by S orgright visual point image be designated as { R org(x, y) }, by S disleft visual point image be designated as { L dis(x, y) }, by S disright visual point image be designated as { R dis(x, y) }, wherein, (x herein, y) coordinate position of the pixel in left visual point image and right visual point image is represented, 1≤x≤W, 1≤y≤H, W represents the width of left visual point image and right visual point image, and H represents the height of left visual point image and right visual point image, L org(x, y) represents { L org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), R org(x, y) represents { R org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), L dis(x, y) represents { L dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y), R dis(x, y) represents { R dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
2. according to { L org(x, y) }, { R org(x, y) }, { L dis(x, y) }, { R dis(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, correspondingly obtain { L org(x, y) }, { R org(x, y) }, { L dis(x, y) }, { R dis(x, y) } in the amplitude of each pixel, then according to { L org(x, y) } and { R org(x, y) } in the amplitude of each pixel and { L org(x, y) } and { R org(x, y) } between anaglyph in the pixel value of each pixel, calculate S orgone-eyed figure (cyclopean map), be designated as { CM org(x, y) }, and according to { L dis(x, y) } and { R dis(x, y) } in the amplitude of each pixel and { L org(x, y) } and { R org(x, y) } between anaglyph in the pixel value of each pixel, calculate S disone-eyed figure, be designated as { CM dis(x, y) }, wherein, CM org(x, y) represents { CM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), CM dis(x, y) represents { CM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
In the present embodiment, step detailed process is 2.:
2.-1, to { L org(x, y) } carry out filtering process, obtain { L org(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, by { L org(x, y) } in coordinate position be that the pixel of (x, y) is designated as e in the even symmetry frequency response in different scale and direction α, θ(x, y), by { L org(x, y) } in coordinate position be that the pixel of (x, y) is designated as o in the odd symmetry frequency response in different scale and direction α, θ(x, y), wherein, α represents the scale factor of the wave filter that filtering adopts, 1≤α≤4, and θ represents the direction factor of the wave filter that filtering adopts, 1≤θ≤4.
At this, to { L org(x, y) } wave filter that carries out filtering process employing is log-Garbor wave filter.
2.-2, according to { L org(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, calculate { L org(x, y) } in the amplitude of each pixel, by { L org(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as GE org L ( x , y ) = Σ θ = 1 4 Σ α = 1 4 e α , θ ( x , y ) 2 + o α , θ ( x , y ) 2 .
2.-3,2.-1 2.-2 { L are obtained to step according to step org(x, y) } in the operation of amplitude of each pixel, obtain { R in an identical manner org(x, y) }, { L dis(x, y) } and { R dis(x, y) } in the amplitude of each pixel, by { R org(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as by { L dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as by { R dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as as obtained { L dis(x, y) } in the operating process of amplitude of each pixel be: 1) to { L dis(x, y) } carry out filtering process, obtain { L dis(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, by { L dis(x, y) } in coordinate position be that the pixel of (x, y) is designated as e in the even symmetry frequency response in different scale and direction α, θ' (x, y), by { L dis(x, y) } in coordinate position be that the pixel of (x, y) is designated as o in the odd symmetry frequency response in different scale and direction α, θ' (x, y), wherein, α represents the scale factor of the wave filter that filtering adopts, 1≤α≤4, and θ represents the direction factor of the wave filter that filtering adopts, 1≤θ≤4; 2) according to { L dis(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, calculate { L dis(x, y) } in the amplitude of each pixel, by { L dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as GE dis L ( x , y ) = Σ θ = 1 4 Σ α = 1 4 e α , θ ′ ( x , y ) 2 + o α , θ ′ ( x , y ) 2 .
2.-4, Block Matching Algorithm is adopted to calculate { L org(x, y) } and { R org(x, y) } between anaglyph, be designated as wherein, represent middle coordinate position is the pixel value of the pixel of (x, y).
2.-5, according to { L org(x, y) } and { R org(x, y) } in each pixel amplitude and in the pixel value of each pixel, calculate S orgone-eyed figure, be designated as { CM org(x, y) }, by { CM org(x, y) } in coordinate position be that the pixel value of the pixel of (x, y) is designated as CM org(x, y), CM org ( x , y ) = GE org L ( x , y ) × L org ( x , y ) + GE org R ( x - d org L ( x , y ) , y ) × R org ( x - d org L ( x , y ) , y ) GE org L ( x , y ) + GE org R ( x - d org L ( x , y ) , y ) , Wherein, represent { R org(x, y) } in coordinate position be the amplitude of pixel, represent { R org(x, y) } in coordinate position be the pixel value of pixel.
2.-6, according to { L dis(x, y) } and { R dis(x, y) } in each pixel amplitude and in the pixel value of each pixel, calculate S disone-eyed figure, be designated as { CM dis(x, y) }, by { CM dis(x, y) } in coordinate position be that the pixel value of the pixel of (x, y) is designated as CM dis(x, y), CM dis ( x , y ) = GE dis L ( x , y ) × L dis ( x , y ) + GE dis R ( x - d org L ( x , y ) , y ) × R dis ( x - d org L ( x , y ) , y ) GE dis L ( x , y ) + GE dis R ( x - d org L ( x , y ) , y ) , Wherein, represent { R dis(x, y) } in coordinate position be the amplitude of pixel, represent { R dis(x, y) } in coordinate position be the pixel value of pixel.
3. according to { CM org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, calculate { CM dis(x, y) } in the objective evaluation metric of each pixel, by { CM dis(x, y) } in coordinate position be that the objective evaluation metric of the pixel of (x, y) is designated as Q image(x, y), by { CM dis(x, y) } in the objective evaluation metric set expression of all pixels be { Q image(x, y) }.
In the present embodiment, step detailed process is 3.:
3.-1, { CM is calculated org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, by { CM org(x, y) } in coordinate position be (x 1, y 1) the average of pixel value of pixel and standard deviation be designated as μ respectively org(x 1, y 1) and σ org(x 1, y 1), by { CM dis(x, y) } in coordinate position be (x 1, y 1) the average of pixel value of pixel and standard deviation be designated as μ respectively dis(x 1, y 1) and σ dis(x 1, y 1),
σ org ( x 1 , y 1 ) = Σ ( x 1 , y 1 ) ∈ N ( x 1 , y 1 ) ( CM org ( x 1 , y 1 ) - μ org ( x 1 , y 1 ) ) 2 M ,
μ dis ( x 1 , y 1 ) = Σ ( x 1 , y 1 ) ∈ ( x 1 , y 1 ) CM dis ( x 1 , y 1 ) M ,
σ dis ( x 1 , y 1 ) = Σ ( x 1 , y 1 ) ∈ N ( x 1 , y 1 ) ( CM dis ( x 1 , y 1 ) - μ dis ( x 1 , y 1 ) ) 2 M , Wherein, 1≤x 1≤ W, 1≤y 1≤ H, N (x 1, y 1) represent with coordinate position for (x 1, y 1) pixel centered by 8 × 8 neighborhood windows, M represents N (x 1, y 1) in the number of pixel, CM org(x 1, y 1) represent { CM org(x, y) } in coordinate position be (x 1, y 1) the pixel value of pixel, CM dis(x 1, y 1) represent { CM dis(x, y) } in coordinate position be (x 1, y 1) the pixel value of pixel.
3.-2, according to { CM org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, calculate { CM dis(x, y) } in the objective evaluation metric of each pixel, by { CM dis(x, y) } in coordinate position be (x 1, y 1) the objective evaluation metric of pixel be designated as Q image(x 1, y 1), Q image ( x 1 , y 1 ) = 4 × ( μ org ( x 1 , y 1 ) × μ dis ( x 1 , y 1 ) ) × ( σ org ( x 1 , y 1 ) × σ dis ( x 1 , y 1 ) ) + C ( μ org ( x 1 , y 1 ) 2 + μ dis ( x 1 , y 1 ) 2 ) × ( σ org ( x 1 , y 1 ) 2 + σ dis ( x 1 , y 1 ) 2 ) + C , Wherein, C is controling parameters, in the present embodiment, gets C=0.01.
4. according to { CM org(x, y) } spectrum redundancy properties, namely according to { CM org(x, y) } amplitude and phase place, calculate { CM org(x, y) } remarkable figure (saliency map), be designated as { SM org(x, y) }, and according to { CM dis(x, y) } spectrum redundancy properties, namely according to { CM dis(x, y) } amplitude and phase place, calculate { CM dis(x, y) } remarkable figure, be designated as { SM dis(x, y) }, wherein, SM org(x, y) represents { SM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), SM dis(x, y) represents { SM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
In the present embodiment, step detailed process is 4.:
4.-1, to { CM org(x, y) } carry out discrete Fourier transform (DFT), obtain { CM org(x, y) } amplitude and phase place, be designated as { M respectively org(u, v) } and { A org(u, v) }, wherein, u represents the amplitude of transform domain or the width of phase place, and v represents the amplitude of transform domain or the height of phase place, 1≤u≤W, 1≤v≤H, M org(u, v) represents { M org(u, v) } in coordinate position be the amplitude of the pixel of (u, v), A org(u, v) represents { A org(u, v) } in coordinate position be the phase value of the pixel of (u, v).
4.-2, { M is calculated org(u, v) } the amplitude of high fdrequency component, be designated as { R org(u, v) }, by { R org(u, v) } in coordinate position be that the amplitude of the high fdrequency component of the pixel of (u, v) is designated as R org(u, v), R org(u, v)=log (M org(u, v))-h m(u, v) * log (M org(u, v)), wherein, log () is take e as the logarithmic function at the end, and e=2.718281828, " * " is convolution operation symbol, h m(u, v) represents the mean filter of m × m, in the present embodiment, gets m=3.
4.-3, according to { R org(u, v) } and { A org(u, v) } carry out inverse discrete fourier transform, using the inverse transformation image of acquisition as { CM org(x, y) } remarkable figure, be designated as { SM org(x, y) }, wherein, SM org(x, y) represents { SM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
4.-4,4.-1 4.-3 { CM are obtained to step according to step org(x, y) } the operation of remarkable figure, obtain { CM in an identical manner dis(x, y) } remarkable figure, be designated as { SM dis(x, y) }, wherein, SM dis(x, y) represents { SM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
5. { CM is calculated org(x, y) } and { CM dis(x, y) distortion map (distortion map) }, be designated as { DM (x, y) }, be (x by coordinate position in { DM (x, y) }, the pixel value of pixel y) is designated as DM (x, y), DM (x, y)=(CM org(x, y)-CM dis(x, y)) 2.
6. according to { SM org(x, y) } and { SM dis(x, y) } and { DM (x, y) }, to { CM dis(x, y) } in the objective evaluation metric of each pixel merge, obtain S dispicture quality objective evaluation predicted value, be designated as Q, Q = [ Σ ( x , y ) ∈ Ω Q image ( x , y ) × SM ( x , y ) Σ ( x , y ) ∈ Ω SM ( x , y ) ] γ × [ Σ ( x , y ) ∈ Ω Q image ( x , y ) × DM ( x , y ) Σ ( x , y ) ∈ Ω DM ( x , y ) ] β , Wherein, Ω represents pixel domain scope, SM (x, y)=max (SM org(x, y), SM dis(x, y)), max () is for getting max function, γ and β is weight coefficient, in the present embodiment, gets γ=1.601, β=0.501.
7. the undistorted stereo-picture that n original is adopted, set up its distortion stereo-picture set under the different distortion level of different type of distortion, this distortion stereo-picture set comprises the stereo-picture of several distortions, utilizes subjective quality assessment method to obtain the mean subjective scoring difference of the stereo-picture of every width distortion in the set of distortion stereo-picture respectively, is designated as DMOS, DMOS=100-MOS, wherein, MOS represents subjective scoring average, DMOS ∈ [0,100], n >=1.
In the present embodiment, the stereo-picture as Fig. 2 a and Fig. 2 b is formed is utilized, the stereo-picture that Fig. 3 a and Fig. 3 b is formed, the stereo-picture that Fig. 4 a and Fig. 4 b is formed, the stereo-picture that Fig. 5 a and Fig. 5 b is formed, the stereo-picture that Fig. 6 a and Fig. 6 b is formed, the stereo-picture that Fig. 7 a and Fig. 7 b is formed, the stereo-picture that Fig. 8 a and Fig. 8 b is formed, the stereo-picture that Fig. 9 a and Fig. 9 b is formed, the stereo-picture that Figure 10 a and Figure 10 b is formed, the stereo-picture that Figure 11 a and Figure 11 b is formed, the stereo-picture that Figure 12 a and Figure 12 b is formed, the stereo-picture that Figure 13 a and Figure 13 b are formed totally 12 width (n=12) undistorted stereo-picture establishes its distortion stereo-picture set under the different distortion level of different type of distortion, this distortion stereo-picture set comprises the stereo-picture of 252 width distortions of 4 kinds of type of distortion altogether, wherein stereo-picture totally 60 width of the distortion of JPEG compression, stereo-picture totally 60 width of the distortion of JPEG2000 compression, stereo-picture totally 60 width of the distortion of Gaussian Blur (Gaussian Blur), stereo-picture totally 72 width of the distortion of H.264 encoding.
8. 1. 6. S is calculated to step according to step disthe operation of picture quality objective evaluation predicted value, the picture quality objective evaluation predicted value of the stereo-picture of every width distortion in the set of calculated distortion stereo-picture respectively.
12 undistorted stereo-pictures shown in Fig. 2 a to Figure 13 b are adopted to analyze at the stereo-picture of JPEG compression in various degree, JPEG2000 compression, Gaussian Blur and 252 width distortions H.264 in coding distortion situation the correlativity that the picture quality objective evaluation predicted value of the stereo-picture of the distortion that the present embodiment obtains and mean subjective mark between difference.Here, utilize 4 of evaluate image quality evaluating method conventional objective parameters as evaluation index, namely Pearson correlation coefficient (the Pearson linear correlation coefficient under non-linear regression condition, PLCC), Spearman related coefficient (Spearman rank order correlation coefficient, SROCC), Kendall related coefficient (Kendall rank-order correlation coefficient, KROCC), square error (root mean squared error, RMSE), PLCC and RMSE reflects that the stereo-picture of distortion evaluates the accuracy of objective models, SROCC and KROCC reflects its monotonicity.The picture quality objective evaluation predicted value of the stereo-picture of the distortion calculated by the inventive method is done five parameter Logistic function nonlinear fittings, PLCC, SROCC and KROCC value is higher, and the lower explanation method for objectively evaluating of RMSE value and mean subjective difference correlativity of marking is better.By the Pearson correlation coefficient adopting the inventive method respectively and do not adopt the inventive method to obtain between the picture quality objective evaluation predicted value of the stereo-picture of distortion and subjective scoring, Spearman related coefficient, Kendall related coefficient and square error compare, comparative result is as table 1, table 2, shown in table 3 and table 4, from table 1, table 2, can find out in table 3 and table 4, final picture quality objective evaluation predicted value and the mean subjective correlativity of marking between difference of the stereo-picture of the distortion adopting the inventive method to obtain are very high, show that the result of objective evaluation result and human eye subjective perception is more consistent, be enough to the validity that the inventive method is described.
Figure 14 gives the scatter diagram that the picture quality objective evaluation predicted value of the stereo-picture of each width distortion in the set of distortion stereo-picture and mean subjective mark difference, and loose point is more concentrated, illustrates that the consistance of objective review result and subjective perception is better.As can be seen from Figure 14, adopt the scatter diagram that obtains of the inventive method more concentrated, and the goodness of fit between subjective assessment data is higher.
Table 1 utilizes the inventive method to compare with the Pearson correlation coefficient between subjective scoring with the picture quality objective evaluation predicted value of the stereo-picture of the distortion not utilizing the inventive method to obtain
Table 2 utilizes the inventive method to compare with the Spearman related coefficient between subjective scoring with the picture quality objective evaluation predicted value of the stereo-picture of the distortion not utilizing the inventive method to obtain
Table 3 utilizes the inventive method to compare with the Kendall related coefficient between subjective scoring with the picture quality objective evaluation predicted value of the stereo-picture of the distortion not utilizing the inventive method to obtain
Table 4 utilizes the inventive method to compare with the square error between subjective scoring with the picture quality objective evaluation predicted value of the stereo-picture of the distortion not utilizing the inventive method to obtain

Claims (6)

1. the objective evaluation method for quality of stereo images of a feature based fusion, it is characterized in that its processing procedure is: first, according to the even symmetry frequency response of each pixel in the left visual point image of original undistorted stereo-picture and right visual point image in different scale and direction and odd symmetry frequency response, and the anaglyph between the left visual point image of original undistorted stereo-picture and right visual point image, obtain the one-eyed figure of original undistorted stereo-picture; According to the even symmetry frequency response of each pixel in the left visual point image of the stereo-picture of distortion to be evaluated and right visual point image in different scale and direction and odd symmetry frequency response, and the anaglyph between the left visual point image of original undistorted stereo-picture and right visual point image, obtain the one-eyed figure of the stereo-picture of distortion to be evaluated; Secondly, according to average and the standard deviation of the pixel value of each pixel in two one-eyed figure, the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is obtained; Again, according to amplitude and the phase place of the one-eyed figure of original undistorted stereo-picture, obtain corresponding remarkable figure; According to amplitude and the phase place of the one-eyed figure of the stereo-picture of distortion to be evaluated, obtain corresponding remarkable figure; Then, according to the distortion map between two remarkable figure and two one-eyed figure, the objective evaluation metric of each pixel in the one-eyed figure of the stereo-picture of distortion to be evaluated is merged, obtains the picture quality objective evaluation predicted value of the stereo-picture of distortion to be evaluated; Finally, the picture quality objective evaluation predicted value of the stereo-picture of the distortion of the different distortion level of several different type of distortion is obtained according to above-mentioned processing procedure.
2. the objective evaluation method for quality of stereo images of a kind of feature based fusion according to claim 1, is characterized in that it specifically comprises the following steps:
1. S is made orgfor original undistorted stereo-picture, make S disfor the stereo-picture of distortion to be evaluated, by S orgleft visual point image be designated as { L org(x, y) }, by S orgright visual point image be designated as { R org(x, y) }, by S disleft visual point image be designated as { L dis(x, y) }, by S disright visual point image be designated as { R dis(x, y) }, wherein, (x herein, y) coordinate position of the pixel in left visual point image and right visual point image is represented, 1≤x≤W, 1≤y≤H, W represents the width of left visual point image and right visual point image, and H represents the height of left visual point image and right visual point image, L org(x, y) represents { L org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), R org(x, y) represents { R org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), L dis(x, y) represents { L dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y), R dis(x, y) represents { R dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
2. according to { L org(x, y) }, { R org(x, y) }, { L dis(x, y) }, { R dis(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, correspondingly obtain { L org(x, y) }, { R org(x, y) }, { L dis(x, y) }, { R dis(x, y) } in the amplitude of each pixel, then according to { L org(x, y) } and { R org(x, y) } in the amplitude of each pixel and { L org(x, y) } and { R org(x, y) } between anaglyph in the pixel value of each pixel, calculate S orgone-eyed figure, be designated as { CM org(x, y) }, and according to { L dis(x, y) } and { R dis(x, y) } in the amplitude of each pixel and { L org(x, y) } and { R org(x, y) } between anaglyph in the pixel value of each pixel, calculate S disone-eyed figure, be designated as { CM dis(x, y) }, wherein, CM org(x, y) represents { CM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), CM dis(x, y) represents { CM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
3. according to { CM org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, calculate { CM dis(x, y) } in the objective evaluation metric of each pixel, by { CM dis(x, y) } in coordinate position be that the objective evaluation metric of the pixel of (x, y) is designated as Q image(x, y);
4. according to { CM org(x, y) } amplitude and phase place, calculate { CM org(x, y) } remarkable figure, be designated as { SM org(x, y) }, and according to { CM dis(x, y) } amplitude and phase place, calculate { CM dis(x, y) } remarkable figure, be designated as { SM dis(x, y) }, wherein, SM org(x, y) represents { SM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y), SM dis(x, y) represents { SM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
5. { CM is calculated org(x, y) } and { CM dis(x, y) } between distortion map, be designated as { DM (x, y) }, the pixel value being the pixel of (x, y) by coordinate position in { DM (x, y) } is designated as DM (x, y), DM (x, y)=(CM org(x, y)-CM dis(x, y)) 2;
6. according to { SM org(x, y) } and { SM dis(x, y) } and { DM (x, y) }, to { CM dis(x, y) } in the objective evaluation metric of each pixel merge, obtain S dispicture quality objective evaluation predicted value, be designated as Q, wherein, Ω represents pixel domain scope, SM (x, y)=max (SM org(x, y), SM dis(x, y)), max () is for getting max function, γ and β is weight coefficient;
7. the undistorted stereo-picture that n original is adopted, set up its distortion stereo-picture set under the different distortion level of different type of distortion, this distortion stereo-picture set comprises the stereo-picture of several distortions, utilizes subjective quality assessment method to obtain the mean subjective scoring difference of the stereo-picture of every width distortion in the set of distortion stereo-picture respectively, is designated as DMOS, DMOS=100-MOS, wherein, MOS represents subjective scoring average, DMOS ∈ [0,100], n >=1;
8. 1. 6. S is calculated to step according to step disthe operation of picture quality objective evaluation predicted value, the picture quality objective evaluation predicted value of the stereo-picture of every width distortion in the set of calculated distortion stereo-picture respectively.
3. the objective evaluation method for quality of stereo images of a kind of feature based fusion according to claim 2, is characterized in that described step detailed process is 2.:
2.-1, to { L org(x, y) } carry out filtering process, obtain { L org(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, by { L org(x, y) } in coordinate position be that the pixel of (x, y) is designated as e in the even symmetry frequency response in different scale and direction α, θ(x, y), by { L org(x, y) } in coordinate position be that the pixel of (x, y) is designated as o in the odd symmetry frequency response in different scale and direction α, θ(x, y), wherein, α represents the scale factor of the wave filter that filtering adopts, 1≤α≤4, and θ represents the direction factor of the wave filter that filtering adopts, 1≤θ≤4;
2.-2, according to { L org(x, y) } in the even symmetry frequency response of each pixel in different scale and direction and odd symmetry frequency response, calculate { L org(x, y) } in the amplitude of each pixel, by { L org(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as
2.-3,2.-1 2.-2 { L are obtained to step according to step org(x, y) } in the operation of amplitude of each pixel, obtain { R in an identical manner org(x, y) }, { L dis(x, y) } and { R dis(x, y) } in the amplitude of each pixel, by { R org(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as by { L dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as by { R dis(x, y) } in coordinate position be that the amplitude of the pixel of (x, y) is designated as
2.-4, Block Matching Algorithm is adopted to calculate { L org(x, y) } and { R org(x, y) } between anaglyph, be designated as wherein, represent middle coordinate position is the pixel value of the pixel of (x, y);
2.-5, according to { L org(x, y) } and { R org(x, y) } in each pixel amplitude and in the pixel value of each pixel, calculate S orgone-eyed figure, be designated as { CM org(x, y) }, by { CM org(x, y) } in coordinate position be that the pixel value of the pixel of (x, y) is designated as CM org(x, y), wherein, represent { R org(x, y) } in coordinate position be the amplitude of pixel, represent { R org(x, y) } in coordinate position be the pixel value of pixel;
2.-6, according to { L dis(x, y) } and { R dis(x, y) } in each pixel amplitude and in the pixel value of each pixel, calculate S disone-eyed figure, be designated as { CM dis(x, y) }, by { CM dis(x, y) } in coordinate position be that the pixel value of the pixel of (x, y) is designated as CM dis(x, y), wherein, represent { R dis(x, y) } in coordinate position be the amplitude of pixel, represent { R dis(x, y) } in coordinate position be the pixel value of pixel.
4. the objective evaluation method for quality of stereo images that merges of a kind of feature based according to claim 3, it is characterized in that described step 2. in-1 to { L org(x, y) } wave filter that carries out filtering process employing is log-Garbor wave filter.
5. the objective evaluation method for quality of stereo images that a kind of feature based according to any one of claim 2 to 4 merges, is characterized in that described step detailed process is 3.:
3.-1, { CM is calculated org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, by { CM org(x, y) } in coordinate position be (x 1, y 1) the average of pixel value of pixel and standard deviation be designated as μ respectively org(x 1, y 1) and σ org(x 1, y 1), by { CM dis(x, y) } in coordinate position be (x 1, y 1) the average of pixel value of pixel and standard deviation be designated as μ respectively dis(x 1, y 1) and σ dis(x 1, y 1),
wherein, 1≤x 1≤ W, 1≤y 1≤ H, N (x 1, y 1) represent with coordinate position for (x 1, y 1) pixel centered by 8 × 8 neighborhood windows, M represents N (x 1, y 1) in the number of pixel, CM org(x 1, y 1) represent { CM org(x, y) } in coordinate position be (x 1, y 1) the pixel value of pixel, CM dis(x 1, y 1) represent { CM dis(x, y) } in coordinate position be (x 1, y 1) the pixel value of pixel;
3.-2, according to { CM org(x, y) } and { CM dis(x, y) } in the average of pixel value of each pixel and standard deviation, calculate { CM dis(x, y) } in the objective evaluation metric of each pixel, by { CM dis(x, y) } in coordinate position be (x 1, y 1) the objective evaluation metric of pixel be designated as Q image(x 1, y 1), wherein, C is controling parameters.
6. the objective evaluation method for quality of stereo images of a kind of feature based fusion according to claim 5, is characterized in that described step detailed process is 4.:
4.-1, to { CM org(x, y) } carry out discrete Fourier transform (DFT), obtain { CM org(x, y) } amplitude and phase place, be designated as { M respectively org(u, v) } and { A org(u, v) }, wherein, u represents the amplitude of transform domain or the width of phase place, and v represents the amplitude of transform domain or the height of phase place, 1≤u≤W, 1≤v≤H, M org(u, v) represents { M org(u, v) } in coordinate position be the amplitude of the pixel of (u, v), A org(u, v) represents { A org(u, v) } in coordinate position be the phase value of the pixel of (u, v);
4.-2, { M is calculated org(u, v) } the amplitude of high fdrequency component, be designated as { R org(u, v) }, by { R org(u, v) } in coordinate position be that the amplitude of the high fdrequency component of the pixel of (u, v) is designated as R org(u, v), R org(u, v)=log (M org(u, v))-h m(u, v) * log (M org(u, v)), wherein, log () is take e as the logarithmic function at the end, and e=2.718281828, " * " is convolution operation symbol, h m(u, v) represents the mean filter of m × m;
4.-3, according to { R org(u, v) } and { A org(u, v) } carry out inverse discrete fourier transform, using the inverse transformation image of acquisition as { CM org(x, y) } remarkable figure, be designated as { SM org(x, y) }, wherein, SM org(x, y) represents { SM org(x, y) } in coordinate position be the pixel value of the pixel of (x, y);
4.-4,4.-1 4.-3 { CM are obtained to step according to step org(x, y) } the operation of remarkable figure, obtain { CM in an identical manner dis(x, y) } remarkable figure, be designated as { SM dis(x, y) }, wherein, SM dis(x, y) represents { SM dis(x, y) } in coordinate position be the pixel value of the pixel of (x, y).
CN201210357956.8A 2012-09-24 2012-09-24 Three-dimensional picture quality objective evaluation method based on feature fusion Active CN102903107B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210357956.8A CN102903107B (en) 2012-09-24 2012-09-24 Three-dimensional picture quality objective evaluation method based on feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210357956.8A CN102903107B (en) 2012-09-24 2012-09-24 Three-dimensional picture quality objective evaluation method based on feature fusion

Publications (2)

Publication Number Publication Date
CN102903107A CN102903107A (en) 2013-01-30
CN102903107B true CN102903107B (en) 2015-07-08

Family

ID=47575320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210357956.8A Active CN102903107B (en) 2012-09-24 2012-09-24 Three-dimensional picture quality objective evaluation method based on feature fusion

Country Status (1)

Country Link
CN (1) CN102903107B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103200420B (en) * 2013-03-19 2015-03-25 宁波大学 Three-dimensional picture quality objective evaluation method based on three-dimensional visual attention
CN103281556B (en) * 2013-05-13 2015-05-13 宁波大学 Objective evaluation method for stereo image quality on the basis of image decomposition
CN103369348B (en) * 2013-06-27 2015-03-25 宁波大学 Three-dimensional image quality objective evaluation method based on regional importance classification
CN106960432B (en) * 2017-02-08 2019-10-25 宁波大学 A kind of no reference stereo image quality evaluation method
CN107945151B (en) * 2017-10-26 2020-01-21 宁波大学 Repositioning image quality evaluation method based on similarity transformation
CN108694705B (en) * 2018-07-05 2020-12-11 浙江大学 Multi-frame image registration and fusion denoising method
CN109903273B (en) * 2019-01-30 2023-03-17 武汉科技大学 Stereo image quality objective evaluation method based on DCT domain characteristics

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101378519A (en) * 2008-09-28 2009-03-04 宁波大学 Method for evaluating quality-lose referrence image quality base on Contourlet transformation
CN101610425A (en) * 2009-07-29 2009-12-23 清华大学 A kind of method and apparatus of evaluating stereo image quality
CN102170581A (en) * 2011-05-05 2011-08-31 天津大学 Human-visual-system (HVS)-based structural similarity (SSIM) and characteristic matching three-dimensional image quality evaluation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4817246B2 (en) * 2006-07-31 2011-11-16 Kddi株式会社 Objective video quality evaluation system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101378519A (en) * 2008-09-28 2009-03-04 宁波大学 Method for evaluating quality-lose referrence image quality base on Contourlet transformation
CN101610425A (en) * 2009-07-29 2009-12-23 清华大学 A kind of method and apparatus of evaluating stereo image quality
CN102170581A (en) * 2011-05-05 2011-08-31 天津大学 Human-visual-system (HVS)-based structural similarity (SSIM) and characteristic matching three-dimensional image quality evaluation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Quality Assessment of Stereoscopic Images;Alexandre Benoit,et al;《Quality Assessment of Stereoscopic Images》;20081014;1-13 *
基于小波图像融合的非对称失真;周武杰等;《光电工程》;20111130;第38卷(第11期);100-105 *

Also Published As

Publication number Publication date
CN102903107A (en) 2013-01-30

Similar Documents

Publication Publication Date Title
CN102903107B (en) Three-dimensional picture quality objective evaluation method based on feature fusion
CN102333233B (en) Stereo image quality objective evaluation method based on visual perception
CN102547368B (en) Objective evaluation method for quality of stereo images
CN104036501A (en) Three-dimensional image quality objective evaluation method based on sparse representation
CN103413298B (en) A kind of objective evaluation method for quality of stereo images of view-based access control model characteristic
CN102708567B (en) Visual perception-based three-dimensional image quality objective evaluation method
CN103136748B (en) The objective evaluation method for quality of stereo images of a kind of feature based figure
CN104243976A (en) Stereo image objective quality evaluation method
CN105282543B (en) Total blindness three-dimensional image quality objective evaluation method based on three-dimensional visual perception
CN102843572B (en) Phase-based stereo image quality objective evaluation method
CN104394403B (en) A kind of stereoscopic video quality method for objectively evaluating towards compression artefacts
CN104658001A (en) Non-reference asymmetric distorted stereo image objective quality assessment method
CN104408716A (en) Three-dimensional image quality objective evaluation method based on visual fidelity
CN103200420B (en) Three-dimensional picture quality objective evaluation method based on three-dimensional visual attention
CN104240248A (en) Method for objectively evaluating quality of three-dimensional image without reference
CN104902268B (en) Based on local tertiary mode without with reference to three-dimensional image objective quality evaluation method
CN104036502A (en) No-reference fuzzy distorted stereo image quality evaluation method
CN102663747A (en) Stereo image objectivity quality evaluation method based on visual perception
CN102999912B (en) A kind of objective evaluation method for quality of stereo images based on distortion map
CN104811691A (en) Stereoscopic video quality objective evaluation method based on wavelet transformation
CN105357519A (en) Quality objective evaluation method for three-dimensional image without reference based on self-similarity characteristic
CN103369348B (en) Three-dimensional image quality objective evaluation method based on regional importance classification
CN105574901A (en) General reference-free image quality evaluation method based on local contrast mode
CN102999911B (en) Three-dimensional image quality objective evaluation method based on energy diagrams
CN106791822A (en) It is a kind of based on single binocular feature learning without refer to stereo image quality evaluation method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191218

Address after: Room 1,020, Nanxun Science and Technology Pioneering Park, No. 666 Chaoyang Road, Nanxun District, Huzhou City, Zhejiang Province, 313000

Patentee after: Huzhou You Yan Intellectual Property Service Co., Ltd.

Address before: 315211 Zhejiang Province, Ningbo Jiangbei District Fenghua Road No. 818

Patentee before: Ningbo University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201229

Address after: 213001 3rd floor, Jinhu innovation center, No.8 Taihu Middle Road, Xinbei District, Changzhou City, Jiangsu Province

Patentee after: Jiangsu Qizhen Information Technology Service Co.,Ltd.

Address before: 313000 room 1020, science and Technology Pioneer Park, 666 Chaoyang Road, Nanxun Town, Nanxun District, Huzhou, Zhejiang.

Patentee before: Huzhou You Yan Intellectual Property Service Co.,Ltd.