CN105208387B - A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction - Google Patents

A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction Download PDF

Info

Publication number
CN105208387B
CN105208387B CN201510675511.8A CN201510675511A CN105208387B CN 105208387 B CN105208387 B CN 105208387B CN 201510675511 A CN201510675511 A CN 201510675511A CN 105208387 B CN105208387 B CN 105208387B
Authority
CN
China
Prior art keywords
mrow
estimated
sad
pixel
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510675511.8A
Other languages
Chinese (zh)
Other versions
CN105208387A (en
Inventor
朱威
张训华
沈吉龙
杨洋
陈朋
郑雅羽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201510675511.8A priority Critical patent/CN105208387B/en
Publication of CN105208387A publication Critical patent/CN105208387A/en
Application granted granted Critical
Publication of CN105208387B publication Critical patent/CN105208387B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The present invention relates to a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction, comprise the following steps:(1)A PU to be estimated is inputted, establishes actually available intra prediction mode set;(2)Calculate all pixels in PU to be estimated be different from director space adjacent pixel poor absolute value and;(3)According to the absolute difference of different directions spatial neighborhood pixels and the grain direction characteristic for judging PU to be estimated;(4)Thick level pattern search scope is determined according to grain direction characteristic;(5)Rate-distortion optimization candidate pattern set is established according to thick level pattern search scope and actually available intra prediction mode set;(6)Choose optimal intra prediction mode.The present invention first reduces the thick level hunting zone of predictive mode according to PU to be estimated grain direction feature, the candidate pattern number for carrying out rate-distortion optimization is reduced again, and the computation complexity of HEVC Intra prediction mode selections can be significantly reduced while well encoded distortion performance is kept.

Description

A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction
Technical field
The present invention relates to digital video coding field, and in particular to a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction.
Background technology
With the fast development of multimedia technology, the video datas of various resolution ratio (including SD, high definition and ultra high-definition regard Frequently occur in succession), the transmission of video data and storage faces enormous challenge.To meet the development of video data compression and transmission Demand, Video coding joint specialist group (the Joint Collaborative Team on organized by ISO/IEC and ITU-T Video Coding, JCT-VC) formulate high efficiency video encoding standard (High Efficiency Video of new generation Coding,HEVC/H.265).Under identical video quality, HEVC with prior-generation video encoding standard H.264 compared with reduce The code stream of half or so is (see G.J.Sullivan, J.-R.Ohm, W.-J.Han, and T.Wiegand, Overview of The high efficiency video coding (HEVC) standard, i.e., " general introduction of high efficiency video encoding standard ", IEEE Transactions on Circuits and Systems for Video Technology,vol.22,no.12, Pp.1649-1668, Dec.2012), i.e., code efficiency is doubled, but its computation complexity increase several times.Although HEVC Mixed coding technology is equally used with traditional video encoding standard, introduces newly encoded technology, such as code tree list in many aspects The inter prediction of first (Coding Tree Unit, CTU) quad-tree partition, multi-angle intra prediction mode and a variety of dividing modes Pattern etc..In order to which more neatly coded image, HEVC propose three kinds of partition modes, respectively coding unit (Coding Unit, CU), predicting unit (Prediction Unit, PU) and converter unit (Transform Unit).HEVC PU predictions Process includes inter prediction and infra-frame prediction, and for wherein I frames or the PU of every IDR frame all only with infra-frame prediction, other type frames can be with Use infra-frame prediction and inter prediction simultaneously.In order to improve the compression efficiency of infra-frame prediction, HEVC is using existing sky around PU Between adjacent reconstruction pixel carry out infra-frame prediction, can at most use 35 kinds of intra prediction modes (see J.Lainema, F.Bossen, W.-J Han, J.Min, and K.Ugur, Intra coding of the HEVC standard, i.e. " HEVC The intraframe coding of standard ", IEEE Transactions on Circuits and Systems for Video Technology,vol.22,no.12,pp.1792-1801,Dec.2012).In HEVC all intra prediction modes, compile Number 0 Planar patterns and the DC patterns of numbering 1 are applied to flat site, the 33 kinds of angle predictive modes of correspondence of numbering 2~34, its In, horizontally right direction is predicted the angle predictive mode of numbering 10, and the angle predictive mode of numbering 26 is along straight down Direction is predicted, and the angle predictive mode of numbering 18 is diagonally predicted toward lower right, the angle prediction mould of numbering 2 Formula is diagonally predicted toward upper right, and the angle predictive mode of numbering 34 is diagonally predicted toward lower left. In HEVC test model HM, intra-prediction process first to carry out thick level mode decision (Rough Mode Decision, RMD), the absolute error after the residual signals Hadamard transform by calculating PU and (Sum of Absolute Transformed Difference, SATD) carry out preliminary screening predictive mode, for the PU that size is 4 × 4 and 8 × 8, choose 8 kinds of predictive modes and make For candidate pattern, for the PU that size is 16 × 16,32 × 32 and 64 × 64, choose 3 kinds of predictive modes as candidate pattern (see L.Zhao,L.Zhang,X.Zhao,S.Ma,D.Zhao,W.Gao,Further encoder improvement for intra Mode decision (JCTVC-D283), i.e. " the further optimization of intra mode decision in cataloged procedure ", Proceedings of the JCT-VC 4th meeting,pp.1-4,Jan.2011).Then rate-distortion optimization (Rate Distortion are used Optimization, RDO) technology (see Wiegand T, Schwarz H, Joch A, Kossentini F, Sullivan G J, Rate-constrained coder control and comparison of video coding standards, i.e., " depending on The restricted encoder of code check of frequency coding standard is controlled and compared ", IEEE Transactions on Circuits and Systems for Video Technology,2003,13(7):688-703), the selection rate distortion generation from some candidate patterns The minimum pattern of valency is as optimum prediction mode in PU frame.HEVC intra prediction modes more can than H.264 more rich and varied It is adapted to coding high-resolution video, but which increases the computation complexity of HEVC intraframe codings.
Has some HEVC Adaptive Mode Selection Method for Intra-Prediction at present.Application No. 201210138816.1 it is special Candidate modes number is reduced to by profit during RMD for the PU that size is 4 × 4 and 8 × 8,16 × 16 and 32 × 32 2~5.Qi Mei is refined et al. to propose a kind of HEVC intra mode decisions based on image texture direction and spatial coherence quickly side (see Qi Meibin, Zhu Guanghui, Yang Yanfang, Jiang Jianguo utilize texture and the HEVC Intra prediction mode selections of spatial coherence to method [J] Journal of Image and Graphics, 2014,19 (8), 1119-1125).The PU texture sides that this method is obtained using Sobel operators Always establish and be selected predictive mode list, and optimum prediction mode in the adjacent PU frames based on spatial coherence is added into the row Table.Application No. 201410842187.X patent provides a kind of HEVC Intra prediction mode selections accelerated method.In the party In method, if PU has texture homogeneity feature, RMD the 1st predictive mode chosen directly is chosen for optimal infra-frame prediction Pattern;Secondly RMD preceding 2 predictive modes chosen are divided into the different situation of 3 classes to accelerate model selection.Waited with above-mentioned reduction The method for selecting predictive mode is different, and the patent of Application No. 201410024635.5 proposes a kind of HEVC fast frames based on SATD Interior prediction method, the computation complexity of HEVC infra-frame prediction is reduced by terminating CU division.This method is calculated by SATD Go out one group of adaptable threshold value, if current CU SATD is less than given threshold value, terminate CU division.And Application No. 201310445775.5 patent on the one hand according to Texture complication determine CU divide;On the other hand, according to PU texture features Some predictive modes for most unlikely turning into optimal mode are deleted from candidate modes list.
The content of the invention
For the computation complexity of effective reduction HEVC infra-frame predictions under conditions of keeping encoding distortion performance, this hair It is bright to provide a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction.
In order to solve the above-mentioned technical problem the technical scheme used for:
A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction, described method comprise the following steps:
(1) PU to be estimated is inputted, establishes actually available intra prediction mode set:
Needed according to the adjacent reconstruction pixel in existing space around PU to be estimated and each HEVC intra prediction modes The adjacent reconstruction pixel in space, all actually available intra prediction modes are chosen for PU to be estimated, form set omega, i.e., to each HEVC intra prediction modes, if the existing pattern carries out the adjacent reconstruction picture in space of infra-frame prediction needs around PU to be estimated Element, then the pattern is added to Ω.
(2) calculate all pixels in PU to be estimated be different from direction spatial neighborhood pixels absolute difference and:
For 33 kinds of angle predictive modes of HEVC intra prediction modes, PU to be estimated grain direction characteristic and the PU are most The angle predictive mode chosen eventually has correlation.Therefore, can be adjacent with its space by calculating the pixel in PU to be estimated The poor absolute value of pixel and PU to be estimated grain direction characteristic is determined, quickly to select intra prediction mode.
First, when the angle predictive mode that numbering is 18 in Ω be present, i.e., PU to be estimated can be used and diagonally turned right The angle predictive mode that lower direction is predicted, then calculate all pixels and the difference of its upper left side adjacent pixel in PU to be estimated Absolute value and SADLU, as shown in formula (1):
In formula (1), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than or equal to N-1, coordinate is the upper left side that the pixel of (x-1, y-1) is located at that coordinate is (x, y).Coordinate is The pixel of (- 1, -1) is the top left pixel for the pixel that coordinate is (0,0), and coordinate is PU upper lefts top to be estimated for the pixel of (0,0) The pixel of Angle Position, coordinate are the pixel that the pixel of (0, N-1) is PU lower-lefts to be estimated corner position, and coordinate is (N-1,0) Pixel is the pixel of PU upper rights corner position to be estimated, and coordinate is PU bottom rights to be estimated corner position for the pixel of (N-1, N-1) Pixel.Coordinate is the coboundary pixel that the pixel of (0, y) is PU to be estimated, and coordinate is PU to be estimated for the pixel of (x, 0) Left margin pixel, coordinate are the right margin pixel that the pixel of (N-1, y) is PU to be estimated, and coordinate is to treat for the pixel of (x, N-1) Estimate PU lower boundary pixel.
Similarly, when the angle predictive mode that numbering is 26 in Ω be present, i.e., PU to be estimated can use side straight down To the angle predictive mode being predicted, then all pixels and the absolute difference of its top adjacent pixel in PU to be estimated are calculated And SADU, as shown in formula (2):
In formula (2), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the surface that the pixel of (x, y-1) is located at that coordinate is (x, y).
When the angle predictive mode that numbering is 34 in Ω be present, i.e., PU to be estimated can be entered using diagonally lower left The angle predictive mode of row prediction, then calculate in PU to be estimated the absolute difference of all pixels and its upper right side adjacent pixel and SADRU, as shown in formula (3):
In formula (3), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the upper right side that the pixel of (x+1, y-1) is located at that coordinate is (x, y).
When the angle predictive mode that numbering is 10 in Ω be present, i.e., it is pre- that PU to be estimated can use horizontal right direction to carry out The angle predictive mode of survey, then calculate all pixels and the absolute difference and SAD of its left adjacent pixel in PU to be estimatedL, As shown in formula (4):
In formula (4), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the left that the pixel of (x-1, y) is located at that coordinate is (x, y).
When the angle predictive mode that numbering is 2 in Ω be present, i.e., PU to be estimated can be entered using diagonally upper right The angle predictive mode of row prediction, then calculate in PU to be estimated the absolute difference of all pixels and its lower left adjacent pixel and SADLB, as shown in formula (5):
In formula (5), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the lower left that the pixel of (x-1, y+1) is located at that coordinate is (x, y).
(3) according to the absolute difference of different directions spatial neighborhood pixels and the line for judging PU to be estimated
Manage directional characteristic:
First, step selection is carried out according to the absolute difference being calculated from step (2) and SAD number:If step (2) SAD number is calculated less than 3, then performs step (5);Otherwise the SAD that first step (2) is calculated carry out from it is small to Longer spread, if first three minimum SAD is followed successively by SADMIN-0、SADMIN-1And SADMIN-2;It is right further according to these three minimum SAD PU to be estimated textural characteristics are classified, as shown in formula (6):
In formula (6), Class represents PU to be estimated texture classification, is worth and represents that PU to be estimated texture is relatively flat for 0, It is worth and represents that significantly horizontal, vertical or diagonal is presented in PU to be estimated texture for 1, is worth and represents PU's to be estimated for 2 Other angle directions are presented in texture, are worth and represent that PU to be estimated texture is complicated for 3, parameter alpha, β and γ are used to adjust SADMIN-i(i =0,1,2) relation between, wherein α are set to 0.9~1.0, β and γ is set to 0.6~1.0.
Then PU texture classification Class and the SAD relations being calculated by formula (6), obtain PU grain directions to be estimated Characteristic, as shown in table 1.In table 1,0 degree of direction refers to horizontally right direction, and pi/2 direction refers to along vertically downward direction, π/ 4 directions refer to along 45 degree of bottom right direction, and the direction of-π/4 refers to along 45 degree of directions of upper right, and the direction of 3 π/4 refers to along 45 degree of lower-left side To.When texture classification Class is equal to 0, it is flatter that PU to be estimated grain direction characteristic is designated as texture.As texture classification Class Equal to 1, PU to be estimated grain direction characteristic is according to SADMIN-0Whether SAD is equal toLU、SADU、SADRU、SADLAnd SADLBRespectively It is in 0 degree of direction and texture in the direction of 3 π/4, texture in pi/2 direction, texture that PU to be estimated is designated as into texture in the direction of π/4, texture In the direction of-π/4.When texture classification Class is equal to 2, PU to be estimated grain direction characteristic is according to SADMIN-0And SADMIN-1Value Whether it is SADLU、SADU、SADRU、SADLAnd SADLBIn two sad values in adjacent direction differentiate PU to be estimated grain direction Characteristic:If (a) SADLUEqual to SADMIN-0And SADUEqual to SADMIN-1, or SADLUEqual to SADMIN-1And SADUIt is equal to SADMIN-0, then it is in [π/4, pi/2] direction grain direction characteristic to be designated as into texture;If (b) SADUEqual to SADMIN-0And SADRUDeng In SADMIN-1, or SADUEqual to SADMIN-1And SADRUEqual to SADMIN-0, then by grain direction characteristic be designated as texture in [pi/2,3 π/4] direction;If (c) SADLUEqual to SADMIN-0And SADLEqual to SADMIN-1, or SADLUEqual to SADMIN-1And SADLIt is equal to SADMIN-0, then it is in [0, π/4] direction grain direction characteristic to be designated as into texture;If (d) SADLEqual to SADMIN-0And SADLBIt is equal to SADMIN-1, or SADLEqual to SADMIN-1And SADLBEqual to SADMIN-0, then it is in [- π/4,0] grain direction characteristic to be designated as into texture Direction;(f) other situations, then grain direction characteristic is designated as complex texture direction.When texture classification Class be equal to 3, it is to be estimated PU grain direction characteristic is designated as complex texture direction.
The PU grain directions characteristic to be estimated of table 1
(4) thick level pattern search scope is determined according to grain direction characteristic:
According to PU to be estimated grain direction characteristic, the predictive mode species of candidate, the predictive mode group after adjustment are reduced Into thick level pattern search scope S, the predictive mode in wherein S is set according to PU to be estimated grain direction characteristic, as follows Shown in table 2:
Predictive mode in the S of table 2
(5) rate-distortion optimization candidate pattern set is established according to thick level hunting zone and Ω:
From the predictive mode hunting zone that step (4) obtains, predictive mode number is still more in some cases, therefore Need to carry out further to screen predictive mode before step (6) uses rate-distortion optimization choice of technology optimum prediction mode.
SATD cost pattern search scopes Ψ is determined first:If going to current procedures from step (4), Ψ is step (4) the thick level pattern search scope S obtained obtains Ω common factor with step (1), if going to current procedures from step (3), Ω is directly then assigned to Ψ.
Then the HEVC intra predictions of each predictive mode in Ψ are calculated, then calculate the SATD cost J of prediction residual, As shown in formula (7):
J=SATD+ λ × R (7)
Wherein J represent cost, SATD represent residual signals Hadamard transform after absolute error and, λ represent Lagrange Operator, required bit number is encoded after the selection of R intermediate schemes.
Each predictive mode is ranked up then according to the orders of SATD costs J from small to large, after sequence Predictive mode establishes rate-distortion optimization candidate pattern set Φ:When the predictive mode for being arranged in the 1st is DC patterns or Planar Pattern, then the predictive mode of 1 before arrangement is only added into Φ;When the predictive mode for being arranged in the 1st is angle mode and the 2nd Predictive mode be DC patterns or Planar patterns, then the predictive mode of 2 before arrangement is only added into Φ;When being arranged in front 2 Predictive mode be all adjacent angle mode, then only by before arrangement 2 predictive mode add Φ;When being arranged in front 2 Predictive mode is non-conterminous angle mode, then the predictive mode of 2 before arrangement first is added into Φ, then by 2 kinds of predictive modes Adjacent angle mode adds Φ;In other cases, for size be 16 × 16,32 × 32 and 64 × 64 PU to be estimated, The predictive mode of 3 before arrangement is then added into Φ, for the PU to be estimated that size is 4 × 4 and 8 × 8, then by 8 before arrangement Predictive mode adds Φ.
(6) optimal intra prediction mode is chosen:
Rate distortion costs minimum is chosen in the candidate pattern set Φ obtained using rate-distortion optimization technology from step (5) Optimal intra prediction mode of the candidate pattern as PU to be estimated, complete PU to be estimated Intra prediction mode selection.
The present invention technical concept be:All pixels point in PU is calculated first is different from director space adjacent pixel SAD, PU texture features are drawn, to determine grain direction feature;Then according to grain direction feature, establish thick level pattern and search Rope scope, reduce the candidate pattern number for carrying out thick level pattern search;Rate-distortion optimization candidate pattern set is finally established, enters one Step reduces the final candidate pattern number for carrying out rate-distortion optimization.
Compared with prior art, the invention has the advantages that:
The present invention proposes a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction.This method compared with prior art, has Following features and advantage:First by comparing the absolute difference of each original pixels in different directions and realization pair inside PU PU grain direction feature is classified;The hunting zone of predictive mode is reduced further according to grain direction feature;Last basis The sequence of predictive mode SATD costs carries out the candidate pattern number of rate-distortion optimization to reduce.Keeping good encoding rate distortion Under conditions of performance, the Intra prediction mode selection that the present invention can significantly decrease in HEVC frames and in inter-frame encoding frame calculates again Miscellaneous degree.
Brief description of the drawings
Fig. 1 is the basic flow sheet of the inventive method.
Fig. 2 is that HEVC encodes angle predictive mode in 33 kinds of frames.
Embodiment
The present invention is described in detail with reference to embodiment and accompanying drawing, but the present invention is not limited to this.
As shown in figure 1, a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction, comprises the following steps:
(1) PU to be estimated is inputted, establishes actually available intra prediction mode set;
(2) calculate all pixels in PU to be estimated be different from direction spatial neighborhood pixels absolute difference and;
(3) according to the absolute difference of different directions spatial neighborhood pixels and the grain direction characteristic for judging PU to be estimated;
(4) thick level pattern search scope is determined according to grain direction characteristic;
(5) rate-distortion optimization candidate is established according to thick level pattern search scope and actually available intra prediction mode set Set of modes;
(6) optimal intra prediction mode is chosen.
Step (1) specifically includes:
Needed according to the adjacent reconstruction pixel in existing space around PU to be estimated and each HEVC intra prediction modes The adjacent reconstruction pixel in space, all actually available intra prediction modes are chosen for PU to be estimated, form set omega, i.e., to each HEVC intra prediction modes, if the existing pattern carries out the adjacent reconstruction picture in space of infra-frame prediction needs around PU to be estimated Element, then the pattern is added to Ω.
Step (2) specifically includes:
First, when the angle predictive mode that numbering is 18 in Ω be present, i.e., PU to be estimated can be used and diagonally turned right The angle predictive mode that lower direction is predicted, then calculate all pixels and the difference of its upper left side adjacent pixel in PU to be estimated Absolute value and SADLU, as shown in formula (1):
In formula (1), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than or equal to N-1, coordinate is the upper left side that the pixel of (x-1, y-1) is located at that coordinate is (x, y).Coordinate is The pixel of (- 1, -1) is the top left pixel for the pixel that coordinate is (0,0), and coordinate is PU upper lefts top to be estimated for the pixel of (0,0) The pixel of Angle Position, coordinate are the pixel that the pixel of (0, N-1) is PU lower-lefts to be estimated corner position, and coordinate is (N-1,0) Pixel is the pixel of PU upper rights corner position to be estimated, and coordinate is PU bottom rights to be estimated corner position for the pixel of (N-1, N-1) Pixel.In HEVC standard, its prediction direction of the angle predictive mode of difference numbering is as shown in Figure 2.
Similarly, when the angle predictive mode that numbering is 26 in Ω be present, i.e., PU to be estimated can use side straight down To the angle predictive mode being predicted, then all pixels and the absolute difference of its top adjacent pixel in PU to be estimated are calculated And SADU, as shown in formula (2):
In formula (2), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the surface that the pixel of (x, y-1) is located at that coordinate is (x, y).
When the angle predictive mode that numbering is 34 in Ω be present, i.e., PU to be estimated can be entered using diagonally lower left The angle predictive mode of row prediction, then calculate in PU to be estimated the absolute difference of all pixels and its upper right side adjacent pixel and SADRU, as shown in formula (3):
In formula (3), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the upper right side that the pixel of (x+1, y-1) is located at that coordinate is (x, y).
When the angle predictive mode that numbering is 10 in Ω be present, i.e., it is pre- that PU to be estimated can use horizontal right direction to carry out The angle predictive mode of survey, then calculate all pixels and the absolute difference and SAD of its left adjacent pixel in PU to be estimatedL, As shown in formula (4):
In formula (4), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the left that the pixel of (x-1, y) is located at that coordinate is (x, y).
When the angle predictive mode that numbering is 2 in Ω be present, i.e., PU to be estimated can be entered using diagonally upper right The angle predictive mode of row prediction, then calculate in PU to be estimated the absolute difference of all pixels and its lower left adjacent pixel and SADLB, as shown in formula (5):
In formula (5), PU to be estimated size is N × N (N=4,8,16,32,64), and p (x, y) is coordinate in PU to be estimated For the pixel value of the pixel of (x, y), wherein x is horizontal coordinate, and y is vertical coordinate, in PU to be estimated their value be more than Integer equal to 0 and less than N, coordinate is the lower left that the pixel of (x-1, y+1) is located at that coordinate is (x, y).
Step (3) specifically includes:
First, step selection is carried out according to SAD number of the absolute difference being calculated from step (2):If step (2) SAD number is calculated less than 3, then performs step (5);Otherwise the SAD first step (2) being calculated is arranged from small to large Row, if first three minimum SAD is followed successively by SADMIN-0、SADMIN-1And SADMIN-2;Further according to these three minimum SAD, treat and estimate Meter PU textural characteristics are classified, as shown in formula (6):
In formula (6), Class represents PU to be estimated texture classification, is worth and represents that PU to be estimated texture is relatively flat for 0, It is worth and represents that significantly horizontal, vertical or diagonal is presented in PU to be estimated texture for 1, is worth and represents PU's to be estimated for 2 Other angle directions are presented in texture, are worth and represent that PU to be estimated texture is complicated for 3, parameter alpha, β and γ are used to adjust SADMIN-i(i =0,1,2) relation between, wherein α are set to 0.9~1.0, β and γ and are set to 0.6~1.0, and α is set to 0.95, β and γ herein It is set to 0.9.
Then PU texture classification Class and the SAD relations being calculated by formula (6), obtain PU grain directions to be estimated Characteristic, as shown in table 1.In table 1,0 degree of direction refers to horizontally right direction, and pi/2 direction refers to along vertically downward direction, π/ 4 directions refer to along 45 degree of bottom right direction, and the direction of-π/4 refers to along 45 degree of directions of upper right, and the direction of 3 π/4 refers to along 45 degree of lower-left side To.When texture classification Class is equal to 2, PU to be estimated grain direction characteristic is according to SADMIN-0And SADMIN-1Value whether be SADLU、SADU、SADRU、SADLAnd SADLBIn two sad values in adjacent direction differentiate PU to be estimated grain direction characteristic: If (a) SADLUEqual to SADMIN-0And SADUEqual to SADMIN-1, or SADLUEqual to SADMIN-1And SADUEqual to SADMIN-0, then It is in [π/4, pi/2] direction that grain direction characteristic is designated as into texture;If (b) SADUEqual to SADMIN-0And SADRUEqual to SADMIN-1, Or SADUEqual to SADMIN-1And SADRUEqual to SADMIN-0, then it is in [pi/2,3 π/4] direction grain direction characteristic to be designated as into texture; If (c) SADLUEqual to SADMIN-0And SADLEqual to SADMIN-1, or SADLUEqual to SADMIN-1And SADLEqual to SADMIN-0, then It is in [0, π/4] direction that grain direction characteristic is designated as into texture;If (d) SADLEqual to SADMIN-0And SADLBEqual to SADMIN-1, or Person SADLEqual to SADMIN-1And SADLBEqual to SADMIN-0, then it is in [- π/4,0] direction grain direction characteristic to be designated as into texture;(f) Other situations, then grain direction characteristic is designated as complex texture direction.When texture classification Class be equal to 3, PU to be estimated texture Directional characteristic is designated as complex texture direction.
The PU grain directions characteristic to be estimated of table 1
Step (4) specifically includes:
According to PU to be estimated grain direction characteristic, the predictive mode species of candidate, the predictive mode group after adjustment are reduced Into the thick level pattern search scope S of predictive mode, the predictive mode in wherein S be according to PU to be estimated grain direction characteristic come Set, it is as shown in table 2 below:
Predictive mode in the S of table 2
Step (5) specifically includes:
SATD cost pattern search scopes Ψ is determined first:If going to current procedures from step (4), Ψ is step (4) the thick level pattern search scope S obtained obtains Ω common factor with step (1), if going to current procedures from step (3), Ω is directly then assigned to Ψ.
Then the HEVC intra predictions of each predictive mode in Ψ are calculated, then calculate the SATD costs of prediction residual, As shown in formula (7):
J=SATD+ λ × R (7)
Wherein J represent cost, SATD represent residual signals Hadamard transform after absolute error and, λ represent Lagrange Operator, required bit number is encoded after the selection of R intermediate schemes.
Each predictive mode is ranked up then according to the orders of SATD costs J from small to large, after sequence Predictive mode establishes rate-distortion optimization candidate pattern set Φ:When the predictive mode for being arranged in the 1st is DC patterns or Planar Pattern, then the predictive mode of 1 before arrangement is only added into Φ;When the predictive mode for being arranged in the 1st is angle mode and the 2nd Predictive mode be DC patterns or Planar patterns, then the predictive mode of 2 before arrangement is only added into Φ;When being arranged in front 2 Predictive mode be all adjacent angle mode, then only by before arrangement 2 predictive mode add Φ;When being arranged in front 2 Predictive mode is non-conterminous angle mode, then the predictive mode of 2 before arrangement first is added into Φ, then by 2 kinds of predictive modes Adjacent angle mode adds Φ;In other cases, for size be 16 × 16,32 × 32 and 64 × 64 PU to be estimated, The predictive mode of 3 before arrangement is then added into Φ, for the PU to be estimated that size is 4 × 4 and 8 × 8, then by 8 before arrangement Predictive mode adds Φ.
Step (6) specifically includes:
Rate distortion costs minimum is chosen in the candidate pattern set Φ obtained using rate-distortion optimization technology from step (5) Optimal intra prediction mode of the candidate pattern as PU to be estimated, complete PU to be estimated Intra prediction mode selection.

Claims (4)

1. a kind of HEVC Adaptive Mode Selection Method for Intra-Prediction, it is characterised in that described system of selection comprises the following steps:
(1) PU to be estimated is inputted, establishes actually available intra prediction mode set:
It is adjacent according to the space that the adjacent reconstruction pixel in the existing spaces of PU to be estimated and each HEVC intra prediction modes need Pixel is rebuild, all actually available intra prediction modes is chosen for PU to be estimated, forms set omega;
(2) calculate all pixels in PU to be estimated be different from direction spatial neighborhood pixels absolute difference and:
When the angle predictive mode that numbering is 18 in Ω be present, i.e., PU to be estimated can be used and diagonally carried out toward lower right The angle predictive mode of prediction, then calculate in PU to be estimated the absolute difference of all pixels and its upper left side adjacent pixel and SADLU, as shown in formula (1):
<mrow> <msub> <mi>SAD</mi> <mrow> <mi>L</mi> <mi>U</mi> </mrow> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>y</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>-</mo> <mn>1</mn> <mo>,</mo> <mi>y</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
In formula (1), PU to be estimated size is N × N (N=4,8,16,32,64), p (x, y) be in PU to be estimated coordinate for (x, Y) pixel value of pixel, wherein x are horizontal coordinate, and y is vertical coordinate, and their value is more than or equal to 0 in PU to be estimated And the integer less than N, coordinate is the upper left side that the pixel of (x-1, y-1) is located at the pixel that coordinate is (x, y);
When the angle predictive mode that numbering is 26 in Ω be present, i.e., PU to be estimated can use what vertically downward direction was predicted Angle predictive mode, then calculate all pixels and the absolute difference and SAD of its top adjacent pixel in PU to be estimatedU, such as formula (2) shown in:
<mrow> <msub> <mi>SAD</mi> <mi>U</mi> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>y</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
In formula (2), PU to be estimated size is N × N (N=4,8,16,32,64), p (x, y) be in PU to be estimated coordinate for (x, Y) pixel value of pixel, wherein x are horizontal coordinate, and y is vertical coordinate, and their value is more than or equal to 0 in PU to be estimated And the integer less than N, coordinate is the surface that the pixel of (x, y-1) is located at the pixel that coordinate is (x, y);
When the angle predictive mode that numbering is 34 in Ω be present, i.e., PU to be estimated can use diagonally lower left progress pre- The angle predictive mode of survey, then calculate in PU to be estimated the absolute difference of all pixels and its upper right side adjacent pixel and SADRU, as shown in formula (3):
<mrow> <msub> <mi>SAD</mi> <mrow> <mi>R</mi> <mi>U</mi> </mrow> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>y</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>+</mo> <mn>1</mn> <mo>,</mo> <mi>y</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
In formula (3), PU to be estimated size is N × N (N=4,8,16,32,64), p (x, y) be in PU to be estimated coordinate for (x, Y) pixel value of pixel, wherein x are horizontal coordinate, and y is vertical coordinate, and their value is more than or equal to 0 in PU to be estimated And the integer less than N, coordinate is the upper right side that the pixel of (x+1, y-1) is located at the pixel that coordinate is (x, y);
When the angle predictive mode that numbering is 10 in Ω be present, i.e., PU to be estimated can use what horizontal right direction was predicted Angle predictive mode, then calculate all pixels and the absolute difference and SAD of its left adjacent pixel in PU to be estimatedL, such as formula (4) shown in:
<mrow> <msub> <mi>SAD</mi> <mi>L</mi> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>y</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>-</mo> <mn>1</mn> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
In formula (4), PU to be estimated size is N × N (N=4,8,16,32,64), p (x, y) be in PU to be estimated coordinate for (x, Y) pixel value of pixel, wherein x are horizontal coordinate, and y is vertical coordinate, and their value is more than or equal to 0 in PU to be estimated And the integer less than N, coordinate is the left that the pixel of (x-1, y) is located at the pixel that coordinate is (x, y);
When the angle predictive mode that numbering is 2 in Ω be present, i.e., PU to be estimated can use diagonally upper right progress pre- The angle predictive mode of survey, then calculate in PU to be estimated the absolute difference of all pixels and its lower left adjacent pixel and SADLB, as shown in formula (5):
<mrow> <msub> <mi>SAD</mi> <mrow> <mi>L</mi> <mi>B</mi> </mrow> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>y</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mo>|</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>-</mo> <mn>1</mn> <mo>,</mo> <mi>y</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mo>|</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow>
In formula (5), PU to be estimated size is N × N (N=4,8,16,32,64), p (x, y) be in PU to be estimated coordinate for (x, Y) pixel value of pixel, wherein x are horizontal coordinate, and y is vertical coordinate, and their value is more than or equal to 0 in PU to be estimated And the integer less than N, coordinate is the lower left that the pixel of (x-1, y+1) is located at the pixel that coordinate is (x, y);
(3) according to the absolute difference of different directions spatial neighborhood pixels and the grain direction characteristic for judging PU to be estimated:
Step selection is carried out according to the absolute difference being calculated from step (2) and SAD number first:If step (2) is counted Calculation obtains SAD number less than 3, then performs step (5);Otherwise the SAD first step (2) being calculated is arranged from small to large Row, classify to PU to be estimated grain direction characteristic;
(4) thick level pattern search scope is determined according to grain direction characteristic;
(5) rate-distortion optimization candidate pattern set is established according to thick level pattern search scope and Ω;
(6) optimal intra prediction mode is chosen:
The minimum candidate's mould of rate distortion costs is chosen in the candidate pattern set obtained using rate-distortion optimization technology from step (5) Optimal intra prediction mode of the formula as PU to be estimated, complete PU to be estimated Intra prediction mode selection.
A kind of 2. HEVC Adaptive Mode Selection Method for Intra-Prediction as claimed in claim 1, it is characterised in that described step (3) in, if first three minimum SAD is followed successively by SADMIN-0、SADMIN-1And SADMIN-2, it is right further according to these three minimum SAD PU to be estimated textural characteristics are classified, as shown in formula (6):
<mrow> <mi>C</mi> <mi>l</mi> <mi>a</mi> <mi>s</mi> <mi>s</mi> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>i</mi> <mi>f</mi> <mi> </mi> <msub> <mi>SAD</mi> <mrow> <mi>M</mi> <mi>I</mi> <mi>N</mi> <mo>-</mo> <mn>0</mn> </mrow> </msub> <mo>&gt;</mo> <mi>&amp;alpha;</mi> <mo>&amp;times;</mo> <msub> <mi>SAD</mi> <mrow> <mi>M</mi> <mi>I</mi> <mi>N</mi> <mo>-</mo> <mn>2</mn> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>1</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>e</mi> <mi>l</mi> <mi>s</mi> <mi>e</mi> <mi> </mi> <mi>i</mi> <mi>f</mi> <mi> </mi> <msub> <mi>SAD</mi> <mrow> <mi>M</mi> <mi>I</mi> <mi>N</mi> <mo>-</mo> <mn>0</mn> </mrow> </msub> <mo>&lt;</mo> <mi>&amp;beta;</mi> <mo>&amp;times;</mo> <msub> <mi>SAD</mi> <mrow> <mi>M</mi> <mi>I</mi> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>2</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>e</mi> <mi>l</mi> <mi>s</mi> <mi>e</mi> <mi> </mi> <mi>i</mi> <mi>f</mi> <mi> </mi> <msub> <mi>SAD</mi> <mrow> <mi>M</mi> <mi>I</mi> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo>&lt;</mo> <mi>&amp;gamma;</mi> <mo>&amp;times;</mo> <msub> <mi>SAD</mi> <mrow> <mi>M</mi> <mi>I</mi> <mi>N</mi> <mo>-</mo> <mn>2</mn> </mrow> </msub> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>3</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>o</mi> <mi>t</mi> <mi>h</mi> <mi>e</mi> <mi>r</mi> <mi>s</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow>
In formula (6), Class represents PU to be estimated texture classification, is worth and represents that PU to be estimated texture is relatively flat for 0, is worth for 1 Represent that significantly horizontal, vertical or diagonal is presented in PU to be estimated texture, the texture being worth for 2 expression PU to be estimated is in Existing other angle directions, it is worth and represents that PU to be estimated texture is complicated for 3, parameter alpha, β and γ are used to adjust SADMIN-i(i=0,1, 2) relation between, wherein α are set to 0.9~1.0, β and γ is set to 0.6~1.0;
Then PU texture classification Class and the SAD relations being calculated by formula (6), obtain PU grain directions characteristic to be estimated, As shown in table 1,
The PU grain directions characteristic to be estimated of table 1
Wherein 0 degree of direction refers to horizontally right direction, and pi/2 direction refers to along vertically downward direction, and the direction of π/4 refers to along bottom right 45 degree of directions, the direction of-π/4 refer to along 45 degree of directions of upper right, and the direction of 3 π/4 refers to along 45 degree of lower-left direction.
A kind of 3. HEVC Adaptive Mode Selection Method for Intra-Prediction as claimed in claim 1, it is characterised in that the step (4) in, the PU to be estimated obtained according to step (3) grain direction characteristic, the predictive mode species of candidate is reduced, after adjustment Predictive mode forms thick level pattern search scope S, as shown in table 2 below,
Predictive mode in the S of table 2
Predictive mode in wherein S is set according to PU to be estimated grain direction characteristic.
A kind of 4. HEVC Adaptive Mode Selection Method for Intra-Prediction as claimed in claim 1, it is characterised in that in step (5), SATD cost pattern search scopes Ψ is determined first:If going to current procedures from step (4), Ψ is that step (4) obtains Thick level pattern search scope S and step (1) obtain Ω common factor, if going to current procedures from step (3), direct general Ω is assigned to Ψ;Then the HEVC intra predictions of each predictive mode in Ψ are calculated, then calculate the SATD costs of prediction residual; Each predictive mode is ranked up then according to the orders of SATD costs J from small to large, further according to the predictive mode after sequence Establish rate-distortion optimization candidate pattern set Φ:When the predictive mode for being arranged in the 1st is DC patterns or Planar patterns, then only The predictive mode of 1 before arrangement is added into Φ;When the predictive mode for being arranged in the 1st is angle mode and the prediction mould of the 2nd Formula is DC patterns or Planar patterns, then the predictive mode of 2 before arrangement only is added into Φ;When the prediction mould for being arranged in front 2 Formula is adjacent angle mode, then the predictive mode of 2 before arrangement only is added into Φ;When the predictive mode for being arranged in front 2 is Non-conterminous angle mode, then the predictive mode of 2 before arrangement is first added into Φ, then by the adjacent angle of 2 kinds of predictive modes Pattern adds Φ;In other cases, for the PU to be estimated that size is 16 × 16,32 × 32 and 64 × 64, then by before arrangement 3 The predictive mode of position adds Φ, for the PU to be estimated that size is 4 × 4 and 8 × 8, then adds the predictive mode of 8 before arrangement Φ。
CN201510675511.8A 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction Active CN105208387B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510675511.8A CN105208387B (en) 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510675511.8A CN105208387B (en) 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction

Publications (2)

Publication Number Publication Date
CN105208387A CN105208387A (en) 2015-12-30
CN105208387B true CN105208387B (en) 2018-03-13

Family

ID=54955775

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510675511.8A Active CN105208387B (en) 2015-10-16 2015-10-16 A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction

Country Status (1)

Country Link
CN (1) CN105208387B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105812825B (en) * 2016-05-10 2019-02-26 中山大学 A kind of packet-based image encoding method
CN106331726B (en) * 2016-09-23 2019-10-22 优酷网络技术(北京)有限公司 A kind of infra-frame prediction decoding method and device based on HEVC
CN110213576B (en) * 2018-05-03 2023-02-28 腾讯科技(深圳)有限公司 Video encoding method, video encoding device, electronic device, and storage medium
CN109618162B (en) * 2018-10-26 2021-04-13 西安科锐盛创新科技有限公司 Post-selection prediction method in bandwidth compression
CN109413435B (en) * 2018-10-26 2020-10-16 苏州市吴越智博大数据科技有限公司 Prediction method based on video compression
CN109510996B (en) * 2018-10-26 2021-05-11 西安科锐盛创新科技有限公司 Post-selection prediction method in bandwidth compression
CN109640092A (en) * 2018-10-26 2019-04-16 西安科锐盛创新科技有限公司 Rear selection prediction technique in bandwidth reduction
CN109660793B (en) * 2018-10-26 2021-03-16 西安科锐盛创新科技有限公司 Prediction method for bandwidth compression
CN109361922B (en) * 2018-10-26 2020-10-30 西安科锐盛创新科技有限公司 Predictive quantization coding method
CN109618169B (en) * 2018-12-25 2023-10-27 中山大学 Intra-frame decision method, device and storage medium for HEVC

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102665079A (en) * 2012-05-08 2012-09-12 北方工业大学 Adaptive fast intra prediction mode decision for high efficiency video coding (HEVC)
CN103517069A (en) * 2013-09-25 2014-01-15 北京航空航天大学 HEVC intra-frame prediction quick mode selection method based on texture analysis
CN103763570A (en) * 2014-01-20 2014-04-30 华侨大学 Rapid HEVC intra-frame prediction method based on SATD
CN104581152A (en) * 2014-12-25 2015-04-29 同济大学 HEVC intra-frame prediction mode decision accelerating method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102665079A (en) * 2012-05-08 2012-09-12 北方工业大学 Adaptive fast intra prediction mode decision for high efficiency video coding (HEVC)
CN103517069A (en) * 2013-09-25 2014-01-15 北京航空航天大学 HEVC intra-frame prediction quick mode selection method based on texture analysis
CN103763570A (en) * 2014-01-20 2014-04-30 华侨大学 Rapid HEVC intra-frame prediction method based on SATD
CN104581152A (en) * 2014-12-25 2015-04-29 同济大学 HEVC intra-frame prediction mode decision accelerating method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
An adaptive fast intra mode decision in HEVC;Mengmeng Zhang等;《Image Processing (ICIP), 2012 19th IEEE International Conference on》;20121003;全文 *
利用纹理和空间相关性的HEVC帧内预测模式选择;齐美斌;《中国图像图形学报》;20140816(第8期);全文 *

Also Published As

Publication number Publication date
CN105208387A (en) 2015-12-30

Similar Documents

Publication Publication Date Title
CN105208387B (en) A kind of HEVC Adaptive Mode Selection Method for Intra-Prediction
CN104754357B (en) Intraframe coding optimization method and device based on convolutional neural networks
CN101964906B (en) Rapid intra-frame prediction method and device based on texture characteristics
CN106961606B (en) HEVC intra-frame coding mode selection method based on texture division characteristics
CN106534846B (en) A kind of screen content and natural contents divide and fast encoding method
CN107277509B (en) A kind of fast intra-frame predicting method based on screen content
CN103517069A (en) HEVC intra-frame prediction quick mode selection method based on texture analysis
CN107509076B (en) A kind of Encoding Optimization towards ultra high-definition video
CN103220522A (en) Method and apparatus for encoding video, and method and apparatus for decoding video
CN103118262B (en) Rate distortion optimization method and device, and video coding method and system
CN108259897A (en) A kind of intraframe coding optimization method based on deep learning
CN101820546A (en) Intra-frame prediction method
CN104811729B (en) A kind of video multi-reference frame coding method
CN106688238A (en) Improved reference pixel selection and filtering for intra coding of depth map
CN101309421A (en) Intra-frame prediction mode selection method
CN104284186A (en) Fast algorithm suitable for HEVC standard intra-frame prediction mode judgment process
CN105681797A (en) Prediction residual based DVC-HEVC (Distributed Video Coding-High Efficiency Video Coding) video transcoding method
CN111586405B (en) Prediction mode rapid selection method based on ALF filtering in multifunctional video coding
CN108769696A (en) A kind of DVC-HEVC video transcoding methods based on Fisher discriminates
CN110351552B (en) Fast coding method in video coding
CN101287125A (en) Fast mode selection method in frame
CN102547257B (en) Method for obtaining optimal prediction mode and device
CN109151467B (en) Screen content coding inter-frame mode rapid selection method based on image block activity
CN103313055B (en) A kind of chroma intra prediction method based on segmentation and video code and decode method
CN104811730A (en) Video image intra-frame encoding unit texture analysis and encoding unit selection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant