CN110166775B - Intra-frame prediction method, encoder and storage device - Google Patents

Intra-frame prediction method, encoder and storage device Download PDF

Info

Publication number
CN110166775B
CN110166775B CN201910556689.9A CN201910556689A CN110166775B CN 110166775 B CN110166775 B CN 110166775B CN 201910556689 A CN201910556689 A CN 201910556689A CN 110166775 B CN110166775 B CN 110166775B
Authority
CN
China
Prior art keywords
prediction mode
joint
prediction
intra
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910556689.9A
Other languages
Chinese (zh)
Other versions
CN110166775A (en
Inventor
曾飞洋
江东
林聚财
殷俊
方诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN201910556689.9A priority Critical patent/CN110166775B/en
Publication of CN110166775A publication Critical patent/CN110166775A/en
Application granted granted Critical
Publication of CN110166775B publication Critical patent/CN110166775B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • H04N19/159Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding

Abstract

The invention discloses an intra-frame prediction method, an encoder and a storage device. The intra prediction method includes: selecting an independent prediction mode of a current coding block from a plurality of candidate intra-frame prediction modes; if the current coding block meets the joint prediction condition, determining a joint prediction mode of the current coding block, wherein the joint prediction mode comprises an intra-frame prediction mode of at least one coded block; and if the joint prediction mode meets the preset condition, selecting the joint prediction mode as the intra-frame prediction mode of the current coding block. By the method, the accuracy of intra-frame prediction can be improved.

Description

Intra-frame prediction method, encoder and storage device
Technical Field
The present application relates to the field of video coding, and in particular, to an intra prediction method, an encoder, and a storage device.
Background
Because the amount of video image data is large, it is usually necessary to encode and compress the video image data before transmitting or storing the video image data, and the encoded data is called a video code stream. Subject to hardware and other constraints, such as limited storage space, limited transmission bandwidth, etc., encoders always want to keep the video stream as small as possible.
The video coding mainly comprises video acquisition, prediction, transformation quantization and entropy coding, wherein the prediction is divided into an intra-frame prediction part and an inter-frame prediction part which are respectively used for removing spatial redundancy and temporal redundancy.
The intra-frame prediction is to predict the pixel value of a current pixel according to the pixel values of the reference pixels coded around the current pixel in a frame image. Current intra prediction modes include DC, Planar, and various angle modes. For a certain intra-frame prediction mode, finding out a reference pixel corresponding to a pixel in a current coding block according to the direction pointed by the intra-frame prediction mode, calculating the prediction cost of the intra-frame prediction mode according to the pixel value of the corresponding reference pixel, repeating the above process for a plurality of intra-frame prediction modes to obtain the prediction costs of the plurality of intra-frame prediction modes, and finally selecting the intra-frame prediction mode with the minimum prediction cost as the intra-frame prediction mode of the current coding block. Only a single intra-frame prediction mode is considered in the intra-frame prediction process, and the removal effect of the spatial redundancy is limited.
Disclosure of Invention
The application provides an intra-frame prediction method, an encoder and a storage device, which can solve the problem of limited removal effect of spatial redundancy in the intra-frame prediction process in the related art.
In order to solve the technical problem, the application adopts a technical scheme that: selecting an independent prediction mode of a current coding block from a plurality of candidate intra-frame prediction modes; if the current coding block meets the joint prediction condition, determining a joint prediction mode of the current coding block, wherein the joint prediction mode comprises an intra-frame prediction mode of at least one coded block; and if the joint prediction mode meets the preset condition, selecting the joint prediction mode as the intra-frame prediction mode of the current coding block.
In order to solve the above technical problem, the present application adopts another technical solution that: there is provided an encoder comprising a processor for executing instructions to implement the aforementioned intra prediction method.
In order to solve the above technical problem, the present application adopts another technical solution that: there is provided a memory device storing instructions that, when executed, implement the aforementioned intra prediction method.
The beneficial effect of this application is: selecting an independent prediction mode of a current coding block from a plurality of candidate intra prediction modes; if the current coding block meets the joint prediction condition, determining a joint prediction mode of the current coding block, wherein the joint prediction mode comprises an independent prediction mode of the current coding block and an intra-frame prediction mode of at least one coded block; if the joint prediction mode meets the preset condition, the joint prediction mode is selected as the intra-frame prediction mode of the current coding block, the joint prediction mode is introduced in the intra-frame prediction process and is not limited to a single intra-frame prediction mode, the possibility of more accurate intra-frame prediction of the current coding block is improved, and the removal effect of spatial redundancy is improved.
Drawings
FIG. 1 is a flowchart illustrating a first embodiment of an intra prediction method according to the present invention;
FIG. 2 is a flow chart illustrating a method for calculating a joint prediction cost using a joint prediction mode according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a second embodiment of the intra prediction method according to the present invention;
FIG. 4 is a diagram illustrating a current block and adjacent encoded blocks L and A in a specific example of an application of the selection of the first set of joint prediction modes and joint prediction conditions;
FIG. 5 is a diagram illustrating an encoded block and a current block of a current frame in an example of a specific application of the selection of the second set of joint prediction modes and joint prediction conditions;
FIG. 6 is a schematic diagram of an embodiment of an encoder of the present invention;
FIG. 7 is a schematic structural diagram of a memory device according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in this application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any indication of the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise. All directional indications (such as up, down, left, right, front, and rear … …) in the embodiments of the present application are only used to explain the relative positional relationship between the components, the movement, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indication is changed accordingly. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those skilled in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
FIG. 1 is a flowchart illustrating a first embodiment of an intra prediction method according to the present invention. It should be noted that, if the result is substantially the same, the flow sequence shown in fig. 1 is not limited in this embodiment. As shown in fig. 1, the present embodiment includes:
s1: an independent prediction mode of a current coding block is selected from a plurality of candidate intra prediction modes.
The current encoding block, which refers to a block to be currently encoded, may be referred to simply as a current block. In some cases, a coded block may be referred to as a Coding Unit (CU). If a separate luminance and color coding mode, such as YUV coding, is used, the current coding block may be a luminance block or a chrominance block.
The prediction cost of each candidate intra-frame prediction mode can be calculated, and the candidate intra-frame prediction mode with the minimum prediction cost is selected as the independent prediction mode. The candidate intra prediction modes include DC, Planar, and some or all of the plurality of angular modes. The prediction cost of a certain intra-frame prediction mode may refer to the cost for performing intra-frame prediction on a current coding block by using the intra-frame prediction mode, and the cost of rate distortion and the like may be used as the prediction cost.
S2: and if the current coding block meets the joint prediction condition, determining the joint prediction mode of the current coding block.
In an embodiment of the present invention, the joint prediction condition may include at least one of the following three conditions:
(1) there are adjacent coded blocks to the current coded block and the adjacent coded blocks employ intra prediction.
(2) The intra prediction mode of at least some of the neighboring encoded blocks is different from the independent prediction mode.
(3) The current coding block is not the first coding block of the current frame.
Optionally, the joint prediction condition may further include: the size of the current coding block is within a first range and/or the size of the adjacent coded block is within a second range.
The first/second range may be defined by a single threshold, and the range on the other side is not limited, e.g., the first/second range may be greater than a certain preset threshold or less than a certain preset threshold. The first/second range may be defined by two thresholds, and the size of the threshold and whether the threshold itself is included in the first/second range may be determined according to actual needs.
If the current coding block meets the joint prediction condition, a preset selection mode can be adopted to determine the joint prediction mode of the current coding block, and the joint prediction mode comprises an intra-frame prediction mode of at least one coded block. The encoded blocks in this embodiment all belong to the current frame, i.e. belong to the same frame as the current encoded block. The selection of the joint prediction mode may be matched to the joint prediction conditions.
In a specific embodiment of the present invention, the joint prediction mode may include intra prediction modes of N encoded blocks adjacent to the current encoded block, where N is a positive integer. The position of adjacent coded blocks may be fixed.
In another embodiment of the present invention, the joint prediction mode may include an intra prediction mode of N matched blocks, where the N matched blocks are N candidate encoded blocks of the plurality of candidate encoded blocks having the smallest rate of difference from the current encoded block. The candidate encoded blocks include part or all of the encoded blocks of the current frame.
The disparity rate may be derived based on pixel disparity and/or texture similarity between the candidate encoded block and the current encoded block. Specifically, the pixel difference index between the candidate coded block and the current coded block may be used as the difference rate, or the texture similarity metric between the candidate coded block and the current coded block may be used as the difference rate, or the difference rate may be calculated by using the pixel difference index and the texture similarity metric.
Calculating the pixel difference index requires that the size of the candidate encoded block be the same as the size of the current encoded block. Specifically, a pixel value error of each pixel in the candidate encoded block and the current encoded block may be calculated, and then a pixel difference index may be calculated based on the pixel value errors of all the pixels, where the pixel difference index may be a Sum of Absolute Differences (SAD), Mean Square Error (MSE), mean absolute error (MAD), threshold difference count (NTD), and the like of the pixel value errors of all the pixels. The smaller the pixel difference index is, the smaller the pixel difference between the candidate encoded block and the current encoded block is, and the higher the similarity is.
A similarity measure between the texture features of the candidate encoded block and the texture features of the current encoded block may be computed as the texture similarity measure. The similarity metric may be an euclidean distance metric, a Mahalanobis metric, or the like. The smaller the texture similarity measure, the higher the degree of texture similarity between the candidate encoded block and the current encoded block. Computing the texture similarity metric does not require that the size of the candidate encoded block be the same as the size of the current encoded block.
S3: and if the joint prediction mode meets the preset condition, selecting the joint prediction mode as the intra-frame prediction mode of the current coding block.
Whether to select the joint prediction mode as the intra prediction mode of the current coding block can be judged based on the joint prediction cost. The joint prediction cost is the prediction cost for predicting the current coding block by adopting a joint prediction mode.
Specifically, the preset condition may include that the joint prediction cost is smaller than the independent prediction cost, and/or that the joint prediction cost is smaller than a preset threshold. In addition, the preset condition may further include that the independent prediction cost is greater than a specified threshold, and the like.
If the joint prediction mode does not meet the preset condition, the independent prediction mode can be selected as the intra-frame prediction mode of the current coding block.
After the intra-frame prediction mode of the current coding block is determined, the current coding block can be coded to obtain a code stream of the current coding block. The code stream of the current coding block can comprise a joint prediction mark, and the joint prediction mark is used for indicating whether the intra-frame prediction mode of the current coding block is a joint prediction mode. The current coding block may be encoded using the independent prediction mode prior to determining the joint prediction flag.
Alternative expression patterns for the joint predictive markers include, but are not limited to, the following three:
(i) the joint prediction flag comprises a joint prediction syntax element, and the value of the joint prediction syntax element is used for indicating whether the intra-frame prediction mode of the current coding block is a joint prediction mode.
For example, a 1-bit syntax element COMBINE _ PRED is added as a joint prediction flag. The combination _ PRED of TRUE indicates that the joint prediction mode is adopted, and the combination _ PRED of FALSE indicates that the joint prediction mode is not adopted.
(ii) The joint prediction flag includes a joint prediction syntax element, and in case that an intra prediction mode of the current coding block is a joint prediction mode, the joint prediction flag further includes the joint prediction mode.
For example, a 1-bit syntax element COMBINE _ PRED is added as a joint prediction flag. The combination _ PRED of TRUE indicates that the joint prediction mode is adopted, and the combination _ PRED of FALSE indicates that the joint prediction mode is not adopted. In the case where COMBINE _ PRED is TRUE, joint prediction will be performed simultaneouslyMode Mcom1,Mcom2,…,McomNAnd (6) coding is carried out.
(iii) The joint prediction mark comprises an intra-frame prediction mode sequence number of the current coding block, wherein under the condition that the intra-frame prediction mode of the current coding block is the joint prediction mode, the intra-frame prediction mode sequence number of the current coding block is different from the sequence numbers of all candidate intra-frame prediction modes. The intra prediction mode number corresponding to the joint prediction flag may be newly added. For example, if the existing intra prediction modes (including DC, Planar, and various angle modes) have the numbers 0 to 70, the added numbers 71 and 72 are used as joint prediction flags, 71 indicates that the joint prediction mode is used, and 72 indicates that the joint prediction mode is not used.
Before judging whether the joint prediction mode meets the preset condition, the joint prediction cost can be calculated by using the joint prediction mode. As shown in fig. 2, in an embodiment of the present invention, calculating the joint prediction cost by using the joint prediction mode may include:
s31: and calculating the joint prediction value of the current coding block by using the joint prediction mode.
The joint predictor for each pixel in the current coding block is a weighted average of multiple predictors for the pixel. If the weighted weights are normalized, i.e. the sum of all weights is 1, the joint predictor can also be considered as a weighted sum of multiple predictors. The plurality of prediction values include prediction values obtained by predicting the pixels in an independent prediction mode. Optionally, the plurality of prediction values may further include a prediction value obtained by predicting the pixel in an independent prediction mode.
For example, using the independent prediction mode M0The predicted value obtained by predicting the pixel a in the current coding block is Pred0. The joint prediction mode includes intra prediction modes for N encoded blocks, denoted as: mcom1,Mcom2,…,McomN. The prediction value obtained by predicting the pixel a by adopting the joint prediction mode comprises the following steps: pred1,Pred2,…,PredN. Joint predictor Pred for pixel acomComprises the following steps:
Predcom=α*Pred01*Pred12*Pred2+…+βN*PredN
wherein α is Pred0Weight of (1), betaNIs PredNThe weight of (c). 0<=α<=1,0<=βN<=1,α+β1+…+βN1. Alpha and betaNCan be determined according to actual needs.
S32: and calculating the joint prediction cost by using the joint prediction value of the current coding block.
And subtracting the joint predicted value of the current coding block from the pixel value of the current coding block to obtain a residual error corresponding to the joint prediction mode, and calculating the prediction cost based on the residual error corresponding to the joint prediction mode to be used as the joint prediction cost.
Through the implementation of the embodiment, a joint prediction mode is introduced in the intra-frame prediction process, and is not limited to a single intra-frame prediction mode, so that the possibility of more accurate intra-frame prediction on the current coding block is improved, and the removal effect of spatial redundancy is improved.
The complete intra prediction process based on joint prediction is illustrated in the following with reference to the accompanying drawings.
The second embodiment of the intra prediction method of the present invention is a further extension of the first embodiment of the intra prediction method of the present invention, and the same parts are not described again. As shown in fig. 3, the second embodiment of the intra prediction method of the present invention comprises:
s11: and acquiring a reference pixel of the current coding block.
S12: the reference pixels are filtered.
In some cases, S12 may be omitted.
S13: an independent prediction mode of a current coding block is selected from a plurality of candidate intra prediction modes, and an independent prediction cost is calculated based on the independent prediction mode.
The candidate intra prediction mode may be determined according to the type (whether a luminance block or a chrominance block), size, etc. of the current coding block.
S14: and judging whether the current coding block meets the joint prediction condition.
If the current coding block meets the joint prediction condition, jumping to S15; otherwise, the process jumps to S19.
S15: a joint prediction mode for a current coding block is determined.
The manner in which the joint prediction mode is selected may be correlated with the joint prediction condition. Two sets of joint prediction conditions and selection are exemplified below.
A first group:
a. the method comprises the following steps that an adjacent coded block exists with a current coded block, and the adjacent coded block adopts intra-frame prediction;
b. the intra prediction mode of at least a portion of the adjacent coded blocks is different from the independent prediction mode;
c. the size of the current coding block is within a first range.
d. The size of adjacent coded blocks is within a second range.
The joint prediction condition must include a and b, and whether c or d is included can be determined according to actual needs.
The corresponding joint prediction mode is selected by selecting the intra prediction modes of N coded blocks adjacent to the current coded block as the joint prediction modes.
A specific application example of the first group is illustrated. If the adjacent coded block L on the left side of the current block exists, L also adopts intra-frame prediction, and the prediction mode M of LLIndependent prediction mode M unequal to current block0And selecting the prediction mode M of L when the following size condition is satisfiedLPrediction mode M of sum AAAs a joint prediction mode.
When the current block is an luma block, the size conditions are that the width and height of the current block are both greater than or equal to 32, and the width and height of L are both greater than or equal to 16. When the current block is a chroma block, the size conditions are that the width and height of the current block are both greater than or equal to 16, and the width and height of L are both greater than or equal to 8.
The positions of L and A are shown in FIG. 4. L is a coding block to which a left side adjacent pixel point of a left lower point BL of the current block belongs, and A is a coding block to which an upper side adjacent pixel point of a right upper point TR of the current block belongs.
Second group:
e. the current coding block is not the first coding block of the current frame.
f. The size of the current coding block is within a first range.
The joint prediction condition must include e, and whether f is included or not can be determined according to actual needs.
The corresponding selection mode of the joint prediction mode is to screen N matched blocks from a plurality of candidate coded blocks, and the intra-frame prediction mode of the N matched blocks is used as the joint prediction mode.
A specific application example of the second group is illustrated. If the current block is not the first coding block of the current frame and the following size condition is satisfied, the SAD of the pixel difference is used as the difference rate matching block (this process may also be referred to as searching the matching block according to the SAD minimum criterion), and the intra prediction mode of the matching block is used as the joint prediction mode.
When the current block is an illumination block, the size condition is that the width and height of the current block are both greater than or equal to 16. When the current block is a chroma block, the size condition is that the width and height of the current block are both greater than or equal to 8.
Illustrating how the matching block is searched according to the SAD minimum criterion. As shown in fig. 5, the encoded blocks of the current frame are located at the left and upper sides of the current block with sequence numbers 1-15. Since employing pixel disparity requires that the candidate encoded blocks be the same size as the current block, the available candidate encoded blocks are 8, 11, and 13. The calculation formula of SAD is as follows:
Figure BDA0002107068360000091
where k denotes the sequence number of the candidate coded block, s and c denote the pixel values of the candidate coded block and the current block, respectively, and x and y denote the abscissa and ordinate of the corresponding positions within the candidate coded block and the current block.
Selecting 2 of the coded blocks 8, 11 and 13 with the minimum SAD as matching blocks, and adopting the intra-frame prediction mode of the matching blocks as the joint prediction mode Mcom1And Mcom2
S16: and calculating the joint prediction cost by using the joint prediction mode.
S17: and judging whether the joint prediction cost is smaller than the independent prediction cost.
In this embodiment, the preset condition is that the joint prediction cost is smaller than the independent prediction cost. If the joint prediction cost is less than the independent prediction cost, jumping to S18; otherwise, the process jumps to S19.
S18: and selecting the joint prediction mode as the intra-frame prediction mode of the current coding block.
S19: the independent prediction mode is selected as the intra prediction mode of the current coding block.
S20: and coding the current coding block to obtain the code stream of the current block.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an encoder according to an embodiment of the present invention. As shown in fig. 6, the encoder 30 includes a processor 31.
The processor 31 may also be referred to as a CPU (Central Processing Unit). The processor 31 may be an integrated circuit chip having signal processing capabilities. The processor 31 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The encoder may further comprise a memory (not shown) for storing instructions and data required for the processor 31 to operate.
The processor 31 is configured to execute instructions to implement the methods provided by any of the embodiments of the intra prediction method of the present invention described above and any non-conflicting combinations.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a memory device according to an embodiment of the invention. The memory device 40 of an embodiment of the present invention stores instructions that, when executed, implement the methods provided by any of the embodiments of the intra prediction methods of the present invention, as well as any non-conflicting combinations. The instructions may form a program file stored in the storage device in the form of a software product, so as to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute all or part of the steps of the methods according to the embodiments of the present application. The aforementioned storage device includes: various media capable of storing program codes, such as a U disk, a mobile hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or terminal equipment, such as a computer, a server, a mobile phone, a tablet and the like.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The above embodiments are merely examples and are not intended to limit the scope of the present disclosure, and all modifications, equivalents, and flow charts using the contents of the specification and drawings of the present disclosure or those directly or indirectly applied to other related technical fields are intended to be included in the scope of the present disclosure.

Claims (13)

1. An intra prediction method, comprising:
selecting an independent prediction mode of a current coding block from a plurality of candidate intra-frame prediction modes;
if the current coding block meets the joint prediction condition, determining a joint prediction mode of the current coding block, wherein the joint prediction mode comprises an intra-frame prediction mode of at least one coded block;
if the joint prediction mode meets the preset condition, selecting the joint prediction mode as an intra-frame prediction mode of the current coding block;
the preset condition comprises that joint prediction cost is smaller than independent prediction cost and/or the joint prediction cost is smaller than a preset threshold value, wherein the joint prediction cost is prediction cost for predicting the current coding block by adopting the joint prediction mode, and the independent prediction cost is prediction cost for predicting the current coding block by adopting the independent prediction mode;
if the joint prediction mode meets the preset condition, selecting the joint prediction mode as the intra-frame prediction mode of the current coding block further comprises: calculating a joint predicted value of the current coding block by using the joint prediction mode, wherein the joint predicted value of each pixel in the current coding block is a weighted average value of a plurality of predicted values of the pixel, and the plurality of predicted values comprise predicted values obtained by predicting the pixel by using the joint prediction mode; and calculating the joint prediction cost by using the joint prediction value of the current coding block.
2. The method of claim 1,
the joint prediction condition comprises at least one of the following three conditions:
a coded block adjacent to the current coded block exists, and the adjacent coded block adopts intra-frame prediction;
intra prediction modes of at least a portion of the neighboring encoded blocks are different from the independent prediction modes;
the current encoding block is not the first encoding block of the current frame.
3. The method of claim 2,
the joint prediction condition further comprises: the size of the current encoding block is within a first range and/or the size of the adjacent encoded block is within a second range.
4. The method of claim 1,
the joint prediction mode includes intra prediction modes of N coded blocks adjacent to the current coded block, where N is a positive integer.
5. The method of claim 1,
the joint prediction mode includes an intra prediction mode of N matched blocks, the N matched blocks being N candidate coded blocks of a plurality of candidate coded blocks having a smallest rate of difference with the current coded block, N being a positive integer.
6. The method of claim 5,
the disparity rate is derived based on pixel disparity and/or texture similarity between the candidate encoded block and the current encoded block.
7. The method of claim 6,
if the parameter used for calculating the difference rate comprises the pixel difference, the size of the candidate coded block is the same as that of the current coded block.
8. The method of claim 5,
the candidate encoded blocks include part or all of the encoded blocks of the current frame.
9. The method of claim 1, wherein the plurality of prediction values further comprises a prediction value obtained by predicting the pixel using the independent prediction mode.
10. The method of claim 1, further comprising:
and coding the current coding block to obtain a code stream of the current coding block, wherein the code stream of the current coding block comprises a joint prediction mark, and the joint prediction mark is used for indicating whether an intra-frame prediction mode of the current coding block is the joint prediction mode.
11. The method of claim 10,
the joint prediction flag comprises a joint prediction syntax element, and the value of the joint prediction syntax element is used for indicating whether the intra-frame prediction mode of the current coding block is the joint prediction mode; or
The joint prediction flag comprises the joint prediction syntax element, and in the case that an intra prediction mode of the current coding block is the joint prediction mode, the joint prediction flag further comprises the joint prediction mode; or
The joint prediction mark comprises an intra-frame prediction mode sequence number of the current coding block, wherein under the condition that the intra-frame prediction mode of the current coding block is the joint prediction mode, the intra-frame prediction mode sequence number of the current coding block is different from the sequence numbers of all candidate intra-frame prediction modes.
12. An encoder, comprising a processor and a memory coupled to the processor, the memory storing instructions for execution by the processor to implement the method of any one of claims 1-11.
13. A storage device storing instructions that, when executed by a processor, implement the method of any of claims 1-11.
CN201910556689.9A 2019-06-25 2019-06-25 Intra-frame prediction method, encoder and storage device Active CN110166775B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910556689.9A CN110166775B (en) 2019-06-25 2019-06-25 Intra-frame prediction method, encoder and storage device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910556689.9A CN110166775B (en) 2019-06-25 2019-06-25 Intra-frame prediction method, encoder and storage device

Publications (2)

Publication Number Publication Date
CN110166775A CN110166775A (en) 2019-08-23
CN110166775B true CN110166775B (en) 2021-05-11

Family

ID=67627030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910556689.9A Active CN110166775B (en) 2019-06-25 2019-06-25 Intra-frame prediction method, encoder and storage device

Country Status (1)

Country Link
CN (1) CN110166775B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111950587B (en) * 2020-07-02 2024-04-16 北京大学深圳研究生院 Intra-frame coding block dividing processing method and hardware device
CN111741299B (en) * 2020-07-09 2022-03-25 腾讯科技(深圳)有限公司 Method, device and equipment for selecting intra-frame prediction mode and storage medium
WO2023050370A1 (en) * 2021-09-30 2023-04-06 Oppo广东移动通信有限公司 Intra-frame prediction method, decoder, coder, and coding/decoding system
CN114938449B (en) * 2022-07-20 2023-10-27 浙江大华技术股份有限公司 Intra-frame prediction method, image encoding method, image decoding method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101677480B1 (en) * 2010-09-07 2016-11-21 에스케이 텔레콤주식회사 Method and Apparatus for Encoding/Decoding of Video Data Using Efficient Selection of Intra Prediction Mode Set
CN102547257B (en) * 2010-12-10 2014-04-02 联芯科技有限公司 Method for obtaining optimal prediction mode and device
CN102685474B (en) * 2011-03-10 2014-11-05 华为技术有限公司 Encoding and decoding method of prediction modes, encoding and decoding device and network system
CN103220506B (en) * 2012-01-19 2015-11-25 华为技术有限公司 A kind of decoding method and equipment
CN109889827B (en) * 2019-04-11 2021-01-29 腾讯科技(深圳)有限公司 Intra-frame prediction coding method and device, electronic equipment and computer storage medium

Also Published As

Publication number Publication date
CN110166775A (en) 2019-08-23

Similar Documents

Publication Publication Date Title
CN110166775B (en) Intra-frame prediction method, encoder and storage device
US10523965B2 (en) Video coding method, video decoding method, video coding apparatus, and video decoding apparatus
US11394999B2 (en) Method, device, and system for determining prediction weight for merge mode
CN110446044B (en) Linear model prediction method, device, encoder and storage device
US11979579B2 (en) Method and apparatus for encoding or decoding video data in FRUC mode with reduced memory accesses
US8649436B2 (en) Methods for efficient implementation of skip/direct modes in digital video compression algorithms
US9510010B2 (en) Method for decoding images based upon partition information determinations and apparatus for decoding using same
US20220210436A1 (en) Method for acquiring motion vectors, prediction method and device
CN109862353B (en) Chroma block prediction mode acquisition method and device, coder-decoder and storage device
CN110087083B (en) Method for selecting intra chroma prediction mode, image processing apparatus, and storage apparatus
CN111357290A (en) Video image processing method and device
CN110719467B (en) Prediction method of chrominance block, encoder and storage medium
CN111586415B (en) Video coding method, video coding device, video coder and storage device
CN112565769B (en) Block division method, inter-frame prediction method, video coding method and related device
CN112804525B (en) IBC mode intra block copy prediction method, device, medium and equipment
CN110166774B (en) Intra-frame prediction method, video encoding method, video processing apparatus, and storage medium
CN105828084B (en) HEVC (high efficiency video coding) inter-frame coding processing method and device
CN111586416A (en) Video coding method, video coding device, video coder and storage device
CN111108749A (en) Encoding method, decoding method, encoding device, and decoding device
CN113542768B (en) Motion search method, motion search device and computer-readable storage medium
KR102076781B1 (en) Method of adaptive intra prediction mode encoding and apparatus for the same, and method of decoding and apparatus for the same
CN109587496B (en) Skip block distinguishing method, encoder, electronic device and readable storage medium
KR20200077497A (en) Method of adaptive intra prediction mode encoding and apparatus for the same, and method of decoding and apparatus for the same
KR20200015684A (en) Method of adaptive intra prediction mode encoding and apparatus for the same, and method of decoding and apparatus for the same
CN113301337A (en) Coding and decoding method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant