CN117041604A - Deblocking filtering method and related device - Google Patents

Deblocking filtering method and related device Download PDF

Info

Publication number
CN117041604A
CN117041604A CN202311073016.0A CN202311073016A CN117041604A CN 117041604 A CN117041604 A CN 117041604A CN 202311073016 A CN202311073016 A CN 202311073016A CN 117041604 A CN117041604 A CN 117041604A
Authority
CN
China
Prior art keywords
block
video frames
edge position
block edge
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311073016.0A
Other languages
Chinese (zh)
Inventor
郑佳臻
葛维
胡均浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Unisoc Chongqing Technology Co Ltd
Original Assignee
Unisoc Chongqing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Unisoc Chongqing Technology Co Ltd filed Critical Unisoc Chongqing Technology Co Ltd
Priority to CN202311073016.0A priority Critical patent/CN117041604A/en
Publication of CN117041604A publication Critical patent/CN117041604A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/86Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/86Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness
    • H04N19/865Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness with detection of the former encoding block subdivision in decompressed video

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application discloses a deblocking filtering method and a related device, wherein the method comprises the following steps: dividing each video frame of the one or more video frames into a plurality of 8 x 8 blocks, and determining a block edge position in each video frame based on each block; determining artificial block edge locations in one or more video frames based on the number of occurrences of each block edge location in the one or more video frames; and filtering the next frame of the one or more video frames by utilizing the artificial block edge position. By adopting the embodiment of the application, the block effect in a plurality of video frames corresponding to the video can be reduced, thereby improving the video quality.

Description

Deblocking filtering method and related device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a deblocking filtering method and a related device.
Background
Deblocking filter (Deblocking Filter: DBF), also known as deblocking filter, is a filter that reduces the occurrence of visual artifacts at block boundaries. The visual defect, also known as blocking artifacts, is mainly generated by block-based codec processing in video codecs, and is an artificial boundary (Artificial Blocking).
Therefore, how to utilize the deblocking filter to reduce the influence of blocking artifacts to improve the quality of video is a major issue in the decoding process.
Disclosure of Invention
The embodiment of the application provides a deblocking filtering method and a related device, which can reduce blocking effect in a plurality of video frames corresponding to video, thereby improving video quality.
In a first aspect, an embodiment of the present application provides a deblocking filtering method, including:
dividing each video frame of the one or more video frames into a plurality of 8 x 8 blocks, and determining a block edge position in each video frame based on each block;
determining artificial block edge locations in one or more video frames based on the number of occurrences of each block edge location in the one or more video frames;
and filtering the next frame of the one or more video frames by utilizing the artificial block edge position.
In the embodiment of the application, the electronic equipment can determine the edge position of the artificial block in one or more video frames, and filter the next frame of the one or more video frames by utilizing the edge position of the artificial block, so that the block effect in a plurality of video frames corresponding to the video can be reduced, and the video quality is improved.
In an alternative embodiment, determining block edge locations in each video frame on a per block basis includes: determining a first difference amount between information corresponding to each two adjacent parallel lines in information corresponding to a plurality of parallel lines in each block in each video frame; obtaining a second difference between the information corresponding to each two adjacent parallel lines based on the first difference and the first weight between the information corresponding to each two adjacent parallel lines; determining a block edge position of each block in each video frame based on the second difference amount and the first difference amount; the block edge locations in each video frame are determined based on the edge locations of each block in each video frame.
In an alternative embodiment, determining artificial block edge locations in the one or more video frames based on the number of occurrences of each block edge location in the one or more video frames comprises: determining a first block edge position with the largest occurrence number and a second block edge position with the largest occurrence number in one or more video frames; if the quantization result of the first block edge position is higher than the quantization result of the second block edge position by N times, and the number of edges generated by the first block corresponding to the first block edge position is larger than or equal to a preset value, determining the first block edge position as an artificial block edge position, wherein N is larger than or equal to 2.
In an alternative embodiment, filtering a next frame of the one or more video frames using the artificial block edge locations includes: determining a dynamic filter strength based on an amount of inter-frame difference between a last frame of the one or more video frames and a next frame of the one or more video frames; the next frame of the one or more video frames is filtered based on the dynamic filter strength and the artificial block edge position.
In an alternative embodiment, determining the dynamic filter strength based on an amount of inter-frame difference between a last frame of the one or more video frames and a next frame of the one or more video frames includes: determining a third difference amount between every two adjacent parallel lines based on information corresponding to the plurality of parallel lines in a last frame of the one or more video frames; determining the movement degree of the picture in the last frame based on the third difference amount and the inter-frame difference amount; the dynamic filter strength is determined based on the degree of movement of the picture in the last frame.
In a second aspect, an embodiment of the present application provides a deblocking filter apparatus, including:
a determining unit, configured to divide each video frame of the one or more video frames into a plurality of 8×8 blocks, and determine a block edge position in each video frame based on each block;
A determining unit, configured to determine an artificial block edge position in one or more video frames based on the number of occurrences of each block edge position in the one or more video frames;
and the processing unit is used for carrying out filtering processing on the next frame of the one or more video frames by utilizing the edge position of the artificial block.
Optionally, the deblocking filter apparatus performs optional embodiments and advantageous effects, which are described in the above related content of the first aspect and are not described in detail herein.
In a third aspect, an embodiment of the present application provides an electronic device, including: the processor, the memory, the processor and the memory are interconnected, wherein the memory is for storing a computer program comprising program instructions, wherein the processor is for invoking the program instructions to implement the method according to any of the optional embodiments of the first aspect described above.
In a fourth aspect, an embodiment of the present application provides a chip, where the chip includes a processor, and the processor performs a method related to any optional implementation manner of the first aspect. Optionally, the chip may further include a memory, and a computer program or instructions stored on the memory, and executed by the processor to implement the method according to any of the optional embodiments of the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip module, including a transceiver component and a chip, where the chip includes a processor, and the processor performs a method related to any optional embodiment of the first aspect. Optionally, the chip may further include a memory, and a computer program or instructions stored on the memory, and executed by the processor to implement the method according to any of the optional embodiments of the first aspect.
In a sixth aspect, an embodiment of the present application provides a computer readable storage medium storing a computer program, the computer program comprising program instructions which, when executed by a computer, implement a method according to any of the alternative embodiments of the first aspect.
In a seventh aspect, embodiments of the present application provide a computer program product comprising a computer program or program instructions which, when executed, implement a method according to any of the alternative embodiments of the first aspect described above.
Drawings
FIG. 1 is a schematic diagram of deblocking filtering using a vertical deblocking filter according to an embodiment of the present application;
Fig. 2 is a flowchart of a deblocking filtering method according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a block detection and determination process using a vertical deblocking filter according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a line number in an edge position of a mark block according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a filtering process using a vertical deblocking filter according to an embodiment of the present application;
FIG. 6 is a diagram showing the dynamic filtering strength when filtering with a vertical deblocking filter according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a filtering process using a horizontal deblocking filter according to an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating a block detection and determination process using a horizontal deblocking filter according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a filtering process using a horizontal deblocking filter according to an embodiment of the present application;
FIG. 10 is a diagram showing the dynamic filtering strength when filtering with a horizontal deblocking filter according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a deblocking filter device according to an embodiment of the present application;
Fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the described embodiments of the application may be combined with other embodiments.
It should be noted that, in the present application, "first", "second", "third", etc. are used for distinguishing similar objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the term "include" and any variations thereof are intended to cover a non-exclusive inclusion. For example, a process, method, software, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
In the application, "equal to" can be used in conjunction with "less than" or "greater than" but not in conjunction with "less than" and "greater than" at the same time. When the combination of the 'equal' and the 'less' is adopted, the method is applicable to the technical scheme adopted by the 'less'. When being used with 'equal to' and 'greater than', the method is applicable to the technical scheme adopted by 'greater than'.
In the present application, "and/or" is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone. In this context, the character "/" indicates that the front and rear associated objects are an "or" relationship.
"at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, a and b, a and c, b and c, or a, b and c, wherein each of a, b, c may itself be an element or a collection comprising one or more elements.
The term "at least one" in the present application means one or more. "plurality" means two or more. The first, second, etc. descriptions in the embodiments of the present application are only used for illustration and distinction of description objects, and no order division is provided, nor is the number of the descriptions in the embodiments of the present application to be construed as any limitation of the embodiments of the present application. For example, the first identifier and the second identifier are merely to distinguish between identifiers corresponding to different vehicles, and are not intended to represent that the two identifiers are the same or different.
In the present disclosure, "exemplary," "in some embodiments," "in other embodiments," etc. are used to indicate an example, instance, or illustration. Any embodiment or design described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, the term use of an example is intended to present concepts in a concrete fashion.
"of", "corresponding", and "associated" in the present application may be sometimes used in combination, and it should be noted that the meaning of the expression is consistent when the distinction is not emphasized. Communication and transmission may sometimes be mixed in embodiments of the present application, and it should be noted that the meaning expressed is consistent when distinction is not emphasized. For example, a transmission may include sending and/or receiving, either nouns or verbs.
In the present application, the electronic device may be a terminal device. A terminal device is a device having a wireless communication function, and may be also referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal device, a vehicle-mounted terminal device, an industrial control terminal device, a UE unit, a UE station, a mobile station, a remote terminal device, a mobile device, a UE terminal device, a wireless communication device, an intelligent terminal device, a UE agent, a UE apparatus, or the like. The terminal device may be fixed or mobile.
Alternatively, the terminal device may be deployed on land, including indoors or outdoors, hand-held, wearable or vehicle-mounted; can be deployed on the water surface (such as ships, etc.); but also may be deployed in the air (e.g., aircraft, balloons, satellites, etc.).
It should be noted that the terminal device may support at least one wireless communication technology, such as Long-Term Evolution (LTE), new Radio (NR), 6G, or next-generation wireless communication technology. For example, the terminal device may be a mobile phone (mobile phone), tablet (pad), desktop, notebook, all-in-one, in-vehicle, virtual Reality (VR) terminal device, augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol, SIP) phone, a wireless local loop (wireless local loop, WLL) station, a personal digital assistant (personal digital assistant, PDA), a handheld device with wireless communication functionality, a computing device or other processing device connected to a wireless modem, a wearable device, a terminal in a next-generation communication system such as a terminal in NR network, a future-generation mobile communication device in a PLMN (PLMN) or a future-generation mobile network (public land mobile network), etc.
Further, the terminal device may further include a device having a transceiver function, such as a chip system. The chip system may include a chip and may also include other discrete devices.
First, some concepts related to the embodiments of the present application will be briefly described.
1. Blocking effect
In block-based codecs such as high-order video coding (advanced video coding, AVC) (alternatively referred to as h.264, or MPEG-4), high efficiency video coding (high efficiency video coding, HEVC) (alternatively referred to as h.265), etc., intra-Prediction (Intra-Prediction) or Transform (Transform) coding, video frames are re-divided into blocks for coding and compression, whether macroblock (AVC) or coding tree unit (coding tree unit) in HEVC. In decoding and restoring, partial information is lost due to the problem of compression rate, so that some discontinuous excessive phenomenon (block distortion) is generated at the edge of each block, and the phenomenon is called block effect (or artificial block edge position). Among them, block-based codecs include, but are not limited to AVC, HEVC, etc.
The visual discontinuities created by these block edges are referred to as artificial boundaries (artificial blocking), and the reasons for creating artificial boundaries can be divided into two types: the first is to encounter a signal (Source) with a residual amount of blocking artifacts. The Residual refers to the Residual (Residual) caused by the prediction distortion caused by the encoding and decoding processes of the H.264/H.265, and the Residual is the error of quantization and inverse quantization caused by the discrete cosine transform (discrete cosine transform, DCT) when the block is compressed. The second is motion compensation. The discontinuity of block prediction is caused by the fact that adjacent blocks in the same picture may be affected by the picture movement.
2. Deblocking filter
The deblocking filter is also called deblocking filter, and is a filter for reducing visual defects generated at block boundaries. Among other things, visual defects may also be referred to as blocking artifacts.
In general, in the coding process of a block-based codec (e.g., AVC or HEVC), a systematic loop filter (loop filter) is used to perform operations, so that decoded macro blocks can be well restored. However, in the case that the video file format is MPEG-4/MPEG-2/VC-I/RMVB, there is still little block distortion (blocking distortion) after the macroblock (decoded marcoblock) is decoded, so that the deblocking filter can more effectively eliminate the block edges between decoded sections to improve the appearance of the decoded picture, even further to improve the compression ratio of the original block.
The deblocking filter has a total of three parts, which are boundary strength calculation (Blocky Strength Computation), boundary decision (block decision) and filter application (Filter Implementation), respectively.
In the present application, deblocking filters are classified into a deblocking filter in a vertical direction and a deblocking filter in a horizontal direction. The deblocking filter in the vertical direction may be referred to as a vertical deblocking filter (vertical deblocking filter), and the deblocking filter in the horizontal direction may be referred to as a horizontal deblocking filter (horizontal deblocking filter).
In the present application, the deblocking filter is functionally divided into block edge detection and decision (dbk_det) and filter application (dbk_fil).
Block-based video codecs produce blocking artifacts when decoding video. At present, the deblocking filter can be used to reduce the blocking effect generated by the block boundary in the video frame, however, when the method is used to reduce the blocking effect, the complexity of boundary analysis and boundary judgment is higher, and the motion compensation part of the moving picture in the video frame is weaker, so that the filtering effect is poorer, and the video quality corresponding to a plurality of video frames is poorer. In addition, the deblocking filter cannot be applied to pictures restored by video codecs with various resolutions and different formats to eliminate blocking artifacts in the encoded restored pictures. For example, for a picture with resolution 720×480, no deblocking filter for AVC or HEVC can be applied to handle blocking artifacts in the picture.
The embodiment of the application provides a deblocking filtering method and a related device, wherein the method comprises the following steps: dividing each video frame of the one or more video frames into a plurality of 8 x 8 blocks, and determining a block edge position in each video frame based on each block; determining artificial block edge locations in one or more video frames based on the number of occurrences of each block edge location in the one or more video frames; and filtering the next frame of the one or more video frames by utilizing the artificial block edge position. That is, after determining the artificial block edge position in one or more video frames, the electronic device may utilize the artificial block edge position to filter the same position of a next frame of the one or more video frames. In this way, blocking artifacts in one or more video frames can be reduced, thereby improving video quality. In addition, the block edge detection is determined after the block interval of 8×8 is utilized to block one or more video frames, so that the method can be effectively applied to pictures with different resolutions and restored by video coding.
Referring to fig. 1, fig. 1 is a schematic diagram of deblocking filtering by using a vertical deblocking filter according to an embodiment of the present application. In fig. 1, the abscissa indicates video frames, and the ordinate vsync indicates the start header of each video frame. frame0 (vdet) and frame0 (vfil) represent block edge detection and judgment, respectively, in the current frame (or referred to as the current frame) of the plurality of video frames, and filter application; frame1 (vfil) represents the filter application in the next frame (or called the sub-frame) of the current frame; vdi_dbk_vdet represents the detection and judgment of the block edge in the vertical direction; vdi_dbk_vfil represents a filter application in the vertical direction. As shown in fig. 1, the electronic device may determine statistics of block edge locations for each block in one or more video frames and determine whether artificial block edges are present in the one or more video frames based on the statistics. If there is an artificial block edge in one or more video frames, the electronic device may apply dynamic estimation to filter at a block edge position (block index) of a next frame of the one or more video frames based on the artificial block edge position. In this way, blocking artifacts in one or more video frames can be reduced, thereby improving video quality.
The deblocking filtering method provided by the embodiment of the present application is described in detail below.
Referring to fig. 2, fig. 2 is a flowchart of a deblocking filtering method according to an embodiment of the present application. As shown in fig. 2, the deblocking filtering method may include, but is not limited to, the following steps:
s201, dividing each video frame in one or more video frames into a plurality of 8 x 8 blocks, and determining the block edge position in each video frame according to each block.
The 8 x 8 block has a better filtering effect in the following steps. Because of the larger block, the filtering effect of the whole picture in the scene with low resolution is poor, while the smaller block cannot be reconstructed by using multi-line filtering, and the restoring effect is poor. Wherein there are 64 pixels in each 8 x 8 block.
In an alternative embodiment, the electronic device determining the block edge position in each video frame from each block may include: determining a first difference amount between information corresponding to each two adjacent parallel lines in information corresponding to a plurality of parallel lines in each block in each video frame; obtaining a second difference between the information corresponding to each two adjacent parallel lines based on the first difference and the first weight between the information corresponding to each two adjacent parallel lines; determining a block edge position of each block in each video frame based on the second difference amount and the first difference amount; the block edge locations in each video frame are determined based on the edge locations of each block in each video frame.
Taking the vertical deblocking filter as an example, referring to fig. 3, the block edge position of each block in each video frame is determined by the electronic device, and fig. 3 is a schematic diagram of a block detection and judgment process using the vertical deblocking filter according to an embodiment of the present application. The process of block edge quantization 301 is described herein, wherein block edge quantization 201 is used to determine where a block edge (block) of a block is located. As shown in fig. 3, assuming that the electronic device is to determine the block edge position of block 1 in video frame 1, the plurality of parallel lines in block 1 includes two rows of parallel lines cl2_y, cl1_y preceding the center line cur_y, and three rows of parallel lines nxt_y, nxt2_y, nxt3_y following the center line cur_y. When determining the block edge position of the block 1, the electronic device may first determine that the first difference between the information corresponding to cl2_y and cl1_y is u_ypcp, determine that the first difference between the information corresponding to cl1_y and cur_y is u_ypcp or u_ypcn, determine that the first difference between the information corresponding to cur_y and nxt_y is u_ ypn12, and determine that the first difference between the information corresponding to nxt2_y and nxt3_y is u_ ypn23. Then, the first difference u_ypcp is adjusted by means of the first weight, and a second difference u_mpcp is obtained; adjusting the first difference u_ypcn to obtain a second difference u_mpcn; adjusting the first difference u_ ypn12 to obtain a second difference u_ mpn12; the first difference u_ ypn23 is adjusted to obtain a second difference u_ mpn23; the second difference is compared with the first difference. Thus, the difference in the occurrence of the block edge is even higher than the first difference between the information corresponding to the two parallel lines normally adjacent by an integer multiple, and at this time, the block edge position (edge detect valid & qualify) of the block 1 can be determined.
The following explains the determination of the block edge position of each video frame based on the block edge position of each block in each video frame by the electronic device in connection with the statistics of the block edge positions and the determination of whether it is an artificial block edge 302 in fig. 3. In the statistics and determination of block edge positions in fig. 3 as to whether the block edge is an artificial block edge 302, the block_edge det refers to the block edge position differentiated by the difference between lines, and marks which line is detected as the block edge of the block;
edge_idx_counter refers to marking each line with a block number by a freely counted line number; s2d_vdet refers to a buffer (buffer) that is divided into 5 lines or 7 lines in the case of vertical deblocking filtering, wherein in the 5 lines processing mode, the buffer is used to achieve the effect equivalent to the 7 lines processing mode because there are two sets of reference line output values; u_sort refers to counting the detected block by line number; u_max2th means that comparing the first block edge position with the highest occurrence count with the second block edge position with the highest occurrence count, and outputting a correct block position number (block_valid); the edgev refers to the block edge of the current block; multi-frame check means that under the statistics of block edge positions of pictures of a plurality of video frames, a plurality of block edge fixed positions are determined, so that the stability can be improved; the blockv_offset refers to the number of the block edge, wherein only one number is in one video frame;
The blockv_offset_valid refers to a marked block edge position, where there are multiple marked block edge positions in a video frame.
In this portion, the electronic device may use the block edge position (edge detect valid & qualify) of each block determined by the block edge quantization 301 in fig. 3 to perform block edge detection and judgment, and perform statistics of video frame information of each block to determine the block edge position of the video frame of each block.
In summary, the electronic device may divide each video frame into 8×8 blocks, and locate (u_start) information corresponding to each column of each block inputted into the deblocking filter according to the number of blocks of 0-7, so as to determine that a block edge occurs for each 8×8 block. Then, the electronic device may count the number of repeated cycles of the block edge positions in each video frame according to the block edge position of each block obtained in the block edge quantization 301, so as to determine the block edge position of each video frame.
S202, determining the artificial block edge positions in one or more video frames based on the occurrence times of each block edge position in the one or more video frames.
In an alternative embodiment, the electronic device determining the artificial block edge locations in one or more video frames based on the number of occurrences of each block edge location in one or more video frames may include: determining a first block edge position with the largest occurrence number and a second block edge position with the largest occurrence number in one or more video frames; if the quantization result of the first block edge position is higher than the quantization result of the second block edge position by N times, and the number of edges generated by the first block corresponding to the first block edge position is larger than or equal to a preset value, determining the first block edge position as an artificial block edge position, wherein N is larger than or equal to 2.
That is, when the electronic device determines the artificial block edge positions in one or more video frames based on the occurrence number of each block edge position in one or more video frames, the two block edge positions with the highest occurrence number in the one or more video frames can be compared, and when the quantization result of the block edge position with the highest statistics is several times higher than that of the next highest, and the number of generated edges is enough, the block edge position with the highest occurrence number can be regarded as the artificial block edge. Wherein the number of edges generated per se is sufficient to indicate that the edge position with the highest number of occurrences has a number of occurrences greater than a threshold value (threshold). For example, the electronic device may first determine the most frequently occurring block edge position 1 and the most frequently occurring block edge position 2 among the 8×8 blocks. When the quantization result of the block edge position 1 is higher than N times (e.g., 8 times) the quantization result of the block edge position 2, and the number of occurrences of the block edge position 1 is equal to or greater than the preset value, the electronic device may determine that the block edge position 1 is an artificial block edge position.
In the application, the block edge position is divided into 8 sections for u_sort statistics. Fig. 4 is a schematic diagram of a column in a mark block edge position according to an embodiment of the present application. As shown in fig. 4, parallel lines corresponding to 0, 8, 16, 24 in the vertical direction are lines in which the edges of the artificial block are detected, and thus, in 8 sections encoded as 0 to 7, all of the sections are classified as 0. In fig. 4, the bolded black border may represent an artificial block edge that is repeated and is very numerous.
Optionally, with the continuous improvement of the resolution of the video frame, the number of lines stored after u_sort can be reduced by using a cross-block statistics manner. For example, after the original 8×8 blocks are changed to 2 consecutive 8×8 blocks for respective statistics, information storage is performed. Thus, the storage capacity of a half Static Random-Access Memory (SRAM) can be reduced under the condition that the effect of an algorithm similar to that of an original 8 x 8 block is achieved.
S203, filtering the next frame of one or more video frames by utilizing the edge position of the artificial block.
In an alternative embodiment, the electronic device performs filtering processing on a next frame of the one or more video frames using the artificial block edge position, and may include: determining a dynamic filter strength based on an amount of inter-frame difference between a last frame of the one or more video frames and a next frame of the one or more video frames; the next frame of the one or more video frames is filtered based on the dynamic filter strength and the artificial block edge position. Therefore, the dynamic filtering strength is determined by determining the moving degree of the picture in the last frame of one or more video frames according to the inter-frame difference, and the moving picture in the one or more video frames can be dynamically compensated better, so that the filtering effect can be accurately set, and the video quality can be improved.
In this embodiment, the electronic device determining the dynamic filter strength based on an amount of inter-frame difference between a last frame of the one or more video frames and a next frame of the one or more video frames may include: determining a third difference amount between every two adjacent parallel lines based on information corresponding to the plurality of parallel lines in a last frame of the one or more video frames; determining the movement degree of the picture in the last frame based on the third difference amount and the inter-frame difference amount; the dynamic filter strength is determined based on the degree of movement of the picture in the last frame.
In this embodiment, filtering the next frame of the current frame based on the dynamic filtering strength and the artificial block edge position in the current frame includes: the next frame of the one or more video frames is filtered based on the third amount of difference, the dynamic filter strength, and the artificial block edge location.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating a filtering process using a vertical deblocking filter according to an embodiment of the present application. As shown in fig. 5, it is assumed that the plurality of parallel lines in the last frame of the one or more video frames includes the center line cur_yi, an upper adjacent parallel line prv_yi of cur_yi, and a lower adjacent parallel line nxt_yi of cur_yi. The electronic device may first determine that the difference between the information corresponding to the decur_yi and prv_yi is u_diff_cp, and determine that the difference between the information corresponding to the decur_yi and nxt_yi is u_diff_cn. Next, the degree of motion of the picture in the last frame is determined based on u_diff_cp, u_diff_cn, inter-frame difference field_diff_cnt, which can be used to determine the motion level after quantization of each video frame, which can be used to determine the normal filter strength (x_fil_th1) or the high difference filter suppression start value (x_fil_th2), and the artificial block edge position edge. Then, the electronic device can adjust the initial dynamic filtering strength based on the moving degree of the picture in the last frame to obtain the current dynamic filtering strength gain control. Finally, filtering a next frame of the one or more video frames based on the current dynamic filtering strength gain control, the artificial block edge position edge, and the 3-tap low pass filter (3-tap Low Pass Filter,3-tap LPF) or the 5-tap low pass filter (5-tap LPF) to output information filep_final of the filtered next frame of the one or more video frames in a vertical direction (the information of the filtered vertical direction through the 3-tap LPF or the 5-tap LPF). The filtering coefficient of the 3-tap LPF is information corresponding to 3 parallel lines prv_yi, cur_yi and nxt_yi respectively; the filter coefficients of the 3-tap LPF are information corresponding to 5 parallel lines prv2_yi, prv_yi, cur_yi, nxt_yi, and nxt2_yi, respectively.
Alternatively, the modes of operation of the vertical deblocking filter may be categorized into the following 3 categories with reference to the degree of picture movement in one or more video frames: a stationary normal mode, a high-speed moving mode, and a low-speed moving mode. In which there are individual normal filter intensities (or referred to as dynamic filter intensity peaks) in different modes of operation (cfilth1) and high variance filter rejection start values (cfilth2).
Referring to fig. 6, fig. 6 is a schematic diagram of dynamic filtering strength when filtering by using a vertical deblocking filter according to an embodiment of the present application. As shown in fig. 6, the abscissa vdiff represents the amount of difference of adjacent lines, and the ordinate gain represents the dynamic filter intensity. Wherein, _filjth1 refers to the dynamic filtering intensity peak; * _filth2 refers to the filter suppression start value of high difference; reg_vdi_dbk_fil_slope means that in the case of high difference, the larger difference is less obvious for the filtering effect of the adjacent row, and when the difference reaches a certain degree, the gain value is 0, wherein slope represents an adjustment slope 1, and the larger the adjustment slope 1 is, the faster the speed of decreasing the value of _filjth1 is; reg_vdi_dbk_fil_sle2 means that in the case of low variance, the smaller variance is less obvious for the filtering effect of the adjacent rows, and as the variance becomes larger, the gain value gradually increases, and sle 2 means that the slope2 is adjusted, and the larger the slope2 is, the faster the gain value reaches the speed of ×fil_th1; reg_vdi_dbk_fil_vdiff_th3 refers to the lowest criterion given to the gain value when the difference is very small (e.g. the difference is 0) for exhibiting a stable linear change with slope2, _fil_th1; reg_vdi_dbk_fil_vdiff_th4 refers to the difference margin value, and when the difference is greater than this value, the gain value starts to decrease; reg_vdi_dbk_vdiff refers to the value of _filjth1 when the surface is moving at normal speed; reg_vdi_dbk __ vdiff_hmot refers to the value of _filjth1 when the picture moves at high speed; reg_vdi_dbk __ vdiff_lmot refers to the value of _filjth1 when the picture moves at low speed.
* Filth 1 and filth 2 represent that the amount of difference exceeding a certain intensity can be limited to different degrees in different modes of operation. As shown in fig. 6, in the case of a great difference amount, the dynamic filter strength gain value between two adjacent parallel lines is 0, in which case the electronic device finally outputs only the result of the center line cur_yi.
Optionally, the electronic device may control the dynamic filtering strength when filtering the vertical deblocking filter by using the edge difference, so that some effects caused by the high edge difference due to the frame movement may be effectively suppressed.
Optionally, the electronic device may also converge the dynamic filtering strength when filtering the vertical deblocking filter by using the low difference of the edge, and may reduce the weight of the adjacent line when the difference between the center line cur_yi and the adjacent parallel line is small to a certain extent, mainly the center line cur_yi output.
Therefore, when the embodiment of the application is adopted, the electronic equipment can carry out secondary repair on the area corresponding to the edge position of the artificial block under the condition that the edge position of the artificial block in one or more video frames is determined, namely, the edge position of the artificial block is utilized to carry out filtering processing on the next frame of one or more video frames, so that the blocking effect in a plurality of video frames corresponding to the video can be reduced, and the video quality is improved.
In addition, in the embodiment of the present application, the block interval of 8×8 is utilized to block-divide one or more video frames and then determine the block edge position, which can be effectively applied to video codecs with different resolutions and different formats.
The above description describes the deblocking filtering method provided by the embodiment of the present application by taking a vertical deblocking filter as an example, and the following description describes the deblocking filtering method provided by the embodiment of the present application by taking a horizontal deblocking filter as an example.
Referring to fig. 7, fig. 7 is a schematic diagram of a filtering process performed by a horizontal deblocking filter according to an embodiment of the present application. As shown in fig. 7, the horizontal deblocking filter is mainly divided into block detection and decision (dbk_hdet), dynamic weight control and filter application (dbk_hfil_luma) when performing filtering.
The block detecting and judging part may be divided into two sub-parts, wherein the first sub-part is gfx_dbk_hdet, and is used for judging whether a block effect (block_pre_offset_valid) exists in an input video frame (dbk_din) and sending the judged block edge position (block_pre_offset) to the next frame for re-judgment. Specifically, the electronic device may split the input video frame to be detected into a plurality of line numbers, and input the split input video frame to gfx_dbk_hdet in parallel to determine whether the input video frame has a blocking effect. In fig. 7, gfx_dbk_pre_hdet_b0, …, gfx_dbk_pre_hdet_b7 refer to the statistical number of block edges numbered b0, …, b7, respectively; since there are three scale sizes in the system of the horizontal deblocking filter, gfx_dbk_hdet also determines which mode (mode_det_res) is currently in and determines whether the current operation mode is correct (mode_valid).
The second sub-portion is gfx_dbk_hdet_sel, which reconfirms the block edge position (block_offset) without errors in the case that the first sub-portion determines that the current system is correct, and determines whether there is a blocking effect (block_offset_valid) by determining the stability of continuous frame information in a plurality of video frames.
gfx_dbk_hfil_iso is mainly data and timing control; dbk_dout refers to the output result after the block detection in the horizontal direction is filtered.
Referring to fig. 8, fig. 8 is a schematic diagram of a block detection and determination process using a horizontal deblocking filter according to an embodiment of the present application. As shown in fig. 8, the electronic device may perform block edge quantization to determine a block edge position of a current frame of the plurality of video frames. In determining the block edge position of the current frame in the plurality of video frames, the electronic device may first find out the adjacent edge by using four consecutive points (y 2, y1, y0, yp1, where y0 is the center point) in the horizontal direction. Next, the absolute value ABS of the amount of difference between the information corresponding to each adjacent two points is calculated, for example, the absolute value of the amount of difference between the information corresponding to y2 and y1 is diff_y12, the absolute value of the amount of difference between the information corresponding to y1 and y0 is diff_y01, and the absolute value of the amount of difference between the information corresponding to y0 and yp1 is diff_yp01. Then, the electronic device can amplify the magnification of the amount of difference between the information corresponding to the adjacent points, such as diff_yp10 to yp10_mul_face and diff_y12 to y12_mul_face. The block edges are also limited by a boundary range from reg_gfx_blockedgeh_minth to reg_gfx_blockedgeh_maxth, and the electronic device may determine that an absolute value diff_y01 of a difference between information corresponding to the center point falls within the boundary range from reg_gfx_blockedgeh_minth to reg_gfx_blockedgeh_maxth (y01_edge indicates that diff_y01 falls within upper and lower boundary ranges). Then, if diff_y01 is equal to or greater than the amplified adjacent point difference yp10_mul_face or y12_mul_face, the center point y0 is indicated as the block edge (real_edge indicates the determined edge position of the current block). After determining the edge position of the current block, the electronic device may perform statistics of the block edge position and determine whether the block edge is an artificial block edge. Optionally, the foregoing description of the block edge position statistics and determination of whether the block edge is the artificial block edge 302 in fig. 3 may be referred to by the electronic device, and will not be described herein.
Referring to fig. 9, fig. 9 is a schematic diagram illustrating a filtering process using a horizontal deblocking filter according to an embodiment of the present application. As shown in fig. 9, assume that four consecutive points in the horizontal direction are y2, y1, y0, yp1, where y0 is the center point. The electronic device may first calculate the difference between the information corresponding to each two adjacent points by using the detected position of the block edge (i.e. the artificial block edge position determined by fig. 8) and y2, y1, y0, yp1, for example, the difference between the information corresponding to yp1 and y0 is diff01_abs, the difference between the information corresponding to y0 and y1 is diff12_abs, and the difference between the information corresponding to y1 and y2 is diff23_abs. The diff01_abs, diff12_abs and diff23_abs are input to a low pass filter LPF to obtain the absolute value abshdiff of the integrated difference between adjacent points. Then, the motion degree of the picture in the current frame is judged by matching with the inter-frame difference field_diff_cnt, and the initial dynamic filtering intensity is adjusted based on the motion degree to obtain the current dynamic filtering intensity Filter Gain Control. Finally, filtering a next frame in the one or more videos based on the current dynamic filtering strength, the artificial block edge location.
Referring to fig. 10, fig. 10 is a schematic diagram of dynamic filtering strength when filtering by using a horizontal deblocking filter according to an embodiment of the present application. As shown in fig. 10, the abscissa abshdiff represents the absolute value of the difference amount of adjacent points, and the ordinate gain1 represents the dynamic filter strength. Wherein reg_gfx_dbk_hdiff_th1 (x_th1) refers to the peak value of dynamic filter strength gain 1; reg_gfx_dbk_hdiff_th2 (x_th2) refers to a filter suppression start value of a high difference amount, and after the difference amount exceeds this value, the value of gain1 starts to gradually decrease; reg_gfx_dbk_hdiff_th3 refers to the lowest criterion given to the gain1 value for exhibiting a stable linear change with slope3 and_th1 when the difference is very small (e.g. the difference is 0); reg_gfx_dbk_hdiff_th4 refers to a difference threshold value, and when the difference is larger than the difference threshold value, the gain1 value can reach a stable value; reg_gfx_dbk_hdiff_slope1 means that under the condition of high difference, the larger difference is less obvious for the filtering effect of adjacent points, and when the difference reaches a certain degree, the gain1 value is 0, wherein slope1 represents an adjustment slope1, and the larger the adjustment slope1 is, the faster the reduction speed of the value of_th1 is; reg_gfx_dbk_hdiff_slope3 means that in the case of low difference, the smaller the difference is less obvious for the filtering effect of the adjacent row, and as the difference becomes larger, the gain value gradually increases, and slope3 means that the larger the adjustment slope 2 is, the faster the gain1 value reaches the speed of _th1; reg_gfx_dbk_hdiff_lmot_th1 is the value of th1 when the picture moves at low speed; reg_gfx_dbk_hdiff_th1 refers to the value of_th1 when the picture moves normally; reg_gfx_dbk_hdiff_hmot_th1 refers to the value of th1 when the picture moves at high speed; reg_gfx_dbk_hdiff_lmot_th2 refers to the value of th2 when the picture moves at a low speed; reg_gfx_dbk_hdiff_th2 refers to the value of_th2 when the picture moves normally; reg_gfx_dbk_hdiff_hmot_th2 refers to the value of th2 when the picture moves at high speed. The smaller the difference, the higher the similarity between adjacent filters, so the smaller the dynamic filter strength gain1 value, and the output of the main adjacent filter can be improved. In the case of a great amount of difference, the filter strength gain1 between two adjacent points is 0, in which case the electronic device finally outputs only the result of the center point y 0. Thus, some influence caused by the high edge difference amount of the picture movement can be effectively suppressed.
Optionally, the operation modes of the horizontal deblocking filter can be classified into the following 3 types with reference to the degree of picture movement in a plurality of video frames: a stationary normal mode, a high-speed moving mode, and a low-speed moving mode. In which there are individual normal filter intensities (alternatively referred to as dynamic filter intensity peaks x_th1) and high-variance filter rejection start values (x_th2) in different modes. Where th1 and th 2) represent that the amount of difference in intensity beyond a certain level can be limited to different degrees in different modes of operation.
Alternatively, the electronic device may control the dynamic filtering strength when filtering the horizontal deblocking filter by the amount of difference of the edges.
Referring to fig. 11, fig. 11 is a schematic structural diagram of a deblocking filter device according to an embodiment of the present application. As shown in fig. 11, the deblocking filtering apparatus may include, but is not limited to, a determining unit 1101, a processing unit 1102.
In an alternative embodiment, the deblocking filtering apparatus is configured to perform operations of the electronic device in the deblocking filtering method described above with reference to fig. 2, such as: a determining unit 1101, configured to divide each video frame of the one or more video frames into a plurality of 8×8 blocks, and determine a block edge position in each video frame based on each block; a determining unit 1101, configured to determine an artificial block edge position in one or more video frames based on the number of occurrences of each block edge position in the one or more video frames; and a processing unit 1102, configured to perform filtering processing on a next frame of the one or more video frames by using the edge position of the artificial block. Optionally, the determining unit 1101 and the processing unit 1102 may also perform the relevant operations of the terminal device in various alternative embodiments of the deblocking filtering method described in fig. 2, which are not described in detail herein. It can be appreciated that the beneficial effects that can be achieved by the deblocking filter apparatus provided by the embodiments of the present application can refer to the description of the related deblocking filter method embodiments, and are not described herein again.
Referring to fig. 12, fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the application. Including a processor 1201, a memory 1202 and a communication bus for connecting the processor 1201 and the memory 1202.
The electronic device may also include a communication interface that may be used to receive and transmit data.
Memory 1202 includes, but is not limited to, random access memory (random access memory, RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), or portable read-only memory (compact disc read-only memory, CD-ROM), memory 1202 for storing executed program code and transmitted data.
The processor 1201 may be one or more central processing units (Central Processing Unit, CPU), and in the case where the processor 1201 is one CPU, the CPU may be a single-core CPU or a multi-core CPU. The processor may also be other general purpose processors, digital signal processors (digital signal processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In an alternative embodiment, processor 1201 may be configured to execute computer program or instructions 1203 stored in memory 1202 to cause the electronic device to perform operations in the deblocking filtering method described above with respect to fig. 2, such as: dividing each video frame of the one or more video frames into a plurality of 8 x 8 blocks, and determining a block edge position in each video frame based on each block; determining artificial block edge locations in one or more video frames based on the number of occurrences of each block edge location in the one or more video frames; and filtering the next frame of the one or more video frames by utilizing the artificial block edge position. Optionally, the processor 1201 may be configured to execute a computer program or instructions 1203 stored in the memory 1202, so that the electronic device may also perform relevant operations in various alternative embodiments of the deblocking filtering method described in fig. 2, which are not described in detail herein. It can be appreciated that the specific implementation of the processor 1201 in the electronic device and the beneficial effects that can be achieved in the embodiment of the present application can refer to the description of the related deblocking filtering method embodiment, and are not repeated here.
The embodiment of the application also provides a chip, which comprises: a processor, a memory and a computer program or instructions stored on the memory, wherein the processor executes the computer program or instructions to carry out the steps described in the above method embodiments.
The embodiment of the application also provides a chip module, which comprises a receiving and transmitting assembly and a chip, wherein the chip comprises a processor, a memory and a computer program or instructions stored on the memory, and the processor executes the computer program or instructions to realize the steps described in the embodiment of the method.
The present application also provides a computer-readable storage medium storing a computer program or instructions for signal processing, which when executed cause a computer to implement some or all of the steps described in any of the method embodiments above.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program or instructions which, when executed, implement some or all of the steps described in any of the method embodiments above. The computer program product or instructions may be a software installation package.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
In the foregoing embodiments, the embodiments of the present application have emphasis on each of the descriptions of the embodiments, and any multiple embodiments may be combined, and for portions of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and the division of elements, such as those described above, is merely a logical function division, and may be implemented in other manners, such as multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, or may be in electrical or other forms.
The units described above as separate components may or may not be physically separate, and components shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present application.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units. That is, regarding each of the apparatuses and products described in the above embodiments, the unit/module may be a software unit/module, a hardware unit/module, or a software unit/module, or a hardware unit/module. For example, for each device of the application or integrated chip, each unit/module contained in the product may be implemented in hardware such as a circuit, or at least part of units/modules may be implemented in software program, where the units/modules run on an integrated processor inside the chip, and the rest of units/modules may be implemented in hardware such as a circuit; for each device and product corresponding to or integrated with the chip module, each unit/module contained in the device and product can be realized in a hardware mode such as a circuit, different units/modules can be located in the same piece (such as a chip, a circuit unit and the like) or different components of the chip module, at least part of the units/modules can be realized in a software program mode, and the software program runs in the rest unit/module of the integrated processor inside the chip module and can be realized in a hardware mode such as a circuit; for each device or product of the terminal, the units/modules contained in the device or product can be realized by adopting hardware such as a circuit, different units/modules can be located in the same component (e.g. a chip, a circuit unit, etc.) or different components in the terminal, or at least part of units/modules can be realized by adopting a software program, the sequence runs on a processor integrated in the terminal, and the rest of units/units can be realized by adopting hardware such as a circuit.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software elements that may be stored in a U-disk, random access memory (random access memory, RAM), flash memory, read-only memory (ROM), erasable programmable read-only memory (erasable programmable ROM), electrically erasable programmable read-only memory (EEPROM), registers, hard disk, a removable disk, a magnetic disk, a compact disc read-only memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in a terminal device or a network device. The processor and the storage medium may reside as discrete components in a terminal device or network device.
The integrated units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be embodied essentially or partly in the form of a software product or all or part of the technical solution, which is stored in a memory, and includes several instructions for causing a computer device (which may be a personal computer, a server, or TRP, etc.) to perform all or part of the steps of the method of the embodiments of the present application.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented, in whole or in part, in software, hardware, firmware, or any combination thereof. When the integrated unit is implemented in the form of a software functional unit and sold or used as a stand-alone product, it can be stored in a computer readable memory. Based on such an understanding, the technical solution of the application may be implemented in the form of a computer software product, either in essence or in part contributing to the prior art or in whole or in part. The computer software product is stored in a memory and includes one or more computer instructions for causing a computer device (which may be a personal computer, a server, or TRP, etc.) to perform all or part of the steps of the methods of the various embodiments of the application. The computer device may also be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another. For example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
The foregoing embodiments of the application have been described in some detail by way of illustration of the principles and embodiments of the application, and are not intended to limit the scope of the embodiments of the application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above. That is, the foregoing description is only a specific implementation of the embodiment of the present application, and is not intended to limit the scope of the embodiment of the present application, and any modifications, equivalent substitutions, improvements, etc. made on the basis of the technical solution of the embodiment of the present application should be included in the scope of the embodiment of the present application.

Claims (10)

1. A deblocking filtering method, the method comprising:
dividing each video frame in one or more video frames into a plurality of 8 x 8 blocks, and determining the block edge position in each video frame based on each block;
determining artificial block edge locations in the one or more video frames based on the number of occurrences of each block edge location in the one or more video frames;
And filtering the next frame of the one or more video frames by utilizing the edge position of the artificial block.
2. The method of claim 1, wherein determining the block edge location in each video frame on a per block basis comprises:
determining a first difference amount between information corresponding to each two adjacent parallel lines in information corresponding to a plurality of parallel lines in each block in each video frame;
obtaining a second difference between the information corresponding to each two adjacent parallel lines based on the first difference and the first weight between the information corresponding to each two adjacent parallel lines;
determining a block edge position of each block in each video frame based on the second difference amount and the first difference amount;
and determining the edge position of each block in each video frame based on the edge position of each block in each video frame.
3. The method of claim 1 or 2, wherein the determining artificial block edge locations in the one or more video frames based on the number of occurrences of each block edge location in the one or more video frames comprises:
Determining a first block edge position that occurs most frequently and a second block edge position that occurs more frequently in the one or more video frames;
and if the quantization result of the first block edge position is higher than the quantization result of the second block edge position by N times, and the number of edges generated by the first block corresponding to the first block edge position is larger than or equal to a preset value, determining the first block edge position as an artificial block edge position, wherein N is larger than or equal to 2.
4. The method according to claim 1 or 2, wherein said filtering a next frame of said one or more video frames using said artificial block edge position comprises:
determining a dynamic filter strength based on an amount of inter-frame difference between a last frame of the one or more video frames and a next frame of the one or more video frames;
and filtering a next frame of the one or more video frames based on the dynamic filtering strength and the artificial block edge position.
5. The method of claim 4, wherein the determining the dynamic filter strength based on an amount of inter-frame difference between a last frame of the one or more video frames and a next frame of the one or more video frames comprises:
Determining a third difference amount between every two adjacent parallel lines based on information corresponding to the plurality of parallel lines in the last frame of the one or more video frames;
determining the movement degree of the picture in the last frame based on the third difference amount and the inter-frame difference amount;
and determining dynamic filtering strength based on the movement degree of the picture in the last frame.
6. A deblocking filter apparatus, the method comprising:
a determining unit, configured to divide each video frame in one or more video frames into a plurality of 8×8 blocks, and determine a block edge position in each video frame according to each block;
the determining unit is further configured to determine an artificial block edge position in the one or more video frames according to the number of occurrences of each block edge position in the one or more video frames;
and the processing unit is used for carrying out filtering processing on the next frame of the one or more video frames by utilizing the edge position of the artificial block.
7. An electronic device comprising a processor and a memory, the processor and the memory being interconnected, wherein the memory is adapted to store a computer program comprising program instructions, the processor being adapted to invoke the program instructions to perform the method of any of claims 1 to 5.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions which, when executed by a computer, are executed by a method according to any of claims 1 to 5.
9. A chip, characterized in that it comprises a processor that performs the method according to any of claims 1 to 5.
10. A chip module comprising a transceiver component and a chip comprising a processor, the processor performing the method of any one of claims 1 to 5.
CN202311073016.0A 2023-08-24 2023-08-24 Deblocking filtering method and related device Pending CN117041604A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311073016.0A CN117041604A (en) 2023-08-24 2023-08-24 Deblocking filtering method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311073016.0A CN117041604A (en) 2023-08-24 2023-08-24 Deblocking filtering method and related device

Publications (1)

Publication Number Publication Date
CN117041604A true CN117041604A (en) 2023-11-10

Family

ID=88644834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311073016.0A Pending CN117041604A (en) 2023-08-24 2023-08-24 Deblocking filtering method and related device

Country Status (1)

Country Link
CN (1) CN117041604A (en)

Similar Documents

Publication Publication Date Title
TWI674790B (en) Method of picture data encoding and decoding and apparatus
KR20180105294A (en) Image compression device
US10887614B2 (en) Adaptive thresholding for computer vision on low bitrate compressed video streams
EP2941755A1 (en) Method and apparatus of reducing random noise in digital video streams
CN108184118A (en) Cloud desktop contents encode and coding/decoding method and device, system
US7697763B2 (en) Data compression method and apparatus thereof
US20090016623A1 (en) Image processing device, image processing method and program
US20140169473A1 (en) Texture sensitive temporal filter based on motion estimation
CN101835041A (en) Image processing equipment and method
US20220116664A1 (en) Loop filtering method and device
WO2022261838A1 (en) Residual encoding method and apparatus, video encoding method and device, and system
Gandam et al. An efficient post-processing adaptive filtering technique to rectifying the flickering effects
US20150110191A1 (en) Video encoding method and apparatus, and video decoding method and apparatus performing motion compensation
CN112655212B (en) Video coding optimization method and device and computer storage medium
CN102484726B (en) Block Artifact Reducer
CN117041604A (en) Deblocking filtering method and related device
CN101453559B (en) Noise detection method and apparatus for video signal
US8831354B1 (en) System and method for edge-adaptive and recursive non-linear filtering of ringing effect
Lee et al. Impulse Noise Immune Bayer Image Compression with Direction Estimation for Imaging Sensor
US10405002B2 (en) Low complexity perceptual visual quality evaluation for JPEG2000 compressed streams
US20210321119A1 (en) Image and video data processing method and system
KR101979492B1 (en) Method for adaptive scene change detection based on resolution and apparatus for the same
CN113453007A (en) Method for improving monitoring scene H264 coding efficiency
US7711203B2 (en) Impulsive noise removal using maximum and minimum neighborhood values
CN115474055B (en) Video encoding method, encoder, medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination