CN113099232B - Video decoding the method is a device(s) electronic device and computing machine storage medium - Google Patents

Video decoding the method is a device(s) electronic device and computing machine storage medium Download PDF

Info

Publication number
CN113099232B
CN113099232B CN202110320486.7A CN202110320486A CN113099232B CN 113099232 B CN113099232 B CN 113099232B CN 202110320486 A CN202110320486 A CN 202110320486A CN 113099232 B CN113099232 B CN 113099232B
Authority
CN
China
Prior art keywords
layer
alpha
color
frame
stream
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110320486.7A
Other languages
Chinese (zh)
Other versions
CN113099232A (en
Inventor
眭世晨
刘华平
许崝
林勇平
徐贤锋
陈舜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Netease Cloud Music Technology Co Ltd
Original Assignee
Hangzhou Netease Cloud Music Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Netease Cloud Music Technology Co Ltd filed Critical Hangzhou Netease Cloud Music Technology Co Ltd
Priority to CN202110320486.7A priority Critical patent/CN113099232B/en
Publication of CN113099232A publication Critical patent/CN113099232A/en
Application granted granted Critical
Publication of CN113099232B publication Critical patent/CN113099232B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • H04N19/21Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding with binary alpha-plane coding for video objects, e.g. context-based arithmetic encoding [CAE]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The application discloses a video decoding method an apparatus, an electronic device, and a computer storage medium, video decoding the method comprises the following steps: a video stream to be decoded is received, when it is determined that the video stream includes both a color layer encoded stream and an Alpha layer encoded stream, invoking color layer decoding separately a decoder and Alpha layer decoder; acquiring a color layer coding stream in a video stream, and decoding the color layer coding stream by a color layer decoder to obtain a color value; acquiring an Alpha layer coded stream in a video stream, and decoding the Alpha layer coded stream by an Alpha layer decoder to obtain an Alpha value; and obtaining a display image according to the color value and the Alpha value. According to the method, the Alpha layer decoder is added to the existing color layer decoder framework, so that the Alpha layer frame data of the HEVC Video With Alpha format video file is decoded, and the semitransparent characteristic of the video is supported.

Description

Video decoding method a device(s) electronic device and computing machine storage medium
Technical Field
The present invention relates to the field of video encoding and decoding technologies, and in particular, to a video decoding method, apparatus, electronic device, and computer storage medium.
Background
This section is intended to provide a background or context for embodiments of the present application that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
In the field of image codec technology, an Alpha Channel (Alpha Channel) is a parameter for affecting image transparency and translucence. For PNG and webp format images, the transparency and translucency of each pixel in the image can be determined by appending an Alpha (Alpha) value to the RGB value of that pixel. When the display is performed, the mixing proportion of the pixel color RGB value of the display image and the pixel color RGB value of the background image is determined through the Alpha value, so that the display image is semi-transparent and covered on the background image.
In the technical field of video encoding and decoding, a video with semitransparent characteristics can be obtained by applying an Alpha channel to each image frame of the video. And storing the Alpha channel value in an additional layer (namely the Alpha layer) independent of the color layer by the HEVC video format expansion standard HEVC Video With Alpha supporting the Alpha channel, and acquiring the pixel value of the Alpha layer as a corresponding Alpha value for display mixing during display.
The related decoding technical schemes mostly support the decoding of a color layer, and how to realize the general video decoding suitable for HEVC Video With Alpha format based on the existing decoding architecture needs to provide a corresponding solution.
Disclosure of Invention
In view of the foregoing technical problems, there is a need for an improved method for implementing general video decoding suitable for HEVC Video With Alpha format based on existing decoding architecture.
In a first aspect, an embodiment of the present application provides a video decoding method, including:
receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when the video stream is determined to simultaneously comprise a color layer coded stream and an Alpha layer coded stream;
acquiring a color layer coded stream in the video stream, and decoding the color layer coded stream by the color layer decoder to obtain a color value;
acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value;
and obtaining a display image according to the color value and the Alpha value.
Thus, the application realizes that decoding HEVC Video With Alpha format video can support semitransparent property by adding an Alpha layer decoder on the basis of the existing color layer decoder. If the color layer decoder and the Alpha layer decoder are software decoders, the technical scheme of the application can be applied to electronic devices of different types, different models and different operating systems (such as Windows, android, linux, mac OS, IOS and the like), and is not limited to the electronic devices of specific models and specific operating systems.
Optionally, determining that the video stream includes both color layer data and Alpha layer data includes:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the expansion data identification corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and the Alpha layer coding stream.
In this way, whether the Alpha layer coded stream exists in the video stream is determined through the expansion data identification in the video parameter set.
Optionally, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for buffering a color layer reference frame obtained in the process of decoding the color layer coded stream;
and creating an Alpha layer reference frame buffer zone, wherein the Alpha layer reference frame buffer zone is used for buffering the Alpha layer reference frame obtained in the process of decoding the Alpha layer coded stream.
Thus, by establishing the color layer reference frame buffer and the Alpha layer reference frame buffer, the reference frame corresponding to the current frame can be cached, thereby realizing decoding.
Optionally, after invoking the color layer decoder and the Alpha layer decoder, the method further comprises:
Acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set to be a color layer image parameter set or an Alpha layer image parameter set according to data layer mark parameters in the image parameter set;
initializing the color layer decoder with the color layer sequence parameter set and the color layer image parameter set;
and initializing the Alpha layer decoder by using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
In this way, the color layer decoder and the Alpha layer decoder are initialized through the parameter sets corresponding to the color layer and the Alpha layer respectively, so that the subsequent decoding of the color layer frame data and the Alpha layer frame data according to the parameters in the parameter sets is facilitated.
Optionally, obtaining a color layer encoded stream and an Alpha layer encoded stream in the video stream includes:
acquiring frame data in the video stream, and determining that the frame data is color layer frame data or Alpha layer frame data according to data layer mark parameters of the frame data;
Sending the color layer frame data to the color layer decoder;
and sending the Alpha layer frame data to the Alpha layer decoder.
In this way, the color layer frame data can be distinguished from the Alpha layer frame data by the data layer flag parameter nuh_layer_id, facilitating decoding using a corresponding decoder.
Optionally, the video decoding method further comprises:
when the data layer mark parameter is analyzed to be empty or illegal, acquiring a frame sequence number of a frame to be decoded of the frame data;
if the frame sequence number is a new frame sequence number, determining that the frame to be decoded is color layer frame data;
and if the frame sequence number is the repeated frame sequence number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameter in the frame to be decoded.
Optionally, determining, according to the frame marking parameter in the frame to be decoded, that the frame to be decoded is color layer frame data or Alpha layer frame data includes:
when the frame to be decoded is determined to be a starting data packet according to the frame marking parameter, determining that the frame to be decoded is the Alpha layer frame data;
and when the frame to be decoded is determined not to be a starting data packet according to the frame marking parameter, determining the frame to be decoded to be the color layer frame data.
In this way, by distinguishing the color layer frame data from the Alpha layer frame data by the frame number and the frame flag parameter, decoding failure due to the error of the data layer flag parameter can be avoided.
Optionally, the Alpha layer frame data is in a YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
In this way, by discarding the data of the U color direction and the V color direction in the Alpha layer frame data, the decoding speed of the Alpha layer decoder can be increased.
Optionally, the obtaining a display image according to the color value and the Alpha value includes:
outputting the color value and the Alpha value to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphic processor, and mixing the color value and the Alpha value according to the position of the pixel.
Thus, the semitransparent characteristic of the video is realized by mixing the color value and the Alpha value through the graphic processor.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to a color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coded stream to obtain an Alpha value, generating a decoding channel flag parameter corresponding to the Alpha layer;
wherein the graphics processor determines a color value and an Alpha value for each pixel based on the decoding channel flag parameter.
In this way, the subsequent graphics processor is facilitated to distinguish between the color values and the Alpha values.
Optionally, the video decoding method further comprises:
invoking a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
acquiring a color layer coded stream in the video stream, and decoding the color layer coded stream by the color layer decoder to obtain a color value;
and displaying an image according to the color value.
In this way, decoding compatibility for video files without Alpha layers is achieved by decoding video files that do not include Alpha layer encoded streams using a color layer decoder.
In a second aspect, an embodiment of the present application provides a video decoding apparatus, including:
an initialization unit, configured to receive a video stream to be decoded, and call a color layer decoder and an Alpha layer decoder when determining that the video stream includes both a color layer encoded stream and an Alpha layer encoded stream;
a color layer decoding unit, configured to obtain a color layer encoded stream in the video stream, and decode the color layer encoded stream by using the color layer decoder to obtain a color value;
an Alpha layer decoding unit, configured to obtain an Alpha layer encoded stream in the video stream, and decode the Alpha layer encoded stream by using the Alpha layer decoder to obtain an Alpha value;
and the display unit is used for obtaining a display image according to the color value and the Alpha value.
Optionally, determining that the video stream includes both color layer data and Alpha layer data includes:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the expansion data identification corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and the Alpha layer coding stream.
Optionally, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
Creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for buffering a color layer reference frame obtained in the process of decoding the color layer coded stream;
and creating an Alpha layer reference frame buffer zone, wherein the Alpha layer reference frame buffer zone is used for buffering the Alpha layer reference frame obtained in the process of decoding the Alpha layer coded stream.
Optionally, after invoking the color layer decoder and the Alpha layer decoder, the method further comprises:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set to be a color layer image parameter set or an Alpha layer image parameter set according to data layer mark parameters in the image parameter set;
initializing the color layer decoder with the color layer sequence parameter set and the color layer image parameter set;
and initializing the Alpha layer decoder by using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
Optionally, obtaining a color layer encoded stream and an Alpha layer encoded stream in the video stream includes:
Acquiring frame data in the video stream, and determining that the frame data is color layer frame data or Alpha layer frame data according to data layer mark parameters of the frame data;
sending the color layer frame data to the color layer decoder;
and sending the Alpha layer frame data to the Alpha layer decoder.
Optionally, the video decoding device further comprises:
the data judging unit is used for acquiring the frame sequence number of the frame to be decoded of the frame data when the data layer mark parameter is analyzed to be empty or illegal;
if the frame sequence number is a new frame sequence number, determining that the frame to be decoded is color layer frame data;
and if the frame sequence number is the repeated frame sequence number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameter in the frame to be decoded.
Optionally, determining, according to the frame marking parameter in the frame to be decoded, that the frame to be decoded is color layer frame data or Alpha layer frame data includes:
when the frame to be decoded is determined to be a starting data packet according to the frame marking parameter, determining that the frame to be decoded is the Alpha layer frame data;
and when the frame to be decoded is determined not to be a starting data packet according to the frame marking parameter, determining the frame to be decoded to be the color layer frame data.
Optionally, the Alpha layer frame data is in a YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
Optionally, the obtaining a display image according to the color value and the Alpha value includes:
outputting the color value and the Alpha value to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphic processor, and mixing the color value and the Alpha value according to the position of the pixel.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to a color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coded stream to obtain an Alpha value, generating a decoding channel flag parameter corresponding to the Alpha layer;
Wherein the graphics processor determines a color value and an Alpha value for each pixel based on the decoding channel flag parameter.
Optionally, the initialization unit is further configured to invoke a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
the display unit is also used for displaying images according to the color values;
and when the color value is determined by the initialization unit that the video stream does not comprise an Alpha layer coded stream, the color layer decoding unit acquires the color layer coded stream in the video stream, and the color layer decoder decodes the color layer coded stream to obtain the color value.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory for storing instructions executable by the processor;
wherein the processor is configured to execute the instructions to implement the video decoding method.
In a fourth aspect, an embodiment of the present application provides a computer storage medium storing a computer program for implementing the video decoding method.
According to the video decoding method, the video decoding device, the electronic equipment and the computer storage medium, the Alpha layer decoder is added to the existing color layer decoder framework, so that the Alpha layer frame data of the HEVC Video With Alpha format video file can be decoded, and the semitransparent characteristic of a video is supported.
Drawings
The above, as well as additional purposes, features, and advantages of exemplary embodiments of the present application will become readily apparent from the following detailed description when read in conjunction with the accompanying drawings. Several embodiments of the present application are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which:
fig. 1 is an application scenario schematic diagram of a video decoding method provided in an embodiment of the present application;
fig. 2 is a flowchart of a video decoding method according to an embodiment of the present application;
fig. 3 is a schematic flow chart for determining whether the video stream includes both color layer data and Alpha layer data according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of initializing a decoder using a parameter set according to an embodiment of the present application;
FIG. 5 is a flow chart of judging whether frame data is color layer frame data or Alpha layer frame data according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a process for decoding Alpha layer frame data according to an embodiment of the present disclosure;
FIG. 7 is a flow chart of displaying decoded video using a graphics processor according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a video decoding device according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
Fig. 10 is a schematic structural diagram of a computer program product according to an embodiment of the present application.
Detailed Description
The principles and spirit of the present application will be described below with reference to several exemplary embodiments. It should be understood that these embodiments are presented merely to enable one skilled in the art to better understand and practice the present application and are not intended to limit the scope of the present application in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Those skilled in the art will appreciate that embodiments of the present application may be implemented as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the following forms, namely: complete hardware, complete software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software.
In this document, it should be understood that any number of elements in the drawings is for illustration and not limitation, and that any naming is used only for distinction and not for any limitation.
For ease of understanding, the terms referred to in the embodiments of the present application are explained below:
Video codec:
the video coding mode refers to a mode of converting a file in an original video format into a file in another video format by a compression technique. The most important codec standards in video streaming are the international union of H.263, H.264, H.265, the M-JPEG of the moving still picture experts group and the MPEG-series standard of the moving picture experts group of the International standards organization, and the company of Google Netflix et al designates VPx and AV1, etc. Video decoding is the reverse operation of video encoding.
(two) high efficiency video coding (High Efficiency Video Coding, HEVC):
an advanced video coding format, HEVC, was released in 2012 with the design objective to replace the h.264 coding format. HEVC has higher compression rate and better compression quality than H.264, and supports 4K,10-12bit HDR and 3D video formats.
In the HEVC standard, there are a number of different parameter sets in a video stream for setting codec parameters. Among these are video parameter sets (Video Parameter Set, VPS), sequence parameter sets (Sequence Parameter Set, SPS), picture parameter sets (Picture Parameter Set, PPS), etc. The parameter sets relevant to the present application are described below:
(1)VPS:
the layers of a given video sequence refer to the same VPS.
The VPS contains information mainly including:
(1) syntax elements shared by multiple layers and operation points;
(2) key information of the operation point, such as grade and level;
(3) other operating point characteristic information not belonging to SPS, such as virtual reference decoder (Hypothetical Reference Decoder, HRD) parameters.
(2)SPS:
The information contained in the SPS mainly comprises:
(1) information in image format.
(2) Encoding parameter information.
(3) Information related to the reference image.
(4) Level, layer and level related parameters.
(5) Time domain hierarchy information.
(6) The availability information is visualized.
(7) Other information. Including the VPS number, SPS identification number, and SPS extension information of the current SPS reference.
(3)PPS:
The information contained in PPS mainly includes:
(1) the availability flag of the coding tool.
(2) Quantization process-related syntax elements.
(3) Tile related syntax elements.
(4) Deblocking filtering related syntax elements.
(5) Control information in the header.
(6) Other information that may be shared when encoding an image.
(III) HEVC Video With Alpha Standard:
the HEVC Video With Alpha standard is an HEVC video format extension standard supporting Alpha channels. The standard stores Alpha channel values in an additional layer (i.e., alpha layer) independent of the color layer, and obtains pixel values of the Alpha layer as corresponding Alpha values for display blending during display.
For video files with Alpha layers, the standard partial coding scheme is as follows:
(1) VPS part:
nuh_layer_id=0。
vps_extension +.0 to indicate that the video stream contains additional layers of extensions.
(2) SPS portion:
(1) nuh_layer_id=0, indicating that this SPS is a color layer SPS.
(2) nuh_layer_id=1, indicating that this SPS is an Alpha layer SPS.
(3) PPS portion:
(1) nuh_layer_id=0, indicating that this PPS is a color layer PPS.
(2) nuh_layer_id=1, indicating that this PPS is an Alpha layer PPS.
(4) Frame data portion:
(1) nuh_layer_id=0, indicating that this frame data is color layer frame data.
The color layer frame data is consistent with the common HEVC format video coding mode.
(2) nuh_layer_id=vps_extension, indicating that this frame data is Alpha layer frame data.
Alpha layer frame data is encoded using a YUV color model format, where Y color direction data is Alpha values, u=128, v=128.
(IV) RGB color model:
one color standard is to obtain various colors by superimposing three color channels of Red (Red), green (Green), and Blue (Blue). This standard includes almost all colors perceived by human vision and is one of the most widely used color systems at present.
(V) YUV color model:
One color standard is to obtain a wide variety of colors by superimposing the Y, U, V three color channels. Y represents brightness (Luminance or Luma). U and V represent chromaticity (Chroma) for describing image color and saturation.
The principles and spirit of the present application are explained in detail below with reference to several representative embodiments thereof.
Summary of The Invention
According to the technical scheme for decoding the HEVC Video With Alpha format video, the related technical scheme can directly ignore Alpha layer data and only decode color layer data, so that the semitransparent characteristic of the video cannot be realized.
In order to solve the above-mentioned problems, the present application provides a video decoding method, including: receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when the video stream is determined to simultaneously comprise a color layer coded stream and an Alpha layer coded stream; acquiring a color layer coded stream in the video stream, and decoding the color layer coded stream by the color layer decoder to obtain a color value; acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value; and obtaining a display image according to the color value and the Alpha value. According to the video decoding method, the Alpha layer decoder is added to the existing color layer decoder architecture, so that Alpha layer frame data of a HEVC Video With Alpha format video file can be decoded, and the semitransparent characteristic of a video is supported.
Having described the basic principles of the present application, various non-limiting embodiments of the present application are specifically described below.
Application scene overview
Referring to fig. 1, an application scenario diagram of a video decoding method according to an embodiment of the present application is shown. The application scenario shown in fig. 1 includes terminal devices 101 and 102. Terminal devices 101 and 102 include, but are not limited to, desktop computers, mobile phones, mobile computers, tablet computers, media players, smart wearable devices, smart televisions, vehicle-mounted devices, personal digital assistants (Personal Digital Assistant, PDAs), and like electronic devices. The terminal devices 101 and 102 may be two different devices or the same device, and only the correspondence between encoding and decoding is described herein.
The terminal device 101 is configured to encode an original video file according to HEVC Video With Alpha to obtain a video file having semitransparent characteristics, and to deliver the video file to the terminal device 102. The data transmission method between the terminal device 101 and the terminal device 102 includes, but is not limited to, a network connection, a mobile storage medium reading method, and the like.
The terminal device 102 is configured to receive the video file and convert the video file into a video stream to be decoded, and call a color layer decoder and an Alpha layer decoder when determining that the video stream includes both a color layer encoded stream and an Alpha layer encoded stream; acquiring a color layer coded stream in the video stream, and decoding the color layer coded stream by the color layer decoder to obtain a color value; acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value; and obtaining a display image according to the color value and the Alpha value.
It should be noted that the above application scenario is only shown for the convenience of understanding the spirit and principles of the present application, and embodiments of the present application are not limited in any way in this respect. Rather, embodiments of the present application may be applied to any scenario where applicable.
Exemplary method
Referring to fig. 2, an embodiment of the present application provides a video decoding method, including:
s201, receiving a video stream to be decoded;
s202, judging whether the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream, if so, executing a step S203;
s203, respectively calling a color layer decoder and an Alpha layer decoder;
S204, obtaining a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value;
s205, acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value;
s206, obtaining a display image according to the color value and the Alpha value.
In the implementation process, the color layer decoder and the Alpha layer decoder may be hardware decoders or software decoders.
Thus, the application realizes that decoding HEVC Video With Alpha format video can support semitransparent property by adding an Alpha layer decoder on the basis of the existing color layer decoder. If the color layer decoder and the Alpha layer decoder are software decoders, the technical scheme of the application can be applied to electronic devices of different types, different models and different operating systems (such as Windows, android, linux, mac OS, IOS and the like), and is not limited to the electronic devices of specific models and specific operating systems.
Optionally, the video decoding method further comprises:
when it is determined that the video stream does not include the Alpha layer encoded stream, step S207 is performed;
S207, calling a color layer decoder;
subsequently, the step S204 is performed;
s208, displaying an image according to the color value obtained in the step S204.
In this way, decoding compatibility for video files without Alpha layers is achieved by decoding video files that do not include Alpha layer encoded streams using a color layer decoder.
Optionally, as shown in fig. 3, in step S202, determining whether the video stream includes color layer data and Alpha layer data at the same time includes:
s301, obtaining a VPS from the video stream;
s302, analyzing an expansion data identifier vps_extension in the VPS;
and S303, determining that the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream if the expansion data identifier vps_extension corresponding to the Alpha layer is analyzed.
And S304, if the expansion data identifier vps_extension corresponding to the Alpha layer is not analyzed, determining that the video stream does not comprise the Alpha layer coding stream.
According to the coding standard of HEVC Video With Alpha, when vps_extension +.0, it is indicated that the video stream contains an additional layer independent of the color layer. In this application, if vps_extension is not equal to 0, it is determined that the video stream includes both the color layer encoded stream and the Alpha layer encoded stream. And when vps_extension=0, it indicates that no additional layer independent of the color layer is included in the video stream. In the present application, if vps_extension=0, it is determined that the Alpha layer encoded stream is not included in the video stream.
Optionally, in the step S201, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for buffering a color layer reference frame obtained in the process of decoding the color layer coded stream;
and creating an Alpha layer reference frame buffer zone, wherein the Alpha layer reference frame buffer zone is used for buffering the Alpha layer reference frame obtained in the process of decoding the Alpha layer coded stream.
In the encoding process, the encoding is generally performed by using an inter-frame prediction mode, that is, a part of image frames in an original video file will refer to information of other image frames (reference frames) and will encode changes relative to the reference frames. Correspondingly, in the decoding process, the image information of the current frame needs to be decoded according to the information of the reference frame. Thus, by establishing the color layer reference frame buffer and the Alpha layer reference frame buffer, the reference frame corresponding to the current frame can be cached, thereby realizing decoding.
If the video stream does not include the Alpha layer encoded stream, in step S207, a color layer reference frame buffer is created when the color layer decoder is called, where the color layer reference frame buffer is used to buffer the color layer reference frame obtained in the process of decoding the color layer encoded stream.
Before performing the formal decoding operation, decoding parameters need to be initialized to the decoder using parameter sets corresponding to the color layer and Alpha layer, respectively.
Optionally, as shown in fig. 4, in step S203, after the color layer decoder and the Alpha layer decoder are invoked, the method further includes:
s401, acquiring SPS from the video stream;
s402, determining that the SPS is a color layer SPS or an Alpha layer SPS according to a data layer mark parameter nuh_layer_id in the SPS;
s403, obtaining PPS from the video stream;
s404, determining that the PPS is a color layer PPS or an Alpha layer PPS according to a data layer mark parameter nuh_layer_id of the PPS;
s405 initializing the color layer decoder with the color layer SPS and the color layer PPS;
s406, initializing the Alpha layer decoder by utilizing the Alpha layer SPS and the Alpha layer PPS.
In the implementation process, if the video stream includes both the color layer encoded stream and the Alpha layer encoded stream, the video stream includes at least two SPS and two PPS groups. If nuh_layer_id=0, the corresponding SPS or PPS is the color layer parameter set; if nuh_layer_id=1, the corresponding SPS or PPS is the Alpha layer parameter set.
In this way, the color layer decoder and the Alpha layer decoder are initialized through the parameter sets corresponding to the color layer and the Alpha layer respectively, so that the subsequent decoding of the color layer frame data and the Alpha layer frame data according to the parameters in the parameter sets is facilitated.
If the video stream does not include the Alpha layer coded stream, after invoking the color layer decoder, the method further includes:
and acquiring SPS and PPS from the video stream, and initializing the color layer decoder by directly utilizing the SPS and the PPS.
After the parameter set of the color layer and the parameter set of the Alpha layer are acquired and the parameter initialization setting of the decoder is completed, the frame data in the video stream also need to be distinguished so as to be decoded by using the corresponding decoder respectively.
Optionally, as shown in fig. 5, in step S204, a color layer encoded stream in the video stream is obtained, and in step S205, an Alpha layer encoded stream in the video stream is obtained, including:
s501, acquiring frame data in the video stream;
s502, judging the frame data to be color layer frame data or Alpha layer frame data according to a data layer mark parameter nuh_layer_id of the frame data; executing step S503 when the frame data is determined to be color layer frame data, and executing step S504 when the frame data is determined to be Alpha layer frame data;
S503, sending the color layer frame data to the color layer decoder;
s504, sending the Alpha layer frame data to the Alpha layer decoder.
According to the coding standard of HEVC Video With Alpha, when nuh_layer_id=0 of frame data, it is indicated that the frame data is color layer frame data; when nuh_layer_id=vps extension of frame data, it means that the frame data is Alpha layer frame data.
In this way, the color layer frame data can be distinguished from the Alpha layer frame data by the data layer flag parameter nuh_layer_id, facilitating decoding using a corresponding decoder.
Optionally, as shown in fig. 5, the video decoding method further includes:
in step S502, when the data layer flag parameter nuh_layer_id is analyzed to be null or illegal, step S505 is executed;
in the implementation process, the case that the data layer flag parameter nuh_layer_id is an illegal value may be nuh_layer_id not equal to 0, and nuh_layer_id not equal to vps extension, where the frame data cannot be judged to be color layer frame data or Alpha layer frame data according to the data layer flag parameter nuh_layer_id.
S505, acquiring a frame sequence number frame_num of a frame to be decoded of the frame data;
S506, judging whether the frame sequence number frame_num is a new frame sequence number; if the frame_num is the new frame number, determining that the frame to be decoded is color layer frame data, and executing step S503; if the frame_num is the repeated frame number, step S507 is executed;
s507, judging the frame to be decoded as color layer frame data or Alpha layer frame data according to a frame marking parameter first_slice_in_frame in the frame to be decoded; and correspondingly performs the step S503 or the step S504.
Specifically, determining the frame to be decoded as color layer frame data or Alpha layer frame data according to the frame marking parameter first_slice_in_frame in the frame to be decoded includes:
when the frame to be decoded is determined to be a starting data packet according to the frame marking parameter first_slice_in_frame, determining that the frame to be decoded is the Alpha layer frame data;
and when the frame to be decoded is not the initial data packet according to the frame marking parameter first_slice_in_frame, determining the frame to be decoded as the color layer frame data.
In a specific implementation process, when a frame flag parameter first_slice_in_frame=1 in the frame to be decoded, determining the frame to be decoded as a start data packet; when the frame mark parameter first_slice_in_frame=0 in the frame to be decoded, determining that the frame to be decoded is not a start data packet.
In this way, decoding failure due to errors in the data layer flag parameter nuh_layer_id can be avoided by distinguishing between color layer frame data and Alpha layer frame data by the frame number frame_num, or by a combination of the frame number frame_num and the frame flag parameter first_slice_in_frame.
According to the coding standard of HEVC Video With Alpha, the Alpha layer frame data is in a YUV color model format.
Optionally, as shown in fig. 6, in the process of decoding the Alpha layer frame data by using the Alpha layer decoder, in step S205, the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, which includes:
s601, the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data;
s602, decoding the Y color direction data by the Alpha layer decoder to obtain the Alpha value.
By discarding the data of the U color direction and the V color direction in the Alpha layer frame data, the decoding speed of the Alpha layer decoder can be increased by about 30%.
Optionally, as shown in fig. 7, in step S206, obtaining a display image according to the color value and the Alpha value includes:
s703, outputting the color value and the Alpha value to a graphics processor (Graphics Processing Unit, GPU);
S704, determining a mixing mode of the color value and the Alpha value by the GPU according to the color value and the Alpha value of each pixel, and mixing the color value and the Alpha value according to the position of the pixel.
As an optional implementation manner, the step S703 of outputting the color value and the Alpha value to the GPU specifically includes:
and respectively carrying out texture sampling on the color value and the Alpha value to obtain two textures, and outputting the two textures to a GPU.
As another alternative embodiment, the step S703 of outputting the color value and the Alpha value to the GPU specifically includes:
and performing texture sampling on the color value and the Alpha value to obtain a piece of texture, and outputting the texture to a GPU.
In the implementation process, if the color values and the Alpha values are sampled respectively in the step S703 and then output to the GPU, then in the step S704, the two textures are combined into a four-dimensional matrix containing both the color values and the Alpha values by using a shader (loader), and then mixed. In step S704, the color value and the Alpha value are mixed in a large number, and the present application is not limited thereto. One specific mixing scheme is as follows:
Wherein a represents pixels of an image frame of a video decoded by the video decoding method of the present application, B represents pixels of an image frame of a background video, C represents pixels of a displayed image frame obtained after final mixing, and R, G, B represent red, green, and blue tristimulus values of the corresponding image, respectively.
In this way, the GPU mixes the color value and the Alpha value, so that the semitransparent characteristic of the video is realized.
Accordingly, if the video stream does not include the Alpha layer encoded stream, in step S208, displaying the image according to the color value specifically includes: and outputting the color value to a GPU, and displaying by the GPU according to the color value of each pixel.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
s701, after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel flag parameter output flag corresponding to a color layer;
and/or S702, after the Alpha layer decoder decodes the Alpha layer coded stream to obtain an Alpha value, generating a decoding channel flag parameter output flag corresponding to the Alpha layer;
in the step S704, the GPU determines a color value and an Alpha value of each pixel according to the decoding channel flag parameter output flag.
In this way, the subsequent graphics processor is facilitated to distinguish between the color values and the Alpha values.
Exemplary apparatus
Having described the method of the exemplary embodiments of the present application, a description is next given of the video decoding apparatus of the exemplary embodiments of the present application.
Fig. 8 is a schematic structural diagram of a video decoding apparatus 800 according to an embodiment of the present application. In one embodiment, the video decoding apparatus 800 includes: an initialization unit 801, a color layer decoding unit 802, an Alpha layer decoding unit 803, and a display unit 804.
An initialization unit 801, configured to receive a video stream to be decoded, and call a color layer decoder and an Alpha layer decoder when determining that the video stream includes both a color layer encoded stream and an Alpha layer encoded stream;
a color layer decoding unit 802, configured to obtain a color layer encoded stream in the video stream, and decode the color layer encoded stream by the color layer decoder to obtain a color value;
an Alpha layer decoding unit 803, configured to obtain an Alpha layer encoded stream in the video stream, and decode the Alpha layer encoded stream by using the Alpha layer decoder to obtain an Alpha value;
and a display unit 804, configured to obtain a display image according to the color value and the Alpha value.
Optionally, determining that the video stream includes both color layer data and Alpha layer data includes:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the expansion data identification corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and the Alpha layer coding stream.
Optionally, when the color layer decoder and the Alpha layer decoder are called, the method further includes:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for buffering a color layer reference frame obtained in the process of decoding the color layer coded stream;
and creating an Alpha layer reference frame buffer zone, wherein the Alpha layer reference frame buffer zone is used for buffering the Alpha layer reference frame obtained in the process of decoding the Alpha layer coded stream.
Optionally, after invoking the color layer decoder and the Alpha layer decoder, the method further comprises:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set to be a color layer image parameter set or an Alpha layer image parameter set according to data layer mark parameters in the image parameter set;
Initializing the color layer decoder with the color layer sequence parameter set and the color layer image parameter set;
and initializing the Alpha layer decoder by using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
Optionally, obtaining a color layer encoded stream and an Alpha layer encoded stream in the video stream includes:
acquiring frame data in the video stream, and determining that the frame data is color layer frame data or Alpha layer frame data according to data layer mark parameters of the frame data;
sending the color layer frame data to the color layer decoder;
and sending the Alpha layer frame data to the Alpha layer decoder.
Optionally, the video decoding device further comprises:
a data judging unit (not shown in fig. 8) for obtaining a frame number of a frame to be decoded of the frame data when the data layer flag parameter is parsed to be null or illegal;
if the frame sequence number is a new frame sequence number, determining that the frame to be decoded is color layer frame data;
and if the frame sequence number is the repeated frame sequence number, determining that the frame to be decoded is color layer frame data or Alpha layer frame data according to the frame marking parameter in the frame to be decoded.
Optionally, determining, according to the frame marking parameter in the frame to be decoded, that the frame to be decoded is color layer frame data or Alpha layer frame data includes:
When the frame to be decoded is determined to be a starting data packet according to the frame marking parameter, determining that the frame to be decoded is the Alpha layer frame data;
and when the frame to be decoded is determined not to be a starting data packet according to the frame marking parameter, determining the frame to be decoded to be the color layer frame data.
Optionally, the Alpha layer frame data is in a YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
Optionally, the obtaining a display image according to the color value and the Alpha value includes:
outputting the color value and the Alpha value to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphic processor, and mixing the color value and the Alpha value according to the position of the pixel.
Optionally, before outputting the color value and the Alpha value to a graphics processor, the method further includes:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to a color layer;
And/or after the Alpha layer decoder decodes the Alpha layer coded stream to obtain an Alpha value, generating a decoding channel flag parameter corresponding to the Alpha layer;
wherein the graphics processor determines a color value and an Alpha value for each pixel based on the decoding channel flag parameter.
Optionally, the initializing unit 801 is further configured to invoke a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
the display unit 804 is further configured to display an image according to the color value;
when the initialization unit 801 determines that the video stream does not include an Alpha layer encoded stream, the color layer decoding unit 802 obtains a color layer encoded stream in the video stream, and the color layer decoder decodes the color layer encoded stream to obtain a color value.
Based on the same inventive concept as the video decoding method, the embodiment of the application also provides an electronic device, which can be specifically an electronic device such as a desktop computer, a mobile phone, a mobile computer, a tablet computer, a media player, an intelligent wearable device, an intelligent television, a vehicle-mounted device, a personal digital assistant (Personal Digital Assistant, PDA) and the like.
The electronic device 900 is described below with reference to fig. 9. The electronic device shown in fig. 9 is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present invention.
As shown in fig. 9, the electronic device 900 may be embodied in the form of a general purpose computing device, which may be, for example, a terminal device, and the components of the electronic device 900 may include, but are not limited to: the at least one processing unit 910, the at least one storage unit 920, and a bus 930 connecting the different system components (including the storage unit 920 and the processing unit 910).
The processing unit 910 may be a general-purpose processor such as a central processing unit (Central Processing Unit, CPU), digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, and may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software modules in the processor for execution.
Bus 930 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, and a local bus using any of a variety of bus architectures.
The storage unit 920 may include readable media in the form of volatile Memory, such as random access Memory (Random Access Memory, RAM) 921 and/or cache Memory 922, and may further include Read-Only Memory (ROM) 923.
The storage unit 920 may also include a program/utility 925 having a set (at least one) of program modules 924, such program modules 924 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The program modules 924, when executed by the processing unit 910, cause the processing unit 910 to perform various steps in the video decoding method according to various exemplary embodiments of the present application described in the section "exemplary methods" above in this specification.
For example, the processing unit 910 may perform a video decoding method as shown in fig. 2, including: s201, receiving a video stream to be decoded; s202, judging and determining whether the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream; if yes, S203, respectively calling a color layer decoder and an Alpha layer decoder; s204, obtaining a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value; s205, acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value; s206, obtaining a display image according to the color value and the Alpha value.
The electronic device 900 may also communicate with one or more external devices 940 (e.g., keyboard, pointing device, etc.), one or more devices that enable a user to interact with the electronic device 900, and/or any devices (e.g., routers, modems, etc.) that enable the electronic device 900 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 950. Also, the electronic device 900 may communicate with one or more networks such as a local area network (Local Area Network, LAN), a wide area network (Wide Area Network, WAN) and/or a public network such as the internet via the network adapter 960. As shown, the network adapter 960 communicates with other modules of the electronic device 900 over the bus 930. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with the electronic device 900, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
Exemplary program product
The present embodiments provide a computer readable storage medium storing computer program instructions for use in the above-described electronic device, which contains a program for executing the video decoding method in any of the exemplary embodiments of the present application.
The computer storage media described above can be any available media or data storage device that can be accessed by a computer, including, but not limited to, magnetic storage (e.g., floppy disks, hard disks, magnetic tape, magneto-optical disks (MOs), etc.), optical storage (e.g., CD, DVD, BD, HVD, etc.), and semiconductor storage (e.g., ROM, EPROM, EEPROM, nonvolatile storage (NAND FLASH), solid State Disk (SSD)), etc.
In some possible implementations, as shown in fig. 10, various aspects of the present application may also be implemented as a computer program product 1000 comprising program code for causing an electronic device to perform the steps of the video decoding method according to various exemplary embodiments of the present application described in the above section of the "exemplary method" when the computer program product is run on the electronic device, e.g., the electronic device may perform the video decoding method as shown in fig. 2, comprising: s201, receiving a video stream to be decoded; s202, judging and determining whether the video stream simultaneously comprises a color layer coding stream and an Alpha layer coding stream; if yes, S203, respectively calling a color layer decoder and an Alpha layer decoder; s204, obtaining a color layer coding stream in the video stream, and decoding the color layer coding stream by the color layer decoder to obtain a color value; s205, acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value; s206, obtaining a display image according to the color value and the Alpha value.
The computer program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer program product for video decoding according to embodiments of the present application may employ a portable compact disc read-only memory (CD-ROM) and include program code and may run on a server device. However, the program product of the present application is not limited thereto, and in this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The readable signal medium may include a data signal propagated in baseband or as part of a carrier wave with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
It should be noted that although several units or sub-units of the apparatus are mentioned in the above detailed description, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more of the elements described above may be embodied in one element in accordance with embodiments of the present application. Conversely, the features and functions of one unit described above may be further divided into a plurality of units to be embodied.
Furthermore, although the operations of the methods of the present application are depicted in the drawings in a particular order, this is not required to or suggested that these operations must be performed in this particular order or that all of the illustrated operations must be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform.
While the spirit and principles of this application have been described with reference to several particular embodiments, it is to be understood that this application is not limited to the disclosed particular embodiments nor does it imply that features in the various aspects are not useful in combination, nor are they intended to be in any way useful for the convenience of the description. The application is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (22)

1. A video decoding method, comprising:
receiving a video stream to be decoded, and calling a color layer decoder and an Alpha layer decoder respectively when the video stream is determined to simultaneously comprise a color layer coded stream and an Alpha layer coded stream;
acquiring a color layer coded stream in the video stream, and decoding the color layer coded stream by the color layer decoder to obtain a color value;
acquiring an Alpha layer coded stream in the video stream, and decoding the Alpha layer coded stream by the Alpha layer decoder to obtain an Alpha value;
obtaining a display image according to the color value and the Alpha value;
when the data layer flag parameter of the frame data in the video stream is analyzed to be null or illegal, and the frame number of the frame data to be decoded is a repeated frame number, if the frame to be decoded is determined to be a start data packet according to the frame flag parameter in the frame to be decoded, determining that the frame to be decoded is Alpha layer frame data; and if the frame to be decoded is determined not to be the initial data packet according to the frame marking parameter, determining that the frame to be decoded is color layer frame data.
2. The method of claim 1, wherein determining that the video stream includes both color layer data and Alpha layer data comprises:
Acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the expansion data identification corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and the Alpha layer coding stream.
3. The method of claim 1, wherein when invoking the color layer decoder and Alpha layer decoder, further comprising:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for buffering a color layer reference frame obtained in the process of decoding the color layer coded stream;
and creating an Alpha layer reference frame buffer zone, wherein the Alpha layer reference frame buffer zone is used for buffering the Alpha layer reference frame obtained in the process of decoding the Alpha layer coded stream.
4. The method of claim 1, further comprising, after invoking the color layer decoder and the Alpha layer decoder:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
acquiring an image parameter set from the video stream, and determining the image parameter set to be a color layer image parameter set or an Alpha layer image parameter set according to data layer mark parameters in the image parameter set;
Initializing the color layer decoder with the color layer sequence parameter set and the color layer image parameter set;
and initializing the Alpha layer decoder by using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
5. The method of claim 1, wherein obtaining a color layer encoded stream and an Alpha layer encoded stream in the video stream comprises:
acquiring frame data in the video stream, and determining that the frame data is color layer frame data or Alpha layer frame data according to data layer mark parameters of the frame data;
sending the color layer frame data to the color layer decoder;
and sending the Alpha layer frame data to the Alpha layer decoder.
6. The method as recited in claim 5, further comprising:
and when the data layer mark parameter is analyzed to be empty or illegal, and the frame number of the frame to be decoded of the frame data is a new frame number, determining that the frame to be decoded is color layer frame data.
7. The method of claim 5, wherein the Alpha layer frame data is in YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
And the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
8. The method of claim 1, wherein said deriving a display image from said color values and said Alpha values comprises:
outputting the color value and the Alpha value to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphic processor, and mixing the color value and the Alpha value according to the position of the pixel.
9. The method of claim 8, wherein prior to outputting the color value and the Alpha value to a graphics processor, further comprising:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to a color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coded stream to obtain an Alpha value, generating a decoding channel flag parameter corresponding to the Alpha layer;
wherein the graphics processor determines a color value and an Alpha value for each pixel based on the decoding channel flag parameter.
10. The method as recited in claim 1, further comprising:
invoking a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
acquiring a color layer coded stream in the video stream, and decoding the color layer coded stream by the color layer decoder to obtain a color value;
and displaying an image according to the color value.
11. A video decoding apparatus, comprising:
an initialization unit, configured to receive a video stream to be decoded, and call a color layer decoder and an Alpha layer decoder when determining that the video stream includes both a color layer encoded stream and an Alpha layer encoded stream;
a color layer decoding unit, configured to obtain a color layer encoded stream in the video stream, and decode the color layer encoded stream by using the color layer decoder to obtain a color value;
an Alpha layer decoding unit, configured to obtain an Alpha layer encoded stream in the video stream, and decode the Alpha layer encoded stream by using the Alpha layer decoder to obtain an Alpha value;
a display unit for obtaining a display image according to the color value and the Alpha value;
when the data layer flag parameter of the frame data in the video stream is analyzed to be null or illegal, and the frame number of the frame data to be decoded is a repeated frame number, if the frame to be decoded is determined to be a start data packet according to the frame flag parameter in the frame to be decoded, determining that the frame to be decoded is Alpha layer frame data; and if the frame to be decoded is determined not to be the initial data packet according to the frame marking parameter, determining that the frame to be decoded is color layer frame data.
12. The apparatus of claim 11, wherein determining that the video stream includes both color layer data and Alpha layer data comprises:
acquiring a video parameter set from the video stream, and analyzing an expansion data identifier in the video parameter set;
and if the expansion data identification corresponding to the Alpha layer is analyzed, determining that the video stream simultaneously comprises a color layer coding stream and the Alpha layer coding stream.
13. The apparatus of claim 11, wherein the invoking the color layer decoder and the Alpha layer decoder further comprises:
creating a color layer reference frame buffer area, wherein the color layer reference frame buffer area is used for buffering a color layer reference frame obtained in the process of decoding the color layer coded stream;
and creating an Alpha layer reference frame buffer zone, wherein the Alpha layer reference frame buffer zone is used for buffering the Alpha layer reference frame obtained in the process of decoding the Alpha layer coded stream.
14. The apparatus of claim 11, further comprising, after invoking the color layer decoder and the Alpha layer decoder:
acquiring a sequence parameter set from the video stream, and determining the sequence parameter set as a color layer sequence parameter set or an Alpha layer sequence parameter set according to a data layer mark parameter in the sequence parameter set;
Acquiring an image parameter set from the video stream, and determining the image parameter set to be a color layer image parameter set or an Alpha layer image parameter set according to data layer mark parameters in the image parameter set;
initializing the color layer decoder with the color layer sequence parameter set and the color layer image parameter set;
and initializing the Alpha layer decoder by using the Alpha layer sequence parameter set and the Alpha layer image parameter set.
15. The apparatus of claim 11, wherein obtaining a color layer encoded stream and an Alpha layer encoded stream in the video stream comprises:
acquiring frame data in the video stream, and determining that the frame data is color layer frame data or Alpha layer frame data according to data layer mark parameters of the frame data;
sending the color layer frame data to the color layer decoder;
and sending the Alpha layer frame data to the Alpha layer decoder.
16. The apparatus as recited in claim 15, further comprising:
and the data judging unit is used for determining that the frame to be decoded is color layer frame data when the data layer flag parameter is analyzed to be null or illegal and the frame number of the frame to be decoded of the frame data is a new frame number.
17. The apparatus of claim 15, wherein the Alpha layer frame data is in YUV color model format;
the Alpha layer decoder decodes the Alpha layer encoded stream to obtain an Alpha value, including:
and the Alpha layer decoder acquires Y color direction data from the Alpha layer frame data and decodes the Y color direction data to obtain the Alpha value.
18. The apparatus of claim 11, wherein said deriving a display image from said color values and said Alpha values comprises:
outputting the color value and the Alpha value to a graphics processor;
and determining a mixing mode of the color value and the Alpha value according to the color value and the Alpha value of each pixel by the graphic processor, and mixing the color value and the Alpha value according to the position of the pixel.
19. The apparatus of claim 18, wherein prior to outputting the color value and the Alpha value to a graphics processor, further comprising:
after the color layer decoder decodes the color layer coded stream to obtain a color value, generating a decoding channel mark parameter corresponding to a color layer;
and/or after the Alpha layer decoder decodes the Alpha layer coded stream to obtain an Alpha value, generating a decoding channel flag parameter corresponding to the Alpha layer;
Wherein the graphics processor determines a color value and an Alpha value for each pixel based on the decoding channel flag parameter.
20. The apparatus of claim 11, wherein the initialization unit is further configured to invoke a color layer decoder when it is determined that the video stream does not include an Alpha layer encoded stream;
the display unit is also used for displaying images according to the color values;
and when the color value is determined by the initialization unit that the video stream does not comprise an Alpha layer coded stream, the color layer decoding unit acquires the color layer coded stream in the video stream, and the color layer decoder decodes the color layer coded stream to obtain the color value.
21. An electronic device, comprising: a processor and a memory for storing instructions executable by the processor;
wherein the processor is configured to execute the instructions to implement the video decoding method of any of claims 1-10.
22. A computer storage medium storing a computer program which, when executed by a processor, implements the video decoding method according to any one of claims 1-10.
CN202110320486.7A 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium Active CN113099232B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110320486.7A CN113099232B (en) 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110320486.7A CN113099232B (en) 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium

Publications (2)

Publication Number Publication Date
CN113099232A CN113099232A (en) 2021-07-09
CN113099232B true CN113099232B (en) 2024-01-19

Family

ID=76669913

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110320486.7A Active CN113099232B (en) 2021-03-25 2021-03-25 Video decoding the method is a device(s) electronic device and computing machine storage medium

Country Status (1)

Country Link
CN (1) CN113099232B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015101173A1 (en) * 2013-12-30 2015-07-09 Mediatek Singapore Pte. Ltd. Method and apparatus for scaling parameter coding for inter-component residual prediction
CN106105220A (en) * 2014-01-07 2016-11-09 诺基亚技术有限公司 Method and apparatus for video coding and decoding
CN107071514A (en) * 2017-04-08 2017-08-18 腾讯科技(深圳)有限公司 A kind of photograph document handling method and intelligent terminal
WO2018184483A1 (en) * 2017-04-08 2018-10-11 腾讯科技(深圳)有限公司 Picture file processing method and system, and storage medium
CN110113615A (en) * 2018-02-01 2019-08-09 腾讯科技(深圳)有限公司 Image encoding method, device, calculates equipment and storage medium at coding/decoding method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160034259A (en) * 2013-07-18 2016-03-29 엘지전자 주식회사 Method and apparatus for processing video signal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015101173A1 (en) * 2013-12-30 2015-07-09 Mediatek Singapore Pte. Ltd. Method and apparatus for scaling parameter coding for inter-component residual prediction
CN106105220A (en) * 2014-01-07 2016-11-09 诺基亚技术有限公司 Method and apparatus for video coding and decoding
CN107071514A (en) * 2017-04-08 2017-08-18 腾讯科技(深圳)有限公司 A kind of photograph document handling method and intelligent terminal
WO2018184483A1 (en) * 2017-04-08 2018-10-11 腾讯科技(深圳)有限公司 Picture file processing method and system, and storage medium
CN110113615A (en) * 2018-02-01 2019-08-09 腾讯科技(深圳)有限公司 Image encoding method, device, calculates equipment and storage medium at coding/decoding method

Also Published As

Publication number Publication date
CN113099232A (en) 2021-07-09

Similar Documents

Publication Publication Date Title
US11012489B2 (en) Picture file processing method, picture file processing device, and storage medium
JP7053722B2 (en) A method and apparatus for signaling a picture / video format of an LDR picture within a bitstream and a picture / video format of a decoded HDR picture obtained from this LDR picture and an illumination picture.
US20200036983A1 (en) Picture file procesing method and apparatus, and storage medium
JP6472429B2 (en) Method, apparatus and system for determining LUMA values
US8446961B2 (en) Color gamut scalability techniques
US10194150B2 (en) Method and device for coding image, and method and device for decoding image
US8520734B1 (en) Method and system for remotely communicating a computer rendered image sequence
RU2693185C2 (en) Encoding method and colour conversion decoding method and corresponding devices
JP7086587B2 (en) Method and device for reconstructing image data from decoded image data
TWI707309B (en) Method, system and storage medium for processing image file
KR102523233B1 (en) Method and device for decoding a color picture
KR102617258B1 (en) Image processing method and apparatus
CN101340587A (en) Method for encoding input image, method and apparatus for displaying an encoded image
WO2018153802A1 (en) Method and device for reconstructing an hdr image
JP7043164B2 (en) Methods and Devices for Encoding Both High Dynamic Range Frames and Impose Low Dynamic Range Frames
US11967345B2 (en) System and method for rendering key and fill video streams for video processing
WO2017063168A1 (en) Image coding method and apparatus, and image processing device
WO2023142127A1 (en) Coding and decoding methods and apparatuses, device, and storage medium
EP1850290B1 (en) Image processing apparatus and method for preventing degradation of image quality when bit format of image is converted
CN107660340A (en) The method and corresponding device of color mapping are carried out to vision signal
CN113099232B (en) Video decoding the method is a device(s) electronic device and computing machine storage medium
TW201138476A (en) Joint scalar embedded graphics coding for color images
US10484640B2 (en) Low power video composition using a stream out buffer
US10034007B2 (en) Non-subsampled encoding techniques
TW202408245A (en) Coding method and apparatus, decoding method and apparatus, and coder, decoder and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant