CN107682018B - Decoding method and device - Google Patents

Decoding method and device Download PDF

Info

Publication number
CN107682018B
CN107682018B CN201710893279.4A CN201710893279A CN107682018B CN 107682018 B CN107682018 B CN 107682018B CN 201710893279 A CN201710893279 A CN 201710893279A CN 107682018 B CN107682018 B CN 107682018B
Authority
CN
China
Prior art keywords
symbol
decoding
symbols
address
code
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710893279.4A
Other languages
Chinese (zh)
Other versions
CN107682018A (en
Inventor
马传文
钟炎培
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Wanxiang Electronics Technology Co Ltd
Original Assignee
Xian Wanxiang Electronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Wanxiang Electronics Technology Co Ltd filed Critical Xian Wanxiang Electronics Technology Co Ltd
Priority to CN201710893279.4A priority Critical patent/CN107682018B/en
Publication of CN107682018A publication Critical patent/CN107682018A/en
Application granted granted Critical
Publication of CN107682018B publication Critical patent/CN107682018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same, similar or subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/40Conversion to or from variable length codes, e.g. Shannon-Fano code, Huffman code, Morse code

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Compression, Expansion, Code Conversion, And Decoders (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The disclosure provides a decoding method and a decoding device, relates to the technical field of electronic information, and can solve the problems of low decoding efficiency and high resource consumption in a Huffman decoding process. The specific technical scheme is as follows: acquiring a target bit stream, wherein the target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a Huffman algorithm; calculating a codeword of each symbol in the at least one symbol according to the code length information; generating a decoding table according to the code word of each symbol; and decoding the data information according to the decoding table to obtain original data. The invention is used for Huffman decoding.

Description

Decoding method and device
Technical Field
The present disclosure relates to the field of electronic information technologies, and in particular, to a decoding method and apparatus.
Background
Huffman coding is a prefix coding technique, where codewords are constructed according to the probability of occurrence of characters, shorter codes are used for symbols with a large number of occurrences, and longer codes are used for symbols with a small number of occurrences.
In the decoding process, static huffman decoding and dynamic huffman decoding are usually included. Static Huffman decoding, wherein a universal character probability is counted according to a specific application scene, a fixed binary tree and a Huffman code table are constructed, and then decoding is carried out according to the fixed code table; and dynamic Huffman decoding, wherein a plurality of code tables are designed for decoding at the same time according to different code lengths, and finally, a correct decoding value is selected.
However, in the process of implementing the above decoding, the static huffman decoding is decoded according to the fixed code table, and the decoding efficiency is low, while the dynamic huffman decoding can adapt to different code lengths, but the resource consumption is too large because different code tables are designed for each code length.
Disclosure of Invention
The embodiment of the disclosure provides a decoding method and a decoding device, which can solve the problems of low decoding efficiency and high resource consumption in the Huffman decoding process. The technical scheme is as follows:
according to a first aspect of embodiments of the present disclosure, there is provided a decoding method, including:
acquiring a target bit stream, wherein the target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a Huffman algorithm;
calculating a codeword of each symbol in the at least one symbol according to the code length information;
generating a decoding table according to the code word of each symbol;
and decoding the data information according to the decoding table to obtain original data.
For data information generated by coding according to the Huffman algorithm, the code word of each symbol is directly calculated and a code table is generated, only one code table is needed, the code table does not need to be respectively designed according to different code lengths, resources consumed in the decoding process are reduced, meanwhile, the decoding table is calculated according to the target data stream, the decoding table can adapt to the target data stream, and the decoding efficiency is improved.
In one embodiment, calculating a codeword for each of at least one symbol based on the code length information includes:
grouping at least one symbol according to the code length of the at least one symbol, and grouping symbols with the same code length into a group;
the code words for each set of symbols are calculated separately.
The symbols are grouped according to the code length, and the code words of each group of symbols are respectively calculated, so that the processing efficiency is higher, and the calculation speed is higher.
In one embodiment, separately computing the codeword for each set of symbols comprises:
respectively calculating the code word of the first symbol in each group of symbols according to a normal Huffman algorithm;
and respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols.
Because the code words of each group of symbols in the normal form Huffman have a certain rule, the code word speed of the symbols calculated according to the Huffman algorithm is faster.
In one embodiment, generating a decoding table from the codeword for each symbol includes:
respectively determining the address of each symbol in each group of symbols;
and generating a decoding table according to the address of each symbol.
The address and the symbol of each symbol correspond to each other, and the code word can be directly searched in the decoding table according to the address, so that the decoding efficiency is improved.
In one embodiment, separately determining the address of each symbol in each set of symbols comprises:
determining the address of the first symbol in the mth group of symbols, wherein m is an integer greater than 0;
according to formula Kn=K1+(Pn-P0) Calculating addresses of symbols other than the first symbol, wherein KnIndicating the address of the nth symbol in the mth group of symbols, K1The address, P, representing the first symbolnCode word representing the nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0.
The address of each group of symbols is calculated according to a formula, the corresponding relation between the address and the code word is clear, and the decoding efficiency is higher.
In one embodiment, decoding the data information according to the decoding table to obtain the original data includes:
dividing the data information into at least one character segment according to the code length information;
calculating an address of each of the at least one character segment;
and decoding each character segment according to the decoding table and the address of each character segment to obtain original data.
Because the decoding table is calculated according to the code length information, when decoding, the code word can be determined directly according to the address of the character segment, the data information is decoded, and the decoding efficiency is improved.
According to a second aspect of the embodiments of the present disclosure, there is provided a decoding apparatus including:
the device comprises an acquisition module, a decoding module and a decoding module, wherein the acquisition module is used for acquiring a target bit stream, the target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a Huffman algorithm;
the code word module is used for calculating the code word of each symbol in at least one symbol according to the code length information;
the code table module is used for generating a decoding table according to the code word of each symbol;
and the decoding module is used for decoding the data information according to the decoding table to obtain the original data.
In one embodiment, the codeword module includes a grouping sub-module and a calculation sub-module;
the grouping submodule is used for grouping at least one symbol according to the code length of the at least one symbol and grouping the symbols with the same code length into a group;
and the calculation submodule is used for calculating the code word of each group of symbols respectively.
In one embodiment, the calculation sub-module is configured to calculate a codeword of a first symbol in each group of symbols according to a canonical huffman algorithm; and respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols.
In one embodiment, the code table module includes an address submodule and a mapping submodule;
the address submodule is used for respectively determining the address of each symbol in each group of symbols;
and the mapping submodule is used for generating a decoding table according to the address of each symbol.
In one embodiment, the address submodule is configured to determine an address of a first symbol in an mth group of symbols, where m is an integer greater than 0; according to formula Kn=K1+(Pn-P0) Calculating addresses of symbols other than the first symbol, wherein KnIndicating the address of the nth symbol in the mth group of symbols, K1The address, P, representing the first symbolnCode word representing the nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0.
In one embodiment, the decoding module includes a segmentation sub-module, a character segment sub-module, and an original data sub-module;
the segmentation submodule is used for dividing the data information into at least one character segment according to the code length information;
a character segment submodule for calculating an address of each of at least one character segment;
and the original data submodule is used for decoding each character segment according to the decoding table and the address of each character segment to obtain original data.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart of a decoding method provided by an embodiment of the present disclosure;
fig. 2 is a flowchart of a decoding method according to another embodiment of the disclosure;
FIG. 3 is a schematic diagram of a symbol grouping provided by an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a codeword provided by an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a decoding table provided in the embodiment of the present disclosure;
fig. 6 is a schematic diagram of segmentation information provided by an embodiment of the present disclosure;
fig. 7 is a block diagram of a decoding apparatus provided in an embodiment of the present disclosure;
fig. 8 is a block diagram of a decoding apparatus provided in an embodiment of the present disclosure;
fig. 9 is a block diagram of a decoding apparatus provided in an embodiment of the present disclosure;
fig. 10 is a block diagram of a decoding apparatus according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The embodiment of the present disclosure provides a decoding method, as shown in fig. 1, the decoding method includes the following steps:
101. a target bitstream is obtained.
The target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a Huffman algorithm. Optionally, the data information may be generated according to a canonical huffman algorithm code.
It should be noted that the symbol is a unit constituting the original data, for example, if the original data is an article, the symbol includes chinese characters, letters, numbers, punctuations, and the like therein. As another example, the original data is a frame image, and the symbol includes color value components of each color, such as RGB (red green blue) components. Of course, this is merely an example.
102. A codeword for each of the at least one symbol is calculated based on the code length information.
The code word is a character string generated by encoding a symbol according to the Huffman algorithm, and is binary encoding.
In one embodiment, calculating a codeword for each of at least one symbol based on the code length information includes: grouping at least one symbol according to the code length of the at least one symbol, and grouping symbols with the same code length into a group; the code words for each set of symbols are calculated separately.
The symbols are grouped according to the code length, and the code words of each group of symbols are respectively calculated, so that the processing efficiency is higher, and the calculation speed is higher.
In one embodiment, separately computing the codeword for each set of symbols comprises: respectively calculating the code word of the first symbol in each group of symbols according to a normal Huffman algorithm; and respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols.
Specifically, in each group of symbols in the canonical huffman, the code word of the next symbol is obtained by adding "1" to the code word of the previous symbol; the code word of the first symbol in the latter group of symbols is obtained by shifting the code word of the last symbol in the former group of symbols by one bit and then adding '1'. Because the code words of each group of symbols in the normal form Huffman have a certain rule, the code word speed of the symbols calculated according to the Huffman algorithm is faster.
103. And generating a decoding table according to the code word of each symbol.
The decoding table is used for indicating the corresponding relation between the symbols and the code words.
In one embodiment, generating a decoding table from the codeword for each symbol includes: respectively determining the address of each symbol in each group of symbols; and generating a decoding table according to the address of each symbol.
In this embodiment, the decoding table is used to indicate the correspondence between addresses and symbols, the addresses and symbols of each symbol correspond to each other, and the code words can be directly searched in the decoding table according to the addresses, thereby improving decoding efficiency.
In one embodiment, separately determining the address of each symbol in each set of symbols comprises: determining the address of the first symbol in the mth group of symbols, wherein m is an integer greater than 0; according to formula Kn=K1+(Pn-P0) Calculating addresses of symbols other than the first symbol, wherein KnIndicating the address of the nth symbol in the mth group of symbols, K1The address, P, representing the first symbolnCode word representing the nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0.
The address of each group of symbols is calculated according to a formula, the corresponding relation between the address and the code word is clear, and the decoding efficiency is higher.
104. And decoding the data information according to the decoding table to obtain original data.
In one embodiment, decoding the data information according to the decoding table to obtain the original data includes: dividing the data information into at least one character segment according to the code length information; calculating an address of each of the at least one character segment; and decoding each character segment according to the decoding table and the address of each character segment to obtain original data.
Because the decoding table is calculated according to the code length information, when decoding, the code word can be determined directly according to the address of the character segment, the data information is decoded, and the decoding efficiency is improved.
According to the decoding method provided by the embodiment of the disclosure, for data information generated according to Huffman algorithm coding, the code word of each symbol is directly calculated and a code table is generated, only one code table is provided, and the code tables do not need to be respectively designed according to different code lengths, so that resources consumed in the decoding process are reduced, and meanwhile, the decoding table is calculated according to a target data stream, so that the decoding method can adapt to the target data stream and improve the decoding efficiency.
Based on the decoding method described in the embodiment corresponding to fig. 1, another embodiment of the present disclosure provides a decoding method, as shown in fig. 2, including the following steps:
201. a target bitstream is obtained.
The target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a normal Huffman algorithm.
202. And grouping at least one symbol according to the code length information, and grouping the symbols with the same code length into a group.
As shown in fig. 3, fig. 3 is a schematic diagram of symbol grouping according to an embodiment of the present disclosure, in fig. 3, taking a symbol "A, B, C, D, E" as an example, a code length of a symbol a and a symbol B is 2, and a code length of a symbol C, a symbol D, and a symbol E is 3.
203. And respectively calculating the code words of each group of symbols according to a normal Huffman algorithm.
Firstly, respectively calculating the code word of a first symbol in each group of symbols according to a normal form Huffman algorithm; and respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols. In each group of symbols of the paradigm Huffman, the code word of the latter symbol is obtained by adding '1' to the code word of the former symbol; the code word of the first symbol in the latter group of symbols is obtained by shifting the code word of the last symbol in the former group of symbols by one bit and then adding '1'. Because the code words of each group of symbols in the normal form Huffman have a certain rule, the code word speed of the symbols calculated according to the Huffman algorithm is faster.
As shown in fig. 4, fig. 4 is a schematic diagram of a codeword according to an embodiment of the present disclosure. In fig. 4, the code word of the symbol a is 00, the code length of the symbol B is the same as the code length of the symbol a, and the code word of the symbol a is added with 1 to obtain the code word 01 of the symbol B; the symbol C is the first symbol in a group of symbols with a code length of 3, the code word of the symbol B is shifted to the left by one bit and then 1 is added to obtain the code word 011 of the symbol C, then the code word of the symbol C is added with 1 to obtain the code word 100 of the symbol D, and similarly, the code word of the symbol E is 101.
204. The address of each symbol in each set of symbols is determined separately.
Specifically, the address of the first symbol in the mth group of symbols may be determined, where m is an integer greater than 0; according to formula Kn=K1+(Pn-P0) Calculating addresses of symbols other than the first symbol, wherein KnIndicating the address of the nth symbol in the mth group of symbols, K1The address, P, representing the first symbolnCode word representing the nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0.
205. And generating a decoding table according to the address of each symbol.
The decoding table is used for indicating the corresponding relation between the symbols and the addresses.
As shown in fig. 5, fig. 5 is a schematic diagram of a decoding table according to an embodiment of the disclosure. Determining the address of the first symbol A of the first group of symbols with the code length of 2 as 0, and determining the address of the first symbol C of the first group of symbols with the code length of 3 as the address of A plus the number of the first group of symbols, so that the address of C is 2; in each group of symbols, the address of the latter symbol is the address of the last symbol plus 1.
206. The data information is divided into at least one character segment according to the code length information.
In one embodiment, 0 may be added after the code words of all the symbols, the code length of each symbol is made up to the longest code length number of bits, and the data information is segmented according to the code words made up to the longest code length number of bits.
As shown in fig. 6, fig. 6 is a schematic diagram of segmentation information provided by an embodiment of the present disclosure. In fig. 6, the longest code length of a symbol is 3, all symbols are complemented to 3 bits, a codeword 00 of a symbol a is complemented to 3 bits to be 000, and a codeword 01 of a symbol B is complemented to 3 bits to be 010.
It is determined to which segment the first 3 bits of the data information belong and it is then possible to determine how segmented the first 3 bits should be. For example, the first 3 bits are 001, and between 000 and 010, the first two bits can be determined to be a character segment, and then the 3 rd to 5 th bits are judged; if the first 3 bits are 011 and the code word is the same as the code word of symbol C, it can be determined that the first 3 bits are a character segment and then the 4 th to 6 th bits are continuously judged. In this way, the data information can be divided into at least one character segment.
207. An address of each of the at least one character segment is calculated.
Each character segment after segmentation is a code word of each symbol after being coded according to a Huffman algorithm, and the address of each character segment in at least one character segment is calculated. The calculation method corresponds to the calculation method of the address in step 204, taking the target character segment as an example, where the target character segment is any one character segment, and the address of the target character segment is (the code word of the target character segment-the first symbol) + the address of the first symbol, and the first symbol is the first symbol of the symbol group in which the target character segment is located.
208. And decoding each character segment according to the decoding table and the address of each character segment to obtain original data.
The decoding table records the corresponding relation between the symbols and the addresses, and according to the addresses of the character segments, the corresponding symbols can be found through the decoding table, so that the data information is decoded.
According to the decoding method provided by the embodiment of the disclosure, for data information generated according to Huffman algorithm coding, the code word of each symbol is directly calculated and a code table is generated, only one code table is provided, and the code tables do not need to be respectively designed according to different code lengths, so that resources consumed in the decoding process are reduced, and meanwhile, the decoding table is calculated according to a target data stream, so that the decoding method can adapt to the target data stream and improve the decoding efficiency.
Based on the above embodiments corresponding to fig. 1 and fig. 2, an embodiment of the present disclosure provides a decoding apparatus for performing the decoding method described in the above embodiments corresponding to fig. 1 and fig. 2, and as shown in fig. 7, the decoding apparatus 70 includes: an obtaining module 701, a code word module 702, a code table module 703 and a decoding module 704;
an obtaining module 701, configured to obtain a target bit stream, where the target bit stream includes code length information and data information, the code length information includes a code length of at least one symbol, and the data information is generated by encoding original data according to a huffman algorithm;
a codeword module 702, configured to calculate a codeword for each symbol in at least one symbol according to the code length information;
a code table module 703, configured to generate a decoding table according to the code word of each symbol;
and the decoding module 704 is configured to decode the data information according to the decoding table to obtain original data.
In one embodiment, as shown in fig. 8, the codeword module 702 includes a grouping sub-module 7021 and a calculation sub-module 7022;
a grouping submodule 7021, configured to group at least one symbol according to a code length of the at least one symbol, and group symbols having the same code length into a group;
a calculating sub-module 7022 is used to calculate the code word of each group of symbols respectively.
In one embodiment, the calculating submodule 7022 is configured to calculate a codeword of a first symbol in each group of symbols according to a normal huffman algorithm; and respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols.
In one embodiment, as shown in fig. 9, code table module 703 includes an address submodule 7031 and a mapping submodule 7032;
an address sub-module 7031 for determining the address of each symbol in each group of symbols, respectively;
and a mapping submodule 7032 for generating a decoding table according to the address of each symbol.
In one embodiment, the address submodule 7031 is configured to determine an address of a first symbol in an mth group of symbols, where m is an integer greater than 0; according to formula Kn=K1+(Pn-P0) Calculating addresses of symbols other than the first symbol, wherein KnIndicating the address of the nth symbol in the mth group of symbols, K1Ground representing the first symbolAddress, PnCode word representing the nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0.
In one embodiment, as shown in fig. 10, decoding module 704 includes a segmentation sub-module 7041, a character segment sub-module 7042, and an original data sub-module 7043;
a segmentation sub-module 7041 configured to divide the data information into at least one character segment according to the code length information;
a character segment sub-module 7042 for calculating an address of each of the at least one character segment;
and the original data sub-module 7043 is configured to decode each character segment according to the decoding table and the address of each character segment to obtain original data.
The decoding device provided by the embodiment of the disclosure directly calculates the code word of each symbol and generates the code table for the data information generated according to the huffman algorithm coding, only one code table is provided, and the code tables do not need to be respectively designed according to different code lengths, so that the resource consumed in the decoding process is reduced, and meanwhile, the decoding table is calculated according to the target data stream, so that the decoding device can adapt to the target data stream, and the decoding efficiency is improved.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (2)

1. A method of decoding, the method comprising:
acquiring a target bit stream, wherein the target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a Huffman algorithm;
calculating a codeword for each symbol of the at least one symbol according to the code length information;
generating a decoding table according to the code word of each symbol;
decoding the data information according to the decoding table to obtain the original data;
wherein calculating a codeword for each symbol of the at least one symbol according to the code length information comprises: grouping the at least one symbol according to the code length of the at least one symbol, and grouping the symbols with the same code length into a group;
respectively calculating the code word of the first symbol in each group of symbols according to a normal Huffman algorithm;
respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols;
generating a decoding table according to the code word of each symbol, wherein the decoding table comprises: determining an address of each symbol in each set of symbols, respectively;
generating the decoding table according to the address of each symbol;
said separately determining an address of each symbol in said each set of symbols comprises: determining the address of the first symbol in the mth group of symbols, wherein m is an integer greater than 0;
according to formula Kn=K1+(Pn-P0) Calculating the addresses of the symbols other than the first symbol, wherein KnRepresenting the address of the nth symbol of said mth group of symbols, K1An address, P, representing said first symbolnCode word representing said nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0;
the decoding the data information according to the decoding table to obtain the original data includes:
dividing the data information into at least one character segment according to the code length information;
calculating an address of each of the at least one character segment;
and decoding each character segment according to the decoding table and the address of each character segment to obtain the original data.
2. A decoding apparatus, characterized in that the decoding apparatus comprises:
the device comprises an acquisition module, a decoding module and a decoding module, wherein the acquisition module is used for acquiring a target bit stream, the target bit stream comprises code length information and data information, the code length information comprises the code length of at least one symbol, and the data information is generated by encoding original data according to a Huffman algorithm;
a code word module, configured to calculate a code word of each symbol in the at least one symbol according to the code length information;
the code table module is used for generating a decoding table according to the code word of each symbol;
the decoding module is used for decoding the data information according to the decoding table to obtain the original data;
the code word module comprises a grouping submodule and a calculating submodule;
the grouping submodule is used for grouping the at least one symbol according to the code length of the at least one symbol and grouping the symbols with the same code length into a group;
the calculation submodule is used for respectively calculating the code word of the first symbol in each group of symbols according to a normal form Huffman algorithm; respectively calculating the code word of each group of symbols according to the code word of the first symbol in each group of symbols;
the code table module comprises an address submodule and a mapping submodule;
the address submodule is used for respectively determining the address of each symbol in each group of symbols;
the mapping submodule is used for generating the decoding table according to the address of each symbol;
the addressA submodule for determining an address of a first symbol in the mth group of symbols, m being an integer greater than 0; according to formula Kn=K1+(Pn-P0) Calculating the addresses of the symbols other than the first symbol, wherein KnRepresenting the address of the nth symbol of said mth group of symbols, K1An address, P, representing said first symbolnCode word representing said nth symbol, P0A codeword representing the first symbol, n being an integer greater than 0;
the decoding module comprises a segmentation submodule, a character segment submodule and an original data submodule;
the segmentation submodule is used for dividing the data information into at least one character segment according to the code length information;
the character segment submodule is used for calculating the address of each character segment in the at least one character segment;
and the original data submodule is used for decoding each character segment according to the decoding table and the address of each character segment to obtain the original data.
CN201710893279.4A 2017-09-28 2017-09-28 Decoding method and device Active CN107682018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710893279.4A CN107682018B (en) 2017-09-28 2017-09-28 Decoding method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710893279.4A CN107682018B (en) 2017-09-28 2017-09-28 Decoding method and device

Publications (2)

Publication Number Publication Date
CN107682018A CN107682018A (en) 2018-02-09
CN107682018B true CN107682018B (en) 2021-06-04

Family

ID=61137811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710893279.4A Active CN107682018B (en) 2017-09-28 2017-09-28 Decoding method and device

Country Status (1)

Country Link
CN (1) CN107682018B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1268810A (en) * 1999-03-30 2000-10-04 松下电器产业株式会社 Decoding device
CN102438145A (en) * 2011-11-22 2012-05-02 广州中大电讯科技有限公司 Image lossless compression method on basis of Huffman code

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1889366A (en) * 2006-07-13 2007-01-03 浙江大学 Hafman decoding method
CN100525450C (en) * 2007-03-13 2009-08-05 北京中星微电子有限公司 Method and device for realizing Hoffman decodeng
CN101051846A (en) * 2007-05-09 2007-10-10 上海广电(集团)有限公司中央研究院 Quick Huffman decoding method based on context
CN100578943C (en) * 2007-05-22 2010-01-06 北京中星微电子有限公司 Optimized Huffman decoding method and device
CN101741392B (en) * 2008-11-27 2013-01-09 安凯(广州)微电子技术有限公司 Huffman decoding method for fast resolving code length
KR101118089B1 (en) * 2008-12-10 2012-03-09 서울대학교산학협력단 Apparatus and system for Variable Length Decoding
TW201143306A (en) * 2010-05-19 2011-12-01 Hon Hai Prec Ind Co Ltd Method for storing information of nodes in a huffman tree and method for decoding data using an array of the huffman tree
JP5656593B2 (en) * 2010-12-07 2015-01-21 インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Apparatus and method for decoding encoded data
US9923576B2 (en) * 2014-09-16 2018-03-20 Cisco Technology, Inc. Decoding techniques using a programmable priority encoder

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1268810A (en) * 1999-03-30 2000-10-04 松下电器产业株式会社 Decoding device
CN102438145A (en) * 2011-11-22 2012-05-02 广州中大电讯科技有限公司 Image lossless compression method on basis of Huffman code

Also Published As

Publication number Publication date
CN107682018A (en) 2018-02-09

Similar Documents

Publication Publication Date Title
US10547324B2 (en) Data compression coding method, apparatus therefor, and program therefor
US11106735B2 (en) Directed graph compression
IL309374A (en) Transform coefficient coding
CN108259911A (en) A kind of OLED screen Demura lossless date-compress, decompression method
WO2021130754A1 (en) Systems and methods of data compression
GB2539966A (en) Data processing method and apparatus
CN104850619A (en) Receipt code generation method and apparatus
CN110545417B (en) Image encoding and decoding method and related device of desktop scene
KR20010014990A (en) A low disparity coding method for digital data
CN107682018B (en) Decoding method and device
KR102659349B1 (en) Apparatus and method for data compression
CN110545435B (en) Table top pixel coding method, device and storage medium based on probability model
CN111708574A (en) Instruction stream compression and decompression method and device
CN107645665B (en) WebP entropy coding method and device
JP2003264703A (en) Data encoder, data encoding method and program therefor
US11184023B1 (en) Hardware friendly data compression
CN110545437B (en) Coefficient encoding method, coefficient decoding method, electronic device, and medium
CN111836051B (en) Desktop image encoding and decoding method and related device
US20130222159A1 (en) Entropy method of binary-ternary lossless data coding
CN116566397A (en) Encoding method, decoding method, encoder, decoder, electronic device, and storage medium
US20100316116A1 (en) Processing data streams
RU2006119146A (en) TWO-COMPONENT INTEGRATION OF MESSAGES IN THE IMAGE
Wu et al. Improved angle freeman chain code using improved adaptive arithmetic coding
CN106941610B (en) Binary ROI mask coding method based on improved block coding
US10560117B2 (en) Groups of phase invariant codewords

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant