CN114429200A - Standardized Huffman coding and decoding method and neural network computing chip - Google Patents

Standardized Huffman coding and decoding method and neural network computing chip Download PDF

Info

Publication number
CN114429200A
CN114429200A CN202111639628.2A CN202111639628A CN114429200A CN 114429200 A CN114429200 A CN 114429200A CN 202111639628 A CN202111639628 A CN 202111639628A CN 114429200 A CN114429200 A CN 114429200A
Authority
CN
China
Prior art keywords
coding
character
sequence
huffman
characters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111639628.2A
Other languages
Chinese (zh)
Inventor
王秉睿
支天
郭崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Computing Technology of CAS
Original Assignee
Institute of Computing Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Computing Technology of CAS filed Critical Institute of Computing Technology of CAS
Priority to CN202111639628.2A priority Critical patent/CN114429200A/en
Publication of CN114429200A publication Critical patent/CN114429200A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/063Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/126Character encoding

Abstract

The invention provides a normalized Huffman coding method and a neural network computing chip, comprising the following steps: run-length all-zero coding is carried out on the neural network data to obtain run-length compressed data, wherein the run-length all-zero coding comprises the step of carrying out the run-length coding only on zero characters in the neural network data; and performing Huffman coding on the run-length compressed data, and adding 1 to shift left the leaf node codes in each level of nodes in the coding result from top to bottom so as to shift the leaf nodes in each level of nodes in the coding result to the left side of the tree structure, thereby generating the canonical Huffman coding of the coding result as the compression result of the neural network data.

Description

Standardized Huffman coding and decoding method and neural network computing chip
Technical Field
The invention relates to the field of neural network operation, in particular to a normalized Huffman coding and decoding method and a neural network computing chip.
Background
Entropy coding is a branch of lossless compression algorithm, and no information is lost in the process of compression and decompression according to the principle of entropy, wherein the entropy of information represents the measure of uncertainty of a signal source. There are many kinds of entropy coding algorithms, and two representative ones are mainly described herein: huffman coding and arithmetic coding.
Huffman Coding, first proposed by Huffman in 1952, is a variable length Coding based on an optimal binary tree (also called Huffman tree) structure. The main idea is that each character (literal) in the sequence to be coded is used as a leaf node of a binary tree, thereby satisfying the property of Prefix Coding (Prefix Coding), that is, the code of each character is not the code Prefix of any other character, thereby reducing the complexity of decoding.
In the specific implementation, the huffman coding needs to perform word frequency statistics on each character, continuously combines two child nodes with the minimum frequency to serve as the frequency of a parent node of the child nodes, and recurses until a root node is generated. The process of growing a binary tree is also the process of generating a Huffman code table, and the code of each character corresponds to the path from the root node to one leaf node in the Huffman tree, 0 is added to the left branch, and 1 is added to the right branch. FIG. 1 shows a Huffman tree corresponding to the sequence { AABBCCCDDDEEE }, and FIG. 2 shows the encoding result corresponding to this Huffman tree.
From the above construction process of the huffman tree, it can be seen that the coding has two characteristics:
(1) since the higher the word frequency, the smaller the character encoding length, the optimal weighted encoding length of the entire character sequence is obtained, and thus huffman encoding is considered as entropy encoding with the optimal compression ratio.
(2) The huffman tree constructed by the same character sequence is not unique, for example, the parent nodes of characters a and B are equal in frequency to the leaf nodes corresponding to character D, so that the huffman coding has many variations, and the hardware overhead of different implementations may be very different.
Disclosure of Invention
Specifically, the invention provides a normalized Huffman coding method, which comprises the following steps:
step 1, obtaining Huffman tree data to be normalized;
and 2, performing 1 left shift on leaf node codes in each level of nodes in the Huffman tree data from top to bottom to shift the leaf nodes in each level of nodes in the Huffman tree data to the left side of a tree structure, and generating and storing the canonical coding data of the Huffman tree data.
The normalized huffman coding method, wherein the huffman tree data is a character sequence, the step 2 comprises:
step 21, arranging all characters to be coded in the character sequence according to the descending order of the occurrence frequency to form a first table;
step 22, extracting all the effective code lengths related in the character sequence, and arranging all the effective code lengths in an ascending order to form a second table;
step 23, extracting the sorting subscript of the character corresponding to the last code word of each effective code length in the character sequence in the first table to form a third table;
step 24, extracting the numerical value of the last code word of each effective code length in the character sequence to form a fourth table;
step 25, subtracting the third table from the fourth table to form a fifth table;
step 26, sequentially extracting the characters in the character sequence as current characters, and obtaining the sequence of the current characters by accessing the first table; accessing the third table to obtain a first item subscript index which is greater than or equal to the sequence; accessing the fifth table and the second table to obtain a base value and a code length corresponding to the subscript, adding the base value and the sequence as a code word numerical value of the current character, and expressing the code word numerical value as the standard code of the current character by using the code length;
and 27, collecting the standard codes of all the characters in the character sequence according to the sequence to obtain the standard coded data.
The normalized Huffman decoding method comprises the following steps:
step S1, acquiring the canonical coding data to be decoded, traversing the fourth table and the second table by the subscript accumulated from 0 until the condition that the value in the fourth table is greater than or equal to the front len bit of the code stream is met, and acquiring the current codeword value, the subscript value, the len value and the value of which the subscript in the fourth table is 0, wherein the len value is the effective code length in the second table;
and step S2, accessing the fifth table, subtracting the numerical value in the fifth table corresponding to the subscript value from the numerical value of the current code word to obtain a character sequence, and accessing the first table according to the character sequence to obtain the characters corresponding to the character sequence.
The invention also provides a neural network computing chip based on the normalized Huffman coding, which comprises the following components: an input circuit, an arithmetic circuit, and a storage circuit; the operational circuit comprises a main circuit and a slave circuit;
an input circuit for acquiring neural network data;
the operation circuit responds to a quantization instruction and performs run-length all-zero coding on the neural network data to obtain run-length compressed data, wherein the run-length all-zero coding comprises the step of performing the run-length coding on only zero characters in the neural network data; performing Huffman coding on the run-length compressed data, and performing plus-1 left shift on leaf node codes in nodes at each level in a coding result from top to bottom so as to shift the leaf nodes in the nodes at each level in the coding result to the left side of a tree structure, and generating a standard Huffman coding of the coding result as a compression result of the neural network data;
the storage circuit is used for storing the compression result.
The computing circuit is used for arranging all characters to be coded in the character sequence according to the descending order of the occurrence frequency to form a first table;
extracting all effective code lengths related in the character sequence, and arranging all effective code lengths in an ascending order to form a second table;
extracting the sorting subscript of the character corresponding to the last code word of each effective code length in the character sequence in the first table to form a third table;
extracting the numerical value of the last code word of each effective code length in the character sequence to form a fourth table;
subtracting the third table from the fourth table to form a fifth table;
sequentially extracting characters in the character sequence as current characters, and obtaining the sequence of the current characters by accessing the first table; accessing the third table to obtain the index of the first item greater than or equal to the sequence; accessing the fifth table and the second table to obtain a base value and a code length corresponding to the subscript, adding the base value and the sequence to be used as a code word numerical value of the current character, and representing the code word numerical value as the standard code of the current character by using the code length;
and collecting the standard codes of all characters in the character sequence according to the sequence to obtain the standard Huffman codes.
The invention also provides a storage medium for storing the program of any one normalized Huffman coding method.
According to the scheme, the invention has the advantages that:
1. aiming at the characteristic that the quantized neural network data has sparsity, the run-length coding is improved, run-length all-zero coding is provided, and the neural network data can be efficiently and losslessly compressed;
2. the run-length all-zero coding of the invention comprises second-order character replacement, further improves the compression efficiency, reduces the number of 0 in the data and reserves more compression spaces for the subsequent Huffman coding;
3. the Huffman tree is reformed from top to bottom, a complete Huffman tree structure is saved, and the complexity of table look-up operation is obviously reduced.
Drawings
FIG. 1 is an exemplary diagram of a Huffman tree;
FIG. 2 is a table diagram of the corresponding encoding results of the Huffman tree;
fig. 3 and 4 are exemplary diagrams of normalized huffman trees.
Detailed Description
The invention provides a normalized Huffman coding method, which comprises the following steps:
step 1, obtaining Huffman tree data to be normalized;
and 2, performing 1 left shift on leaf node codes in each level of nodes in the Huffman tree data from top to bottom to shift the leaf nodes in each level of nodes in the Huffman tree data to the left side of a tree structure, and generating and storing the canonical coding data of the Huffman tree data.
The normalized huffman coding method, wherein the huffman tree data is a character sequence, the step 2 comprises:
step 21, arranging all characters to be coded in the character sequence according to the descending order of the occurrence frequency to form a first table;
step 22, extracting all the effective code lengths related in the character sequence, and arranging all the effective code lengths in an ascending order to form a second table;
step 23, extracting the sorting subscripts of the characters corresponding to the last code word of each effective code length in the character sequence in the first table to form a third table;
step 24, extracting the numerical value of the last code word of each effective code length in the character sequence to form a fourth table;
step 25, subtracting the third table from the fourth table to form a fifth table;
step 26, sequentially extracting the characters in the character sequence as current characters, and obtaining the sequence of the current characters by accessing the first table; accessing the third table to obtain a first item subscript index which is greater than or equal to the sequence; accessing the fifth table and the second table to obtain a base value and a code length corresponding to the subscript, adding the base value and the sequence as a code word numerical value of the current character, and expressing the code word numerical value as the standard code of the current character by using the code length;
and 27, collecting the standard codes of all the characters in the character sequence according to the sequence to obtain the standard coded data.
The normalized Huffman decoding method comprises the following steps:
step S1, acquiring the canonical coding data to be decoded, traversing the fourth table and the second table by the subscript accumulated from 0 until the condition that the value in the fourth table is greater than or equal to the front len bit of the code stream is met, and acquiring the current codeword value, the subscript value, the len value and the value of which the subscript in the fourth table is 0, wherein the len value is the effective code length in the second table;
and step S2, accessing the fifth table, subtracting the numerical value in the fifth table corresponding to the subscript value from the numerical value of the current code word to obtain a character sequence, and accessing the first table according to the character sequence to obtain the characters corresponding to the character sequence.
In order to make the aforementioned features and effects of the present invention more comprehensible, embodiments accompanied with figures are described in detail below.
And (5) normalizing Huffman coding. The huffman codes corresponding to the same data distribution are not unique, because the nodes with the same frequency may appear in the process of constructing the huffman tree, and in order to better apply the huffman codes, a fixed and efficient method for constructing the huffman tree is needed, and the unique huffman codes generated by the method are called normalized huffman codes.
Huffman tree reformation. In particular, the present invention employs HSF coding as the normalized huffman coding. The main idea of the coding is to reform the Huffman tree from top to bottom, leaf nodes in the same level of nodes are preferentially moved to the left side of the binary tree, all codes can be obtained by adding 1 and adding 1 for left movement on the premise of not changing frequency distribution, and then complex binary tree traversal or complete table look-up operation is replaced by comparison and addition operation, so that storage and calculation cost required by coding and decoding is greatly reduced. As shown in fig. 3, fig. 4 and table 1, given the character sequences { u1, u2, u3, u4, u5} and any two sets of huffman trees, their normalized forms can be found using the above-described reconstruction method.
Table 1 normalized huffman coding example
Figure BDA0003443075710000051
Figure BDA0003443075710000061
Redefining the code table. It can be seen that each huffman tree corresponds to its normalized form, and the normalized codewords have a hardware-friendly operation rule: the same code length +1, and the left shift of 1 bit after different code length +1, we can use this rule to redefine the code table, thus completely saving the whole Huffman tree structure, and significantly reducing the complexity of table lookup operation.
Taking the above-mentioned code sequence (a) as an example, firstly, the encoding and decoding process needs the following code tables:
(1) CharTable (first table), all the characters to be coded are arranged according to the descending order of the occurrence frequency, i.e., { u1, u4, u5, u2, u3} in example (a), and the coding and decoding can be multiplexed.
(2) LenTable (second table), the ascending order of all effective code lengths, the HSF coding in example (a) is only 2bit and 3bit, the corresponding LenTable is {2,3}, and the coding and decoding can be multiplexed.
(3) In the RangeTable (third table), the character corresponding to the last codeword of each effective code length is the ordering index in the CharTable, and in the case (a), the last codewords of 2bit and 3bit correspond to u5 and u3, respectively, so the RangeTable is {2,4}, and is only used in the encoding stage.
(4) The value of the last codeword of each effective code length, i.e. the last codewords of 2 bits and 3 bits in example (a), is 10 and 111, respectively, so the limit table is {2,7}, and is only used in the decoding stage.
(5) The BaseTable (fifth table), the limiter table minus the RangeTable, the BaseTable in example (a) is {0,3}, and the codec can be multiplexed.
On the basis, the flow of generating the HSF code of the character u4 and the above code table in the given example (a) can be divided into the following three steps:
lookup, visit CharTable, get rank of u4, i.e. rank (u4) ═ 1;
compare, access RangeTable, get the first item index greater than or equal to rank (u4), since rank (u4) ≦ 2, index (u4) is 0;
add, access BaseTable and LenTable to obtain base value base and code length len corresponding to subscript index, the sum of base and rank is the numerical value of the codeword, and HSF coding of the character u4 can be obtained by combining len: since base (u4) is 0, len (u4) is 2, and code (u4) is 0+1 is 1, the final encoding result is 01.
Correspondingly, we can derive the decoding flow for parsing the first character u4 from a series of HSF code streams such as 01 xxx:
the Compare accesses the Limit Table and the LenTable, and the index accumulated from 0 is traversed until the limit is more than or equal to the front len bit of the code stream, because the limit value of 0 in the Limit Table is more than or equal to the front 2bit of the code stream, the index is 0, the limit is 2, the len is 2, and the code is 1;
sub, accessing the BaseTable, and subtracting a base value corresponding to the index from the code value to obtain a character sequence, i.e. rank is 1-0 is 1;
and looking up, accessing the CharTable to obtain the characters corresponding to the rank, so that the final decoding result is the item with the rank of 1 in the CharTable, namely the character u 4.
The introduction of the standardized Huffman coding algorithm shows that the HSF coding can simplify the storage and operation structure of the Huffman coding, and meanwhile, the coding and decoding processes are naturally divided into 3-level pipelines, so that an efficient and reasonable implementation scheme is provided for hardware deployment.
The normalized Huffman coding is provided, and through Huffman tree reforming and code table redefinition, an entropy coding with very simplified storage and calculation is designed for further compressing the character after run-length all-zero coding.
The invention also provides a neural network computing chip based on the normalized Huffman coding, which comprises the following components: an input circuit, an arithmetic circuit, and a storage circuit; the operational circuit comprises a main circuit and a slave circuit;
an input circuit for acquiring neural network data;
the operation circuit responds to a quantization instruction and carries out run-length all-zero coding on the neural network data to obtain run-length compressed data, wherein the run-length all-zero coding comprises the step of carrying out the run-length coding on only zero characters in the neural network data; performing Huffman coding on the run-length compressed data, and performing plus-1 left shift on leaf node codes in nodes at each level in a coding result from top to bottom so as to shift the leaf nodes in the nodes at each level in the coding result to the left side of a tree structure, and generating a standard Huffman coding of the coding result as a compression result of the neural network data;
the storage circuit is used for storing the compression result.
The computing circuit is used for arranging all characters to be coded in the character sequence according to the descending order of the occurrence frequency to form a first table;
extracting all effective code lengths related in the character sequence, and arranging all effective code lengths in an ascending order to form a second table;
extracting the sorting subscript of the character corresponding to the last code word of each effective code length in the character sequence in the first table to form a third table;
extracting the numerical value of the last code word of each effective code length in the character sequence to form a fourth table;
subtracting the third table from the fourth table to form a fifth table;
sequentially extracting characters in the character sequence as current characters, and obtaining the sequence of the current characters by accessing the first table; accessing the third table to obtain the index of the first item greater than or equal to the sequence; accessing the fifth table and the second table to obtain a base value and a code length corresponding to the subscript, adding the base value and the sequence as a code word numerical value of the current character, and expressing the code word numerical value as the standard code of the current character by using the code length;
and collecting the standard codes of all characters in the character sequence according to the sequence to obtain the standard Huffman codes.

Claims (6)

1. A normalized Huffman coding method, comprising:
step 1, obtaining Huffman tree data to be normalized;
and 2, performing 1 left shift on leaf node codes in each level of nodes in the Huffman tree data from top to bottom to shift the leaf nodes in each level of nodes in the Huffman tree data to the left side of a tree structure, and generating and storing the canonical coding data of the Huffman tree data.
2. The normalized huffman coding method of claim 1, wherein the huffman tree data is a character sequence, the step 2 comprises:
step 21, arranging all characters to be coded in the character sequence according to the descending order of the occurrence frequency to form a first table;
step 22, extracting all the effective code lengths related in the character sequence, and arranging all the effective code lengths in an ascending order to form a second table;
step 23, extracting the sorting subscript of the character corresponding to the last code word of each effective code length in the character sequence in the first table to form a third table;
step 24, extracting the numerical value of the last code word of each effective code length in the character sequence to form a fourth table;
step 25, subtracting the third table from the fourth table to form a fifth table;
step 26, sequentially extracting the characters in the character sequence as current characters, and obtaining the sequence of the current characters by accessing the first table; accessing the third table to obtain the index of the first item greater than or equal to the sequence; accessing the fifth table and the second table to obtain a base value and a code length corresponding to the subscript, adding the base value and the sequence as a code word numerical value of the current character, and expressing the code word numerical value as the standard code of the current character by using the code length;
and 27, collecting the standard codes of all the characters in the character sequence according to the sequence to obtain the standard coded data.
3. A normalized huffman decoding method according to claim 2, comprising:
step S1, acquiring the canonical coding data to be decoded, traversing the fourth table and the second table by the subscript accumulated from 0 until the condition that the value in the fourth table is greater than or equal to the front len bit of the code stream is met, and acquiring the current codeword value, the subscript value, the len value and the value of which the subscript in the fourth table is 0, wherein the len value is the effective code length in the second table;
and step S2, accessing the fifth table, subtracting the numerical value in the fifth table corresponding to the subscript value from the numerical value of the current code word to obtain a character sequence, and accessing the first table according to the character sequence to obtain the characters corresponding to the character sequence.
4. A neural network computing chip based on normalized Huffman coding is characterized by comprising: an input circuit, an arithmetic circuit, and a storage circuit; the operational circuit comprises a main circuit and a slave circuit;
an input circuit for acquiring neural network data;
the operation circuit responds to a quantization instruction and performs run-length all-zero coding on the neural network data to obtain run-length compressed data, wherein the run-length all-zero coding comprises the step of performing the run-length coding on only zero characters in the neural network data; performing Huffman coding on the run-length compressed data, and performing plus-1 left shift on leaf node codes in nodes at each level in a coding result from top to bottom so as to shift the leaf nodes in the nodes at each level in the coding result to the left side of a tree structure, and generating a standard Huffman coding of the coding result as a compression result of the neural network data;
the storage circuit is used for storing the compression result.
5. The neural network computing chip based on normalized Huffman coding of claim 4, wherein the coding result is a character sequence, the arithmetic circuit is used for arranging all characters to be coded in the character sequence according to the descending order of the occurrence frequency to form a first table;
extracting all effective code lengths related in the character sequence, and arranging all effective code lengths in an ascending order to form a second table;
extracting the sequencing subscript of the character corresponding to the last code word of each effective code length in the character sequence in the first table to form a third table;
extracting the numerical value of the last code word of each effective code length in the character sequence to form a fourth table;
subtracting the third table from the fourth table to form a fifth table;
sequentially extracting characters in the character sequence as current characters, and obtaining the sequence of the current characters by accessing the first table; accessing the third table to obtain the index of the first item greater than or equal to the sequence; accessing the fifth table and the second table to obtain a base value and a code length corresponding to the subscript, adding the base value and the sequence as a code word numerical value of the current character, and expressing the code word numerical value as the standard code of the current character by using the code length;
and collecting the standard codes of all characters in the character sequence according to the sequence to obtain the standard Huffman codes.
6. A storage medium storing a program for executing the normalized huffman coding method of any one of claims 1 to 3.
CN202111639628.2A 2021-12-29 2021-12-29 Standardized Huffman coding and decoding method and neural network computing chip Pending CN114429200A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111639628.2A CN114429200A (en) 2021-12-29 2021-12-29 Standardized Huffman coding and decoding method and neural network computing chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111639628.2A CN114429200A (en) 2021-12-29 2021-12-29 Standardized Huffman coding and decoding method and neural network computing chip

Publications (1)

Publication Number Publication Date
CN114429200A true CN114429200A (en) 2022-05-03

Family

ID=81310817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111639628.2A Pending CN114429200A (en) 2021-12-29 2021-12-29 Standardized Huffman coding and decoding method and neural network computing chip

Country Status (1)

Country Link
CN (1) CN114429200A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116489369A (en) * 2023-06-26 2023-07-25 深圳市美力高集团有限公司 Driving digital video compression processing method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116489369A (en) * 2023-06-26 2023-07-25 深圳市美力高集团有限公司 Driving digital video compression processing method
CN116489369B (en) * 2023-06-26 2023-09-08 深圳市美力高集团有限公司 Driving digital video compression processing method

Similar Documents

Publication Publication Date Title
US10979071B2 (en) Systems and methods for variable length codeword based, hybrid data encoding and decoding using dynamic memory allocation
KR100894002B1 (en) Device and data method for selective compression and decompression and data format for compressed data
CN104365099B (en) For the entropy code of transform coefficient levels and the parameter updating method of entropy decoding and the entropy code device and entropy decoding device of the transform coefficient levels for using this method
CN116681036B (en) Industrial data storage method based on digital twinning
JP3989485B2 (en) Method and apparatus for encoding and decoding binary states and corresponding computer program and corresponding computer readable information storage medium
CN107465928A (en) Context initialization in entropy code
CN103067022A (en) Nondestructive compressing method, uncompressing method, compressing device and uncompressing device for integer data
KR20110007865A (en) Data compression method
US5694128A (en) Tree structured binary arithmetic coder
CN107565970B (en) Hybrid lossless compression method and device based on feature recognition
CN114429200A (en) Standardized Huffman coding and decoding method and neural network computing chip
US6055273A (en) Data encoding and decoding method and device of a multiple-valued information source
KR101023536B1 (en) Lossless data compression method
JPH11340838A (en) Coder and decoder
CN106537914A (en) Method and apparatus for performing arithmetic coding by limited carry operation
Mathpal et al. A research paper on lossless data compression techniques
CN116961672A (en) Lossless data compression method based on transducer encoder
CN104682966B (en) The lossless compression method of table data
Shoba et al. A Study on Data Compression Using Huffman Coding Algorithms
KR20160100496A (en) Improved huffman code method and apprartus thereof by using binary clusters
US7193542B2 (en) Digital data compression robust relative to transmission noise
Ezhilarasan et al. A new entropy encoding technique for multimedia data compression
US8754791B1 (en) Entropy modifier and method
US9843341B1 (en) Methods and devices for sparse data compression through dimension coding
Rani et al. A survey on lossless text data compression techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination