CN113271109A - Iterative cycle data storage method and system in LDPC decoding process - Google Patents

Iterative cycle data storage method and system in LDPC decoding process Download PDF

Info

Publication number
CN113271109A
CN113271109A CN202110418530.8A CN202110418530A CN113271109A CN 113271109 A CN113271109 A CN 113271109A CN 202110418530 A CN202110418530 A CN 202110418530A CN 113271109 A CN113271109 A CN 113271109A
Authority
CN
China
Prior art keywords
node confidence
variable node
vector
coefficient vector
confidence coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110418530.8A
Other languages
Chinese (zh)
Inventor
刘扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Mengxin Technology Co ltd
Original Assignee
Wuhan Mengxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Mengxin Technology Co ltd filed Critical Wuhan Mengxin Technology Co ltd
Priority to CN202110418530.8A priority Critical patent/CN113271109A/en
Publication of CN113271109A publication Critical patent/CN113271109A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H03ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M13/00Coding, decoding or code conversion, for error detection or error correction; Coding theory basic assumptions; Coding bounds; Error probability evaluation methods; Channel models; Simulation or testing of codes
    • H03M13/03Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words
    • H03M13/05Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words using block codes, i.e. a predetermined number of check bits joined to a predetermined number of information bits
    • H03M13/11Error detection or forward error correction by redundancy in data representation, i.e. code words containing more digits than the source words using block codes, i.e. a predetermined number of check bits joined to a predetermined number of information bits using multiple parity bits
    • H03M13/1102Codes on graphs and decoding on graphs, e.g. low-density parity check [LDPC] codes
    • H03M13/1105Decoding

Landscapes

  • Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Error Detection And Correction (AREA)

Abstract

The invention relates to a method and a system for storing iteration cycle data in an LDPC decoding process, wherein the method comprises the steps that after odd iterations, V2C is stored in a cache space according to a preset storage format, and V2C stored in the cache space is read and C2V is updated according to the storage format; updating the V2C and the hard judgment result according to the updated C2V, and storing the updated V2C and the hard judgment result into a cache space in an inverted order; after even iterations, storing the V2C in a cache space according to a storage format opposite to the preset storage format, reading the V2C in the cache space, performing reverse-order processing on the read sequence number of the V2C, and updating the C2V; and updating the V2C and the hard judgment result according to the updated C2V, and storing the updated V2C and the hard judgment result into the cache space in a positive sequence. The invention can enable the updating processes of C2V and V2C to run in parallel, thereby greatly improving the efficiency of hardware decoding; at the same time, the parallelized architecture may allow C2V to no longer require large chunks of cache space.

Description

Iterative cycle data storage method and system in LDPC decoding process
Technical Field
The invention relates to the field of satellite communication, in particular to a method and a system for storing iteration cycle data in an LDPC decoding process.
Background
The B-CNAV1 navigation message is broadcast in a B1C signal, subframe 2 of which is encoded using 64-ary LDPC (200,100), each codeword symbol of which is composed of 6 bits, defined in a finite field of primitive polynomials. The information length k is 100 codeword symbols, i.e. 600 bits. The check matrix is a 100 × 200 sparse matrix, the first 100 × 100 part corresponds to information symbols, and the last 100 × 100 part corresponds to check symbols. The B-CNAV2 navigation messages are broadcast in the B2a signal, which is encoded using 64-ary LDPC (96, 48). The information length k is 48 codeword symbols, i.e., 288 bits. The check matrix is a 48 × 96 sparse matrix, the first 48 × 48 part corresponds to information symbols, and the second 48 × 48 part corresponds to check symbols. The B-CNAV3 navigation messages are broadcast in the B2B signal, the messages being encoded using the 64-ary LDPC (162, 81). The information length k is 81 codeword symbols, i.e., 486 bits. The check matrix is an 81 × 162 sparse matrix, the first 81 × 81 part corresponds to information symbols, and the second 81 × 81 part corresponds to check symbols.
The code rate of the three LDPC codes with different lengths is 0.5, so that the three LDPC codes can share the same decoding module during implementation, and only different parameters, such as code length, a check matrix H and the like, need to be configured.
The extended min-sum algorithm (EMS) is a commonly used iterative-based belief propagation decoding algorithm to estimate the transmitted codeword c. Code word c ═ c generated by multi-system LDPC coding0,c1,...,cn-1) After channel transmission, the receiving end obtains the receiving sequence y ═ y (y)0,y1,...,yn-1). Wherein y isj=(yj,0,yj,1,...,yj,r-1) Is a code character number cjCorresponding received information, cj∈GF(q),q=2rJ is more than or equal to 0 and less than n. If the LDPC code is 64-system LDPC, r is 6 and q is 64.
The traditional LDPC algorithm needs multiple steps of serially iterating check node confidence vectors C2V, variable node confidence vectors V2C and the like to obtain a final decoding result, and the process consumes a large amount of time. Meanwhile, due to the serial operation structure, intermediate cache results of the check node confidence vector C2V and the variable node confidence vector V2C need to be stored separately, and a large amount of hardware resources are occupied.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method and a system for storing iteration cycle data in an LDPC decoding process, so that the updating processes of a check node confidence coefficient vector C2V and a variable node confidence coefficient vector V2C can be operated in parallel, and the hardware decoding efficiency is greatly improved; meanwhile, the parallelized architecture may allow the check node confidence vector C2V to eliminate the need for large blocks of cache space, thereby reducing chip area.
The technical scheme for solving the technical problems is as follows: an iteration cycle data storage method in the LDPC decoding process is characterized in that a variable node confidence coefficient vector V2C, code word calculation and one round of updating of a check node confidence coefficient vector C2V in the LDPC decoding process are called as one iteration; the iteration times in the LDPC decoding process are divided into odd iteration times and even iteration times, the iteration cycle data storage method comprises the following steps,
s1, after odd iterations, the variable node confidence coefficient vector V2C is stored in the variable node confidence coefficient vector V2C cache space according to a preset storage format, and the next iteration is performed after the odd iterations;
s2, based on the S1, reading the variable node confidence coefficient vector V2C stored in the variable node confidence coefficient vector V2C cache space, and updating the check node confidence coefficient vector C2V according to the read variable node confidence coefficient vector V2C stored in the variable node confidence coefficient vector V2C cache space;
s3, updating the variable node confidence coefficient vector V2C and the hard judgment result according to the updated check node confidence coefficient vector C2V, and storing the updated variable node confidence coefficient vector V2C and the hard judgment result into a variable node confidence coefficient vector V2C cache space in an inverted sequence to finish single iteration, wherein at the moment, the storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space is opposite to the preset storage format;
s4, after even iterations, the variable node confidence coefficient vector V2C is stored in the variable node confidence coefficient vector V2C cache space according to a storage format opposite to the preset storage format, and the next iteration is performed after the even iterations;
s5, based on the S4, reading the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space, performing reverse order processing on the sequence number of the variable node confidence coefficient vector V2C in the read variable node confidence coefficient vector V2C cache space, and updating the check node confidence coefficient vector C2V according to the variable node confidence coefficient vector V2C subjected to sequence number reverse order processing;
and S6, updating the variable node confidence coefficient vector V2C and the hard judgment result according to the updated check node confidence coefficient vector C2V, and storing the updated variable node confidence coefficient vector V2C and the hard judgment result into the variable node confidence coefficient vector V2C cache space in a positive sequence to finish single iteration, wherein at the moment, the storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space is the same as the preset storage format.
Based on the method for storing the iteration cycle data in the LDPC decoding process, the invention also provides a system for storing the iteration cycle data in the LDPC decoding process.
An iteration cycle data storage system in the LDPC decoding process refers to one round of updating of a variable node confidence coefficient vector V2C, code word calculation and a check node confidence coefficient vector C2V in the LDPC decoding process as one iteration; the iteration times in the LDPC decoding process are divided into odd iteration times and even iteration times, the iteration cycle data storage system comprises the following modules,
the variable node confidence coefficient vector V2C cache space module is used for storing and storing the variable node confidence coefficient vector V2C according to a preset storage format after odd iterations;
a check node confidence vector C2V updating module for reading the variable node confidence vector V2C stored in the variable node confidence vector V2C cache space module during an even number of iterations, and updating the check node confidence vector C2V according to the read variable node confidence vector V2C stored in the variable node confidence vector V2C cache space module;
a variable node confidence coefficient vector C2V updating module, configured to update the variable node confidence coefficient vector V2C and the hard decision result according to the updated check node confidence coefficient vector C2V, and store the updated variable node confidence coefficient vector V2C and the hard decision result in the variable node confidence coefficient vector V2C cache space module in an inverted order, so as to complete a single iteration, where at this time, a storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module is opposite to the preset storage format;
the variable node confidence coefficient vector V2C cache space module is further used for storing the variable node confidence coefficient vector V2C according to a storage format opposite to the preset storage format after even iterations;
a check node confidence coefficient vector C2V updating module, which is further configured to, in an odd number of iterations, read the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module, perform reverse order processing on the sequence number of the variable node confidence coefficient vector V2C in the read variable node confidence coefficient vector V2C cache space module, and update the check node confidence coefficient vector C2V according to the variable node confidence coefficient vector V2C subjected to sequence number reverse order processing;
and the variable node confidence coefficient vector C2V updating module is further configured to update the variable node confidence coefficient vector V2C and the hard decision result according to the updated check node confidence coefficient vector C2V, and store the updated variable node confidence coefficient vector V2C and the hard decision result in the variable node confidence coefficient vector V2C cache space module in a positive order to complete a single iteration, where a storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module is the same as the preset storage format.
Based on the method for storing the iteration cycle data in the LDPC decoding process, the invention also provides a computer storage medium.
A computer storage medium comprising a memory and a computer program stored in the memory, the computer program when executed by a processor implementing the iterative cycle data storage method in an LDPC decoding process as described above.
The invention has the beneficial effects that: aiming at the characteristic that the row repetition of an LDPC check matrix in a GNSS system is 2, the data storage is optimized in a targeted manner in the LDPC decoding process, so that the updating processes of a check node confidence coefficient vector C2V and a variable node confidence coefficient vector V2C can run in parallel, the algorithm running time is reduced, and the hardware decoding efficiency is greatly improved; meanwhile, the parallelized architecture may allow the check node confidence vector C2V to eliminate the need for large blocks of cache space, thereby reducing chip area.
Drawings
FIG. 1 is an overall flow diagram of an EMS decoding algorithm;
FIG. 2 is a flow chart of an iterative cycle data storage method in an LDPC decoding process according to the present invention;
FIG. 3 is a block diagram of an iterative cycle data storage system in an LDPC decoding process according to the present invention;
fig. 4 is a hardware implementation architecture diagram of an iterative cycle data storage system in an LDPC decoding process according to the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
In the EMS decoding algorithm process, the check matrix H gives the connection relation between the LDPC code check nodes and the variable nodes, and the confidence information can be transmitted between the check nodes and the variable nodes which are connected with each other. Each row in the check matrix H corresponds to one check node CN, and each column corresponds to one variable node VN. Setting the check matrix H as m rows and n columns, the row sequence number as i, and the column sequence number as j, defining the following two ordinal number sets:
Mj={i:0≤i<m,hi,j≠0},0≤j<n,Ni={j:0≤j<n,hi,j≠0},0≤i<m;
wherein h isi,jIs the element in the ith row and the jth column in the check matrix H.
If hi,jNot equal to 0, the node CN is checkediAnd variable node VNjAnd may communicate confidence information with each other. By variable nodes VNjTo connected check nodes CNi(i∈Mj) Is denoted as V2Cj,iCan be used for calculating the check node CNiA checksum of (3); by check node CNiTo connected variable nodes VNj(j∈Ni) Is denoted as C2Vi,jCan be used for estimating variable node VNjThe sign value of (c). Iteratively updating V2C by using belief propagation decoding algorithmj,iAnd C2Vi,jCan correct the connectionAnd receiving the sequence y, thereby realizing the estimation of the transmission code word c.
Fig. 1 is an overall flowchart of an EMS coding algorithm. The EMS decoding algorithm flow is as follows:
initialization: the maximum number of iterations itr _ max is set, and the current number of iterations itr is 0. According to the corresponding receiving vector y of each code character numberjComputing a confidence vector Lj(j is more than or equal to 0 and less than n). By means of LjInitializing variable nodes VNjAll of V2Cj,iAnd (5) vector quantity. All q finite field elements x ∈ GF (q), q ═ 2rTogether with their corresponding log-likelihood ratios LLR (x) form a confidence vector Lj
Figure BDA0003026952310000061
Wherein the content of the first and second substances,
Figure BDA0003026952310000062
is the order probability P (y) in GF (q)j| x) the largest finite field element, i.e. directly on the received symbol yjAnd making hard decision elements bit by bit. Finite field element x and
Figure BDA0003026952310000063
the corresponding bit sequences are x ═ x (x) respectively0,x1,...xr-1) And
Figure BDA0003026952310000064
Figure BDA0003026952310000065
XOR is an XOR operation, i.e. if x and
Figure BDA0003026952310000066
if a certain bit is the same, then Δj,b0, otherwise Δj,b=1。
Step 1: for each variable node VNj(j is more than or equal to 0 and less than n), and calculating a judgment symbol according to a variable node updating rule
Figure BDA00030269523100000611
And confidence vector V2Cj,i
(1-a) if the current iteration number itr is 0, sorting the LLR values of q finite field elements in each codeword in ascending order, and taking the top nmOutput of each element, then to V2Cj,iInitialization is performed.
Figure BDA0003026952310000067
Figure BDA0003026952310000068
Wherein the content of the first and second substances,
Figure BDA0003026952310000069
for truncated nmA finite field element and hi,jFinite field multiplication.
(1-B) if the current iteration number itr ≠ 0, utilize VNjAll confidence vectors received C2Vf,j(f∈MjF ≠ i), calculate VNjIs transmitted to CNiConfidence vector V2Cj,i
Figure BDA00030269523100000610
The addition operation in the above equation only adds the LLR values of the same field element.
Figure BDA0003026952310000071
The operation means sorting LLR values in ascending order and truncating the top nmOne element is output, and n ismThe individual finite field elements are different from each other.
(1-C) judging, each variable node carries out judgment once in each calculation, and LLR is selectedminThe corresponding finite field element is used as a decision value.
Figure BDA0003026952310000072
Step 2: and calculating the checksum by using a check matrix H of the multi-system LDPC code.
Figure BDA0003026952310000073
If s is 0, the decision value sequence is used as decoding output and the decoding is terminated, otherwise, the step 3 is executed.
And 3, step 3: for each check node CNi(i is more than or equal to 0 and less than m), and calculating a confidence coefficient vector C2V according to the check node updating rulei,j
Figure BDA0003026952310000074
Adding finite field elements from different confidence degree vectors to obtain candidate elements, calculating corresponding LLR values, sequencing all LLR values in an ascending order, and intercepting the top nmThe smallest LLR value and its finite field elements are output. Let the input confidence vectors be (U)sU) and (Q)sQ), the output confidence vector is (V)sV), wherein U, Q and V are n in ascending ordermLong LLR vector, Us、QsAnd VsIs the corresponding finite field element vector. Build a size of nm*nmConfidence matrix M and finite field element matrix Ms
Figure BDA0003026952310000075
Figure BDA0003026952310000076
Wherein the content of the first and second substances,
Figure BDA0003026952310000077
is a finite fieldAnd (4) adding. The basic calculation formula of the check node is
Figure BDA0003026952310000078
The above formula can be realized by the pair size of nmThe register S of (a) is completed by:
initialization: the 1 st column of the confidence matrix M is stored in the register S, so that S [ ζ [ ]]=M[ζ,0],0≤ζ<nmLet ε be 0.
Step 31: the minimum value in register S is found (assuming that M d, p corresponds to the minimum value in register S at this time).
Step 32: if the minimum value of the register S corresponds to a finite field element not present in the vector VsIn (3), the minimum value in the register S is given to V [ epsilon ]]Giving the corresponding finite field element to Vs[ε]Epsilon is epsilon + 1; otherwise, no operation is performed.
Step 33: the minimum value in register S is replaced with the element to the right of its corresponding element in confidence matrix M (i.e., the minimum value in current register S is assumed to be M d, ρ + 1), which is replaced with M d, ρ + 1).
Step 34: go to step 31 until ε ═ nm
And 4, step 4: let itr be itr + 1. If itr is itr _ max, the decoding is terminated and failure is declared, otherwise step 1 is proceeded to.
The traditional LDPC algorithm needs multiple steps of serially iterating check node confidence vectors C2V, variable node confidence vectors V2C and the like to obtain a final decoding result, and the process consumes a large amount of time. Meanwhile, due to the serial operation structure, intermediate cache results of the check node confidence vector C2V and the variable node confidence vector V2C need to be stored separately, and a large amount of hardware resources are occupied.
Therefore, aiming at the characteristic that the row repetition of the LDPC check matrix in the GNSS system is 2, the data storage is optimized in a targeted manner in the LDPC decoding process, so that the updating processes of the check node confidence coefficient vector C2V and the variable node confidence coefficient vector V2C can be operated in parallel, the algorithm operation time is reduced, and the hardware decoding efficiency is greatly improved; meanwhile, the parallelized architecture may allow the check node confidence vector C2V to eliminate the need for large blocks of cache space, thereby reducing chip area.
The specific scheme of the iterative cycle data storage method in the LDPC decoding process is as follows.
As shown in fig. 2, in an LDPC decoding process, an iteration cycle data storage method refers to a round of updating of a variable node confidence vector V2C, a codeword calculation, and a check node confidence vector C2V in an LDPC decoding process as one iteration; the iteration times in the LDPC decoding process are divided into odd iteration times and even iteration times, the iteration cycle data storage method comprises the following steps,
s1, after odd iterations, the variable node confidence coefficient vector V2C is stored in the variable node confidence coefficient vector V2C cache space according to a preset storage format, and the next iteration (even iterations) is performed after the odd iterations;
s2, based on the S1, reading the variable node confidence coefficient vector V2C stored in the variable node confidence coefficient vector V2C cache space, and updating the check node confidence coefficient vector C2V according to the read variable node confidence coefficient vector V2C stored in the variable node confidence coefficient vector V2C cache space;
s3, updating the variable node confidence coefficient vector V2C and the hard judgment result according to the updated check node confidence coefficient vector C2V, and storing the updated variable node confidence coefficient vector V2C and the hard judgment result into a variable node confidence coefficient vector V2C cache space in an inverted sequence to finish single iteration, wherein at the moment, the storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space is opposite to the preset storage format; (S2 and S3 are even iterations of the process)
S4, after even iterations, storing the variable node confidence coefficient vector V2C in a variable node confidence coefficient vector V2C cache space according to a storage format opposite to the preset storage format, and performing next iteration (odd iterations) after the even iterations;
s5, based on the S4, reading the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space, performing reverse order processing on the sequence number of the variable node confidence coefficient vector V2C in the read variable node confidence coefficient vector V2C cache space, and updating the check node confidence coefficient vector C2V according to the variable node confidence coefficient vector V2C subjected to sequence number reverse order processing;
and S6, updating the variable node confidence coefficient vector V2C and the hard judgment result according to the updated check node confidence coefficient vector C2V, and storing the updated variable node confidence coefficient vector V2C and the hard judgment result into the variable node confidence coefficient vector V2C cache space in a positive sequence to finish single iteration, wherein at the moment, the storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space is the same as the preset storage format. (S5 and S6 are odd iterations)
In this embodiment, formula (3) is summarized as the initialization of the LDPC, and the present embodiment optimizes formulas (4), (5), and (7) as an iterative loop. Without loss of generality, equation (7) can be analyzed as the beginning of a cycle.
In order to ensure that the check node confidence vector C2V and the variable node confidence vector V2C can share a cache space in the parallelization process, the invention provides an improved variable node confidence vector V2C storage mode, and the storage structure can be changed according to the parity of the iteration times.
Specifically, after an odd number of iterations, the variable node confidence vector V2C is stored in the variable node confidence vector V2C cache space according to a conventional storage format (existing storage format), and the next iteration is performed after the odd number of iterations. At this time, the check node confidence vector C2V updating module reads the variable node confidence vector V2C in the ith column in the variable node confidence vector V2C buffer space, and according to the read variable node confidence vector V2C in the ith column in the variable node confidence vector V2C buffer space, calculates the result of the check node confidence vector C2V in the ith row and jth column, and outputs the result of the check node confidence vector C2V in the ith row and jth column to the variable node confidence vector V2C updating module; wherein j ∈ Ni. Due to the verificationThe node confidence vector C2V updating module does not have an independent cache space, and the variable node confidence vector V2C updating module must immediately process the output of the check node confidence vector C2V updating module, i.e. the result of the j row and k column of the variable node confidence vector V2C is calculated according to the formula (4), and k belongs to MjAnd k ≠ i. However, since the data in the jth row and kth column of the variable node confidence vector V2C may not be calculated yet and cannot be overwritten by the check node confidence vector C2V, the results are stored in reverse order in the jth row and ith column of the variable node confidence vector V2C buffer space and marked as the jth row. When the second data (i.e., the result of the jth row and jth column of the check node confidence vector C2V output by the check node confidence vector C2V updating module) required by the jth column of the module updated by the variable node confidence vector V2C arrives, the result of the jth row and jth column of the variable node confidence vector V2C and the hard-decision result are simultaneously calculated through formula (4) and formula (5), and the result of the jth row and ith column of the variable node confidence vector V2C and the hard-decision result are stored in the jth row and kth column of the variable node confidence vector V2C buffer space in an inverted sequence; repeating the steps until a single iteration is completed; at this time, the storage format of the variable node confidence vector V2C in the variable node confidence vector V2C buffer space is opposite to the preset storage format.
After even iterations, the check node confidence vector C2V update module needs to process the data sequence number when loading the variable node confidence vector V2C, so as to ensure the correctness of the calculated data, and the specific calculation process is the same as that of the odd iterations. For the variable node confidence vector V2C updating module, since the initial storage mode is reverse to the conventional mode, the calculation result can be directly stored in the corresponding position, and the storage mode of the variable node confidence vector V2C returns to normal.
In order to more clearly illustrate the calculation flow of the invention, the embodiment takes 64-system LDPC (200,100) coding as an example, 1200 soft bits are input, 600 0/1 hard bits are output after decoding, each soft bit is 8-bit signed number, and the value is-127. Each symbol corresponds to 6 soft bits, so that it can also be considered that 200 symbols are input and 100 symbols are decoded and output. Each symbol pairNumber n of initial LLR values to be storedl16, LLR offset value offset _ value 11, and n LLR value updated for each symbol in C2V and V2C stagesm16, LLR offset _ nm 11, register S length nbThe maximum number of iterations is set to 30, 8. The parameters can be dynamically modified in various ways according to algorithm requirements and scenes, and the invention is not limited to the setting of the parameters.
The decoded input is 1200 soft bits after demodulation, the soft bits are firstly subjected to hard judgment one by one to obtain 1200 0/1 decoded code words, and then verification is carried out. And if the verification is successful, outputting the first 600 bits after the current hard judgment. If the verification fails, the Lj initial value needs to be calculated, the variable node confidence coefficient vector V2C needs to be updated in sequence, the decoded code word needs to be updated, and the verification node confidence coefficient vector C2V needs to be verified and updated. One update of each of V2C, codeword calculation, and C2V is referred to as one iteration of the algorithm. And when the maximum iteration number is reached, the algorithm is unconditionally terminated regardless of whether the decoding is successful or not. The optimization strategy provided by the invention does not change the basic flow of the algorithm, and only optimizes the implementation details, so as to reduce the storage space consumption and accelerate the calculation speed.
Based on the method for storing the iteration cycle data in the LDPC decoding process, the invention also provides a system for storing the iteration cycle data in the LDPC decoding process.
As shown in fig. 3, in an LDPC decoding process, an iteration cycle data storage system refers to a variable node confidence vector V2C, a codeword calculation, and a round of updating of a check node confidence vector C2V in an LDPC decoding process as one iteration; the iteration times in the LDPC decoding process are divided into odd iteration times and even iteration times, the iteration cycle data storage system comprises the following modules,
the variable node confidence coefficient vector V2C cache space module is used for storing and storing the variable node confidence coefficient vector V2C according to a preset storage format after odd iterations;
a check node confidence vector C2V updating module for reading the variable node confidence vector V2C stored in the variable node confidence vector V2C cache space module during an even number of iterations, and updating the check node confidence vector C2V according to the read variable node confidence vector V2C stored in the variable node confidence vector V2C cache space module;
a variable node confidence coefficient vector C2V updating module, configured to update the variable node confidence coefficient vector V2C and the hard decision result according to the updated check node confidence coefficient vector C2V, and store the updated variable node confidence coefficient vector V2C and the hard decision result in the variable node confidence coefficient vector V2C cache space module in an inverted order, so as to complete a single iteration, where at this time, a storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module is opposite to the preset storage format;
the variable node confidence coefficient vector V2C cache space module is further used for storing the variable node confidence coefficient vector V2C according to a storage format opposite to the preset storage format after even iterations;
a check node confidence coefficient vector C2V updating module, which is further configured to, in an odd number of iterations, read the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module, perform reverse order processing on the sequence number of the variable node confidence coefficient vector V2C in the read variable node confidence coefficient vector V2C cache space module, and update the check node confidence coefficient vector C2V according to the variable node confidence coefficient vector V2C subjected to sequence number reverse order processing;
and the variable node confidence coefficient vector C2V updating module is further configured to update the variable node confidence coefficient vector V2C and the hard decision result according to the updated check node confidence coefficient vector C2V, and store the updated variable node confidence coefficient vector V2C and the hard decision result in the variable node confidence coefficient vector V2C cache space module in a positive order to complete a single iteration, where a storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module is the same as the preset storage format.
The check node confidence vector C2V updating module is specifically configured to, during even iterations, read the variable node confidence vector V2C from the ith column of variable node confidence vector in the cache space moduleThe quantity V2C, and the variable node confidence vector V2C in the ith column in the cache space module according to the read quantity node confidence vector V2C, calculate the result of the check node confidence vector C2V in the ith row and the jth column, and output the result of the check node confidence vector C2V in the ith row and the jth column to the variable node confidence vector C2V updating module; wherein j ∈ Ni
The variable node confidence coefficient vector C2V updating module is specifically configured to calculate a kth line result and a hard decision result of the jth line of the variable node confidence coefficient vector V2C according to the result output by the check node confidence coefficient vector C2V updating module, store the kth line result and the hard decision result of the jth line of the variable node confidence coefficient vector V2C in an inverted order into the ith line of the jth line of the variable node confidence coefficient vector V2C cache space module, and mark the jth line; wherein k ∈ MjK is not equal to i; when receiving the result of the verification node confidence vector C2V updating module outputting the result of the verification node confidence vector C2V line j, according to the result of the verification node confidence vector C2V line j output by the verification node confidence vector C2V updating module, the result of the variable node confidence vector V2C line j, line i and hard judgment result are calculated, and the result of the variable node confidence vector V2C line j, line i and hard judgment result are stored in the line j, line k of the variable node confidence vector V2C cache space module in reverse order; repeating the above process until a single iteration is completed; at this time, the storage format of the variable node confidence vector V2C in the variable node confidence vector V2C cache space module is opposite to the preset storage format.
Fig. 4 is a hardware implementation architecture diagram of an iterative cycle data storage system in an LDPC decoding process according to the present invention, and as can be seen from fig. 4, the C2V module (check node confidence vector C2V updating module) is not equipped with a buffer space, but directly transmits its output to the V2C module (variable node confidence vector V2C updating module). While the ram400x72_ V2C module (variable node confidence vector V2C cache space module) is used to store the output of the V2C module, the C2V module may load the variable node confidence vector V2C from the ram400x72_ V2C module. Other modules in fig. 4 (e.g., GF _ MUL module, ram200x50_ Lj module, SORT _6 module, INIT _ LIR module, ram200x48_ softbit module, ram400x23_ h module, LDPC _ REG _ FILE module, ram200x6_ Csym module, CHK _ SUM module, ram20x30_ Csym module, and MCU module) are used to complete the decoding work with the V2C module and the C2V module.
Based on the method for storing the iteration cycle data in the LDPC decoding process, the invention also provides a computer storage medium.
A computer storage medium comprising a memory and a computer program stored in the memory, the computer program when executed by a processor implementing the iterative cycle data storage method in an LDPC decoding process as described above.
Aiming at the characteristic that the row repetition of an LDPC check matrix in a GNSS system is 2, the data storage is optimized in a targeted manner in the LDPC decoding process, so that the updating processes of a check node confidence coefficient vector C2V and a variable node confidence coefficient vector V2C can run in parallel, the algorithm running time is reduced, and the hardware decoding efficiency is greatly improved; meanwhile, the parallelized architecture may allow the check node confidence vector C2V to eliminate the need for large blocks of cache space, thereby reducing chip area.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (9)

1. An iteration cycle data storage method in the LDPC decoding process is characterized in that a variable node confidence coefficient vector V2C, code word calculation and one round of updating of a check node confidence coefficient vector C2V in the LDPC decoding process are called as one iteration; the method is characterized in that: the iteration times in the LDPC decoding process are divided into odd iteration times and even iteration times, the iteration cycle data storage method comprises the following steps,
s1, after odd iterations, the variable node confidence coefficient vector V2C is stored in the variable node confidence coefficient vector V2C cache space according to a preset storage format, and the next iteration is performed after the odd iterations;
s2, based on the S1, reading the variable node confidence coefficient vector V2C stored in the variable node confidence coefficient vector V2C cache space, and updating the check node confidence coefficient vector C2V according to the read variable node confidence coefficient vector V2C stored in the variable node confidence coefficient vector V2C cache space;
s3, updating the variable node confidence coefficient vector V2C and the hard judgment result according to the updated check node confidence coefficient vector C2V, and storing the updated variable node confidence coefficient vector V2C and the hard judgment result into a variable node confidence coefficient vector V2C cache space in an inverted sequence to finish single iteration, wherein at the moment, the storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space is opposite to the preset storage format;
s4, after even iterations, the variable node confidence coefficient vector V2C is stored in the variable node confidence coefficient vector V2C cache space according to a storage format opposite to the preset storage format, and the next iteration is performed after the even iterations;
s5, based on the S4, reading the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space, performing reverse order processing on the sequence number of the variable node confidence coefficient vector V2C in the read variable node confidence coefficient vector V2C cache space, and updating the check node confidence coefficient vector C2V according to the variable node confidence coefficient vector V2C subjected to sequence number reverse order processing;
and S6, updating the variable node confidence coefficient vector V2C and the hard judgment result according to the updated check node confidence coefficient vector C2V, and storing the updated variable node confidence coefficient vector V2C and the hard judgment result into the variable node confidence coefficient vector V2C cache space in a positive sequence to finish single iteration, wherein at the moment, the storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space is the same as the preset storage format.
2. The method for storing iterative cycle data in the LDPC decoding process according to claim 1, wherein: setting a check matrix H in the LDPC decoding process as m rows and n columns, wherein the row sequence number is i, and the column sequence number is j, and defining the following two ordinal number sets:
Mj={i:0≤i<m,hi,j≠0},0≤j<n,Ni={j:0≤j<n,hi,j≠0},0≤i<m;
wherein h isi,jIs the element in the ith row and the jth column in the check matrix H.
3. The method for storing iterative cycle data in the LDPC decoding process according to claim 2, wherein: specifically, the step S2 is,
based on the S1, the check node confidence vector C2V updating module reads the variable node confidence vector V2C in the ith column in the variable node confidence vector V2C cache space, and according to the read variable node confidence vector V2C in the ith column in the variable node confidence vector V2C cache space, calculates the result of the check node confidence vector C2V at the ith row and the jth column, and outputs the result of the check node confidence vector C2V at the ith row and the jth column to the variable node confidence vector V2C updating module; wherein j ∈ Ni
4. The method for storing iterative cycle data in the LDPC decoding process according to claim 3, wherein: specifically, the step S3 is,
s31, the variable node confidence coefficient vector V2C updating module calculates the result of the variable node confidence coefficient vector V2C at the jth line and the kth column and the hard judgment result according to the result output by the check node confidence coefficient vector C2V updating module, and stores the result of the variable node confidence coefficient vector V2C at the jth line and the kth column and the hard judgment result in the jth line and the ith column of the jth line of the variable node confidence coefficient vector V2C cache space in an inverted sequence, and marks the jth line; wherein k ∈ Mj,k≠i;
S32, when the variable node confidence coefficient vector V2C updating module receives the result of the verification node confidence coefficient vector C2V at the line k and the column j output by the verification node confidence coefficient vector C2V updating module, the result of the variable node confidence coefficient vector V2C at the line j and the column i and the hard judgment result are calculated according to the result of the verification node confidence coefficient vector C2V at the line k and the column j output by the verification node confidence coefficient vector C2V updating module, and the result of the variable node confidence coefficient vector V2C at the line j and the column i and the hard judgment result are stored in the line j and the column k of the variable node confidence coefficient vector V2C cache space in an inverted sequence;
s33, repeating the step S32 until the single iteration is finished; at this time, the storage format of the variable node confidence vector V2C in the variable node confidence vector V2C buffer space is opposite to the preset storage format.
5. An iteration cycle data storage system in the LDPC decoding process refers to one round of updating of a variable node confidence coefficient vector V2C, code word calculation and a check node confidence coefficient vector C2V in the LDPC decoding process as one iteration; it is characterized in that the iteration times in the LDPC decoding process are divided into odd iteration times and even iteration times, the iteration cycle data storage system comprises the following modules,
the variable node confidence coefficient vector V2C cache space module is used for storing and storing the variable node confidence coefficient vector V2C according to a preset storage format after odd iterations;
a check node confidence vector C2V updating module for reading the variable node confidence vector V2C stored in the variable node confidence vector V2C cache space module during an even number of iterations, and updating the check node confidence vector C2V according to the read variable node confidence vector V2C stored in the variable node confidence vector V2C cache space module;
a variable node confidence coefficient vector C2V updating module, configured to update the variable node confidence coefficient vector V2C and the hard decision result according to the updated check node confidence coefficient vector C2V, and store the updated variable node confidence coefficient vector V2C and the hard decision result in the variable node confidence coefficient vector V2C cache space module in an inverted order, so as to complete a single iteration, where at this time, a storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module is opposite to the preset storage format;
the variable node confidence coefficient vector V2C cache space module is further used for storing the variable node confidence coefficient vector V2C according to a storage format opposite to the preset storage format after even iterations;
a check node confidence coefficient vector C2V updating module, which is further configured to, in an odd number of iterations, read the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module, perform reverse order processing on the sequence number of the variable node confidence coefficient vector V2C in the read variable node confidence coefficient vector V2C cache space module, and update the check node confidence coefficient vector C2V according to the variable node confidence coefficient vector V2C subjected to sequence number reverse order processing;
and the variable node confidence coefficient vector C2V updating module is further configured to update the variable node confidence coefficient vector V2C and the hard decision result according to the updated check node confidence coefficient vector C2V, and store the updated variable node confidence coefficient vector V2C and the hard decision result in the variable node confidence coefficient vector V2C cache space module in a positive order to complete a single iteration, where a storage format of the variable node confidence coefficient vector V2C in the variable node confidence coefficient vector V2C cache space module is the same as the preset storage format.
6. The system for storing iterative cycle data in an LDPC decoding process according to claim 5, wherein: setting a check matrix H in the LDPC decoding process as m rows and n columns, wherein the row sequence number is i, and the column sequence number is j, and defining the following two ordinal number sets:
Mj={i:0≤i<m,hi,j≠0},0≤j<n,Ni={j:0≤j<n,hi,j≠0},0≤i<m;
wherein h isi,jIs the element in the ith row and the jth column in the check matrix H.
7. The system for storing iterative cycle data in an LDPC decoding process according to claim 6, wherein: the check node confidence vector C2V update module is specifically configured to,
in the process of even iterations, reading the variable node confidence vector V2C to cache the variable node confidence vector V2C in the ith column in the space module, and reading the variable node confidence vector V2 according to the read variable node confidence vector VThe variable node confidence vector V2C in the ith column in the 2C cache space module calculates the result of the check node confidence vector C2V in the ith row and the jth column, and outputs the result of the check node confidence vector C2V in the ith row and the jth column to the variable node confidence vector C2V updating module; wherein j ∈ Ni
8. The system for storing iterative cycle data in an LDPC decoding process according to claim 7, wherein: the variable node confidence vector C2V update module is specifically configured to,
according to the result output by the check node confidence coefficient vector C2V updating module, calculating the result of the variable node confidence coefficient vector V2C at the jth line and the kth column and the hard judgment result, and storing the result of the variable node confidence coefficient vector V2C at the jth line and the kth column in the reverse order into the jth line and the ith column of the variable node confidence coefficient vector V2C cache space module, and marking the jth line; wherein k ∈ Mj,k≠i;
When a result of updating the jth row and jth column of the check node confidence vector C2V output by the module according to the check node confidence vector C2V is received, a result of the jth row and jth column of the check node confidence vector C2V output by the module is updated according to the check node confidence vector C2V, a result of the jth row and jth column of the variable node confidence vector V2C is calculated, and the result of the jth row and ith column of the variable node confidence vector V2C is stored in an inverted sequence to the jth row and kth column of the variable node confidence vector V2C cache space module until a single iteration is completed; at this time, the storage format of the variable node confidence vector V2C in the variable node confidence vector V2C cache space module is opposite to the preset storage format.
9. A computer storage medium, characterized in that: comprising a memory and a computer program stored in the memory, which when executed by a processor implements the method of iterative cycle data storage in an LDPC decoding process as claimed in any one of claims 1 to 4.
CN202110418530.8A 2021-04-19 2021-04-19 Iterative cycle data storage method and system in LDPC decoding process Pending CN113271109A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110418530.8A CN113271109A (en) 2021-04-19 2021-04-19 Iterative cycle data storage method and system in LDPC decoding process

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110418530.8A CN113271109A (en) 2021-04-19 2021-04-19 Iterative cycle data storage method and system in LDPC decoding process

Publications (1)

Publication Number Publication Date
CN113271109A true CN113271109A (en) 2021-08-17

Family

ID=77228996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110418530.8A Pending CN113271109A (en) 2021-04-19 2021-04-19 Iterative cycle data storage method and system in LDPC decoding process

Country Status (1)

Country Link
CN (1) CN113271109A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050005231A1 (en) * 2003-07-03 2005-01-06 Feng-Wen Sun Method and system for generating parallel decodable low density parity check (LDPC) codes
US20110179333A1 (en) * 2008-05-21 2011-07-21 The Regents Of The University Of California Lower-complexity layered belief propagation decoding ldpc codes
US20140281787A1 (en) * 2013-03-15 2014-09-18 Lsi Corporation Min-Sum Based Hybrid Non-Binary Low Density Parity Check Decoder
US20150326249A1 (en) * 2014-05-08 2015-11-12 Sandisk Enterprise Ip Llc Modified trellis-based min-max decoder for non-binary low-density parity-check error-correcting codes
CN108933604A (en) * 2018-02-28 2018-12-04 和芯星通科技(北京)有限公司 A kind of variable node processing method and processing device
CN109802688A (en) * 2018-12-28 2019-05-24 杭州中科微电子有限公司 A kind of m-ary LDPC decoding system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050005231A1 (en) * 2003-07-03 2005-01-06 Feng-Wen Sun Method and system for generating parallel decodable low density parity check (LDPC) codes
US20110179333A1 (en) * 2008-05-21 2011-07-21 The Regents Of The University Of California Lower-complexity layered belief propagation decoding ldpc codes
US20140281787A1 (en) * 2013-03-15 2014-09-18 Lsi Corporation Min-Sum Based Hybrid Non-Binary Low Density Parity Check Decoder
US20150326249A1 (en) * 2014-05-08 2015-11-12 Sandisk Enterprise Ip Llc Modified trellis-based min-max decoder for non-binary low-density parity-check error-correcting codes
CN108933604A (en) * 2018-02-28 2018-12-04 和芯星通科技(北京)有限公司 A kind of variable node processing method and processing device
CN109802688A (en) * 2018-12-28 2019-05-24 杭州中科微电子有限公司 A kind of m-ary LDPC decoding system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
L. ZHOU, J. SHA, Y. CHEN AND Z. WANG: "《Memory efficient EMS decoding for non-binary LDPC codes》", 《 2013 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS)》 *
何凯: "《非二进制LDPC码解码器设计研究》", 《中国硕博论文期刊》 *

Similar Documents

Publication Publication Date Title
US10567010B2 (en) Flexible polar encoders and decoders
Sarkis et al. Fast polar decoders: Algorithm and implementation
US7181676B2 (en) Layered decoding approach for low density parity check (LDPC) codes
US7373581B2 (en) Device, program, and method for decoding LDPC codes
US7631241B2 (en) Apparatus and method for decoding low density parity check codes
TWI699977B (en) Method employed in ldpc decoder and the decoder
JP4320418B2 (en) Decoding device and receiving device
CN106330203B (en) LDPC decoding method
CN106936444B (en) Set decoding method and set decoder
US20180076831A1 (en) Partial sum computation for polar code decoding
CN110741558B (en) Polarization encoder with logic three-dimensional memory, communication unit, integrated circuit and method thereof
US20220255560A1 (en) Method and apparatus for vertical layered decoding of quasi-cyclic low-density parity check codes built from clusters of circulant permutation matrices
CN109802688B (en) Multilevel LDPC decoding system and method
CN112134570A (en) Multi-mode LDPC decoder applied to deep space communication
CN112865812B (en) Multi-element LDPC decoding method, computer storage medium and computer
CN106856406B (en) Method for updating check node in decoding method and decoder
CN112953553B (en) Improved multi-system LDPC decoding method, device and medium in GNSS system
US20160049962A1 (en) Method and apparatus of ldpc encoder in 10gbase-t system
CN109245775B (en) Decoder and method for realizing decoding
CN113271109A (en) Iterative cycle data storage method and system in LDPC decoding process
CN110708077B (en) LDPC code large number logic decoding method, device and decoder
CN113285723B (en) Check node updating method, system and storage medium in LDPC decoding process
CN102136842B (en) Decoder and decoding method thereof
CN113742898A (en) LDPC decoder logic design method applied to low-earth-orbit satellite Internet system
KR101268060B1 (en) Method for encoing and decoding using unitive state-check code

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210817