Li Yingge, Hu Haihua
Information Engineering, Guangdong University of Technology, Guangzhou 510006, China.
Entropy (Basel). 2024 Sep 12;26(9):781. doi: 10.3390/e26090781.
As high-speed big-data communications impose new requirements on storage latency, low-density parity-check (LDPC) codes have become a widely used technology in flash-memory channels. However, the iterative LDPC decoding algorithm faces a high decoding latency problem due to its mechanism based on iterative message transmission. Motivated by the unbalanced bit reliability of codeword, this paper proposes two technologies, i.e., serial entropy feature-based layered normalized min-sum (S-EFB-LNMS) decoding and parallel entropy feature-based layered normalized min-sum (P-EFB-LNMS) decoding. First, we construct an entropy feature vector that reflects the real-time bit reliability of the codeword. Then, the reliability of the output information of the layered processing unit (LPU) is evaluated by analyzing the similarity between the check matrix and the entropy feature vector. Based on this evaluation, we can dynamically allocate and schedule LPUs during the decoding iteration process, thereby optimizing the entire decoding process. Experimental results show that these techniques can significantly reduce decoding latency.
随着高速大数据通信对存储延迟提出新要求,低密度奇偶校验(LDPC)码已成为闪存通道中广泛使用的技术。然而,迭代LDPC解码算法由于其基于迭代消息传输的机制而面临高解码延迟问题。受码字比特可靠性不平衡的启发,本文提出了两种技术,即基于串行熵特征的分层归一化最小和(S-EFB-LNMS)解码和基于并行熵特征的分层归一化最小和(P-EFB-LNMS)解码。首先,我们构造一个反映码字实时比特可靠性的熵特征向量。然后,通过分析校验矩阵与熵特征向量之间的相似度来评估分层处理单元(LPU)输出信息的可靠性。基于此评估,我们可以在解码迭代过程中动态分配和调度LPU,从而优化整个解码过程。实验结果表明,这些技术可以显著降低解码延迟。