• Stochastic Resonance Decoding for Quantum LDPC Codes

      Raveendran, Nithin; Nadkarni, Priya J.; Garani, Shayan Srinivasa; Vasic, Bane; Univ Arizona, Dept Elect & Comp Engn (IEEE, 2017)
      We introduce a stochastic resonance based decoding paradigm for quantum codes using an error correction circuit made of a combination of noisy and noiseless logic gates. The quantum error correction circuit is based on iterative syndrome decoding of quantum low-density parity check codes, and uses the positive effect of errors in gates to correct errors due to decoherence. We analyze how the proposed stochastic algorithm can escape from short cycle trapping sets present in the dual containing Calderbank, Shor and Steane (CSS) codes. Simulation results show improved performance of the stochastic algorithm over the deterministic decoder.
    • A Sub-Graph Expansion-Contraction Method for Error Floor Computation

      Raveendran, Nithin; Declercq, David; Vasic, Bane; Univ Arizona, Dept Elect & Comp Engn (IEEE, 2020-04-20)
      In this paper, we present a computationally efficient method for estimating error floors of low-density parity-check (LDPC) codes over the binary symmetric channel (BSC) without any prior knowledge of its trapping sets (TSs). Given the Tanner graph G of a code, and the decoding algorithm D , the method starts from a list of short cycles in G , and expands each cycle by including its sufficiently large neighborhood in G . Variable nodes of the expanded sub-graphs GEXP are then corrupted exhaustively by all possible error patterns, and decoded by D operating on GEXP . Union of support of the error patterns for which D fails on each GEXP defines a subset of variable nodes that is a TS. The knowledge of the minimal error patterns and their strengths in each TSs is used to compute an estimation of the frame error rate. This estimation represents the contribution of error events localized on TSs, and therefore serves as an accurate estimation of the error floor performance of D at low BSC cross-over probabilities. We also discuss trade-offs between accuracy and computational complexity. Our analysis shows that in some cases the proposed method provides a million-fold improvement in computational complexity over standard Monte-Carlo simulation.
    • Towards the Exact Rate-Memory Trade-off for Uncoded Caching with Secure Delivery

      Bahrami, Mohsen; Attia, Mohamed Adel; Tandon, Ravi; Vasic, Bane; Univ Arizona, Dept Elect & Comp Engn (IEEE, 2017)
      We consider the problem of secure delivery in a single-hop caching network with a central server connected to multiple end-users via an insecure multi-cast link. The server has a database of a set of files (content) and each user, which is equipped with a local cache memory, requests access one of the files. In addition to delivering users' requests, the server needs to keep the files (information-theoretically) secure from an external eavesdropper observing the underlying communications between the server and users. We focus on an important class of content placement strategies where the pre-fetching is required to be uncoded and caches are filled with uncoded fragments of files. In this paper, we establish the exact characterization of secure rate-memory trade-off for the worst-case communication rate through a matching converse under the constraint of uncoded cache placement where the number of users is no larger than the number of files in the database.
    • Trapping Set Analysis of Finite-Length Quantum LDPC Codes

      Raveendran, Nithin; Vasic, Bane; Center for Quantum Networks, University of Arizona, Department of Electrical and Computer Engineering (IEEE, 2021-07-12)
      Iterative decoders for finite length quantum low-density parity-check (QLDPC) codes are impacted by short cycles, detrimental graphical configurations known as trapping sets (TSs) present in a code graph as well as symmetric degeneracy of errors. In this paper, we develop a systematic methodology by which quantum trapping sets (QTSs) can be defined and categorized according to their topological structure. Conventional definition of a TS from classical error correction is generalized to address the syndrome decoding scenario for QLDPC codes. We show that QTS information can be used to design better QLDPC code and decoder. For certain finite-length QLDPC codes, frame error rate improvements of two orders of magnitude in the error floor regime are demonstrated without needing any post-processing steps.
    • Trapping set ontology

      Vasic, Bane; Chilappagari, Shashi Kiran; Nguyen, Dung Viet; Planjery, Shiva Kumar; Univ Arizona, Dept Elect & Comp Engn (IEEE, 2009-09)
      The failures of iterative decoders for low-density parity-check (LDPC) codes on the additive white Gaussian noise channel (AWGNC) and the binary symmetric channel (BSC) can be understood in terms of combinatorial objects known as trapping sets. In this paper, we derive a systematic method to identify the most relevant trapping sets for decoding over the BSC in the error floor region. We elaborate on the notion of the critical number of a trapping set and derive a classification of trapping sets. We then develop the trapping set ontology, a database of trapping sets that summarizes the topological relations among trapping sets. We elucidate the usefulness of the trapping set ontology in predicting the error floor as well as in designing better codes.
    • Two-Bit Bit Flipping Algorithms for LDPC Codes and Collective Error Correction

      Nguyen, Dung Viet; Vasic, Bane; Univ Arizona, Dept Elect & Comp Engn (IEEE, 2014-04)
      A new class of bit flipping algorithms for low-density parity-check codes over the binary symmetric channel is proposed. Compared to the regular (parallel or serial) bit flipping algorithms, the proposed algorithms employ one additional bit at a variable node to represent its "strength." The introduction of this additional bit allows an increase in the guaranteed error correction capability. An additional bit is also employed at a check node to capture information which is beneficial to decoding. A framework for failure analysis and selection of two-bit bit flipping algorithms is provided. The main component of this framework is the (re)definition of trapping sets, which are the most "compact" Tanner graphs that cause decoding failures of an algorithm. A recursive procedure to enumerate trapping sets is described. This procedure is the basis for selecting a collection of algorithms that work well together. It is demonstrated that decoders which employ a properly selected group of the proposed algorithms operating in parallel can offer high speed and low error floor decoding.