Advisor
Tandon, RaviAffiliation
Department of Electrical and Computer Engineering, University of ArizonaIssue Date
2023-10
Metadata
Show full item recordCitation
Adiga, S., Tandon, R., Vasić, B., Bose, T. (2023). Generalization Bounds for Neural Normalized Min-Sum Decoders. International Telemetering Conference Proceedings, 58.Additional Links
https://telemetry.org/Abstract
Machine learning-based decoding algorithms such as neural belief propagation (NBP) have been shown to improve upon prototypical belief propagation (BP) decoders. NBP decoder unfolds the BP iterations into a deep neural network (DNN), and the parameters of the DNN are trained in a data-driven manner. Neural Normalized Min-Sum (NNMS) and Offset min-sum (OMS) decoders with learnable offsets are other adaptations requiring fewer learnable parameters than the NBP decoder. In this paper, we study the generalization capabilities of the neural decoder when the check node messages are scaled by parameters that are learned by optimizing over the training data. Specifically, we show the dependence of the generalization gap (i.e., the difference between empirical and expected BER) on the block length, message length, variable/check node degrees, decoding iterations, and the training dataset size.Type
Proceedingstext
Language
enISSN
1546-21880884-5123
0074-9079
