Output statistics, equivocation, and state masking
Abstract
Given a discrete memoryless channel and a target distribution on its output alphabet, one wishes to construct a length-$ n $ rate-$ R $ codebook such that the output distribution—computed over a codeword that is chosen uniformly at random—should be close to the $ n $-fold tensor product of the target distribution. Here 'close' means that the relative entropy between the output distribution and said $ n $-fold product should be small. We characterize the smallest achievable relative entropy divided by $ n $ as $ n $ tends to infinity. We then demonstrate two applications of this result. The first application is an alternative proof of the achievability of the rate-equivocation region of the wiretap channel. The second application is a new capacity result for communication subject to state masking in the scenario where the decoder has access to channel-state information.
Date
01-06-2025Author
Ligong Wang
Metadata
Show full item recordURI
https://www.aimspress.com/article/doi/10.3934/math.2025590http://digilib.fisipol.ugm.ac.id/repo/handle/15717717/39120
