Novel Coding Inequalities for Mean Codeword Length and Generalized Entropy using Noiseless Communication
Resumo
Shannon’s entropy forms the basis for almost every aspect of information theory. It formulates the foundational stone for various source coding theorems assuming statistical independence and extensive systems. This research aims to investigate the possibility of deriving novel entropy
measures using noiseless coding theorem. The obtained results find a widespread application in information theory and applied mathematics. To accomplish this, a novel expression for mean codeword length has been illustrated. Besides, established relation between entropy measure and
its corresponding codeword length. The results obtained pave the way for a new avenue for entropy-based coding in non-extensive and information-rich environments.
Downloads
Copyright (c) 2025 Boletim da Sociedade Paranaense de Matemática

This work is licensed under a Creative Commons Attribution 4.0 International License.
When the manuscript is accepted for publication, the authors agree automatically to transfer the copyright to the (SPM).
The journal utilize the Creative Common Attribution (CC-BY 4.0).