A Two-Parametric Generalization of Renyi’s Entropy and its New Mean Cod-word Length
Subject Areas : Electrical engineering (electronics, telecommunications, power, control)Alireza Hosseinzadeh 1 , Mehdi Yaghoobi Avval Riabi 2
1 - Department of Statistics, Gonabad Branch, Islamic Azad University, Gonabad, Iran
2 - Department of Statistics, Islamic Azad University, Gonabad Branch, Gonabad, Iran
Keywords: Shannon’s entropy, Renyi’s entropy, Mean Code-word length, Kraft’s inequality, Huffman code.,
Abstract :
At all levels of society, systems have been introduced that deal with the transmission, storage and processing of information. In fact, the society we live in, is called the information society. Therefore, it seems logical to want to know how information can be defined and measured. For the first time, Claude Shannon (1948) linked the concepts of information and probability in his famous article entitled "The Mathematical Theory of Communication" and introduced a mathematical model based on probability to evaluate the amount of information hidden in the probability distribution of a random variable .
In communication theory, until 1948, it was believed that increasing the rate of information transmission in a communication channel increases the probability of error. But Shannon proved that this claim is not true as long as the rate of information transmission is less than the capacity of the communication channel
Due to the importance of this field of study, numerous generalizations for Shannon's entropy have been proposed by researchers so far. Each of these generalizations has properties that lead them to wide applications in different fields. Renyi’s entropy was introduced as a one-parameter generalization of Shannon entropy by Alfred Renyi in (1961). This generalized information measure has similar properties to Shannon’s entropy
In this paper, we first review Shannon and Renyi entropies and some of their important properties. Then, a two-parameter generalization of Renyi’s entropy with the corresponding new mean code-word length, that was proposed by Bhatt et. al (2023), and its important properties are studied. In particular, for some values of the parameters, we evaluate the efficiency of the Huffman code with respect to this generalized entropy and its new mean code-word length
[1] H. Nyquist, Certain Factor Affecting Telegraph Speed. Bell System Technical Journal, vol. 3, No. 2, 324-346, 1924.
[2] R. V. L. Hartley, Transmission of information. Bell System Technical Journal, 7, 535, 1928.
[3] C.E. Shannon, A mathematical theory of communication. Bell System Technical Journal, 27, 379-423, 1948.
[4] T. Cover, J. Thomas, Elements of Information Theory. ISBN 0-471-06259-6, 1991.
[5] R. W. Hamming, Error detecting and error correcting codes. 29 (2): 147–160, 1950.
[6] DA. Huffman, A method for the construction of minimum redundancy codes, in: Proceedings of the IRE.; 40:1098-1101, 1952.
[7] A. Renyi, On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics, University of California Press 4, 547-562, 1961.
[8] J. Havrda and F. Charvat, Quantification method of classification processes. Concept of structural a-entropy. Kybernetika 3(1), 30-35, 1967.
[9] B. D. Sharma and D. P. Mittal, New non-additive measures of entropy for discrete probability distributions. J. Math. Sci.,10, 28-40, 1975.
[10] A. H. Bhat, N. A. Siddiqui, I. A. Mageed, S. Alkhazaleh, V. R. Das and M. A. K Baig, Generalization of Renyi’s Entropy and its Application in Source Coding. Appl. Math. Inf. Sci. 17, No. 5, 941-948, 2023.
[11] L. L. Campbell, A coding theorem and Renyi’s entropy. Information and control 8(4), 423-429, 1965.