CFP last date
20 May 2024
Call for Paper
June Edition
IJCA solicits high quality original research papers for the upcoming June edition of the journal. The last date of research paper submission is 20 May 2024

Submit your paper
Know more
Reseach Article

A Coding Theorem on Havrda-Charvat and Tsallis's Entropy

Published on May 2012 by Satish Kumar, Arvind Kumar
National Workshop-Cum-Conference on Recent Trends in Mathematics and Computing 2011
Foundation of Computer Science USA
RTMC - Number 1
May 2012
Authors: Satish Kumar, Arvind Kumar
084b6414-47ff-4eef-b24a-06eecddf70ad

Satish Kumar, Arvind Kumar . A Coding Theorem on Havrda-Charvat and Tsallis's Entropy. National Workshop-Cum-Conference on Recent Trends in Mathematics and Computing 2011. RTMC, 1 (May 2012), 11-15.

@article{
author = { Satish Kumar, Arvind Kumar },
title = { A Coding Theorem on Havrda-Charvat and Tsallis's Entropy },
journal = { National Workshop-Cum-Conference on Recent Trends in Mathematics and Computing 2011 },
issue_date = { May 2012 },
volume = { RTMC },
number = { 1 },
month = { May },
year = { 2012 },
issn = 0975-8887,
pages = { 11-15 },
numpages = 5,
url = { /proceedings/rtmc/number1/6620-1003/ },
publisher = {Foundation of Computer Science (FCS), NY, USA},
address = {New York, USA}
}
%0 Proceeding Article
%1 National Workshop-Cum-Conference on Recent Trends in Mathematics and Computing 2011
%A Satish Kumar
%A Arvind Kumar
%T A Coding Theorem on Havrda-Charvat and Tsallis's Entropy
%J National Workshop-Cum-Conference on Recent Trends in Mathematics and Computing 2011
%@ 0975-8887
%V RTMC
%N 1
%P 11-15
%D 2012
%I International Journal of Computer Applications
Abstract

A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is well known in information theory. In this communication, first we generalized Shannon inequality and then given its application in coding theory.

References
  1. C. Arndt, Information Measure-Information and its description in Science and Engineering, Springer, Berlin, 2001.
  2. M. A. K. Baig and Rayeees Ahmad Dar, Coding theorems on a generalized information measures, J. KSIAM, vol. 11 (2), 2007, 3-8.
  3. L. L. Campbell, A coding theorem and Renyi's entropy, Information and Control, vol. 8, 1965, 423-429.
  4. Z. Daroczy, Generalized Information Functions, Information and Control, vol. 16, 1970, 36-51.
  5. A. Feinstein, Foundations of Information Theory, McGraw-Hill, New York, 1958.
  6. J. F. Havrda and F. Charvat, Quantification Methods of Classification Process, The Concept of structural ? -entropy, Kybernetika, vol. 3, 1967, 30-35.
  7. D. F. Kerridge, Inaccuracy and inference, J. Roy. Statist Soc. Sec. , vol. B23, 1961, 184-194.
  8. A. B. Khan, B. A. Bhat and S. Pirzada, Some Results on a Generalized Useful Information Measure, Journal of Inequalities in Pure and Applied Mathematics, Vol. 6, Issue 4, Article 117, 2005.
  9. P. Nath, An axiomatic characterization of inaccuracy for finite generalized discrete probability Distributions, Opsearch, vol. 7, 1970, 115-133.
  10. P. Nath and D. P. Mittal, A generalization of Shannon's inequality and its application in coding theory, Inform. and Control, vol. 23, 1973, 439-445.
  11. S. Pirzada and B. A. Bhat, Some more results in coding theory, J. KSIAM, vol. 10(2), 2006, 123-131.
  12. A. Renyi, On Measure of entropy and information, Proc. 4th Berkeley Symp. Maths. Stat. Prob. , Vol. 1, 1961, 547- 561.
  13. Satish Kumar, Some More Results on R-Norm Information Measure, Tamkang Journals of Mathematics, Vol. 40(1), 2009, 41-58.
  14. C. E. Shannon, A mathematical theory of information, Bell System Techn. J. , vol. 27, 1948, 378-423, 623-656.
  15. B. D. Sharma and D. P. Mittal, New non-additive measures of entropy for discrete Probability distributions, J. Math. Sci. , vol. 10, 1975, 28-40.
  16. B. D. Sharma and Ram Autar, On characterization of a generalized inaccuracy measure in information theory, Journal of Applied Probability, vol. 10, 1973, 464-468.
  17. O. Shisha, Inequalities, Academic Press, New York, 1967.
  18. R. P. Singh, R. Kumar and R. K. Tuteja, Application of Holder's Inequality in Information Theory, Information Sciences, vol. 152, 2003, 145-154.
  19. C. Tsallis, Possible Generalization of Boltzmann Gibbs Statistics, J. Stat. Phy. , vol. 52, 1988, 479.
  20. I. Vajda, On measure of entropy and information, Proc. Fourth Berk. Symp. in Math, Stat. and Prob. , vol. 1, 1961, 547-561.
  21. J. C. A. van der Lubbe, On certain coding theorems for the information of order ? and of type ? . In: Trans. Eighth Prague Conf. Inform. Theory, Statist. Dec. Functions, Random Processes, vol. C, Academia, Prague, 1978, pp. 253- 266.
Index Terms

Computer Science
Information Sciences

Keywords

Shannon Inequality Codeword Length Holder's Inequality Kraft Inequality And Optimal Code Length