Information theory is a branch of applied mathematics, electrical engineering and computer science involving the quantification of information. Information theory is a broad and deep mathematical theory. Shannon introduced the quantitative and qualitative model of communication as a statistical process underlying information theory. Entropy optimization includes maximization and minimization. Maximization of entropy is easy and it can be done by using Lagrange’s method since entropy is concave function. Due to the concavity minimization of entropy is not so simple. But calculation of minimum entropy probability distribution is necessary because knowledge of both maximum and minimum entropy probability distribution gives complete information. In the present book, Shannon entropy is minimized for given any two moments as constraints. As a particular case, minimum Shannon entropy for two moments Harmonic Mean and Harmonic Mean has been calculated for n [any value] and for six faced dice also.
|Number of Pages||156|
|Country of Manufacture||India|
|Product Brand||LAP LAMBERT Academic Publishing|
|Product Packaging Info||Box|
|In The Box||1 Piece|
|Product First Available On ClickOnCare.com||2015-07-08 00:00:00|