Description: The Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
File list (Check if you may need any files):
entropy\escorTsallis_entro.m
.......\K_q_escorTsallis.m
.......\K_q_renyi.m
.......\K_q_Tsallis.m
.......\read me.txt
.......\renyi_entro.m
.......\shannon_entro.m
.......\Tsallis_entro.m
entropy