Message Boards Message Boards

What is Entropy function ?

I have problem with Entropy instruction A first example

entro1 = Entropy[{1, 1, 1, 1, 1, 1}] // N
entro2 = Entropy[{0, 1, 1, 4, 1, 1}] // N
entro3 = Entropy[{0, 1, 1, 8, 1, 1}] // N
entro4 = Entropy[{0, 0, 10, 0, 0, 0}] // N
maxentropy = Log[6] // N

Out[369]= 0.
Out[370]= 0.867563
Out[371]= 0.867563
Out[372]= 0.450561
Out[373]= 1.79176

OK for the result entro1 and entro2, but why entro2=entro3, and why entro4 with a maximum concentration is not equal with maxentropy

And now a problem more difficult

country = "France";
ny = ToExpression@
DialogInput[
DynamicModule[{name = ""}, 
Column[{"Combien de villes retenir?", 
InputField[Dynamic[name], String], 
ChoiceButtons[{DialogReturn[name], DialogReturn[]}]}]]];

don = Take[
QuantityMagnitude[CityData[#, "Population"]] & /@ 
CityData[{All, country}], ny] // N;

(*Calcul de entropy totale*)

entrotot = Entropy[don] // N

hmaxtot = Log[ny] // N
hrelativetot = N[(entrotot/hmaxtot)*100]

The result is alway entrotot = hmaxtot when I enter different values of ny, the number of cities? What is the formula for the calculus of Entropy[]?

POSTED BY: André Dauphiné
7 Replies

Sorry I would say EntropyFilter[]

POSTED BY: André Dauphiné

Yes, EntropyFilter uses exactly same Information (or Shannon) Entropy formula as Entropy but in a given range given as the 2nd argument of the function. If you read Details section of the docs they even give Shannon formula there. Not sure about conditional entropy.

POSTED BY: Sam Carrettie

Hello Thank you for your explanations. If I understood when all objects in a list (numbers or letters) are similar, the entropy calculated with Entropy instruction gives 0 as result. And this instruction gives Log [n] if all objects are different. Finally, when some objects are identical, the result varies depending on the similarities.

In climatology if for 30 years, we have 2950 rainy days and 8000 days without rain, we say that the probability of having a rainy day per year is 2950/10950 = 0.26, and the probability of having a dry day per year is therefore 0.74.

Please two questions: Can we calculate a conditional entropy using the instruction Entropy? Do the instruction ImageEntropy operates on the same principle as the Entropy statement.

Again thank you for your support

POSTED BY: André Dauphiné

What is ImageEntropy ? Do you have a link or some reference ? You wrote it as Mathematica function but I do not see it in docs.

POSTED BY: Sam Carrettie
POSTED BY: André Dauphiné
POSTED BY: Sam Carrettie

This is Information (or Shannon) Entropy defined as usual but in log-base $e$:

$$H(X)=-\sum_{{x}}p(x)\log_{e}p(x).$$

You can see it clearly by computing

Entropy[{a, b, b}]

$$\frac{2}{3} \log \left(\frac{3}{2}\right)+\frac{\log (3)}{3}$$

Probability of $b$ is $P(b)=2/3$ and $a$ is $P(a)=1/3$. Similarly with

Entropy[{a, b, b, c, c, c, d}]

$$\frac{3}{7} \log \left(\frac{7}{3}\right)+\frac{2}{7} \log \left(\frac{7}{2}\right)+\frac{2 \log(7)}{7}$$

POSTED BY: Sam Carrettie
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract