Message Boards Message Boards

What is Entropy function ?

I have problem with Entropy instruction A first example

entro1 = Entropy[{1, 1, 1, 1, 1, 1}] // N
entro2 = Entropy[{0, 1, 1, 4, 1, 1}] // N
entro3 = Entropy[{0, 1, 1, 8, 1, 1}] // N
entro4 = Entropy[{0, 0, 10, 0, 0, 0}] // N
maxentropy = Log[6] // N

Out[369]= 0.
Out[370]= 0.867563
Out[371]= 0.867563
Out[372]= 0.450561
Out[373]= 1.79176

OK for the result entro1 and entro2, but why entro2=entro3, and why entro4 with a maximum concentration is not equal with maxentropy

And now a problem more difficult

country = "France";
ny = ToExpression@
DialogInput[
DynamicModule[{name = ""}, 
Column[{"Combien de villes retenir?", 
InputField[Dynamic[name], String], 
ChoiceButtons[{DialogReturn[name], DialogReturn[]}]}]]];

don = Take[
QuantityMagnitude[CityData[#, "Population"]] & /@ 
CityData[{All, country}], ny] // N;

(*Calcul de entropy totale*)

entrotot = Entropy[don] // N

hmaxtot = Log[ny] // N
hrelativetot = N[(entrotot/hmaxtot)*100]

The result is alway entrotot = hmaxtot when I enter different values of ny, the number of cities? What is the formula for the calculus of Entropy[]?

POSTED BY: André Dauphiné
7 Replies

Sorry I would say EntropyFilter[]

POSTED BY: André Dauphiné

Yes, EntropyFilter uses exactly same Information (or Shannon) Entropy formula as Entropy but in a given range given as the 2nd argument of the function. If you read Details section of the docs they even give Shannon formula there. Not sure about conditional entropy.

POSTED BY: Sam Carrettie

Hello Thank you for your explanations. If I understood when all objects in a list (numbers or letters) are similar, the entropy calculated with Entropy instruction gives 0 as result. And this instruction gives Log [n] if all objects are different. Finally, when some objects are identical, the result varies depending on the similarities.

In climatology if for 30 years, we have 2950 rainy days and 8000 days without rain, we say that the probability of having a rainy day per year is 2950/10950 = 0.26, and the probability of having a dry day per year is therefore 0.74.

Please two questions: Can we calculate a conditional entropy using the instruction Entropy? Do the instruction ImageEntropy operates on the same principle as the Entropy statement.

Again thank you for your support

POSTED BY: André Dauphiné

What is ImageEntropy ? Do you have a link or some reference ? You wrote it as Mathematica function but I do not see it in docs.

POSTED BY: Sam Carrettie

Thanks You But my problem is not solved Consider these four data sets whose sum is equal to 1, because these are probabilities

data1 = {0.20, 0.2, 0.2, 0.2, 0.2};
data2 = {0, 0, 1, 0, 0};
data3 = {0.2, 0.4, 0.1, 0., 0.3};
data4 = {0.15, 0.5, 0.05, 0, 0.3}

When I do the calculation with the formula of Shannon, I get 1.60944 for data 1, and 0 for data2

-(0.2 Log[0.2] + 0.2 Log[0.2] + 0.2 Log[0.2] + 0.2 Log[0.2] + 
   0.2 Log[0.2])

But, with the Entropy[] instruction, I get for data1 Log[5] - the result of Entropy[data1]

And now for data2, with the formula of Shannon, I have a result of 0, but with the Entropy[] instruction I get 0.500402 for data2, and not Log [5] -Entropy [data2]

With data3 and data4 is even more complicated The result of the calculation with the formula of Shannon, gives 1.27985 but Entropy [data3] gives 1.60944 and equal to Log[5]. And if I change my probabilities and processes such Data4 the résulat remains equal to Log [5] with Entropy[data4], but the formula of Shannon gives 1.14212.

So, I verified with the little program proposed by Fred Garber

SetAttributes[LogProduct, Listable];
LogProduct[x_, y_] := x Log[y] /; x != 0 || y != 0
LogProduct[x_, y_] := 0.0 /; x == 0 && y == 0
entro[list_] := -Plus @@ LogProduct[list, list]

And I have the same results that I have calculated with the formula of Shannon.

I think that Entropy[] is not the entropie of Shannon.

POSTED BY: André Dauphiné
POSTED BY: Sam Carrettie
POSTED BY: Sam Carrettie
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract