Thanks You
But my problem is not solved
Consider these four data sets whose sum is equal to 1, because these are probabilities
data1 = {0.20, 0.2, 0.2, 0.2, 0.2};
data2 = {0, 0, 1, 0, 0};
data3 = {0.2, 0.4, 0.1, 0., 0.3};
data4 = {0.15, 0.5, 0.05, 0, 0.3}
When I do the calculation with the formula of Shannon, I get 1.60944 for data 1, and 0 for data2
-(0.2 Log[0.2] + 0.2 Log[0.2] + 0.2 Log[0.2] + 0.2 Log[0.2] +
0.2 Log[0.2])
But, with the Entropy[] instruction, I get for data1 Log[5] - the result of Entropy[data1]
And now for data2, with the formula of Shannon, I have a result of 0, but with the Entropy[] instruction I get 0.500402 for data2, and not Log [5] -Entropy [data2]
With data3 and data4 is even more complicated
The result of the calculation with the formula of Shannon, gives 1.27985 but Entropy [data3] gives 1.60944 and equal to Log[5].
And if I change my probabilities and processes such Data4 the résulat remains equal to Log [5] with Entropy[data4], but the formula of Shannon gives 1.14212.
So, I verified with the little program proposed by Fred Garber
SetAttributes[LogProduct, Listable];
LogProduct[x_, y_] := x Log[y] /; x != 0 || y != 0
LogProduct[x_, y_] := 0.0 /; x == 0 && y == 0
entro[list_] := -Plus @@ LogProduct[list, list]
And I have the same results that I have calculated with the formula of Shannon.
I think that Entropy[] is not the entropie of Shannon.