Message Boards Message Boards

What is Entropy function ?

POSTED BY: André Dauphiné
7 Replies
POSTED BY: André Dauphiné

Yes, EntropyFilter uses exactly same Information (or Shannon) Entropy formula as Entropy but in a given range given as the 2nd argument of the function. If you read Details section of the docs they even give Shannon formula there. Not sure about conditional entropy.

POSTED BY: Sam Carrettie

Hello Thank you for your explanations. If I understood when all objects in a list (numbers or letters) are similar, the entropy calculated with Entropy instruction gives 0 as result. And this instruction gives Log [n] if all objects are different. Finally, when some objects are identical, the result varies depending on the similarities.

In climatology if for 30 years, we have 2950 rainy days and 8000 days without rain, we say that the probability of having a rainy day per year is 2950/10950 = 0.26, and the probability of having a dry day per year is therefore 0.74.

Please two questions: Can we calculate a conditional entropy using the instruction Entropy? Do the instruction ImageEntropy operates on the same principle as the Entropy statement.

Again thank you for your support

POSTED BY: André Dauphiné
POSTED BY: Sam Carrettie
POSTED BY: André Dauphiné
POSTED BY: Sam Carrettie

This is Information (or Shannon) Entropy defined as usual but in log-base $e$:

$$H(X)=-\sum_{{x}}p(x)\log_{e}p(x).$$

You can see it clearly by computing

Entropy[{a, b, b}]

$$\frac{2}{3} \log \left(\frac{3}{2}\right)+\frac{\log (3)}{3}$$

Probability of $b$ is $P(b)=2/3$ and $a$ is $P(a)=1/3$. Similarly with

Entropy[{a, b, b, c, c, c, d}]

$$\frac{3}{7} \log \left(\frac{7}{3}\right)+\frac{2}{7} \log \left(\frac{7}{2}\right)+\frac{2 \log(7)}{7}$$

POSTED BY: Sam Carrettie
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract