Message Boards Message Boards

What is Entropy function ?

POSTED BY: André Dauphiné
7 Replies

Sorry I would say EntropyFilter[]

POSTED BY: André Dauphiné

Yes, EntropyFilter uses exactly same Information (or Shannon) Entropy formula as Entropy but in a given range given as the 2nd argument of the function. If you read Details section of the docs they even give Shannon formula there. Not sure about conditional entropy.

POSTED BY: Sam Carrettie
POSTED BY: André Dauphiné

What is ImageEntropy ? Do you have a link or some reference ? You wrote it as Mathematica function but I do not see it in docs.

POSTED BY: Sam Carrettie
POSTED BY: André Dauphiné

$0.2$ is not the probability - and should not go into Log. The fact that $0.2$ is listed 5 times also got nothing to do with calculations. In reality $0.2$ in data1 is the only element in the list and its probability is 1 when you picking elements randomly from the list (not $0.2$). So you have 1 Log[1] and it is zero:

data1 = {0.20, 0.2, 0.2, 0.2, 0.2};
Entropy[data1]
(*0*)

Please consider these two identical results:

Entropy[Append[Table[0.2, 35], 0.3]]

$$\frac{35}{36} \log \left(\frac{36}{35}\right)+\frac{\log (36)}{36}$$

Entropy[Append[Table[a, 35], b]]

$$\frac{35}{36} \log \left(\frac{36}{35}\right)+\frac{\log (36)}{36}$$

POSTED BY: Sam Carrettie

This is Information (or Shannon) Entropy defined as usual but in log-base $e$:

$$H(X)=-\sum_{{x}}p(x)\log_{e}p(x).$$

You can see it clearly by computing

Entropy[{a, b, b}]

$$\frac{2}{3} \log \left(\frac{3}{2}\right)+\frac{\log (3)}{3}$$

Probability of $b$ is $P(b)=2/3$ and $a$ is $P(a)=1/3$. Similarly with

Entropy[{a, b, b, c, c, c, d}]

$$\frac{3}{7} \log \left(\frac{7}{3}\right)+\frac{2}{7} \log \left(\frac{7}{2}\right)+\frac{2 \log(7)}{7}$$

POSTED BY: Sam Carrettie
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract