Group Abstract Group Abstract

Message Boards Message Boards

1
|
8.5K Views
|
14 Replies
|
17 Total Likes
View groups...
Share
Share this post:

How to calculate the entropy of a discrete probability distribution?

POSTED BY: Jeremy Murphy
14 Replies
POSTED BY: Jeremy Murphy

Jeremy,

one problem is syntax: It probably should read:

pmf = If[i == n, q, p];

and: Where is the dependence on n, what is "i"? If it is some constant parameter it ought to be contained in your wanted result.

POSTED BY: Henrik Schachner

I successfully solved another pmf, but then I accidentally lost the equation after copying down the result. Now I can't recreate the result!

This is what I'm trying:

q = ((T - 1)/T)^i;
p = (1/T) q;
pmf = if[i = n, q, p];
-Sum[#, {n, 0, Infinity}] & /@ (pmf Log[pmf] // PowerExpand // Expand)

And I previously got a result of:

-((-1 + T) Log[-1 + T]) + T Log[T]

which I verified numerically against other data I have. But now when I run the above commands, I get a very ambiguous result that is hardly simplified at all.

What am I going wrong in trying to recreate the result? It seemed to easy the first time.

POSTED BY: Jeremy Murphy
POSTED BY: Jeremy Murphy

Thanks, Vitaliy. Yes, you would think that generalization of Shannon entropy would do the job, but I had a quick play with it and wasn't able to get a result. I probably just don't know what I'm doing, though.

POSTED BY: Jeremy Murphy

Thanks for the quick intro, Henrik!

POSTED BY: Jeremy Murphy
POSTED BY: Henrik Schachner

Simpler:

 p = \[Lambda]^k Exp[-\[Lambda]]/k!;
 -Sum[#, {k, 0, Infinity}] & /@ (p Log[p] // PowerExpand // Expand)

 (*\[Lambda] - \[Lambda]*Log[\[Lambda]] - Sum[-((\[Lambda]^k*Log[k!])/(E^\[Lambda]*k!)), {k, 0, Infinity}]*)

Test:

Integrate[Sin[x] + Cos[x] + Exp[-Pi*x^2 - 1/x], x](*Can't integrate all terms*)

but:

Integrate[#, x] & /@ (Sin[x] + Cos[x] + Exp[-Pi*x^2 - 1/x])
(*-Cos[x] + Integrate[E^(-x^(-1) - Pi*x^2), x] + Sin[x]*)
POSTED BY: Mariusz Iwaniuk
Posted 3 years ago

Henrik, I'm completely new to Mathematica, so I understand most of what you've written there, but could you briefly explain, or link to an explanation, of a few of things:

  • After the end of FunctionExpand, what is that "/." operator and the expression with Gamma doing? (I know what the Gamma function itself is.)
  • What is the # symbol in the last Sum doing?
  • What is the meaning of "&/@" in the last Sum?

Thanks.

POSTED BY: Updating Name

Thanks, Henrik. Any example is helpful at the moment; it might help me to derive a general solution.

POSTED BY: Jeremy Murphy

Thanks, Mariusz. That question on the Mathematica overflow is closely related, and it helped me find other people asking the same question as me on there!

POSTED BY: Jeremy Murphy

Jeremy,

the entropy for a distribution $p(k)$ is defined as

$$ -\sum_{k=0}^\infty p(k)\ln p(k) $$

This is what I did:

(* definition of Poisson probability: *)
p = \[Lambda]^k Exp[-\[Lambda]]/k!; 

(* entropy - does not give a closed expression: *)
-Sum[p Log[p], {k, 0, Infinity}]  

(* expanding Log[p]: *)
logp = FunctionExpand[Log[p], Assumptions -> {k >= 0, \[Lambda] > 0}] /. Gamma[1 + n_] :> n!

(* doing summation term-wise: *)
-Simplify[Sum[p #, {k, 0, Infinity}] & /@ logp]

This gives finally your wanted result:

enter image description here

Hard to tell how to do it for any other distribution, so - I do not know if this really helps.

POSTED BY: Henrik Schachner

Maybe this helps.

POSTED BY: Mariusz Iwaniuk
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard