Dear Christian,
What changed in 11.3 is that Mathematica has aligned the workings of machine precision arithmetic to the rest of the industry, in that machine precision underflows no longer trigger an exception and a fall-back to Mathematica's extended precision arithmetic, but instead machine precision underflows flush to zero.
For example, using SciPy:
In [12]: scipy.stats.distributions.norm.pdf(39.0) # PDF flushes to zero
Out[12]: 0.0
In [13]: scipy.stats.distributions.norm.logpdf(39.0) # solution: use logPDF
Out[13]: -761.4189385332047
With this in mind, your example can be easily made to work should you have replaced Log[PDF[dist, x]]
with LogLikelihood[dist, {x}]
.
Let me also reparameterize your distribution somewhat, and work in terms of gbdist[n, hh/(qq + hh), 1/(hh + 1 + qq]
rather than gddist[n, q, del]
, as inequalities on q
and del
would translate into positivity of hh
and qq
. Once parameters are fitted, we can easily recover q
and del
.
gbdist[n_, q_, del_] :=
ParameterMixtureDistribution[BinomialDistribution[n, p],
Distributed[p,
BetaDistribution[q (-1 + 1/del), ((-1 + q) (-1 + del))/del]]]
loglikelihood[data_, {qq_, hh_}] :=
Sum[LogLikelihood[
gbdist[Part[data, i, 1], hh/(qq + hh),
1/(hh + 1 + qq)], {Part[data, i, 2]}], {i, Length[data]}]
antdata = {{200, 40}, {200, 78}, {200, 97}, {200, 81}, {200, 96}};
One last thing we've got to do is to replace Log[Beta[a, b]]
with combination of LogGamma
for the same reason of extending the range of applicability of machine precision:
ll = loglikelihood[antdata, {qq, hh}] /.
HoldPattern[Log[Beta[a_, b_]]] :>
With[{aa = Simplify[a], bb = Simplify[b]},
LogGamma[aa] + LogGamma[bb] - LogGamma[Simplify[aa + bb]]];
Now NMaximize
has not issue finding the solution:
In[325]:= {maxVal, params} =
NMaximize[{ll, hh > 0 && qq > 0}, {qq, hh}]
Out[325]= {-22.3822, {qq -> 13.5332, hh -> 8.66654}}
In[326]:= {hh/(qq + hh), 1/(hh + 1 + qq)} /. params
Out[326]= {0.390389, 0.0431039}
I hope you will give Mathematica a chance, and continue using it.
Sincerely,
Oleksandr Pavlyk