Message Boards Message Boards

4 Replies
3 Total Likes
View groups...
Share this post:

Can EvaluationMonitor Give FiniteDifference Gradients in FindMinimum?

I'd like to use EvaluationMonitor to see what values for the gradients FindMinimum is calculating when I set Gradient :> {"FiniteDifference", "DifferenceOrder" -> 1} as that seems to work better than gradients I calculate symbolically.

POSTED BY: Frank Kampas
4 Replies




Based on an error message about a dimension list, I was able to calculate a Jacobian, as follows:

In[39]:= nf2 = 
  Experimental`CreateNumericalFunction[{x, y}, {x^2 + 2 y^4, 
    x*y}, {2}];

In[40]:= nf2["Jacobian"[{1, 2}]]

Out[40]= {{2., 64.}, {2., 1.}}

Am I correct in assuming that the third argument is the dimensions of the function?

POSTED BY: Frank Kampas

Sure, for example nf @ "Hessian"[{1, 1}]

POSTED BY: Ilian Gachevski

Thanks. That looks like a useful technique. Can a Hessian be calculated numerically in a similar fashion?

POSTED BY: Frank Kampas

In FindMinimum, it's possible to use

 Gradient -> {"FiniteDifference", "DifferenceOrder" -> 1, EvaluationMonitor :> Sow[{x, y}]}

to find out where the gradient is being evaluated and then compute the gradient at each of these points, for example

In[4]:= nf = Experimental`CreateNumericalFunction[{x, y}, x^2 + 2 y^4, {}, 
               Gradient -> {"FiniteDifference", "DifferenceOrder" -> 1}];
        nf["Gradient"[{1, 1}]]

Out[4]= {2., 8.}
POSTED BY: Ilian Gachevski
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
or Discard

Group Abstract Group Abstract