Can EvaluationMonitor Give FiniteDifference Gradients in FindMinimum?

Posted 9 years ago
2336 Views
|
4 Replies
|
3 Total Likes
|
 I'd like to use EvaluationMonitor to see what values for the gradients FindMinimum is calculating when I set Gradient :> {"FiniteDifference", "DifferenceOrder" -> 1} as that seems to work better than gradients I calculate symbolically.
4 Replies
Sort By:
Posted 9 years ago
 ?ExperimentalCreateNumericalFunction ExperimentalCreateNumericalFunction Attributes[ExperimentalCreateNumericalFunction]={Protected} Options[ExperimentalCreateNumericalFunction]={Compiled->Automatic,ErrorReturn->Automatic,Evaluated->Automatic,EvaluationMonitor->None,Gradient->Automatic,Hessian->Automatic,Jacobian->Automatic,Message->Automatic,SampleArgument->False,StepMonitor->None,WorkingPrecision->MachinePrecision} Based on an error message about a dimension list, I was able to calculate a Jacobian, as follows: In[39]:= nf2 = ExperimentalCreateNumericalFunction[{x, y}, {x^2 + 2 y^4, x*y}, {2}]; In[40]:= nf2["Jacobian"[{1, 2}]] Out[40]= {{2., 64.}, {2., 1.}} Am I correct in assuming that the third argument is the dimensions of the function?
Posted 9 years ago
 Sure, for example nf @ "Hessian"[{1, 1}]
Posted 9 years ago
 Thanks. That looks like a useful technique. Can a Hessian be calculated numerically in a similar fashion?
Posted 9 years ago
 In FindMinimum, it's possible to use  Gradient -> {"FiniteDifference", "DifferenceOrder" -> 1, EvaluationMonitor :> Sow[{x, y}]} to find out where the gradient is being evaluated and then compute the gradient at each of these points, for example In[4]:= nf = ExperimentalCreateNumericalFunction[{x, y}, x^2 + 2 y^4, {}, Gradient -> {"FiniteDifference", "DifferenceOrder" -> 1}]; nf["Gradient"[{1, 1}]] Out[4]= {2., 8.}