Hi,
Numerical solvers in Mathematica usually analyze the symbolic form of the objective function. To do this analysis, NMinimize[]
evaluates radpatt[xx]
(in the OP at the top). This analysis may be used for method selection, singularity handling, and perhaps other adjustments. Mathematica is designed to do this automatically and to allow for users to override its automatic choices. It uses heuristics that are not always perfect. Symbolic processing takes time. So user-overrides sometimes are needed and can speed things up when the user knows what they are doing.
Using _?NumericQ
in defining the objective function prevents symbolic analysis, and the automatic choice of method is more likely to be less than optimal. Symbolic analysis mainly done on formulas, as far as I know. So if your objective functions calls a numerical routine, such as NIntegrate[]
or FindRoot[]
, then you do not lose much by using _?NumericQ
. And if you need to specify singularities or method, you can do that through options. (One irritating-to-me exception: Plot[f[x], {x, 0, 1}, WorkingPrecision -> 32]
evaluates f[x]
on a machine-precision number at the beginning, and sometimes this causes problems.)
Below are variations on an example. The method, speed, and quality of the result are altered by using _?NumericQ
in obj[]
. Also the last variant shows that the superior method may not be used on obj[]
.
(* Uses Couenne NLP *)
NMinimize[{Cos[x] - Exp[(x - 0.5) y], x^2 + y^2 < 1}, {x, y}] // AbsoluteTiming
(* {0.03809, {-1.60047, {x -> -0.657103, y -> -0.753805}}} *)
(* Ends up using diff. evol. after rejecting Couenne *)
obj // ClearAll; (* I always start function definitions this way (or ClearAll[obj] *)
obj[x_?NumericQ, y_?NumericQ] := Cos[x] - Exp[(x - 0.5) y];
NMinimize[{obj[x, y], x^2 + y^2 < 1}, {x, y}] // AbsoluteTiming
(* {0.154981, {-1.60046, {x -> -0.657221, y -> -0.753698}}} *)
NMinimize[{obj[x, y], x^2 + y^2 < 1}, {x, y}, Method -> "Couenne"]
(* NMinimize::unproc: Could not process the objective and constraints in a form suitable for Couenne. *)
(* NMinimize[{obj[x, y], x^2 + y^2 < 1}, {x, y}, Method -> "Couenne"] *)
If you want to know how I know what is going, here are some undocumented ways to peak under the hood of NMinimize
/NMaximize
:
Optimization`Debug`SetPrintLevel[2];
Block[{Optimization`NMinimizeDump`$DiagnosticLevel = 2},
(* Uses Couenne NLP *)
NMinimize[{Cos[x] - Exp[(x - 0.5) y], x^2 + y^2 < 1}, {x, y}]
]
Optimization`Debug`SetPrintLevel[0];
Optimization`Debug`SetPrintLevel[2];
Block[{Optimization`NMinimizeDump`$DiagnosticLevel = 2},
(* Ends up using diff. evol. after rejecting Couenne *)
NMinimize[{obj[x, y], x^2 + y^2 < 1}, {x, y}]
]
Optimization`Debug`SetPrintLevel[0];