Message Boards Message Boards

0
|
9092 Views
|
5 Replies
|
0 Total Likes
View groups...
Share
Share this post:

Can we control the behavior of NonlinearModelFit?

Posted 10 years ago

I have been using NonlinearModelFit to determine the parameters of a model which is sensitive to its parameters. I have begun the process by determining manually, using manipulate, what is really a very good guess for the values. These are input as the intial values for the fit, and constraints are used to further limit the range of values.

I use Monitor and StepMonitor to follow the solution process. At each step the sum of the residuals squared is calculated, and the parameter values together with that error value is printed, along with a plot superimposing the function using the current parameters with a ListPlot of the data.

What i find is that NonlinearModelFit behaves like a blind dog in a butcher shop. The first step takes it to a function far removed from the good fit I gave it as a start, with the error value much larger than that of the starting condition. At best, it gets back to an as-good fit in perhaps 10 iterations, and eventually improves on it. At worst, it wanders off into distant lands, never to return.

I work also with another program which uses fitting algorithms doing converging iterations for FEA. That tool has extensive user defined parameters for controlling the stepping process. You can tell it the problem is extremely nonlinear and it will be conservative. You can control the initial step sizes for parameters, the maximum step sizes, the step growth and recovery rates. And every step it takes is one which reduces the error.

Are there any similar controls for Mathematica's fitting algorithms?

(I realize it might be better to offer the notebook for examination, but the work may well be proprietary, and not mine to offer.)

Best regards,

David

POSTED BY: David Keith
5 Replies

Take a quick look at the answer here:

http://mathematica.stackexchange.com/questions/21823/methods-for-nonlinearmodelfit

So basically, you can use methods from FindMinimum and NMinimize. I'm not sure how well the NMinimize examples work.

What i find is that NonlinearModelFit behaves like a blind dog in a butcher shop.

Yes. That's the nature of minimization and root finding.

POSTED BY: Sean Clarke

Using FindMinimum with a backtracking line search might help with your problem.

POSTED BY: Frank Kampas
Posted 10 years ago

Thanks to both of you for your responses. I have been studying the material from the various links, much of which I have read before. I could still use more study. The problem at hand is very sensitive to the start point, so I have been looking for methods which would seek a local optimum, rather than get distracted by a global search.

I am finding it difficult to discover what methods are available to NonlinearModelFit. Many I try, including methods from FindMinimum and NMinimize return either as not available. Returning:

"Value of option Method -> \!(\"FindMinimum\") is not Automatic, \ \"Gradient\", \"ConjugateGradient\", \"InteriorPoint\", \"QuasiNewton\ \", \"Newton\", \"NMinimize\", or \"LevenbergMarquardt\""

Or not available for constrained problems:

NonlinearModelFit::ucmtd: Method -> Gradient can only be used for unconstrained problems. >>

In fact, the only method I find accepted (other than automatic) is Interior Point.

I will of course continue to study, but if anyone knows of documenation on methods specific to either NonlinearModelFit or FindFit I would be grateful.

Thanks and kind regards,

David

POSTED BY: David Keith

InteriorPoint is the only FindMinimum method for constrained optimization. However, I don't understand where the constraints come from, since you're fitting parameters. It may be better to minimize the sum of the squares of the error terms using FindMinimum, to have more control over what's going on.

POSTED BY: Frank Kampas
Posted 10 years ago

Thanks, Frank. That is what I had come to suspect. The constraints arise from two sources. The important one is physical. Many of the parameters are probabilities, which must be 0<=p<=1. Others are initial values of counts, which cannot be negative. I am concerned that a fit could be found that violates these. I added some constraints to try to help the fit by boxing in some parameters to reasonable values. I will see how things go with no constraints -- my fears may be groundless. I had also been thinking to roll my own fit with FindMinimum -- it sounds like a good idea.

The workings are a bit frustrating, in that with a five-parameter manipulate I can quickly close in on a good-to-the-eye fit myself. However, it is probably unfair to judge the Mathematica algorithms by that standard. In thinking about what I am doing, I very quickly recognize that some parameters control the form of the curve, and others its placement in y and slope. So I manipulate the two sets alternately to home in on the fit. (Perhaps someone has tried similar things algorithmically, by fitting the 1st derivative, second derivative and the values iteratively.)

POSTED BY: David Keith
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract