Group Abstract Group Abstract

Message Boards Message Boards

Principle of RandomSearch method in NMinimize?

Posted 8 years ago

Hello everyone,

I am using NMinimize procedure with RandomSearch method explicitly chosen for optimization of a non-convex 6 dimensional problem. Those 6 variables are non-negative and they sum up to one.

Can someone explain me how does RandomSearch method in Wolfram Language environment work? It is unclear from http://reference.wolfram.com/language/tutorial/ConstrainedOptimizationGlobalNumerical.html

For example: "... generating a population of random starting points…“ - how admissible solutions are obtained? From which (multivariate) distribution we are sampling from? A similar question may be asked for remaining 3 methods, "NelderMead", "DifferentialEvolution", and "SimulatedAnnealing".

The method seems to be different from method described at en.wikipedia.org/wiki/Random_search where hypercubes are mentioned. Am I right?

Thank you for your answers!

POSTED BY: Lukas Vacek
8 Replies

Is there someone who could possibly write me the procedure in steps?

Say:

  1. Sampling random points from admissible set using [some] multivariate distribution.
  2. ….
  3. ….
  4. ….
  5. Result = […].

We are not far from classifying method as a blackbox for now.

Lukas.

POSTED BY: Lukas Vacek

How would you find an "admissible set"?

POSTED BY: Daniel Lichtblau

That is what I am asking for.

POSTED BY: Lukas Vacek
POSTED BY: Lukas Vacek

You can see that FindMinimum is called, presumably on random points, although I don't know why there are 2n+1 (there are 11 for me in V11.3, for n = 5):

Trace[
 NMinimize[f, {x, y}, Method -> {"RandomSearch", "SearchPoints" -> 5}],
 _FindMinimum,
 TraceInternal -> True
 ]
POSTED BY: Michael Rogers

You can do much better random searches using ParametricIPOPTMinimize

http://community.wolfram.com/groups/-/m/t/1164680

POSTED BY: Frank Kampas

Thank you for your answer Daniel!

You gave me very little insight into method. For deeper understanding I would like to know, for example, implicit starting parameters for "PenaltyFunction" (how is chosen, from which space of functions?), "InitialPoints" (how do i find them?). How the radius of hypercube is determined?

I found in help: "The random search algorithm works by generating a population of random starting points and uses a local optimization method from each of the starting points to converge to a local minimum. The best local minimum is chosen to be the solution. " - There are no hypercubes mentioned.

I have no reason to believe that method is not good, but it is hardly defendable for me to use metod that is not properly cited or described.

Thank you for your feedback.

POSTED BY: Lukas Vacek

If I remember correctly hypercubes are used for this. Except if there are explicit linear inequality constraints, linear programming is used to find viable points in "random" directions. For equality constraints I believe variables are solved for in terms of others, and inequalities are changed accordingly.

POSTED BY: Daniel Lichtblau
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard