# Principle of RandomSearch method in NMinimize?

Posted 6 months ago
866 Views
|
8 Replies
|
0 Total Likes
|
 Hello everyone, I am using NMinimize procedure with RandomSearch method explicitly chosen for optimization of a non-convex 6 dimensional problem. Those 6 variables are non-negative and they sum up to one.Can someone explain me how does RandomSearch method in Wolfram Language environment work? It is unclear from http://reference.wolfram.com/language/tutorial/ConstrainedOptimizationGlobalNumerical.htmlFor example: "... generating a population of random starting points…“ - how admissible solutions are obtained? From which (multivariate) distribution we are sampling from? A similar question may be asked for remaining 3 methods, "NelderMead", "DifferentialEvolution", and "SimulatedAnnealing".The method seems to be different from method described at en.wikipedia.org/wiki/Random_search where hypercubes are mentioned. Am I right?Thank you for your answers!
8 Replies
Sort By:
Posted 6 months ago
 If I remember correctly hypercubes are used for this. Except if there are explicit linear inequality constraints, linear programming is used to find viable points in "random" directions. For equality constraints I believe variables are solved for in terms of others, and inequalities are changed accordingly.
Posted 5 months ago
 Thank you for your answer Daniel!You gave me very little insight into method. For deeper understanding I would like to know, for example, implicit starting parameters for "PenaltyFunction" (how is chosen, from which space of functions?), "InitialPoints" (how do i find them?). How the radius of hypercube is determined?I found in help: "The random search algorithm works by generating a population of random starting points and uses a local optimization method from each of the starting points to converge to a local minimum. The best local minimum is chosen to be the solution. " - There are no hypercubes mentioned.I have no reason to believe that method is not good, but it is hardly defendable for me to use metod that is not properly cited or described.Thank you for your feedback.
Posted 5 months ago
 You can do much better random searches using ParametricIPOPTMinimizehttp://community.wolfram.com/groups/-/m/t/1164680
Posted 5 months ago
 You can see that FindMinimum is called, presumably on random points, although I don't know why there are 2n+1 (there are 11 for me in V11.3, for n = 5): Trace[ NMinimize[f, {x, y}, Method -> {"RandomSearch", "SearchPoints" -> 5}], _FindMinimum, TraceInternal -> True ] 
Posted 5 months ago
 Thank you for participating in this topic. Let me summarize some conclusions: We know that function calls FindMinimum (local optimization method) on certain number of certain points (See Michael Rogers post) Behavior of SearchPoints is strange. (See Michael Rogers post) How radius is determined is unclear. Usage of Penalty function is unclear. How pseudo generator of admissible solution work is unclear. According to Daniel Lichtblau's post "For equality constraints I believe variables are solved for in terms of others, and inequalities are changed accordingly." - Sounds like we solve for one variable and treat others like parameters of solution. How do we set those parameters pseudo randomly such that we somehow cover admissible set?
Posted 5 months ago
 Is there someone who could possibly write me the procedure in steps?Say: Sampling random points from admissible set using [some] multivariate distribution. …. …. …. Result = […]. We are not far from classifying method as a blackbox for now.Lukas.