# Beware the Infinite Lagrange Multiplier

Posted 9 years ago
5816 Views
|
4 Replies
|
8 Total Likes
|
 Consider the minimization of x^2+ y^2 subject to Sin[ x y ] == 1 In[1]:= NMinimize[{x^2 + y^2, Sin[x y] == 1}, {x, y}] Out[1]= {3.14159, {x -> 1.25331, y -> 1.25331}} However, FindMinimum doesn't handle this problem, for some starting values close to the solution In[2]:= FindMinimum[{x^2 + y^2, Sin[x y] == 1}, {{x, 1}, {y, 1}}] During evaluation of In[2]:= FindMinimum::eit: The algorithm does not converge to the tolerance of 4.806217383937354*^-6 in 500 iterations. The best estimated solution, with feasibility residual, KKT residual, or complementary residual of {0.158529,0.0293408,0}, is returned. >> Out[2]= {2., {x -> 1., y -> 1.}} Construct the Lagrange multiplier conditions, replacing the 1 with \[Alpha]. In[3]:= lg = Join[Thread[ Grad[x^2 + y^2 - \[Lambda] (Sin[x y] - \[Alpha]), {x, y, \[Lambda]}] == 0], {Sin[x y] == \[Alpha]}] Out[3]= {2 x - y \[Lambda] Cos[x y] == 0, 2 y - x \[Lambda] Cos[x y] == 0, \[Alpha] - Sin[x y] == 0, Sin[x y] == \[Alpha]} Solve with Reduce In[4]:= Reduce[lg] Out[4]= ((x == 0 && y == 0) || (x Cos[x y] != 0 && ((y == -x && \[Lambda] == -2 Sec[x y]) || (y == x && \[Lambda] == 2 Sec[x y])))) && \[Alpha] == Sin[x y] Further Reduce the solution for x and y equal, with some bounds on x and y to simplify the result. In[5]:= Reduce[y == x && \[Lambda] == 2 Sec[x y] && \[Alpha] == Sin[x y] && 0 < x <= 3/2 && 0 <= y <= 3/2, {x, y, \[Lambda]}, Reals] Out[5]= ((0 < \[Alpha] < 1 && x == Sqrt[2] Sqrt[ ArcTan[(1 - \[Alpha] Sqrt[( 1 - \[Alpha]^2)/\[Alpha]^2])/\[Alpha]]]) || ((2 Tan[9/8])/( 1 + Tan[9/8]^2) <= \[Alpha] < 1 && x == Sqrt[2] Sqrt[ ArcTan[(1 + \[Alpha] Sqrt[(1 - \[Alpha]^2)/\[Alpha]^2])/\[Alpha]]])) && y == x && \[Lambda] == 2 Sec[x y] Calculate the value of x and y with \[Alpha] -> 1 In[6]:= Sqrt[2] Sqrt[ ArcTan[(1 - \[Alpha] Sqrt[( 1 - \[Alpha]^2)/\[Alpha]^2])/\[Alpha]]] /. \[Alpha] -> 1 Out[6]= Sqrt[\[Pi]/2] With x and y both equal to Sqrt[\[Pi]/2], the value of the Lagrange multiplier is In[7]:= 2 Sec[\[Pi]/2] Out[7]= ComplexInfinity
4 Replies
Sort By:
Posted 9 years ago
 Yes, WR Tech Support also pointed out that NMinimize works for this problem. A simpler approach to a symbolic solution than the one I first posted is to require that the gradient of the objective function be perpendicular to the tangent space of the constraint. This works without taking a limit. In[3]:= gradcons = Grad[{Sin[x y] - 1}, {x, y}] Out[3]= {{y Cos[x y], x Cos[x y]}} In[4]:= nl = NullSpace[gradcons] Out[4]= {{-(x/y), 1}} In[5]:= gradobj = Grad[x^2 + y^2, {x, y}] Out[5]= {2 x, 2 y} In[7]:= Reduce[nl.gradobj == 0 && Sin[x y] == 1] Out[7]= (C[1] \[Element] Integers && (y == -Sqrt[\[Pi]/2 + 2 \[Pi] C[1]] || y == Sqrt[\[Pi]/2 + 2 \[Pi] C[1]]) && x == y) || (C[1] \[Element] Integers && (y == -Sqrt[-(\[Pi]/2) + 2 \[Pi] C[1]] || y == Sqrt[-(\[Pi]/2) + 2 \[Pi] C[1]] || y == -Sqrt[(3 \[Pi])/2 + 2 \[Pi] C[1]] || y == Sqrt[(3 \[Pi])/2 + 2 \[Pi] C[1]]) && x == -y)
Posted 9 years ago
 This Lagrange multiplier does go to infinity to keep $\lambda \cdot \cos(x \cdot y) = 1/2$. What I mean is that this procedure of $\alpha \rightarrow 1$ is always going to give two solutions from your problem, no matter how close $\alpha$ to $1$. In particular, the following two statements are identical ( $\epsilon$ and $\zeta$ are positive and we confine $x \cdot y \in \{0, 2 \pi \}$ here): find min $x^2 + y^2$ s.t. $\sin(x \cdot y ) = 1 -\epsilon, \epsilon \rightarrow 0$ find min $x^2 + y^2$ s.t. $x \cdot y = \pi/2 \pm \zeta, \zeta \rightarrow 0$ while in the second statement we have two constraint surfaces, though they may be very close. The two points found by the Lagrange multiplier are not on a simply connected path on the constraint surface, I believe that this makes this problem special. I prefer to use NMinimize to deal with this case. It seems that this function has an option, SimulateAnnealing, that handles this discrete domain very well. You can give it a shot (the solver starts from the red point and it end at the blue point in the contour plot): res2 = Reap@ NMinimize[{x^2 + y^2, Sin[xy] == 1}, {x, y}, EvaluationMonitor :> Sow[{x, y}], Method -> "SimulatedAnnealing"]; start = res2[[2, 1, 1]] end = res2[[2, 1, -1]]
Posted 9 years ago
 Yes, setting Sin[x y] ==1 gives contradictory equations for the Lagrange multiplier method: In[8]:= Reduce[lg /. \[Alpha] -> 1] Out[8]= False I omitted that in the interest of brevity. I don't understand your statement that this is not an issue with infinity. It seems to me that the problem is difficult for alpha =1 because the Lagrange multiplier goes to infinity. If I run the problem using COIN-OR's Interior Point solver IPOPT, I get a very high value for the Lagrange multiplier. In[10]:= callIpOpt[ x^2 + y^2, {Sin[x y] == 1}, {{x, -2, 1, 2}, {y, -2, 1, 2}}] Out[10]= {3.14159, {x -> 1.25331, y -> 1.25331}, "cons values" -> {1.}, "var lb \[Lambda]s" -> {7.70262*10^-10, 7.70262*10^-10}, "var ub \[Lambda]s" -> {3.35603*10^-9, 3.35603*10^-9}, "cons \[Lambda]s" -> {-1.15281*10^7}, "Solve_Succeeded"} If I use EvaluationMonitor to look at what FindMinimum is doing internally, I see that it actually does come close to the solution. In[11]:= resrs = Reap @ FindMinimum[{x^2 + y^2, Sin[x y] == 1}, {{x, 1}, {y, 1}}, EvaluationMonitor :> Sow[{x, y}]]; During evaluation of In[11]:= FindMinimum::eit: The algorithm does not converge to the tolerance of 4.806217383937354*^-6 in 500 iterations. The best estimated solution, with feasibility residual, KKT residual, or complementary residual of {0.158529,0.0293408,0}, is returned. >> In[15]:= resrs[[2, -1, -2]] Out[15]= {1.25331, 1.25331} But apparently throws it out because it hasn't met the convergence criteria.
Posted 9 years ago
 The method failed at the very beginning: if you already put $alpha = 1$ in the reduced form: {2 x - y lambda Cos[x y] == 0, 2 y - x lambda Cos[x y] == 0, 1 - Sin[x y] == 0}the paradox is easily caught: $\sin(x \cdot y ) = 1$ implies that $\cos(x \cdot y)$ is $0$. Hence both $x$ and $y$ are zero from the first and second equation. This, on the other hand, yields $\cos(x \cdot y) = 1$ . Apparently this contradicts to our assumption.If you replace the constraint with $x \cdot y = \pi/2$, you will not have this issue.Why does this paradox happen? This is not an issue with infinity or Mathematica. Let`s try $\sin(x \cdot y) = 0.9$, you should have two hyperbolas without even using branching: $x \cdot y = 2.02$ or $x \cdot y = 1.12$. Then you might find the minimum values from the two constraints and pick the smaller one as the result. So far so good. Yet this "pick-the-smaller" actually has jump between result from the two constraint surfaces, which are not simply connected. And this jump is absolutely artificial and not allowed regardless how close these two curves are. In this example, we say this type of jump is not permitted however close $\alpha$ approaches to $1$. This makes sense right? if the two parabolas had met each other at the first place, we could have just need one of them instead of two.