# Equality-Constrained Optimization With Differential Forms

Posted 9 years ago
6606 Views
|
0 Replies
|
0 Total Likes
|
 Equality-constrained optimization can be performed without Lagrange mulitpliers, using differential forms.See "Differential forms for constrained max-min problems: eliminating Lagrange multipliers", College Mathematics Journal 29 (November 1998), no. 5, 387--396, by Frank ZizzaThe differential forms condition to minimize or maximize f subject to g1 == 0, g2 == 0, etc. is df ^ dg1 ^ dg2 ... == 0 where df, dg1, dg2, etc. are differentials of f, g1, g2, etc. and ^ is the wedge product. This operation can be carried out in Mathematica using the TensorWedge function. For example, considering minimizing x + y subject to x^2+ y^2==1 In:= f = x + y; g = x^2 + y^2 - 1; vars = {x, y}; In:= df = Grad[f, vars]; dg = Grad[g, vars]; In:= w = SymmetrizedArrayRules[TensorWedge[df, dg]] Out= {{1, 2} -> -2 x + 2 y, {_, _} -> 0} In:= Reduce[w[[All, 2]] == 0 && g == 0, vars, Backsubstitution -> True] Out= (x == -(1/Sqrt) && y == -(1/Sqrt)) || (x == 1/Sqrt && y == 1/Sqrt) The Lagrange multiplier approach would be In:= Reduce[df == \[Lambda] dg && g == 0, vars, Backsubstitution -> True] Out= (\[Lambda] == -(1/Sqrt) && x == -(1/Sqrt) && y == -(1/Sqrt)) || (\[Lambda] == 1/Sqrt && x == 1/Sqrt && y == 1/Sqrt)