Message Boards Message Boards

0
|
8352 Views
|
7 Replies
|
0 Total Likes
View groups...
Share
Share this post:

Speeding Up Matrix Expression Evaluation With Expand

POSTED BY: Frank Kampas
7 Replies

Yes, the QuadraticProgramming option for FindMinimum is quite nice, but does require the problem have a constraint, as I complained about in a separate post.

I took Boyd's online course on Convex Optimization last year. The homework had to be done with CVX.
I understand that MATLink was written by some Mathematica users who wanted to be able to access CVX from Mathematica.

POSTED BY: Frank Kampas

I was reproducing an example in Boyd's book on Convex Optimization, which stopped at the maximization. The maximization is the dual of minimizing v.v subject to a.v == b, which NMinimize can do very quickly.

POSTED BY: Frank Kampas

FindMinimum can be even faster if told this is a quadratic problem.

Timing[{min, vals} = 
   FindMinimum[{x.x, a.x == b}, x, Method -> "QuadraticProgramming"];]

(* {0.011513, Null} *)

Very good book you are using. I need to get back to it some day (was going through it 2-3 years ago, got to around p. 200).

POSTED BY: Daniel Lichtblau

You can speed it slightly more using FindMinimum. or the quadratic formula (is there a reason you are not going that route)?

n = 50;
SeedRandom[0];
a = RandomReal[{-1, 1}, {n, n}];
b = RandomReal[{-1, 1}, n];
x = Array[xx, n];

Timing[
 aat = a.Transpose[a];
 iaat = Inverse[aat];
 vals = -2*b.iaat;
 va = vals.a;
 max = -1/4 va.va - b.vals;
 {max, vals}
 ]

(* Out[443]= {0.000278, {27.4582378854, {-13.7190746299, 5.0248966213, 
   4.27200941301, -5.27186128016, -9.34445324724, -16.7770271653, \
-10.2969820955, 14.8082311631, 18.2881033362, 8.79472160829, 
   3.73982999401, 25.6931835927, 
   15.0987923009, -32.8520372134, -17.6629530707, 
   4.61315429798, -10.4093844839, -5.91550548946, 
   2.825412681, -5.4744873435, 1.75459547275, -8.63951633905, 
   7.73932523613, 12.4463904122, 4.99696850235, 
   1.03233240236, -1.92340140008, 0.0448214342458, -12.3600513517, 
   17.2642317887, -9.3190511584, 16.5787124096, 
   9.79082324277, -9.25426886038, 
   3.42454996007, -2.00990959625, -5.07728002623, -8.45579303961, 
   10.077263217, -22.7847052432, -14.8615953665, -19.469187501, \
-4.28859347837, -21.1896979177, 2.43729925419, 18.9547713922, 
   18.3169977856, 3.46879765417, 9.2438775981, -12.1369961499}}}501, \
-4.28859347837, -21.1896979177, 2.43729925419, 18.9547713922, 
   18.3169977856, 3.46879765417, 9.2438775981, -12.1369961499}}} *)
POSTED BY: Daniel Lichtblau

Calculating a.Transpose[a] outside the maximization works slightly better than Expand.

POSTED BY: Frank Kampas

If differentiation of the objective function is involved, then I would suspect that the expanded form is easier to differentiate and that it's easier to evaluate the derivative. Your grouping could work in that direction as well.

POSTED BY: Frank Kampas

This is pretty fast.

AbsoluteTiming[{max, vals} = 
   NMaximize[-1/4 (v.a).(Transpose[a].v) - b.v, v];]

It's all in the ordering of the operations. Grouping matrix.vector helps considerably, I guess.

POSTED BY: Daniel Lichtblau
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract