It has to do with the fact that the system is overdetermined, combined with vagaries of variable ordering and its effect on the underlying linear algebra. The code and commentary below will outline what has happened, with one caveat: it seems that Solve
internals might be reordering, because what Solve
does is opposite to what we will see from linear algebra based on corresponding variable orderings.
So here is the basic setup.
h1 = {60, -100, 50} + r*{0.18125, 1.45, -0.3625};
h2 = {59.5, -100, 50} + s*{0.1845, 1.44, -0.36};
Solve[h1 - h2 == 0, {r, s}]
Solve[h1 - h2 == 0, {s, r}]
(* During evaluation of In[57]:= RowReduce::luc: Result for RowReduce of badly conditioned matrix {{-0.1845,0.18125,0.5},{-1.44,1.45,0.},{0.36,-0.3625,0.}} may contain significant numerical errors. >>
Out[59]= {}
Out[60]= {{s -> 111.111, r -> 110.345}} *)
Now we capture corresponding matrices and use LUDecomposition
to see what row reduction will have to say about them.
{rhs, mat} = Normal[CoefficientArrays[h1 - h2, {r, s}]]
newmat = Join[mat, Map[List, -rhs], 2]
LUDecomposition[newmat]
(* Out[51]= {{0.5, 0.,
0.}, {{0.18125, -0.1845}, {1.45, -1.44}, {-0.3625, 0.36}}}
Out[52]= {{0.18125, -0.1845, -0.5}, {1.45, -1.44, 0.}, {-0.3625, 0.36,
0.}}
During evaluation of In[51]:= LUDecomposition::sing: Matrix {{0.18125,-0.1845,-0.5},{1.45,-1.44,0.},{-0.3625,0.36,0.}} is singular. >>
Out[53]= {{{1.45, -1.44, 0.}, {0.125, -0.0045, -0.5}, {-0.25, 0.,
0.}}, {2, 1, 3}, \[Infinity]} *)
The claim that it is singular means augmenting the matrix with the (negated) right-hand-side vector row-reduced in a way that made the rhs vanish. So this case would be solved, even though we had three equations in but two unknowns, because augmenting does not change the rank (that is to say, the rhs is in the span of the first two columns).
In contrast, when we switch the variable ordering we are not quite so lucky. Or luckier, depending on one's point of view, in that we are informed of a numeric instability caused by an overdetermined system.
{rhs2, mat2} = Normal[CoefficientArrays[h1 - h2, {s, r}]]
newmat2 = Join[mat2, Map[List, -rhs2], 2]
LUDecomposition[newmat2]
(* Out[54]= {{0.5, 0.,
0.}, {{-0.1845, 0.18125}, {-1.44, 1.45}, {0.36, -0.3625}}}
Out[55]= {{-0.1845, 0.18125, -0.5}, {-1.44, 1.45, 0.}, {0.36, -0.3625,
0.}}
During evaluation of In[54]:= LUDecomposition::luc: Result for LUDecomposition of badly conditioned matrix {{-0.1845,0.18125,-0.5},{-1.44,1.45,0.},{0.36,-0.3625,0.}} may contain significant numerical errors. >>
Out[56]= {{{-1.44, 1.45, 0.}, {0.128125, -0.00453125, -0.5}, {-0.25,
1.22507*10^-14, 6.12537*10^-15}}, {2, 1, 3}, 6.55289*10^16} *)
Here the matrix is deemed ill-conditioned but not singular, which means the linear algebra does not quite manage to zero out the augmented column. So the rhs is not seen to be in the span of the first two.
Is machine arithmetic linear algebra really hopeless here? Not at all. A function like RowReduce
has but limited leeway; it does Gaussian elimination. LinearSolve
, by contrast, can use rank-determining technology such as from a singular values decomposition, when handling an overdetermined system. Which is why both cases can be handled by LinearSolve
without fuss.
LinearSolve[mat, rhs]
(* Out[64]= {-110.345, -111.111} *)
LinearSolve[mat2, rhs2]
(* Out[65]= {-111.111, -110.345} *)