The singular value decomposition is useful and effective for floating-point matrices. It is hardly surprising that it bogs down already for 4x4 matrices in the exact case. Try this with a 20x20 matrix:
svdInverse[a_?MatrixQ] := Module[{u, s, v}, {u, s, v} = SingularValueDecomposition[N[a]]; v . Inverse[s] . ConjugateTranspose[u]] MatrixForm[mat = RandomInteger[10, {20, 20}]] svdInverse[mat] // MatrixForm svdInverse[mat] - Inverse[mat] // Chop // MatrixForm
For exact matrices, singular value decomposition is awfully more complicated than inversion, as it involves solving nonlinear algebraic equations.
How about
svdInverse[a_?MatrixQ] := Module[{u, s, v}, {u, s, v} = SingularValueDecomposition[a]; v . Inverse[s] . ConjugateTranspose[u]]; svdInverse[{{1, 1, 1}, {0, 2, 3}, {5, 5, 1}}] // FullSimplify
By the way, your use of Return is redundant.
Return
Thanks for your reply. Your solution works for 3x3 matrices, but hangs on 4x4 matrices. It aborts on TimeConstrained[ svdInverse[m] ], 180]; in any case it is slower than Inverse and qrInverse.
I’d not use SVD for that task unless working with approximate numbers.
Why not?
However, the point is moot, since on my computer with version 14, it has stopped working for 4x4 (and presumably larger) matrices.I tried SingularValueDecomposition because Weisberg seems to mention it here (Weisberg, Sanford. Applied Linear Regression (Wiley Series in Probability and Statistics) (p. 309). Wiley. Kindle Edition), although I cannot immediately find where he actually recommends its use.