There is something known as a MahalanobisDistance. I can write a Mathematica function to calculate it for two vectors u and v with respect to some data. Here's a function that implements it.
Frequently, one is trying to find the distance from some vector u to the mean of data, so one can also write a two argument form:
So far, so good, I think. But it is also correctly reported that the MahalanobisdDistance is the same as the EuclideanDistance when the points are subjected to a special linear transformation. After some poking about, it appears that one can find the transformation through the following process:
And, this seems to work. But ... SchurDecomposition insists that its argument be numeric. Thus, the N in the code above. But surely, I say, at least where the matrix in question has the properties of an inverse covariance matrix (symmetry and who knows what else) there is some way of computing an exact or symbolic answer. I;ve seem some stuff on the internet suggesting that there are eigenvalues involved, but, I can't seem to get any of it to work. Any suggestions for either a special case of SchurDecomposition that works on integer or symbolic matrices or an alternative way of generating the MahalanobisTransform. I have the sense there is a way to do this.
This is all relevant to inference of causation from observational data and computational efficiency, if anyone wants to know why anyone might care. I'd like to be able to write a MahlanobisDistanceFunction that is something like this. It would be nice if T could be symbolic.
Ultimately, I'd like an elegant ResourceFunction to come of this work.
I'm probably missing something, but should it not be:
From what I see at this Wikipedia page, I think you can obtain an equivalent transformation matrix using CholeskyDecomposition.
Thanks, Dan, that's extremely helpful. And it seems to work on my test cases involving symbolic and integer data. But ... if you take a large numeric data matrix, and compute its inverse covariance matrix, there will be little errors that creep into the process such that the matrix no longer appears Hermitian. So CholeskyDecomposition balks. So to make my little MahalanobisDistanceFunction ResourceFunction work, I may need something that either forces the inverse to be symmetric or that reverts to the numeric SchurDecomposition for numeric matrices. Should not be all that hard.
The Mahalanobis measure - in its use of the covariance matrix (or tensor), contracts measurements in relation to the mean of the domain. It is appropriate for the dynamics of some systems but not others. In particular, it fails the triangle inequality for domains of several application areas and thus in those cases is invalid for use as a distance. Also - as you might have discovered, the covariant can be singular in smaller data sets which is a second item to check before use of this measure.
@Richard Frost, It is not at all clear what you mean by claiming the Mahalanobis distance can fail triangle inequality. Bear in mind that it is a Euclidean metric after transforming coordinates by a (symmetric) positive semidefinite matrix.
In the case of a vector domain, some datasets will produce a numerically ill-conditioned covariance matrix. Some practitioners do not check or pay heed to error messages when the inverse is computed and thus the resulting measures can fail the triangle inequality - and sometimes also fail other requirements for a metric.
In a tensor domain one must also determine a suitable norm. I've tried the Spectral Radius without success.
Third, there is the dubious general practice of flattening tensors into vectors - which from the physics viewpoint removes dynamics inherent to the tensors, and can also introduce inappropriate statistical relations within the elements of the vector. So for the particular case of the Mahalanobis measure, it's unsuitable for tensors unless someone determines an appropriate norm.