Nice! One of the things that confused me as beginner was mat.v.w
, for a matrix mat
and two vectors.
SeedRandom[99];
mat = RandomReal[1, {4, 3}];
v = RandomReal[1, 3];
w = RandomReal[1, 4];
The following are OK, since matrix multiplication is associative — however, I think that justification sounds right while actually being wrong:
(w.mat).v
w.(mat.v)
w.mat.v
But below, the first is OK, but the second two print an error. However, since Dot[]
has the attribute Flat
, the second evaluate to the first line, which then yields the correct value. So, strictly speaking, it seems to me, Dot[]
is not associative. That made me uncertain about the best way to use Dot[]
. That was partly because I hadn't really grokked it being a tensor operation instead of a strictly matrix op.
mat.v.w (* OK *)
mat.(v.w) (* Dot::dotsh error; then OK *)
v.w.mat (* Dot::dotsh error; then OK *)
What Dot[]
actually seems to do is, in effect, to contract a tensorial product of the arguments. It has the effect of, though it must do it more efficiently than, the following:
mat.v === TensorContract[Outer[Times, mat, v], {2, 3}]
w.mat === TensorContract[Outer[Times, w, mat], {1, 2}]
If you're feeling you can't get enough of tensors — well, more power you first off — here's a generic version:
amat = RandomReal[1, {3, 4}];
bten = RandomReal[1, {4, 5, 2}];
cvec = RandomReal[1, {2}];
dvec = RandomReal[1, {5}];
Dot[amat, bten, cvec, dvec] === Fold[
TensorContract[Outer[Times, ##],
{TensorRank[#1], TensorRank[#1] + 1}] &,
{amat, bten, cvec, dvec}]