I have the following piece of code. I assume some Matrices which have a certain size. Then I define 2 equations and I substitute the one into the other.
ClearAll["Global"]
$Assumptions {
Element[mGamma, Matrices[{nip, nip}, Reals]],
Element[G, Matrices[{noc, nsc}, Reals]],
Element[H, Matrices[{noc, nip}, Reals]]
}
plantinput = ut[k] == (IdentityMatrix[nip] - mGamma).ut[k - 1] + mGamma.u[k]
controloutput = u[k] == G.w[k] + H.et[k]
s1 = ExpandAll[plantinput /. Rule @@ controloutput]
This gives me the following result:
ut[k] == mGamma.(G.w[k] + H.et[k]) + (-mGamma + IdentityMatrix[nip]).ut[-1 + k]
However I want
ut[k] == mGamma.G.w[k] + mGamma.H.et[k] + (-mGamma + IdentityMatrix[nip]).ut[-1 + k]
How can I achieve this?