Group Abstract Group Abstract

Message Boards Message Boards

0
|
2.5K Views
|
1 Reply
|
2 Total Likes
View groups...
Share
Share this post:

Expand symbolic matrix calculation

Anonymous User
Anonymous User
Posted 11 years ago

I have the following piece of code. I assume some Matrices which have a certain size. Then I define 2 equations and I substitute the one into the other.

ClearAll["Global"]

$Assumptions {
  Element[mGamma, Matrices[{nip, nip}, Reals]],
  Element[G, Matrices[{noc, nsc}, Reals]],
  Element[H, Matrices[{noc, nip}, Reals]]
  }

plantinput = ut[k] == (IdentityMatrix[nip] - mGamma).ut[k - 1] + mGamma.u[k]
controloutput = u[k] == G.w[k] + H.et[k]

s1 = ExpandAll[plantinput /. Rule @@ controloutput]

This gives me the following result:

ut[k] == mGamma.(G.w[k] + H.et[k]) + (-mGamma + IdentityMatrix[nip]).ut[-1 + k]

However I want

ut[k] == mGamma.G.w[k] + mGamma.H.et[k] + (-mGamma + IdentityMatrix[nip]).ut[-1 + k]

How can I achieve this?

POSTED BY: Anonymous User
Posted 11 years ago
POSTED BY: Bill Simpson
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard