| <<< Note 1226.0 by XANADU::BISWAS >>>
-< Differentiation of matrix equations
>-
>> I have a very basic question:
>> For scalars, if x= a**2, then dx/dt = 2a*(da/dt)
>> Now, if I have a matrices, X = A**2 (where A, X are matrices)
>> Then, is dX/dt = 2A (dA/dt), or is it
>> dX/dt = A(dA/dt) + (dA/dt)A, because order matters in matrices?
>> The real problem that I have is the following:
>> Let A'(t) = dA/dt
>> Given, A' = K(t)A(t)
>> Now say, x(t) = A(t)x(0)A(t), where x(0) is not time dependant.
>> *** I need, x'(t)? ***
>> Is, x'(t) = A' x(0)A + Ax(0) A'
>> Therefore, is x'(t) = KAx(0)A + Ax(0)KA (substituting, A'= KA)
>> (I would like it to be in the form x'(t) = KAx(0)A + Ax(0)AK, because then
>> that simplifies to x' = Kx + xK )
>> What am I doing wrong and what is the right way? Thanks a pile.
Let's go back to the definition and play around with it a little and see what we
get.
By definition, dx/dt is lim(x(t+zt)-x(t)/zt) as zt goes to zero. Let's use this
on your function A*x0*A.
Forming the expression corresponding to x(t+zt)-x(t), we get
A(t+zt)*x0*A(t+zt) - A(t)*x0*A(t), which can be rewritten as
A(t+zt)*x0*A(t+zt) - A(t)*x0*A(t+zt) + A(t)*x0*A(t+zt) - A(t)*x0*A(t), which can
be rewritten as
(A(t+zt)-A(t))*x0*A(t+zt) + A(t)*x0*(A(t+zt)-A(t)).
We know that A(t) is differentiable (and thus continuous), so we can divide by
zt and take the limit. We get (remember that zt is a scalar!):
((A(t+zt)-A(t))/zt)*x0*A(t+zt) + A(t)*x0*((A(t+zt)-A(t))/zt)
Now let zt go to zero and we get
A'*x0*A + A*x0*A'.
|
| Great, that derivation from first principles was very clear.
Now as far as the second part goes:
We have established, given x = A(t) x(0) A(t)
then, x' = A'x(0)A + Ax(0)A'
Now, if it is given, A' = K(t)A(t)
Then, x' = KAx(0)A + Ax(0)KA
(As I had mentioned, I would have like to have in the form:
x' = KAx(0)A + Ax(0)AK, because that would have given me
x' = Kx + xK ( original equation, x = Ax(0)A)
Any way to get there? )
|
| <<< Note 1226.2 by XANADU::BISWAS >>>
>> ... x' = A'x(0)A + Ax(0)A'
>> Now, if it is given, A' = K(t)A(t)
>> Then, x' = KAx(0)A + Ax(0)KA
>> (As I had mentioned, I would have like to have in the form:
>> x' = KAx(0)A + Ax(0)AK, because that would have given me
>> x' = Kx + xK ( original equation, x = Ax(0)A)
>> Any way to get there? )
I see no obvious reason why a matrix K with the property you look for should
exist (in general). If, however, det(A) is not zero for all values of t, the
matrix K exists and is equal to A'*A**(-1). Assuming that K exists (and,
equally important, it is feasible to calculate what the elements of K are), we
get (I'm doing it slowly here so as to convince myself that I'm not stumbling)
x' = K*A*x0*A + A*x0*K*A, or x' = K*x + A*x0*K*A (1)
What you want is
x' = K*x + A*x0*A*K (2)
In order for what you want (i.e., eqn (2)) to follow from what we know (i.e.,
eqn (1)), the matrices A and K must commute (i.e., K*A == A*K). A necessary and
sufficient condition for that is that A (for all values of t) has a set of
eigenvectors that span R**n (unless the matrix A is square, the two matrices can
not commute). (Strictly speaking, the condition for two square matrices to
commute is that they have a common set of eigenvectors that span R**n, but in
this case [because of the relation between K and A], if a given vector is an
eigenvector of A it is also an eigenvector of K [I am disregarding the
possibility of eigenvalues of zero].) A sufficient condition that
A's eigenvectors span R**n is that A is Hermitian (i.e., that A(i,j) =
CONJG(A(j,i)) for all I and j; for a real matrix A this is equivalent to A being
symmetric).
So far, we have established a number of conditions that A must satisfy:
1. The elements of A must be differentiable functions of t (from note
1226.2)
2. The determinant of A must be non-zero for all (interesting) values of
t (from definition of K).
3. A's eigenvectors must span R**n (a condition that guarantees that A's
eigenvectors span R**n is that A is Hermitian), because the matrices A
and K must commute.
What you need to do is to find out whether A satisfies these conditions and
whether is is practical to go this route (e.g., if A is of order 1000 you will
have problems regardless of how well-behaved the matrix A is).
|
| Thanks a lot -- that gets me thinking in the right direction.
The matrix A is a state transition matrix of a system of the form:
x' = Kx
If L be the fundamental matrix (columns comprise the solution of the equation)
of the above system, then
A(t, t0) = L(t) * L**(-1)(t0)
^
|
implies inverse
Property of state transition matrix is that it satisfies the underlying diff
eqn, that is how we had got, A' = K A
I go back and read up on the other properties.
Thanks for all the help.
Prabuddha
|