Downloaded 12/31/12 to 128.148.252.35. Redistribution subject to SIAM license or copyright; see http://www.siam.org/jour
Views 138 Downloads 1 File size 247KB
Downloaded 12/31/12 to 128.148.252.35. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php
SIAM J. APPL. MATH. Vol. 17, No. 3, May 1969
A NOTE ON KRONECKER MATRIX PRODUCTS AND MATRIX EQUATION SYSTEMS H.
NEUDECKER?
1. If A (aij) is an (m, n) matrix and B is an (s, t) matrix, the (ms, nt) Kronecker product A (R) B is defined as A (R) B (aijB).
One of the important properties of this product is that it enables us to convert matrices into column vectors. For this we need the following definition and theorem.
DEFINITION 1. Let A.j denote the jth column of an (m, n) matrix A. The mn column vector vec A is then defined as
vecA TI-IFORFM 1. Let A be an (m, n) matrix and B be an (n, p) matrix. Then vec AB
(B’ (R) Im) vec A,
(Ip (R) A)vec B
where B’ is the transpose of B. It has been shown that Definition 1 and Theorem 1 can fruitfully be applied to problems of matrix differentiation [2]. In this note it will be shown that they can be applied to a more general class of linear matrix equations, including linear matrix differential equations. Firstly, four standard properties of Kronecker products have to be related, all of which may be proved in an elementary fashion 1, p. 223 ff.]. The matrices involved can have any appropriate orders. In Property 4 it is assumed that A and B are square of order m and s, respectively. (The same order assumption will be made in Theorems 2 and 3, which will be presented further on.) PROPERTY 1. (A ( B)(C (R) D) (AC) (BD). A- (R) BPROPERTY 2. (A (R) B)PROPERTY 3. (A + B)(R)(C + D)= A(R)C+ A(R)D+B(R)C+ B(R)D. PROPERTY 4. If A has characteristic roots e, i- 1,..., rn, and if B has characteristic roots fl, j 1,..., s, then A (R) B has characteristic roots eflj. Further, I (R) A + B (R) Im has characteristic roots e +/.i. For the treatment of differential equations we need the matrix exponential
.
ea
=I+ A
A2
A3
+-. +-3. +...,
Received by the editors October 1, 1968.
"
Department of Econometrics and Social Statistics, The University of Birmingham, Birmingham,
England. 603
Downloaded 12/31/12 to 128.148.252.35. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php
604
H. NEUDECKER
which is known to converge for all square A. It is known that eA + s eAe s if and only if/t and B commute 1, p. 167]. We can now establish the following theorems. THEOREM 2.
exp (Is (R) A
+ B (R) Ira)
exp (I (R) A)exp (B (R) I,,),
rn and s being the orders of square A and B, respectively. Proof Is (R) A and B (R) I,, commute, and therefore the theorem is true. (We used Property 1.) THEOREM 3. ep(R)A =lp(R)eA, eB(R)tq
eB(R) Iq,
p and q arbitrary.
Proof The first part of the theorem is obvious, since the matrices involved are block diagonal. Further e(R)r=Is(R)Iq+
B(R)I+
(B ) lq)2
Is (R) [q + B (R) Iq +
Is+B+. +... by virtue of Properties
2
2
(R)Iq=eB@Iq,
and 3. We further relate that the equation dX dt
AX,
X(O)
C,
A constant, all matrices having order (n, n), has the solution [1, p. 167] X
etAC.
2. We shall now solve the equation
dX dt
AX + XB,
X(O)
C,
A and B constant, having orders (m, m) and (p, p), respectively, X having order (m, p). The equation can be rewritten as dvec X dt
(Ip (R) A + B’(R) I,,,)vec X,
vec X(O)
vec C,
by means of Theorem 1. This transformed equation has the standard solution" vec X
exp [t(I (R) A
+ B’ (R) Im)] vec C,
Downloaded 12/31/12 to 128.148.252.35. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php
KRONECKER MATRIX PRODUCTS
605
which can be rewritten as
(I v (R) e’a)(e’B’ (R) Ira)vec C
exp{t(It, (R) .A)} exp {t(B’ (R) Ira)} vec C
(It, ( eta) vec C em vec etAcetB, by means of Theorems 1 2 and 3, and Property 1. Therefore the original equation has the solution X etACetn. This equation has been solved by Bellman [1, p. 175] along different lines.
3. The same sort of technique can clearly be applied to the equation AX + XB C, all matrices having the same orders as in the previous section. We rewrite the equation as
(I v (R) A + B’(R) Im)vecX
vec C,
which has the unique solution
vecX=(I v (R) A + B’ (R) Im)- vec C, if and only if
IvA+B’I is nonsingular. The solution for X can be derived then by using Definition 1. Using Property 4 we can state: Necessary and sufficient for the equation to have a unique solution for all C is that + ]j 4= 0 for all and j. This condition has been established by Bellman [1, p. 231] along analogous lines. In case C 0 we can state: Necessary and sufficient for the equation to have a nontrivial solution is +/ 0 for some and j.
4. Another interesting example is the equation
AX
XA
2X.
It will be proved that it possesses a nontrivial solution for X if and only if 2 ai a for some and j, where the ai are the characteristic roots of A. This theorem is due to Lappo-Danilevsky, as reported by Bellman in [1, pp. 236 and 249]. We transform the equation into (I. (R) A
A’ (R) i.)vec X
2vec X,
which in fact is the characteristic equation of I. (R) A A’ (R) I Necessary and sufficient for a nontrivial solution for vec X (the characteristic column vector associated with 1. (R) A A’ (R) I,) is that 2 be a characteristic root of I. (R) A A’(R) 1.. As we know, the matrix has the characteristic roots ai- a s. This establishes the theorem.
Downloaded 12/31/12 to 128.148.252.35. Redistribution subject to SIAM license or copyright; see http://www.siam.org/journals/ojsa.php
606
H. NEUDECKER
5. It is required to find an expression for the derivative dX1/2/dt (cf. Belhnan [1, p. 161]). X is necessarily square of order n, say. We start from X1/2X 1/2 X and get the standard equation
dX 1/2 X /2
dt After rearranging we obtain
+
dX 1/2 XI/2dt
dX dt
dX /2
[(X/2)’ (R) I. + I. (R) X 1/2] vec dt
dX dt
vec-
and then vec
dX /2 dt
[(X/2)
(R)
dX
I. + I. (R) X’/2-] -’ vecdt
From this we can derive dX1/2/dt using Definition 1. X /2 clearly is not unique. (It may not be differentiable even if X is differentiable.) Acknowledgment. The author wishes to thank the referee for his critical remarks. REFERENCES [1] R. BELLMAN, Introduction to Matrix Analysis, McGraw-Hill, New York, 1960. [2] H. NEUDECKER, Some theorems on matrix differentiation with special reference to Kronecker matrix products, J. Amer. Statist. Assoc., to appear.