|
re .1,.2,.3
Thanks for your advice on this matrix problem, I'm
new to Linear Algebra and need advice as to how to go
about attempting to prove formulas. I have learned
there are 3 types of ways of solving but supposedly
the best way to attempt to solve any problem is to
prove it through mathematical induction.
Do most mathematicians prove through mathematical
induction?
Michele
< Note 763.3 by CIMNET::KOLKER "Conan the Librarian" >
-< Need a step >-
re .2
I think you left out a step. You have to show that:
det(A x B) = det(A) * det(B) where A,B are matrices of the appropriate
order.
|
| Here's an elementry proof that DET(A*B) = DET(A)*DET(B) that can be
used for the base problem.
It's useful to recall the definition of the 'wedge product'
(also called the exterior product, alternating product, Grassman
product, etc.) This is a generalization of the cross product in
3 dimensions. The wedge product of two n-dimensional vectors gives
the oriented area spanned by the vectors. If the vectors are linearly
dependant (collinear), then they don't span any area and the product
vanishes. The wedge product of 3 vectors gives the oriented 3-volume in
n-space, and again vanishes if the vectors are coplanar or collinear
(not linearly independant.)
Let W(V1, V2, ...) be the wedge product of the vectors Vi.
It has the properties:
a) It is multilinear, so that
W(a*V1a+b*V1b, v2, v3, ...) = a*W(V1a, V2, ...) + b*(V1b, V2, ...).
b) It is antisymmetric, so transposing any pair of vectors negates
the sign.
c) By using properties (a) and (b) it vanishes if the vectors are
not linearly dependant. In particular repeating a vector makes
the product vanish.
d) It is associative, so that
W(V1, V2, V3) = W(V1, W(V2, V3)) = W(W(V1, V2), V3)
e) The wedge product of k n-dimensional vectors is C(n,k) dimensional,
the number of ways of choosing k things out of n.
Think of a square matrix, A, as a set of n row (or column) vectors.
Then DET(A) is the wedge product of all n vectors. It is a scalar
because there is only one way of choosing n things out of n.
By definition the row vectors of the matrix product C = A*B can be
thought of as weighted sums of the row vectors of B transformed by A.
c[i] = SUM(k) a[i,k]*b[k]
If you write out DET(C) as the wedge product of its row vectors, c[i],
and plug in the individual sums above you'll get a huge summation of
n^n wedge products. But the key thing is that most of them vanish!
The only ones surviving will be those with all distinct b[k]'s, (by
property (c) above) and there are n! of those. Further, all these
products will involve permutations of the b[k]'s and can be permutated
into order, multiplied by +/- 1 depending if the permutation is even or
odd, and factored out of the sum. But this is just DET(B), and the
remaining sum of products of the A elements is just DET(A).
Thus DET(A*B) = DET(A)*DET(B).
It's important to think geometrically when reasoning about linear
algebra. Even though the geometric intuition doesn't constitute
proof, it guides you in doing one, and will give a sanity check
on it, since a result that doesn't make geometric sense is probably
wrong. Note also that most of the above argument is a matter of accurate
definitions, the final proof itself being short.
- Jim
|