Last edited October 9th, 2023i - John Halleck
Last signifcant edit was on June 4th, 2022 - John Halleck
Requires: Basic Familiarity with Matrices (transposes, inverses, identities, ...)Assumptions:
If we multiply matrices below, we assume that they are conformable.
If we we take the inverse of a matrix below , we assume that the matrix has an inverse.
Notation review:
Identity Matrix: I
Zero Matrix: 0
Matrix [multiplicative] Inverse of A: A-1
Matrix Transpose of A: AT
Matrix Both Inverse and Transpose applied to A: A-T
(Since the inverse of the transpose of a matrix equals the transpose of the inverse of matrix.)
Definitions:
A matrix Q is orthogonal if it, multiplied by its transpose, is the identity matrix.
Examples:
QTQ = I and QQT = I
it follows that the transpose of an orthogonal matrix is equal to its inverse.
QT = Q-1 And so it follows that Q = Q-T
In a prior post I introduced Orthogonal Matrices, and it is probably past time to say something interesting about them in the context of factoring matrices.
Factoring a matrix just means finding two or more matrices (called factors) that, when multiplied together, give the original matrix as a product. (This is also called a decomposition of the matrix.) An LU factorization, for example, produces an L matrix that is a unit lower triangular matrix, multiplied by an upper triangular matrix U. This particular factorization is one of the most famous in Linear Algebra.
I have heard it argued that if you have a orthogonal matrix, there is little point in forming an LU factorization of it, because you *already* have what you need to solve problems with it efficiently. Well, that does happen to be true, but I'm *not* solving systems... I'm playing with these matrices looking for interesting relationships. :)
So, let's examine the LU factorization (also called a decomposition) of an orthogonal matrix Q: we know that Q = Q-T, so if Q = LU, we have
LU = (LU)-T = (U-1L-1)T = L-T U-T. It is true, in general, for any matrix that has an LU decomposition, we have:
(LU)-T = L-T U-T
*BUT* with an orthogonal matrix both are also equal to LU.
Side note: There was nothing in that little derivation that depended on the factorization being triangular matrices... This can be done with ANY matrix factorization of Q, Given a factorization ABCD... of Q you would get the obvious A-TB-TC-TD-T...
The LU demo did show that that if LU is orthogonal, then L-T U-T is also orthogonal. L-T U-T is actually a UL factorization (with a different L and U) instead of LU factorization of the *same* matrix when the matrix is orthogonal. L is lower triangular,and L-T (because of the transpose) is upper triangular. And U-T is (because of the transpose) lower triangular. So, for what it is worth, if LU is a unit lower triangular matrix is a unit lower triangular matrix, then L-T is after an inverse and a transpose.
---- LU versus QR:
QR to LU: Solving a matrix problem like Ax=d is often done by finding a factorization of A which is QR where Q is orthogonal and R is upper triangular. If there is an LU factorization of Q, then this is (LU)R and if you re-associated it you would have L(UR), The U and R are both upper triangular, and could be multiplied together into a single upper triangular matrix, making the result of the form of an LU factorization.
*This side of the argument is a bit of a "hand wave", I need to make this a bit more rigorous -JH*
LU to QR: And, if you had Ax = d, and A = LU, you could convert this into a QR factorization by finding an upper triangular matrix E such that LE is orthogonal. Then insert EE-1 into LU to get L EE-1 U. Re-associate this as (LE)(E-1U) which you can see (by inspection) is a QR factorization. [Note that E-1U is upper triangular because both of it's terms are.] Finding such an E is probably as hard, or harder, than producing a QR factoring from scratch... interesting theoretical fact, not even worth it practically.
Navigation: