Friday, March 20, 2020

A note on Rotational Commutativity

Topics: Orthogonal matrices, Commutativity, rotational commutativity

Requires: Basic Familiarity with Matrices (transposes, inverses, identities, ...)

   Notation:

Identity Matrix: I
Zero Matrix: 0
Matrix [multiplicative] Inverse of A: A-1
Matrix Transpose of A: AT
Matrix Both Inverse and Transpose applied to A: A-T
  (Since the inverse of the transpose of a matrix equals the transpose of the inverse of the matrix.)

   Assumptions:

If we multiply matrices below, we assume that they are conformable.
If we we take the inverse of a matrix, we assume that the matrix has an inverse.

   Definitions:

If we multiply an expression on the left, we call this premultiplying by the matrix.
If we multiply an expression on the right, we call it post multiplying.
A matrix Q is orthogonal if, when it is multiplied by its transpose, the result is the identity.
Examples:
  QTQ = I
  QQT = I
Another way to say this is that, for an orthogonal matrix, its transpose is equal to its inverse.
  QT = Q-1
And so it follows that
  Q = Q-T

Preliminaries:

In my prior post, I talked about the fact that, in general, matrix multiplication doesn't commute. And I mentioned, in passing, that in some special cases do commute.
Some cases that do commute are:
    IA = AI = I        Identity commutes with anything.
    0A = A0 = 0        Zero commutes with anything.

Since matrices don't generally commute, one has to be careful to make it clear, when multiplying through by some matrix whether you are post multiplying. If your a multiplying by a matrix and don't state what kind you are doing, it is assumed you are premultiplying.

Rotational Commutativity:

I can't find a proper name in the literature that describes the property I discuss here. I'm calling it "Rotational Commutativity" for lack of a better term. If someone knows of a proper term for it in the mathematics literature, please drop me a line. Put "Matrices" in the subject line of the note.

You can see that I was able to exchange Q and QT in the definition of Orthogonal above.
You might be tempted to say that orthogonal matrices commute. This is not correct. Those only commuted because their product was I.

Lets make this clearer by considering a list of matrices that, when multiplied together, have a product of I.

Example: ABC = I
We can premultiply the equation by A-1, removing the A at the start, and replacing the I on the right hand side with A-1.
    BC = A-1
We then post multiply both sides by A which puts it on the right of the left hand side of the equals and cancels the term on the right hand side.
    BCA = I
So what did we do? We rotated the list by one item.
This shows that for a list of matrices that have the product I, we can produce any rotation. For ABC, we can produce ABC, BCA, and CAB, and all are equal to the identity.
If we had only two matrices AB, and their product was I, rotating the list produces BA, which is the same as swapping them.

Aside: This only worked because the collection had the product I. If we started with anything else there, the two steps would have had some other product. In the ABC case, if the right hand side of the equation were D, we would have ended up with BC = A-1DA, which is ABCA-1 = D.

This shows that for a list of matrices that have the product I, we can produce any rotation. For ABC, we can produce ABC, BCA, and CAB, and all are equal to the identity.
If we had only two matrices AB, and their product was I, rotating the list produces BA, which is the same as swapping them.

However, this rotation does not mean that that collection is the identity in any other order.
For example, the technique used can not generate CBA, BAC, or ACB, because they are not a rotation of ABC.


Navigation:

previous post | next post

No comments:

Post a Comment