Inverse of a 3x3 or 4x4 matrix using the progressive product

Let M be a 4x4 matrix equation.  Image Copyright 2014 Jason Wood. All rights reserved.

Let M be a 4x4 matrix equation.  Image Copyright 2014 Jason Wood. All rights reserved.

I implemented a pleasant function to compute the inverse of a 3x3 matrix using the progressive (wedge) product.

As Lengyel states, the ith row of the inverse of 3x3 matrix M is (1 / D) times the wedge product of all but the ith column of M, where D is the determinant of M. 

Implemented in code, this looks like:

Mat3 inverse(Mat3 const& m) {// get m's column vectors
 auto c0 = m.getCol(0);
 auto c1 = m.getCol(1);
 auto c2 = m.getCol(2);

 auto tmp = c0 ^ c1;
 Real d = (tmp ^ c2)[0]; // compute determinant

 if (isZero(d)) {
return Mat3(0); // no inverse!
 }

 d = 1 / d;

 // create 3x3 inverse from row vectors
 return Mat3(dual(c1 ^ c2) * d, dual(c2 ^ c0) * d, dual(tmp) * d);
}

One thing Lengyel fails to mention (probably just an oversight) is that the wedge products (which are bivectors) need to be dualized so that they are indeed row vectors, which is apparent in the last statement of the above function body.
Note also the complete lack of concern about things like matrix minors / cofactors / adjugate / transpose, etc. It just works! This m.o. is extensible to 4x4 matrices as well.

+++++++++++++++++++++++++++++++++++++++

In case you're interested, have a look at the attached short PDF showing how use this method to compute a 4x4 matrix inverse (I tested it in code, and it works perfectly, although it's inefficient for finding inverses of transformations as used in 3-d graphics, for which 'quick-and-dirty' methods are more appropriate). 


Basically the general procedure for a N x N matrix M (with N >= 3, and column vectors c0, c1, ..., cN) is:
row i of inv(M) = dual(c0 ^ c1 ^ ... cN) / det(M), leaving out column i, and negating odd-numbered rows afterwards (due to the pow(-1, i+j) thing that goes on with matrix minors when computing the adjugate). 

Note that det(M) can be computed as the dual of the progressive product of all of M's column vectors (the dual gives a scalar since said progressive product is an antiscalar).

I.e., for a 3x3 matrix, det(M) = dual(c0 ^ c1 ^ c2), and for a 4x4 matrix, det(M) = dual(c0 ^ c1 ^ c2 ^ c3), and so forth.


BTW, I spent hours combing the internet for this info, and apart from Lengyel briefly mentioning the existence of the procedure (and leaving out some important details), there's some extremely math-heavy stuff talking about how to find the determinant using progressive products, but not how to compute the actual inverse itself. So some of it was trial-and-error / guesswork / understanding.

-Jason Wood

Previous
Previous

Homogeneous coordinates with GA (in progress...)

Next
Next

Interpolating Rotations in Geometric Algebra