# Exterior Algebra Notes #4: The Interior Product

[January 27, 2019]

Vector spaces are assumed to be finite-dimensional and over $\bb{R}$. The grade of a multivector $\alpha$ will be written $\| \alpha \|$, while its magnitude will be written $\vert \alpha \vert$. Bold letters like $\b{u}$ will refer to (grade-1) vectors, while Greek letters like $\alpha$ refer to arbitrary multivectors with grade $\| \alpha \|$.

More notes on exterior algebra. This time, the interior product $\alpha \cdot \beta$, with a lot more concrete intuition than you’ll see anywhere else.

I am not the only person who has had trouble figuring out what the interior product is for. This is what I have so far…

## 1. The Interior Product

The last main tool of exterior algebra is the interior product, written $\alpha \cdot \beta$ or $\iota_{\alpha} \beta$. It subtracts grades ($\| \alpha \cdot \beta \| = \| \beta \| - \| \alpha \|$) and, conceptually, does something akin to ‘dividing $\alpha$ out of $\beta$’. It’s also called the ‘contraction’ or ‘insertion’ operator. We use the same symbol as the inner product because we think of it as a generalization of the inner product: when $\| \alpha \| = \| \beta \|$, then $% %]]>$.

Its abstract definition is that it is adjoint to the wedge product with respect to the inner product:

In practice this means that it sort of ‘undoes’ wedge products, as we will see.

When we looked at the inner product we had a procedure for computing $% %]]>$. We switched from the $\^$ inner product to the $\o$ inner product, by writing both sides as tensor products, with the right side antisymmetrized using $\text{Alt}$:1

Interior products directly generalize inner products to cases where the left side has a lower grade2, (which is why we use $\cdot$ for both), and can be computed with the exact same procedure:

A general formula for the interior product of a vector with a multivector, which can be deduced from the above, is

The intuitive meaning of the interior product is related to projection. We can construct the projection and rejection operators of a vector onto a multivector with:

To understand this, recall that the classic formula for projecting onto a unit vector is:

That is, we find the scalar coordinate along $\b{a}$, then multiply by $\b{a}$ once again. With multivectors, $\b{a} \cdot \beta$ is not a scalar, so we can’t just use scalar multiplication – so it makes some sense that it would be replaced with $\^$.3

The classic vector rejection formula is

Using the interior product we can write this as

The multivector version $\b{a} \^ \beta$ is only non-zero if $\b{\beta}$ has a component which does not contain $\b{a}$ – all $\b{a}$-ness is removed by the wedge product, leaving something like $\b{a} \^ \beta_{\perp \b{a}}$. Then $\b{a} \cdot \b{a} \^ \beta_{\perp \b{a}} = \beta_{\perp \b{a}}$.

The correct interpretation of $\b{a} \cdot \beta$, then, is a lot like what it means when $\beta = \b{b}$: it’s finding the ‘$\b{a}$-component’ of $\beta$. It’s just that, when $\beta$ is a multivector, the ‘$\b{a}$-coordinate’ is no longer a scalar.

For example this is the ‘$\b{x}$‘-component of a bivector $\b{b \^ c}$:

Note that the result doesn’t have any $\b{x}$ factors in it.

What about $\alpha \cdot \beta$, where $\alpha$ is a multivector? It’s still true that $\alpha \cdot \beta$ gives the ‘$\alpha$-coordinate’ of $\beta$, if there is one. But the rejection formula doesn’t work – we can only use $\beta_{\perp \alpha} = \beta - \frac{1}{\Vert \alpha \Vert^2} \alpha \^ (\alpha \cdot \beta)$. The problem is that there are cases where both $\alpha \^ \beta = \alpha \cdot \beta = 0$, such as for $\b{x \^ y}$ and $\b{y \^ z}$.4

If we consider our projection/rejection operations as operators, writing $L_{\b{a}} \beta = \b{a} \^ \beta$ and $\iota_{\b{a}} \beta = \b{a} \cdot \beta$, then:

Since $\iota^2 = L^2 = 0$, this could also be written as

And in fact this works (although the interpretation is trickier) with different vectors for each term:

There is a lot of interesting structure here which is worth diving into in the future. It turns out to be related to a lot of other mathematics. The short version is that $\iota$ is, technically, a “graded derivation” on the exterior algebra, and the property that $\iota L + L \iota = I$ is the exterior-algebra equivalent of the fact that $\p_x x - x \p_x = 1$ on derivatives (in the sense that $(xf)' - x f' = f$).

If we are keeping track of vector space duality, the left side of an interior product $\alpha \cdot \beta$ should transform like a dual multivector. (It certainly seems like it should because the left side of an inner product $% %]]>$ should.) More on that later.

The discussion about projection above seems to me to strongly suggest that we define $\frac{\iota_{\b{a}}}{\vert \b{a} \vert^2} = \b{a}^{-1}$ as a sort of ‘multiplicative inverse’ of $\b{a}$. It’s not a complete inverse, because $\b{a} \^ \b{a}^{-1} \^ \beta = \beta_{\b{a}}$. Instead of being invertible, dividing and then multiplying pulls out the projection on $\b{a}$. There is a certain elegance to it.

In fact there is an argument to be made that interior products $\iota_{\alpha}$, and dual vectors in general, should be considered as negative-grade multivectors, so $\iota_{\alpha} \in \^^{- \| \alpha \|} V$. Then we could write that $\alpha \cdot \beta \in \^^{\| \beta \| - \| \alpha \|} V$ even if $\alpha$ has the higher grade. This is also compelling because it explains why dual vectors transform according to the inverse of a transformation: if $\alpha \ra A^{\^k}(\alpha)$, of course $\iota_{\alpha} \ra A^{-\^ k} (\iota_{\alpha})$. Something to think about! I hope to look into it in a later article.

## 2. More identities

We can use $\iota$ to prove a few more vector identities.

Here’s the vector triple product:

The Jacobi Identity:

The Jacobi Identity can also be rearranged into the following intriguing form, which we will have to figure out someday (it has some relationship to Lie algebras).

The second line is our previous expansion of $\b{a} \cdot (\b{b \^ c})$. How could these be equal?

One case where the interior product is already being used in mathematics is when multiplying by an antisymmetric matrix. A bivector $\b{b \^ c}$ can be represented as a tensor product $\b{b \o c - c \o b}$, which can be treated as an antisymmetric matrix. The interior product $\b{a} \cdot (\b{b \^ c})$ is then equivalent to matrix multiplication:

For instance this is one way of writing a rotation operator which rotates vectors by $\frac{\pi}{2}$ in the $\b{bc}$ plane (if $\b{b}, \b{c}$ are unit vectors):

The Hodge Star can be written as an interior product with the pseudoscalar. In $\bb{R}^3$:

This is probably the better definition. One reason is that it suggests that $\star$ is not so special, and, for instance, we might allow ourselves to take a $\star$ in a subspace. For instance while working in $\bb{R}^3$ here is another way to write the rotation operator $R_{\b{xy}}$:

Other articles related to Exterior Algebra:

1. Recall that we basically elect to antisymmetrize one side because if we did both we would need an extra factor of $1/n!$ for the same result. It might be that there are abstractions of this where you do need to do both sides (for instance if $a \cdot b \neq b \cdot a$?)

2. It is probably possible to generalize to either side having the lower grade, but it’s not normally done that way. I want to investigate it sometime.

3. the other candidate would be $\o$, but we’d like the result to also be a multivector so it makes sense to only consider $\^$

4. I think there’s a way to make it work. It looks something like: for each basis multivector of lower grade, remove it from both sides, like $(\b{x} \cdot \alpha) \cdot (\b{x} \cdot \beta)$. But that’s complicated and will have to be saved for the future.