This will be the last post on symmetric polynomials, at least for now. (They’ll continue to come up when I get to representation theory and my true love, algebraic geometry, but only as part of other theories.)

I want to discuss the Hall inner product on the symmetric function ring and its interaction with the -involution. As a side benefit, we’ll get the “dual” Jacobi-Trudi and Pieri rules, with and swapped.

**The Inner Product on **

Let’s define a bilinear form on by setting the homogeneous symmetric polynomials to be a dual basis to the monomial symmetric polynomials.

In other words, we define by setting

.

By extension, is the coefficient of in . On the other hand, is the coefficient of in . This is not obviously “nice” in any way, at least a priori, but is a convenient way to talk about certain change-of-basis coefficients.

Still, harkening back to my third post on symmetric polynomials, we already know a couple of values of the pairing:

- , the number of 0-1 matrices with row sums and column sums .
- , the Kostka number (the number of SSYTs of shape and content ).

By similar reasoning to the matrix-style argument I used for expanding the ‘s in the -basis, we can find:

- , the number of nonnegative integer matrices with -th row containing one and all other entries zero; and with column sums .
- , the number of nonnegative integer matrices with row sums and column sums .

Oops: something surprising just happened. These last numbers are symmetric in , even though we didn’t define our inner product to be symmetric! But, it must be true for the ‘s, and therefore (since they’re a basis) everything else as well:

**Proposition**. The bilinear form is symmetric.

But more is true:

**Proposition**. The Schur polynomials form an orthonormal basis for .

*Proof*. Let’s consider an inner product . By definition, to compute this we should expand one term in the basis and the other in the basis:

These last inner products vanish by definition unless , so we’re left with

which we wish to show equals . Here, we’re just defining to be the (hitherto unknown) matrix for converting Schur polynomials to the basis, and is the matrix of Kostka numbers, for changing the Schurs to the monomial basis. In other words, we want and to be inverse matrices.

Equivalently, we could show that the inverse of , the matrix converting the basis to the Schur basis, is given by the (transposed) Kostka numbers:

In fact, we’ve already shown this (I hinted at it in the second post): this equality is none other than the the RSK correspondence. More specifically, the RSK correspondence is the above equality when written in the basis. To see this, let’s convert both sides to monomials:

The coefficient of on the one left is the number of nonnegative integer matrices with row sum and column sum ; the coefficient on the right is the number of pairs of SSYTs of the same shape (, summing over all choices), the first tableau having weight and the second having weight . The fact that these agree is exactly what we proved in the RSK correspondence.

**Involving the Involution**

Hilarious, I know. Let’s take a look at the effect of the involution on the inner product. We previously established that swaps the and the basis. Since we determined above that our inner product is symmetric, we get for free,

**Proposition**. The involution is an isometry: .

*Proof*. We can check this on the bases. We know from above that . On the other hand, . It’s clear from the definition above that $A_{\mu \lambda} = A_{\lambda \mu}$.

Next, let’s see what does to the Schur polynomials, bearing in mind that the result will again be an orthonormal basis. Since we just worked hard establishing the conversion (with the Kostka numbers), let’s use that. We have

Applying , we get

But last post‘s Pieri rule already told us how to express the ‘s in the Schur basis: we had

Since the matrices are invertible, these equations must match up term-by-term. The matrices are permuted on the Schur polynomial side; if we reindex the second sum, swapping and , we get

Now, by inverting the system, we have shown:

**Proposition**. The involution sends .

In particular, we can now translate the Jacobi-Trudi and Pieri rules from last post to their ‘dual’ forms:

**Theorem (Dual Jacobi-Trudi)**. The Schur polynomial satisfies

**Theorem (Dual Pieri)**. The Schur polynomials satisfy

where the sum runs over all partitions such that consists of boxes in a (collection of) **horizontal strip**(s). That is, none of the added boxes should be in the same column. (To visualize this, keep in mind that the partition is, itself, a horizontal strip.)

That’s very nice.

**The Neglected Basis**

Since this is my last post on symmetric functions, I should briefly mention a few last details on the power symmetric polynomials.

First, by replaying the -to- change of basis (the one with the generating functions) with a few extra minus signs, we can get the -to- change of basis. It’s basically the same, up to a sign – in particular, it looks almost the same under the involution. By inspection, we can therefore find

This also tells us , since is a ring map.

Second, the inner product satisfies

In particular, the power symmetric polynomials are self-dual and orthogonal to each other, but are not normal. The coefficient is , where is the number of times occurs in .