Home » Mathematics » Algebraic combinatorics » Everything you wanted to know about symmetric polynomials, part V

# Everything you wanted to know about symmetric polynomials, part V

This will be the last post on symmetric polynomials, at least for now. (They’ll continue to come up when I get to representation theory and my true love, algebraic geometry, but only as part of other theories.)

I want to discuss the Hall inner product on the symmetric function ring $\Lambda$ and its interaction with the $\omega$-involution. As a side benefit, we’ll get the “dual” Jacobi-Trudi and Pieri rules, with $e$ and $h$ swapped.

The Inner Product on $\Lambda$

Let’s define a bilinear form on $\Lambda$ by setting the homogeneous symmetric polynomials to be a dual basis to the monomial symmetric polynomials.

In other words, we define $\langle\ ,\ \rangle$ by setting

$\langle m_\lambda, h_\mu \rangle = \delta_{\lambda,\mu}$.

By extension, $\langle f, h_\mu \rangle$ is the coefficient of $m_\mu$ in $f$. On the other hand, $\langle m_\lambda, f \rangle$ is the coefficient of $h_\lambda$ in $f$. This is not obviously “nice” in any way, at least a priori, but is a convenient way to talk about certain change-of-basis coefficients.

Still, harkening back to my third post on symmetric polynomials, we already know a couple of values of the pairing:

• $\langle e_\lambda, h_\mu \rangle = A_{\lambda \mu}$, the number of 0-1 matrices with row sums $\lambda$ and column sums $\mu$.
• $\langle s_\lambda, h_\mu \rangle = K_{\lambda \mu}$, the Kostka number (the number of SSYTs of shape $\lambda$ and content $\mu$).

By similar reasoning to the matrix-style argument I used for expanding the $e$‘s in the $m$-basis, we can find:

• $\langle p_\lambda, h_\mu \rangle = C_{\lambda \mu}$, the number of nonnegative integer matrices with $i$-th row containing one $\lambda_i$ and all other entries zero; and with column sums $\mu$.
• $\langle h_\lambda, h_\mu \rangle = B_{\lambda \mu}$, the number of nonnegative integer matrices with row sums $\lambda$ and column sums $\mu$.

Oops: something surprising just happened. These last numbers are symmetric in $\lambda, \mu$, even though we didn’t define our inner product to be symmetric! But, it must be true for the $h$‘s, and therefore (since they’re a basis) everything else as well:

Proposition. The bilinear form $\langle\ ,\ \rangle$ is symmetric.

But more is true:

Proposition. The Schur polynomials form an orthonormal basis for $\Lambda$.

Proof. Let’s consider an inner product $\langle s_\lambda, s_\mu \rangle$. By definition, to compute this we should expand one term in the $h$ basis and the other in the $m$ basis:

$\displaystyle{ \langle s_\lambda, s_\mu \rangle = \langle \sum_\alpha L_{\lambda \alpha} h_\alpha, \sum_\beta K_{\mu \beta} m_\beta \rangle = \sum_{\alpha,\beta} L_{\lambda \alpha} K_{\mu \beta} \langle h_\alpha, m_\beta \rangle. }$

These last inner products vanish by definition unless $\alpha = \beta$, so we’re left with

$\displaystyle{ \langle s_\lambda, s_\mu \rangle = \sum_\alpha L_{\lambda \alpha} K_{\mu \alpha} = (L \cdot K^T)_{\lambda \mu}, }$

which we wish to show equals $\delta_{\lambda \mu}$. Here, we’re just defining $L$ to be the (hitherto unknown) matrix for converting Schur polynomials to the $h$ basis, and $K$ is the matrix of Kostka numbers, for changing the Schurs to the monomial basis. In other words, we want $L$ and $K^T$ to be inverse matrices.

Equivalently, we could show that the inverse of $L$, the matrix converting the $h$ basis to the Schur basis, is given by the (transposed) Kostka numbers:

$\displaystyle{ h_\lambda = \sum_\mu K_{\mu \lambda} s_\mu.}$

In fact, we’ve already shown this (I hinted at it in the second post): this equality is none other than the the RSK correspondence. More specifically, the RSK correspondence is the above equality when written in the $m$ basis. To see this, let’s convert both sides to monomials:

$\displaystyle{ \sum_\nu B_{\lambda \nu} m_{\nu} = \sum_{\mu,\nu} K_{\mu \lambda} K_{\mu \nu}m_\nu.}$

The coefficient of $m_\nu$ on the one left is the number of nonnegative integer matrices with row sum $\lambda$ and column sum $\nu$; the coefficient on the right is the number of pairs of SSYTs of the same shape ($\mu$, summing over all choices), the first tableau having weight $\lambda$ and the second having weight $\nu$. The fact that these agree is exactly what we proved in the RSK correspondence.

Involving the Involution

Hilarious, I know. Let’s take a look at the effect of the $\omega$ involution on the inner product. We previously established that $\omega$ swaps the $e$ and the $h$ basis. Since we determined above that our inner product is symmetric, we get for free,

Proposition. The $\omega$ involution is an isometry: $\langle f,g \rangle = \langle \omega(f), \omega(g) \rangle$.

Proof. We can check this on the $e,h$ bases. We know from above that $A_{\lambda \mu} = \langle h_\lambda, e_\mu \rangle$. On the other hand, $\langle \omega(h_\lambda), \omega(e_\mu) \rangle = \langle e_\lambda, h_\mu \rangle = A_{\mu \lambda}$. It’s clear from the definition above that $A_{\mu \lambda} = A_{\lambda \mu}$.

Next, let’s see what $\omega$ does to the Schur polynomials, bearing in mind that the result will again be an orthonormal basis. Since we just worked hard establishing the $h \to s$ conversion (with the Kostka numbers), let’s use that. We have

$\displaystyle{ h_\lambda = \sum_\nu K_{\nu \lambda} s_{\nu}.}$

Applying $\omega$, we get

$\displaystyle{ e_\lambda = \sum_\nu K_{\nu \lambda} \omega(s_{\nu}).}$

But last post‘s Pieri rule already told us how to express the $e$‘s in the Schur basis: we had

$\displaystyle{ e_\lambda = \sum_\nu K_{\nu^T \lambda} s_{\nu}.}$

Since the matrices $(K_{\nu,\lambda}), (K_{\nu^T,\lambda})$ are invertible, these equations must match up term-by-term. The matrices are permuted on the Schur polynomial side; if we reindex the second sum, swapping $\nu$ and $\nu^T$, we get

$\displaystyle{ e_\lambda = \sum_\nu K_{\nu \lambda} s_{\nu^T}.}$

Now, by inverting the system, we have shown:

Proposition. The $\omega$ involution sends $s_\lambda \mapsto s_{\lambda^T}$.

In particular, we can now translate the Jacobi-Trudi and Pieri rules from last post to their ‘dual’ forms:

Theorem (Dual Jacobi-Trudi). The Schur polynomial satisfies

$\displaystyle{ s_{\lambda^T} = \det( e_{j-1 + \lambda_{n+1-j}}).}$

Theorem (Dual Pieri). The Schur polynomials satisfy

$\displaystyle{ s_k \cdot s_\lambda = \sum_\mu s_\mu,}$

where the sum runs over all partitions $\mu \supset \lambda$ such that $\mu/\lambda$ consists of $k$ boxes in a (collection of) horizontal strip(s). That is, none of the added boxes should be in the same column. (To visualize this, keep in mind that the partition $(k)$ is, itself, a horizontal strip.)

That’s very nice.

The Neglected $p$ Basis

Since this is my last post on symmetric functions, I should briefly mention a few last details on the power symmetric polynomials.

First, by replaying the $p$-to-$e$ change of basis (the one with the generating functions) with a few extra minus signs, we can get the $p$-to-$h$ change of basis. It’s basically the same, up to a sign – in particular, it looks almost the same under the $\omega$ involution. By inspection, we can therefore find

$\displaystyle{ \omega(p_k) = (-1)^{k-1} p_k.}$

This also tells us $\omega(p_\lambda) = \pm p_\lambda$, since $\omega$ is a ring map.

Second, the inner product satisfies

$\displaystyle{ \langle p_\lambda, p_\mu \rangle = z(\lambda)\delta_{\lambda,\mu}.}$

In particular, the power symmetric polynomials are self-dual and orthogonal to each other, but are not normal. The coefficient is $z(\lambda) = (r_1! \cdots r_n!) \cdot (\lambda_1 \cdots \lambda_k)$, where $r_i$ is the number of times $i$ occurs in $\lambda$.